I find a memory leak when I am testing my app on ios device, look at the code below:
- (void)_startReceive
// Starts a connection to download the current URL.
{
// Open a CFFTPStream for the URL.
CFReadStreamRef ftpStream = CFReadStreamCreateWithFTPURL(NULL, (CFURLRef) url);
assert(ftpStream != NULL);
self.networkStream = (NSInputStream *) ftpStream;
self.networkStream.delegate = self;
[self.networkStream scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:RUNLOOPMODEL];
[self.networkStream open];
CFRelease(ftpStream);
}
- (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode
// An NSStream delegate callback that's called when events happen on our
// network stream.
{
if (self.networkStream == nil) { //EXC_BAD_ACCESS(code = 1,address=......)
NSLog(#"here");
}
switch (eventCode) {
case NSStreamEventOpenCompleted: {
} break;
case NSStreamEventHasBytesAvailable: {
NSInteger bytesRead;
uint8_t buffer[LISTDOCBUFFER];
......
}
I use this code to do a ftp request for document information. But only sometimes (one of eight times) the memory leak will happen at the line I note. And On testing on ios simulator, this never happened. I want to know the possible reason and how to fix it?
The reason could be anything but most likely to be invalid memory management. You can analyze your project in your XCode, go to the Project tab and select analyze where the memory leak is actually happening or you can run Profile from the same pathway to detect any particular memory leaks. Check out this link, it is a really cool topic on how to debug memory related issues.
After type casting ftpStream into NSInputStream you are releasing it (CFRelease(ftpStream)) and again using it if (self.networkStream == nil).Do not call CFRelease() on ftpStream and release NSInputStream once you are done with it.
Related
I am working on an APP for user to upload videos to our FTP server
So far, everything almost done but I met one issue is that after users upload videos(.MOV), I failed to open and play the files.
The error message that quicktime player returns is "can't open because the movie's file format is not recognized"
In my codes, I let users select videos by using ALAssetsLibrady
Then load the video into an ALAsset object, before start uploading, load the video into a NSInputStream object from ALAsset, here is the codes.
ALAssetRepresentation *rep = [currentAsset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:nil];
NSData *data = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
iStream = [NSInputStream inputStreamWithData:data];
[iStream open];
Next step is to set a NSOutputStream and open it, handle uploading operation by following codes.
- (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode
{
switch (eventCode) {
case NSStreamEventNone:
{
break;
}
case NSStreamEventOpenCompleted:
{
//opened connection
NSLog(#"opened connection");
break;
}
case NSStreamEventHasBytesAvailable:
{
// should never happen for the output stream
[self stopSendWithStatus:#"should never happen for the output stream"];
break;
}
case NSStreamEventHasSpaceAvailable:
{
// If we don't have any data buffered, go read the next chunk of data.
NSInteger bufferSize = 65535;
uint8_t *buffer = malloc(bufferSize);
if (bufferOffset == bufferLimit) {
NSInteger bytesRead = [iStream read:buffer maxLength:bufferSize];
if (bytesRead == -1) {
[self stopSendWithStatus:#"file read error"];
} else if (bytesRead == 0) {
[self stopSendWithStatus:nil];
} else {
bufferOffset = 0;
bufferLimit = bytesRead;
}
}
// If we're not out of data completely, send the next chunk.
if (bufferOffset != bufferLimit) {
NSInteger bytesWritten = [oStream write:&buffer[bufferOffset] maxLength:bufferLimit - bufferOffset];
if (bytesWritten == -1) {
[self stopSendWithStatus:#"file write error"];
} else {
bufferOffset += bytesWritten;
}
}
//NSLog(#"available");
break;
}
case NSStreamEventErrorOccurred:
{
//stream open error
[self stopSendWithStatus:[[aStream streamError] description]];
break;
}
case NSStreamEventEndEncountered: //ignore
NSLog(#"end");
break;
}
}
There is no any error occurs, the video file does upload to FTP with correct file size and name, but just can't open it.
Anybody knows any clue?
I have made NSInputStream implementation for streaming ALAsset objects - POSInputStreamLibrary. It doesn't read the whole 1GB video into memory as your solution, but reads movie with chunks instead. Of course this is not the only feature of POSBlobInputStream. More info at my GitHub repository.
I know this probably isn't the answer you're looking for, but you should NOT use a direct connection via FTP to allow users to upload files to your webserver. It's unsecure and slow compared with REST.
Instead, why not write a tiny bit of php to handle the upload, and POST the file from the app via REST? here:
$uploaddir = 'uploads/';
$file = basename($_FILES['file']['name']);
$uploadfile = $uploaddir . $file;
I also recommend using AFNetworking to handle the POST request http://afnetworking.com/
First of all,I guess you meant to reduce memory capacity by convert ALAsset to NSInputStream other than NSData.But you convert it to NSData firstly then convert NSData you got to NSInputStream,it doesn't make sense and would not reduce memory capacity for you have already put your video into memory with NSData.
So if you want to transfer your video via Stream in order to reduce memory pressure(or you have no choice because your video is up to 2GB or more),you should use CFStreamCreateBoundPair to upload file chunk by chunk,see the Apple iOS Developer Library written below.
For large blocks of constructed data, call CFStreamCreateBoundPair to create a pair of streams, then call the setHTTPBodyStream: method to tell NSMutableURLRequest to use one of those streams as the source for its body content. By writing into the other stream, you can send the data a piece at a time.
I have a swift version of converting ALAsset to NSInputStream via CFStreamCreateBoundPair in github.The key point is just like the Documents written.Another reference is this question.
Hope it would be helpful for you.
I have an app that allow automatic login for a user. What my boss want is, when the app launches, the app need to test connectivity to our server (not just test whether there is WiFi or 3G network). I borrowed apple's Reachability sample, it works.
A problem is that takes too long, especially at the launch time. I tried it on a local wireless network without internet connection, it took me almost half an minute to do so. And since auto login called in -ViewDidLoad(), for that half minute, ui didn't load. It is just simply not acceptable to take the time that long. What's even worse, if my app takes too long to load, iOS might even shut the app down.
Further more, I have a lot of web service calls in my app. I understand each web service call could have the chance to fail, because the nature that iPhone/iPad can lose or gain connection easily even when user talk a short walk from one place to another. I personally don't like it but that's what my boss want me to do, and I probably have to test the connection quite often in the app.
So here what I am looking for is either a way to detect connection really really fast(within seconds) or a way to do it behind the scenes and not effect user experience.
Anybody has suggestions?
Thank you for taking your time reading my questions.
Present an intermediate loading view with presentModalViewController while the test is taking place and run the actual test using performSelectorInBackground. Then signal back to the main thread with performSelectorOnMainThread:
- (void)viewDidLoad
{
[super viewDidLoad];
// Make this a hidden member
loadingViewController = [LoadingViewController new];
[self presentModalViewController:loadingViewController animated:NO];
[self performSelectorInBackground:#selector(testConnectivity) withObject:nil];
}
- (void)testConnectivity
{
// Do expensive testing stuff
[self performSelectorOnMainThread:#selector(testCompleted) withObject:nil waitUntilDone:NO];
}
- (void)testCompleted
{
[loadingViewController dismissViewControllerAnimated:YES];
loadingViewController = nil;
}
Note that the overall user experience of waiting 30 seconds for the application to start sort of sucks, and connectivity often changing while you are using an app, so even if you do the test every startup it's unlikely to be reliable. But if it is what your boss wants I suffer with you. ;)
For what it's worth, here is the method I use..
- (BOOL)checkServerAvailability {
bool success = false;
const char *host_name = [#"host" cStringUsingEncoding:NSASCIIStringEncoding];
SCNetworkReachabilityRef reachability =
SCNetworkReachabilityCreateWithName(NULL, host_name);
SCNetworkReachabilityFlags flags;
success = SCNetworkReachabilityGetFlags(reachability, &flags);
bool isAvailable = success && (flags & kSCNetworkFlagsReachable)
&& !(flags & kSCNetworkFlagsConnectionRequired);
CFRelease(reachability);
return isAvailable;
}
Here is the logging result of my server check method...
2012-07-11 11:29:04.892 ASURecycles[1509:f803] starting server check...
2012-07-11 11:29:04.894 ASURecycles[1509:f803] creating reachability...
2012-07-11 11:29:04.894 ASURecycles[1509:f803] creating flags...
2012-07-11 11:29:04.913 ASURecycles[1509:f803] checking for success of reachability, assigning to flags..
2012-07-11 11:29:04.913 ASURecycles[1509:f803] checking our flags to determine network status...
2012-07-11 11:29:04.913 ASURecycles[1509:f803] not available
As you can see, fractions of a second. Of course our server is undergoing trouble at the moment, but I think your issue may be a server issue, not a framework issue. You could always try executing the server check on another thread though.
As I told in the comments you should use Hampus Nilsson's approach to perform the request in the background anyways.
Regarding to your 30 seconds problem I found this in another blog:
- (BOOL)isHostAvailable:(NSString*)hostName
{
// this should check the host but does not work in the simulator, aka it returns YES when should be no
SCNetworkReachabilityRef reachability = SCNetworkReachabilityCreateWithName(NULL, [hostName cStringUsingEncoding:NSASCIIStringEncoding]);
SCNetworkReachabilityFlags flags;
BOOL success = SCNetworkReachabilityGetFlags(reachability, &flags);
if (reachability) {
CFRelease(reachability);
}
if ( ( success && (flags & kSCNetworkFlagsReachable) && !(flags & kSCNetworkFlagsConnectionRequired) ) == NO) {
return NO;
}
// we know at least the network is up, second check for a known page
NSData *dataReply;
NSURLResponse *response;
NSError *error;
// create the request
NSString *urlString = [NSString stringWithFormat:#"http://%#/index.php", hostName];
NSURLRequest *theRequest=[NSURLRequest requestWithURL:[NSURL URLWithString:urlString]
cachePolicy:NSURLRequestReloadIgnoringLocalCacheData
timeoutInterval:8.0];
// Make the connection
dataReply = [NSURLConnection sendSynchronousRequest:theRequest returningResponse:&response error:&error];
if (response != nil) {
NSLog(#"SNNetworkController.isHostAvailable %#", response);
return YES;
} else {
// inform the user that the download could not be made
NSLog(#"SNNetworkController.isHostAvailable %# %#", response, error);
return NO;
}
}
this will perform a request with a timeout value of 8 seconds.
//EDIT:
ASIHTTPRequest example:
ASIFormDataRequest *request = [ASIFormDataRequest requestWithURL:url];
[request setNumberOfTimesToRetryOnTimeout:3];
[request setTimeOutSeconds:20.0];
[request setRequestMethod:#"POST"];
[request startAsynchronous];
I've been trying and trying to get Apples SimpleFTPSample to work for me, but I just can't. Here's the code I've been using:
BOOL success;
CFWriteStreamRef ftpStream;
NSLog(#"Hello");
// assert(self.networkStream == nil); // don't tap send twice in a row!
// assert(self.fileStream == nil); // ditto
// First get and check the URL.
url = [NSURL URLWithString:#hostname];
success = (url != nil);
NSLog(#"After first success");
if (success) {
NSLog(#"In the success if");
// Add the last part of the file name to the end of the URL to form the final
// URL that we're going to put to.
if(fileName) NSLog(#"Filename.");
if(url) NSLog(#"url");
url = [NSMakeCollectable(
CFURLCreateCopyAppendingPathComponent(NULL, (CFURLRef) url,
//(CFStringRef) [fileName lastPathComponent],
(CFStringRef)pngPath,
false)
) autorelease];
[url retain];
NSLog(#"After url=");
success = (url != nil);
assert(success);
NSLog(#"End of success if");
}
// If the URL is bogus, let the user know. Otherwise kick off the connection.
if ( ! success) {
NSLog(#"Invalid URL");
} else {
// Open a stream for the file we're going to send. We do not open this stream;
// NSURLConnection will do it for us.
self.fileStream = [NSInputStream inputStreamWithFileAtPath:pngPath];
assert(self.fileStream != nil);
[self.fileStream open];
// Open a CFFTPStream for the URL.
ftpStream = CFWriteStreamCreateWithFTPURL(NULL, (CFURLRef) url);
if(!ftpStream) NSLog(#"ftpStream is null");
assert(ftpStream != NULL);
self.networkStream = (NSOutputStream *) ftpStream;
#pragma unused (success) //Adding this to appease the static analyzer.
success = [self.networkStream setProperty:#"JEFF" forKey:(id)kCFStreamPropertyFTPUserName];
assert(success);
success = [self.networkStream setProperty:#"jeffrack" forKey:(id)kCFStreamPropertyFTPPassword];
assert(success);
self.networkStream.delegate = self;
[self.networkStream scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
[self.networkStream open];
// Have to release ftpStream to balance out the create. self.networkStream
// has retained this for our persistent use.
CFRelease(ftpStream);
// Tell the UI we're sending.
[self _sendDidStart];
}
}
In this example, hostname is the IP address for the FTP server. The problem I keep getting is that at my NSLog(#"After url=") url keeps being null and I can't figure out why it keeps being created as null. Also, pngPath is the NSString that where the image I'm trying to upload is located. Does anybody know a simple way of editing their simple sample that works, because I've been at this for a couple of weeks and just can't get it to work.
Also, I can't find anything on their API to make it send in passive mode, does anybody know how I could make that work also? Thanks!
Operations with NSString are problematic for scheme protocol ("ftp://" for instance), because as soon as you modify your NSString, it will replace any sequence of "/" characters by a unique "/".
In fact, it becomes problematic with "/" when you use any "pathComponent" or "path" function from NSString class, that will return not a NSString but a NSPathStore2 Class object. That's this class (NSPathStore2) that make funky things with "/".
Then, when you try to use it to make a NSURL, it will fail because the start of the adress becomes "ftp:/myhost/mypath/myfile.ext". One "/" is missing after "ftp:/".
To avoid this problem, here's what I did quickly :
NSString *myPath = [[#"ftp:%2F%2FmyFtpDomain/myPath"
stringByAppendingPathComponent:#"myFilename.ext"]
stringByReplacingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
I escaped the "/" at the creation of the adress by its escaped code %2F, and finally unescaped them.
For the passive mode, use the 'kCFStreamPropertyFTPUsePassiveMode' property of your NSStream :
[_outputStream setProperty:( _passiveMode ? (id)kCFBooleanTrue : (id)kCFBooleanFalse ) forKey:(id)kCFStreamPropertyFTPUsePassiveMode]
All,
I've run it down to this point by commenting, breakpoints, etc. The program crashes at the marked code.
-(void) initNetworkCommunication
{
CFReadStreamRef readStream;
CFWriteStreamRef writeStream;
CFStreamCreatePairWithSocketToHost(NULL, (CFStringRef)#"192.168.17.1", 2004, &readStream, &writeStream);
inputStream = (NSInputStream *)readStream;
outputStream = (NSOutputStream *)writeStream;
[inputStream setDelegate:self];
[outputStream setDelegate:self];
[inputStream scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
[outputStream scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
[inputStream open];//WHY MUST YOU CRASH HERE
[outputStream open];//WHY MUST YOU CRASH HERE ALSO!!?!?
NSLog(#"She be opened, sir!");
}
It doesn't crash if I comment out both of those, but it crashes if I comment out either one (so i.e. they are both causing the program to crash). There is no information that gets posted in the debugger either. All it does is send me to main.m and show me
"Thread 1: Program received signal: "EXC_BAD_ACCESS".
Thanks for the help in advance!
Edit: Here is my delegate method, but it doesn't even present the second active line in the log.
- (void)stream:(NSStream *)theStream handleEvent:(NSStreamEvent)streamEvent {
NSLog(#"stream event %i", streamEvent); //this doesn't post in the log when stream opened...
switch (streamEvent) {
case NSStreamEventOpenCompleted:
NSLog(#"Stream opened");
break;
case NSStreamEventHasBytesAvailable:
if (theStream == inputStream) {
uint8_t buffer[1024];
int len;
while ([inputStream hasBytesAvailable]) {
len = [inputStream read:buffer maxLength:sizeof(buffer)];
if (len > 0) {
NSString *output = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
if (nil != output) {
NSLog(#"server said: %#", output);
//[self messageReceived:output];
}
}
}
}
break;
case NSStreamEventErrorOccurred:
NSLog(#"Can not connect to the host!");
break;
case NSStreamEventEndEncountered:
[theStream close];
[theStream removeFromRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
//[theStream release];
theStream = nil;
break;
default:
NSLog(#"Unknown event");
}
}
What is happening is that the instance of the delegate class is being deallocated (causing EXC_BAD_ACCESS in the run loop) either because you didn't retain it, or you are using ARC (pretty likely) and you do not have a reference to it.
The solution is to either call retain on the delegate class, approximately like so:
SomeStreamDelegate *theDelegate = [[SomeStreamDelegate alloc] init];
[theDelegate retain];
Or if you do have ARC enabled, make an instance variable in the class where you alloc the delegate, and store your instance of connection there. That way ARC will not dealloc it, because the instance var counts as a reference.
If you are using ARC, cast the streams like this :
inputStream = (__bridge NSInputStream *)readStream;
outputStream = (__bridge NSOutputStream *)writeStream;
This should prevent the crash. And keep in mind that if your streams are owned by a separate thread than the main thread that means the run loop needs to be called manually using run method after opening the streams.
When I placed this in my View Controller (and not in a separate class) it worked perfectly.
I had a similar issue where my app would crash in the -handleEvent callback with a huge streamEvent number. I resolved it by making sure I initialize the NSStream objects (input and output) AND open a connection to the server in the -init method of the NetworkClient object which my VC plans to use.
Try this out once,
NSInputStream * inputStream = objc_unretainedObject(readStream);
May be a casting issue
I'm trying to download several images in response to a single http request. On the server side (java) I'm using oreilly multipart response and I'm getting my datas in my iPhone Simulator in didReceiveData (approximately one call for each image) after a call to didReceiveResponse (approximately one call for each image as well) in my delegate.
The problem is this approximately... Has anyone ever managed to handle correctly multipart/x-mixed-re with iPhone SDK ? If yes what is the best strategy here ? Should I play with the expected length ? on server side ? on client side ? should I wait until I've received everything... mmmh that doesn't even seen enough as the calls to didReceiveData happens in a random order (I'm asking picture1,picture2 and I'm sometimes receiving picture2,picture1 even though the order is respected on server side !). Should i temporize between pictures on server side ?
Or should I drop multipart/x-mixed-replace ? what would be the easiest then ?
That's a lot of questions but I'm really stuck here ! Thanks for you help !
I'm not sure what your final use for the images is, but the intended purpose of the multipart/x-midex-replace content type is for each received part to completely replace the previously received responses. Think of it like frames of a video; only one picture is displayed at a time and the previous ones are discarded.
Temporizing is almost never a foolproof solution. Especially on the iPhone you're going to encounter an unimaginable variety of network situations and relying on a magic number delay between frames will probably still fail some of the time.
Since you have control of the server, I'd recommend dropping the multipart. Make sure when you are sending multiple requests to the server that you don't block the main thread of your iPhone app. Use NSOperations or an alternative HTTP library (like ASIHTTPRequest) to make your image fetch operations asynchronous.
I did that successfully using this code. The important thing is to create 2 buffers to receive your data. If you use only one you will have some double access problems (stream access and jpg CODEC access) and corrupted JPG data.
Do not hesitate to ask me for more details.
- (IBAction)startDowload:(id)sender {
NSURL *url = [NSURL URLWithString:#"http://210.236.173.198/axis-cgi/mjpg/video.cgi?resolution=320x240&fps=5"];
NSMutableURLRequest *req = [NSMutableURLRequest requestWithURL:url];
[req setHTTPMethod:#"GET"];
/*
I create 2 NSMutableData buffers. This points is very important.
I swap them each frame.
*/
receivedData = [[NSMutableData data] retain];
receivedData2 = [[NSMutableData data] retain];
currentData = receivedData;
urlCon = [[NSURLConnection alloc] initWithRequest:req delegate:self];
noImg = YES;
frame = 0;
}
- (void)connection:(NSURLConnection *)connection didReceiveResponse:(NSHTTPURLResponse *)response
{
// this method is called when the server has determined that it
// has enough information to create the NSURLResponse
// it can be called multiple times, for example in the case of a
// redirect, so each time we reset the data.
// receivedData is declared as a method instance elsewhere
UIImage *_i;
#try
{
_i = [UIImage imageWithData:currentData];
}
#catch (NSException * e)
{
NSLog(#"%#",[e description]);
}
#finally
{
}
CGSize _s = [_i size];
[imgView setImage:_i];
[imgView setNeedsDisplay];
[[self view] setNeedsDisplay];
}
/*
Buffers swap
*/
if (currentData == receivedData)
{
currentData = receivedData2;
}
else
{
currentData = receivedData;
}
[currendData setLength:0];
}
- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data
{
// append the new data to the currentData (NSData buffer)
[currendData appendData:data];
}