How to get errors/warnings in [UIImage initWithData:] - iphone

i have an MJPEG stream over RTSP/UDP from which i want to generate JPEGs for a UIImageView with [UIImage initWithData:]. Most of the time this works good, but sometimes i get corrupt images and log messages like:
ImageIO: <ERROR> JPEGCorrupt JPEG data: premature end of data segment
My Question is: how can i see (during runtime), that such message occurs? Unfortunatly 'initWithData' has no error output, is there any other way?
Thank you.
Edit: in this case, the initWithData does return a valid UIImage object, not nil!

There is a similar thread to this one on stack overflow: Catching error: Corrupt JPEG data: premature end of data segment.
There solution is to check for the header bytes FF D8 and ending bytes FF D9. So, if you have image data in an NSData, you can check it like so:
- (BOOL)isJPEGValid:(NSData *)jpeg {
if ([jpeg length] < 4) return NO;
const char * bytes = (const char *)[jpeg bytes];
if (bytes[0] != 0xFF || bytes[1] != 0xD8) return NO;
if (bytes[[jpeg length] - 2] != 0xFF || bytes[[jpeg length] - 1] != 0xD9) return NO;
return YES;
}
Then, to check if JPEG data is invalid, just write:
if (![self isJPEGValid:myData]) {
NSLog(#"Do something here");
}
Hope this helps!

The initWithData: method should return nil in such cases.
Try :
UIImage *myImage = [[UIImage alloc] initWithData:imgData];
if(!myImage) {
// problem
}

I've encountered the same problem in this exact situation.
It turned out that I was passing an instance of NSMutableData to global queue for decoding. During decoding the data in NSMutableData was overwritten by next frame received from network.
I've fixed the errors by passing a copy of the data. It might be better to use a buffer pool to improve performance:
NSData *dataCopy = [_receivedData copy];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
ZAssert([self isJPEGValid:dataCopy], #"JPEG data is invalid"); // should never happen
UIImage *image = [UIImage imageWithData:dataCopy];
dispatch_async(dispatch_get_main_queue(), ^{
// show image
});
});

Related

Upload video to FTP from iDevice

I am working on an APP for user to upload videos to our FTP server
So far, everything almost done but I met one issue is that after users upload videos(.MOV), I failed to open and play the files.
The error message that quicktime player returns is "can't open because the movie's file format is not recognized"
In my codes, I let users select videos by using ALAssetsLibrady
Then load the video into an ALAsset object, before start uploading, load the video into a NSInputStream object from ALAsset, here is the codes.
ALAssetRepresentation *rep = [currentAsset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:nil];
NSData *data = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
iStream = [NSInputStream inputStreamWithData:data];
[iStream open];
Next step is to set a NSOutputStream and open it, handle uploading operation by following codes.
- (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode
{
switch (eventCode) {
case NSStreamEventNone:
{
break;
}
case NSStreamEventOpenCompleted:
{
//opened connection
NSLog(#"opened connection");
break;
}
case NSStreamEventHasBytesAvailable:
{
// should never happen for the output stream
[self stopSendWithStatus:#"should never happen for the output stream"];
break;
}
case NSStreamEventHasSpaceAvailable:
{
// If we don't have any data buffered, go read the next chunk of data.
NSInteger bufferSize = 65535;
uint8_t *buffer = malloc(bufferSize);
if (bufferOffset == bufferLimit) {
NSInteger bytesRead = [iStream read:buffer maxLength:bufferSize];
if (bytesRead == -1) {
[self stopSendWithStatus:#"file read error"];
} else if (bytesRead == 0) {
[self stopSendWithStatus:nil];
} else {
bufferOffset = 0;
bufferLimit = bytesRead;
}
}
// If we're not out of data completely, send the next chunk.
if (bufferOffset != bufferLimit) {
NSInteger bytesWritten = [oStream write:&buffer[bufferOffset] maxLength:bufferLimit - bufferOffset];
if (bytesWritten == -1) {
[self stopSendWithStatus:#"file write error"];
} else {
bufferOffset += bytesWritten;
}
}
//NSLog(#"available");
break;
}
case NSStreamEventErrorOccurred:
{
//stream open error
[self stopSendWithStatus:[[aStream streamError] description]];
break;
}
case NSStreamEventEndEncountered: //ignore
NSLog(#"end");
break;
}
}
There is no any error occurs, the video file does upload to FTP with correct file size and name, but just can't open it.
Anybody knows any clue?
I have made NSInputStream implementation for streaming ALAsset objects - POSInputStreamLibrary. It doesn't read the whole 1GB video into memory as your solution, but reads movie with chunks instead. Of course this is not the only feature of POSBlobInputStream. More info at my GitHub repository.
I know this probably isn't the answer you're looking for, but you should NOT use a direct connection via FTP to allow users to upload files to your webserver. It's unsecure and slow compared with REST.
Instead, why not write a tiny bit of php to handle the upload, and POST the file from the app via REST? here:
$uploaddir = 'uploads/';
$file = basename($_FILES['file']['name']);
$uploadfile = $uploaddir . $file;
I also recommend using AFNetworking to handle the POST request http://afnetworking.com/
First of all,I guess you meant to reduce memory capacity by convert ALAsset to NSInputStream other than NSData.But you convert it to NSData firstly then convert NSData you got to NSInputStream,it doesn't make sense and would not reduce memory capacity for you have already put your video into memory with NSData.
So if you want to transfer your video via Stream in order to reduce memory pressure(or you have no choice because your video is up to 2GB or more),you should use CFStreamCreateBoundPair to upload file chunk by chunk,see the Apple iOS Developer Library written below.
For large blocks of constructed data, call CFStreamCreateBoundPair to create a pair of streams, then call the setHTTPBodyStream: method to tell NSMutableURLRequest to use one of those streams as the source for its body content. By writing into the other stream, you can send the data a piece at a time.
I have a swift version of converting ALAsset to NSInputStream via CFStreamCreateBoundPair in github.The key point is just like the Documents written.Another reference is this question.
Hope it would be helpful for you.

iOS - Returning a variable from a block

I have an iOS project which is pulling images from a datasource method. I would like to be able to pull the image from the assets library (and the code chunk below does this just fine).
However, I need this dataSource method to return a UIImage, but when I use the assets library methods to get the image, the image is returned in a result block. Simply putting return image in the result block obviously does not work.
Does anyone have any idea how I can have the method return a UIImage from inside the result block? I have seen several other SO questions about returning images within blocks, but they are say to call another method. I - unfortunately - can't do that because this method is a nimbus datasource method which must return a UIImage.
Any help or advice would be greatly appreciated! Code below:
- (UIImage *)photoAlbumScrollView: (NIPhotoAlbumScrollView *)photoAlbumScrollView
photoAtIndex: (NSInteger)photoIndex
photoSize: (NIPhotoScrollViewPhotoSize *)photoSize
isLoading: (BOOL *)isLoading
originalPhotoDimensions: (CGSize *)originalPhotoDimensions {
__block UIImage *image = nil;
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:[_photos objectAtIndex:photoIndex]
resultBlock:^(ALAsset *asset){
ALAssetRepresentation *rep = [asset defaultRepresentation];
CGImageRef imageRef = [rep fullScreenImage];
if (imageRef) {
image = [UIImage imageWithCGImage:imageRef];
}
}
failureBlock:^(NSError *error) {
//return nil;
}];
return image;
}
You should create an array for each image. When this data source method Is first called, you will not have an image for that index in the array. Kick off the asset call then return a place holder image. When the block returns, replace the place holder image with the asset image returned in the block. You may need to perform this on the main queue using GCD.
So I think I have a solution to your problem. The idea is to make use of a dispatch_group, since you can wait on a dispatch group - it gives you a way to block a thread until something happens. It may require that your datasource action NOT use the mainthread, but you are going to have to play with this. Lets assume that the object implementing photoAlbumScrollView is called 'obj'.
obj creates a serial dispatch queue (called queue)
datasource sends [obj photoAlbumScrollView] message
photoAlbumScrollView does what it does now, but before returning waits on the queue
the final block unblocks the queue letting the group finish
The code:
__block UIImage *image = nil;
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
dispatch_queue_t queue = dispatch_queue_create("com.myApp.assetFetch", DISPATCH_QUEUE_SERIAL);
[assetslibrary assetForURL:[_photos objectAtIndex:photoIndex]
resultBlock:^(ALAsset *asset){
ALAssetRepresentation *rep = [asset defaultRepresentation];
CGImageRef imageRef = [rep fullScreenImage];
if (imageRef) {
image = [UIImage imageWithCGImage:imageRef];
}
dispatch_resume(queue);
}
failureBlock:^(NSError *error) {
dispatch_resume(queue);
}];
dispatch_suspend(queue);
dispatch_sync(queue, ^{ NSLog(#"UNSUSPEND!"); }); // ultimately a block with just a ';' in it
dispatch_release(queue);
return image;
I obviously did not test this but it or something close to it should work, assuming again that you can make this on a thread and not on the mainThread.

I am using ZXingObjC lib for reading barcode from an image but its not returing any result the code i am using is below

-(void)decode:(CVImageBufferRef)BufferRef
{
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
CGImageRef videoFrameImage = [ZXCGImageLuminanceSource createImageFromBuffer:BufferRef];
CGImageRef rotatedImage = [self rotateImage:videoFrameImage degrees:0.0f];
[NSMakeCollectable(videoFrameImage) autorelease];
//Decoding:
ZXLuminanceSource* source = [[[ZXCGImageLuminanceSource alloc] initWithCGImage:rotatedImage] autorelease];
[NSMakeCollectable(rotatedImage) autorelease];
ZXBinaryBitmap* bitmap = [ZXBinaryBitmap binaryBitmapWithBinarizer:[ZXHybridBinarizer binarizerWithSource:source]];
NSError* error = nil;
// There are a number of hints we can give to the reader, including
// possible formats, allowed lengths, and the string encoding.
ZXDecodeHints* hints = [ZXDecodeHints hints];
ZXMultiFormatReader* reader = [ZXMultiFormatReader reader];
ZXResult* result = [reader decode:bitmap
hints:hints
error:&error];
if (result)
{
// The coded result as a string. The raw data can be accessed with
// result.rawBytes and result.length.
NSString* contents = result.text;
// The barcode format, such as a QR code or UPC-A
ZXBarcodeFormat format = result.barcodeFormat;
}
else
{
// Use error to determine why we didn't get a result, such as a barcode
// not being found, an invalid checksum, or a format inconsistency.
}
[pool drain];
}
I ran into this problem which is easily resolved by down sampling the image that the camera generates. Apparently it's too high res for the library to process the bar code out of it. By reducing the size of my UIImage to 640x480 before call [image CGImage] everything worked perfectly.
You've commented that you should use the error to determine why you didn't get a result, but you don't actually do that. Look at what's in error.

iOS NSArray inserting object at an index

I'm looking at the crash log for a device that's testing an app and I see the following lines...
objc_exception_throw + 33
[__NSArrayM insertObject:atIndex:] +187
The code where this happens is below. appData is an NSDictionary, and I'm expecting imageUrl to be a URL to a png file on the internet.
for (int i = 1; i <= [self getNumberOfScreenshots]; i++) {
pathToUrl = #"screenshot_";
pathToUrl = [pathToUrl stringByAppendingString:[[NSNumber numberWithInt:i] stringValue]];
imageUrl = [self.appData valueForKey:pathToUrl];
imageData = [[NSData alloc]initWithContentsOfURL:[NSURL URLWithString:imageUrl]];
[NSMutableArrayObj addObject:imageData];
}
What would cause this type of error? The error happens very rarely..could it be that imageData is sometimes nil because it fails to download the png image off the url, so that throws that exception when I try to add it to the NSMutableArrayObj?
Thanks!
there may two reasons as you have not described error more specically
1) Actually your not allocating the memory to array
2) inserting nil value to array. To stop inserting nil do this:
if(imageData)
{
[NSMutableArrayObj addObject:imageData];
}

Truncated Core Data NSData objects

I am saving arrays of doubles in an NSData* object that is persisted as a binary property in a Core Data (SQLite) data model. I am doing this to store sampled data for graphing in an iPhone app. Sometimes when there are more than 300 doubles in the binary object not all the doubles are getting saved to disk. When I quit and relaunch my app there may be as few as 25 data points that have persisted or as many as 300.
Using NSSQLitePragmasOption with synchronous = FULL and this may be making a difference. It is hard to tell, as bug is intermittent.
Given the warnings about performance problems as a result of using synchronous = FULL, I am seeking advice and pointers.
Thanks.
[[Edit: here is code.]]
The (as yet unrealized) intent of -addToCache: is to add each new datum to the cache but only flush (fault?) Data object periodically.
From Data.m
#dynamic dataSet; // NSData * attribute of Data entity
- (void) addDatum:(double_t)datum
{
DLog(#"-[Data addDatum:%f]", datum);
[self addToCache:datum];
}
- (void) addToCache:(double_t)datum
{
if (cache == nil)
{
cache = [NSMutableData dataWithData:[self dataSet]];
[cache retain];
}
[cache appendBytes:&datum length:sizeof(double_t)];
DLog(#"-[Data addToCache:%f] ... [cache length] = %d; cache = %p", datum, [cache length], cache);
[self flushCache];
}
- (void) wrapup
{
DLog(#"-[Data wrapup]");
[self flushCache];
[cache release];
cache = nil;
DLog(#"[self isFault] = %#", [self isFault] ? #"YES" : #"NO"); // [self isFault] is always NO.
}
- (void) flushCache
{
DLog(#"flushing cache to store");
[self setDataSet:cache];
DLog(#"-[Data flushCache:] [[self dataSet] length] = %d", [[self dataSet] length]);
}
- (double*) bytes
{
return (double*)[[self dataSet] bytes];
}
- (NSInteger) count
{
return [[self dataSet] length]/sizeof(double);
}
- (void) dump
{
ALog(#"Dump Data");
NSInteger numDataPoints = [self count];
double *data = (double*)[self bytes];
ALog(#"numDataPoints = %d", numDataPoints);
for (int i = 0; i
I was trying to get behavior as if my Core Data entity could have an NSMutableData attribute. To do this my NSManagedObject (called Data) had an NSData attribute and an NSMutableData ivar. My app takes sample data from a sensor and appends each data point to the data set - this is why I needed this design.
On each new data point was appended to the NSMutableData and then the NSData attribute was set to the NSMutableData.
I suspect that because the NSData pointer wasn't changing (though its content was), that Core Data did not appreciate the amount of change. Calling -hasChanged on the NSManagedObjectContext showed that there had been changes, and calling -updatedObjects even listed the Data object as having changed. But the actual data that was being written seems to have been truncated (sometimes).
To work around this I changed things slightly. New data points are still appended to NSMutableData but NSData attribute is only set when sampling is completed. This means that there is a chance that a crash might result in truncated data - but for the most part this work around seems to have solved the problem.
Caveat emptor: the bug was always intermittent, so it is possible that is still there - but just harder to manifest.