writeImageToSavedPhotosAlbum save only a few images - iphone

I've been following apples example, QA1702, on how to capture images using the AVFoundation. I won't cite the code here because of space concern. A brief description of what I'm trying to achieve:
Use the iPhone camera to pass a "video" (actually a sequence of images) to a web server, and I know this is possible. However in order to be able to pass the an image using the HTTP POST as in this example, I have to save the image. not necessarily in the photos album but I wan't to be able to view the pictures there as well in debug purposes.
The apple QA1702 contains 3 methods:
- (void)setupCaptureSession
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
//this is modified to be void as you might see, will get back to this
- (void) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
in setupCaptureSession I start the session as in the example. the captureOutput is only running the imageFromSampleBuffer, and that's where I've added some changes:
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
//library is declared in .h and is a ALAssetsLibrary
[library writeImageToSavedPhotosAlbum:quartzImage orientation:ALAssetOrientationDown completionBlock:nil];
// Release the Quartz image
CGImageRelease(quartzImage);
I've removed the creation of the UIImage and changed it to void typ since I do the writeImageToSavedPhotosAlbum: with the CGImageRef here instead.
The problem as I see it is that during the 10sec that I capture images ~150 calls to captureOutput are made, and therefor the same amount to writeImageToSavedPhotos but only ~5-10 pictures are saved. I'm aware of the memory abuse this is but since I'm not getting any warnings I can't figure out why not more images are created. and what can I do about it? Is it because, and I'm only guessing now, writeImageToSavedPhotos starts new threads and the iPhone can't handle more than a certain amount of threads. I've read something about NSOperationQueue, should I look into it?
On a side note, I use a NSTimer in setupCaptureSession:
[NSTimer scheduledTimerWithTimeInterval: 10.0 target:self selector:#selector(timerFireMethod:) userInfo:nil repeats: NO];
however I want to start it in first call to captureOutput in order to avoid time elapsing during the startup of video camera. but if I move this code line to captureOutput then timerFireMethod: is never called? any ideas?

This is solvable with NSOperationQueue, but is no longer interesting to me since writing to file is way to ineffective for most applications.

Related

Can't display a texture loaded asynchronously

In my game i have to change background of the scene during gameplay time. When i set new texture for the background the game slows down for a moment. In order to escape this i'm trying to preload a texture asynchronously and then show it on main thread. This is how i do that:
NSString *filename = [NSString stringWithFormat:#"res/src/level_%i/background.png", [GameLevel sharedGameLevel].currentLevelIndex + 1];
__block CCTexture2D *texture;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSLog(#"FILENAME %#", filename);
[[CCTextureCache sharedTextureCache] addImage:filename];
NSLog(#"%#", [CCTextureCache sharedTextureCache]);
dispatch_async(dispatch_get_main_queue(), ^{
texture = [[CCTextureCache sharedTextureCache] textureForKey:filename];
[spareBackground setTexture:texture];
[dayBackground runAction:[CCSequence actions:fadeOut,[CCCallBlockN actionWithBlock:^(CCNode *node)
{
NSLog(#" TEXTURE %#", texture);
[dayBackground setTexture:texture];
CCFadeIn *fadeIn = [[[CCFadeIn alloc] initWithDuration:5] autorelease];
[dayBackground runAction:fadeIn];
}], nil]];
});
});
but instead of background i always receive a blank screen despite the texture has been successfully loaded, it's not nil. This code works just fine if the texture is loaded on the main thread without using gcd. What am i missing?
My suspect is that CCTextureCache is not thread-safe (and being a shared object it would need to be thread safe in order to be safely called from another thread).
Cocos2d, on the other hand already provides mechanism to load a texture asynchronously, so you might use them instead. This should be the signature:
[[CCTextureCache sharedTextureCache] addImageAsync:filename target:self selector:#selector(textureLoaded:)];
If you can afford to trade 'space' for 'time' , ie your game's memory footprint is reasonable, preload both textures at the start of the scene, and during game play, just flip the texture's visibility.
Also, i found that loading pvr textures instead of png's will run considerably faster (ie consume less cpu i guess). You could try that as a first attempt - hoping that the 'pause' would be acceptable from a user experience standpoint.
ps. any comment i make on performance are based on actual measurements and tests on real devices. The simulator is useless. Make certain this 'pause' is present on devices before spending any effort on optimizing.

How Do I speed Up Image Loading Using The Assets Library?

I am writing an app that is a clone of the UIImagePicker but uses the Assets library. When the user selects a photo, it takes a little bit too long for the image to load. I notice that when I use the photos app which has identical functionality as to what I'm developing, that the image loading is a bit faster. I've heard another responder on this site mention the following in order to mimic the functionality of the photos app:
"Load the thumbnail image first (best with dispatch_async) - that should be really quick. When this has completed, load the fullscreen image like you did above. This is what apple does in the Photo App to provide a smooth user experience."
Does anyone have any code samples of how this can be accomplished? I'm not quite sure that I understand what he means.
Also here is my code for which I'm using to load an image (I'm passing the image as a parameter to another view controller):
myImage = [UIImage imageWithCGImage:[[myAsset defaultRepresentation] fullScreenImage]];
The class ALAsset has two methods to obtain thumbnails:
- (CGImageRef)thumbnail
- (CGImageRef)aspectRatioThumbnail
I bet they are faster than obtaining the full screen sized version of the asset.
Also, you can wrap them with an async operation. Be sure to update the UI in main thread. Roughly like this:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW, 0), ^{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
/* obtain the image here */
dispatch_async(dispatch_get_main_queue(), ^{
/* update screen here */
});
[pool drain];
});
If you need to obtain thumbnails for videos you should use AVAssetImageGenerator. It has a method to obtain them asynchronously.
Look for Apple sample code (AVEditDemo and probably others working with assets library).

AVCaptureVideoPreviewLayer: taking a snapshot

I'm trying to emulate the animation seen in the default camera app, where a snapshot of the cameras viewfinder is animated into the corner of the apps display.
The AVCaptureVideoPreviewLayer object that holds the key to solving this problem isn't very open to these requirements: trying to create a copy of it in a new layer with ..
- (id)initWithLayer:(id)layer
.. returns an empty layer, without the image snapshot, so clearly there is some deeper magic going on here.
Your clues/boos are most welcome.
M.
facing the same woes, from a slightly different angle.
Here are possible solutions, that none are too great IMO:
You can add to an AVCaptureSession both an AVCaptureStillImageOutput and an AVCaptureVideoDataOutput. When you set the sessionPreset to AVCaptureSessionPresetHigh you'll start getting frames by the API, and when you switch to AVCaptureSessionPresetPhoto you can take real images. So right before taking the picture, you can switch to video, get a frame, and then return to camera. Major caveat is that it takes a "long" time (couple of seconds) for the camera to switch between the video camera and picture camera.
Another option would be to use only the camera output (AVCaptureStillImageOutput), and use UIGetScreenImage to get a screen capture of the phone. You could then crop out the controls and leave only the image. This gets complicated if you're showing UI controls over the image. Also, according to this post, Apple started rejecting apps that use this function (it was always iffy).
Aside from these I also tried playing with AVCaptureVideoPreviewLayer. There's this post to save a UIView or CALayer to a UIImage. But it all produces clear or white images. I tried accessing the layer, the view's layer, the superlayer, the presentationLayer, the modelLayer, but to no avail. I guess the data in AVCaptureVideoPreviewLayer is very internal, and not really part of the regular layer infrastructure.
Hope this helps,
Oded.
I think you should add an AVCaptureVideoDataOutput to the current session with:
AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc] init];
videoOutput.videoSettings = #{ (NSString *)kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA) };
[session addOutput:videoOutput];
dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
[videoOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
Then, implement the delegate method below to get your image snapshot:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
// Add your code here that uses the image.
dispatch_async(dispatch_get_main_queue(), ^{
_imageView.image = image;
});
}
This will consume memory and reduce the performance of the app. To improve, you can also optimize your AVCaptureVideoDataOutput with:
videoOutput.minFrameDuration = CMTimeMake(1, 15);
You can also use alwaysDiscardsLateVideoFrames.
there are 2 ways to grab frames of the preview.. AVCaptureVideoDataOutput & AVCaptureStillImageOutput :)
is your capture session is setup to grab video frames, then make your layer with the cgimage from a chosen frame. if it's setup for stills, wait until getting your still image and make your layer from a scaled down version of that cgimage. if you don't have an output on your session yet, you'll have to add one i think.
Starting in iOS 7, you can use UIView::snapshotViewAfterScreenUpdates to snapshot the UIView wrapping your AVCaptureVideoPreviewLayer. This is not the same as UIGetScreenImage, which will get your app rejected.
UIView *snapshot = [self.containerView snapshotViewAfterScreenUpdates:YES];
Recall the old-school way of turning a view into an image. For some reason it worked on everything except for camera previews:
UIGraphicsBeginImageContextWithOptions(self.containerView.bounds.size, NO, [UIScreen mainScreen].scale);
[self.containerView drawViewHierarchyInRect:self.containerView.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

OpenGLES fails to generate Framebuffers in iPhone thread

I've got a lovely OpenGLES code slice that renders up images for me. When I want to, I can call a function on it:
-(UIImage *)renderToImage;
That does a lot of rendering work and returns me an image. This includes the generation of FBOs, textures, etc.
Lately, I've found myself needing to enhance this. The image generation takes four seconds, so I want to pass off the work to another thread and let the app continue. This seemed simple enough. I made a method with this code:
-(void) generateRandomNewImage:(MyViewController *)evc{
UIImage * renderedImage = [self renderToImage];
NSString * fileLoc = [self writeToTempFile:renderedImage];
NSLog(#"File location:%#",fileLoc);
[evc performSelectorOnMainThread:#selector(imageGenerationComplete:) withObject:fileLoc waitUntilDone:NO];
}
Hopefully you can see the logic going on here. This method renders the image, saves it to the filesystem, and calls a method on the main thread's viewcontroller to let it know the file is ready. This code is inside my opengl renderer. It's called here, in the main thread's viewcontroller:
thread = [[NSThread alloc] initWithTarget:renderer
selector:#selector(generateRandomNewImage:)
object:self];
[thread start];
To me, that seems fine too. When I run this code, I get told in my console that my framebuffer object status were error'ed, with a status of zero. I have no idea why. As a result, I get a blank image (saving to the temp files work, by the way, I've tested them).
To test, I put all of this code into the main thread, didn't create any new threads or anything. It all worked fine. As soon as I try and pass off the image generation to another thread, I hit problems.
Using OpenGL in another thread is not that simple as that, only one thread can use a OpenGL context at a time, and your second thread doesn't have a OpenGL context, thus all OpenGL calls fail.
Solution: Create another OpenGL context for the second thread, and read this.

AssetsLibrary and ImageView -setImage Slowness

So this one is pretty odd ad I'm not sure if the trouble is with the AssetsLibrary API, but I can't figure out what else might be happening.
I am loading an array with ALAssets using the -enumerateAssetsUsingBlock method on ALAssetsGroup. When it completes, I am loading a custom image scroller. As the scroller finishes scrolling, I use NSInvocationOperations to load the images for the currently visible views (pages) from the photo library on disk. Once the image is loaded and is cached, it notifies the delegate which then grabs the image from the cache and displays it in an image view in the scroller.
Everything works fine, but the time it takes from when -setImage: actually gets called to the time it actually shows up visibly on the screen is unbearable--sometimes 10 seconds or more to actually show up.
I have tried it both with and without image resizing which adds almost nothing to the processing time when I do the resizing. As I said, the slowdown is somewhere after I call -setImage on the image view. Is anyone aware of some sort of aspect of the AssetLibrary API that might cause this?
Here's some relevant code:
- (void)setImagesForVisiblePages;
{
for (MomentImageView *page in visiblePages)
{
int index = [page index];
ALAsset *asset = [photos objectAtIndex:index];
UIImage *image = [assetImagesDictionary objectForKey:[self idForAsset:asset]];
// If the image has already been cached, load it into the
// image view. Otherwise, request the image be loaded from disk.
if (image)
{
[[page imageView] setImage:image];
}
else {
[self requestLoadImageForAsset:asset];
[[page imageView] setImage:nil];
}
}
}
This will probably mess up any web searches looking to solve problems with the AssetsLibrary, so for that I apologize. It turns out that the problem wasn't the AssetsLibrary at all, but rather my use of multi-threading. Once the image finished loading, I was posting a notification using the default NSNotificationCenter. It was posting it on the background thread which was then updating (or trying to update, at least) the UIImageView with -setImage. Once I changed it to use -performSelectorOnMainThread and had that selector set the image instead, all was well.
Seems no matter how familiar I get with multi-threading, I still forget the little gotchas from time to time.