Can't display a texture loaded asynchronously - iphone

In my game i have to change background of the scene during gameplay time. When i set new texture for the background the game slows down for a moment. In order to escape this i'm trying to preload a texture asynchronously and then show it on main thread. This is how i do that:
NSString *filename = [NSString stringWithFormat:#"res/src/level_%i/background.png", [GameLevel sharedGameLevel].currentLevelIndex + 1];
__block CCTexture2D *texture;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSLog(#"FILENAME %#", filename);
[[CCTextureCache sharedTextureCache] addImage:filename];
NSLog(#"%#", [CCTextureCache sharedTextureCache]);
dispatch_async(dispatch_get_main_queue(), ^{
texture = [[CCTextureCache sharedTextureCache] textureForKey:filename];
[spareBackground setTexture:texture];
[dayBackground runAction:[CCSequence actions:fadeOut,[CCCallBlockN actionWithBlock:^(CCNode *node)
{
NSLog(#" TEXTURE %#", texture);
[dayBackground setTexture:texture];
CCFadeIn *fadeIn = [[[CCFadeIn alloc] initWithDuration:5] autorelease];
[dayBackground runAction:fadeIn];
}], nil]];
});
});
but instead of background i always receive a blank screen despite the texture has been successfully loaded, it's not nil. This code works just fine if the texture is loaded on the main thread without using gcd. What am i missing?

My suspect is that CCTextureCache is not thread-safe (and being a shared object it would need to be thread safe in order to be safely called from another thread).
Cocos2d, on the other hand already provides mechanism to load a texture asynchronously, so you might use them instead. This should be the signature:
[[CCTextureCache sharedTextureCache] addImageAsync:filename target:self selector:#selector(textureLoaded:)];

If you can afford to trade 'space' for 'time' , ie your game's memory footprint is reasonable, preload both textures at the start of the scene, and during game play, just flip the texture's visibility.
Also, i found that loading pvr textures instead of png's will run considerably faster (ie consume less cpu i guess). You could try that as a first attempt - hoping that the 'pause' would be acceptable from a user experience standpoint.
ps. any comment i make on performance are based on actual measurements and tests on real devices. The simulator is useless. Make certain this 'pause' is present on devices before spending any effort on optimizing.

Related

How do I properly preload SKScenes?

I have the exact same problem as described here, however the proposed solution won't work. In short, when transitioning between scenes, the transition in between is skipped entirely because the presentee takes too long to load. Now I'm looking for a way to properly preload all contents of the scene.
What I do is:
alloc/init the scene upon application launch
preload all textures
the scene consists of thereafter
a few seconds later, present the
scene.
The first time a transition is triggered it is always skipped, all following attempts work as expected.
Here is an example of how I alloc/init and preload the scenes (upon launch):
self.menuScene = [[LEMenuScene alloc] initWithSize:self.bounds.size];
[self.menuScene preload];
whereby preload is a category of SKScene:
- (void) preload {
NSMutableArray* allTextures = [[NSMutableArray alloc] init];
[self enumerateChildNodesWithName:#"//SKSpriteNode" usingBlock:^(SKNode *node, BOOL *stop) {
SKTexture* texture = ((SKSpriteNode*)node).texture;
if (texture) [allTextures addObject: texture];
}];
[SKTexture preloadTextures:allTextures withCompletionHandler:^{
//nop
}];
}
I am aware that the loading is carried out asynchronously and that preload will return instantly (which is intended), so I could cause a transition before loading is finished, but it does not matter how long I wait, it always lags.
Any tips/help is greatly appreciated!
Turns out the issue wasn't the textures after all. I misspelled the name of a font, which made the system look for a suitable substitution, causing the delay.
Here's the same issue:
How to cache or preload SKLabelNode font?

NSOperation mainQueue issue

I have slideshow, and I want to show Big images, I added to [NSOperation mainQueue] operation with low priority, this operation shows the image.
If image is small , everything is OK, but when image is about 5Mb, the view freeze for 1 second, and I can't scroll my slideshow. I think, that displaying big images just so difficult for iPhone, that main queue is too overloaded.
But I don't inderstand it , because all my displaying code is executed in low priority operation.
Here is the displaying code.
[imageView removeFromSuperview];
imageView = nil;
// reset our zoomScale to 1.0 before doing any further calculations
self.zoomScale = 1.0;
// make a new UIImageView for the new image
self.imageView = [[[UIImageView alloc] initWithImage:image] autorelease];
[self addSubview:imageView];
self.contentSize = [image size];
[self setMaxMinZoomScalesForCurrentBounds];
self.zoomScale = self.minimumZoomScale;
May be I can set the priority for gesture recognizers (the regular questure recognizers for UIScrollView?)
Update
Please look at my new topic, I described the issue more properly my topik
Priority has to do with scheduling. If you queue up a bunch of operations during a runloop iteration then they will be executed by their priority on that queue.
One solution to speed this up would be to either include resources that are scaled to the exact size that you are displaying them in. If you are trying to show a 2000x2000 px image in a 200x200 area then the system to scale all this stuff in memory. You can also dynamically create smaller to fit images programmatically. This can be done on a background queue so your UI is still responsive.
How to resize the image programmatically in objective-c in iphone
If I understand you correclty and you have done something like
NSOperationQueue* queue = [NSOperationQueue mainQueue];
Then any NSOperation you add to it will execute on the main dispatch queue which is concurrent (and responsible for executing tasks on the main thread). That would explain the freeze. You can create your own queue which would start a thread for every NSOperation and would free the main thread to render the UI normally as:
NSOperationQueue* queue = [[NSOperationQueue alloc] init];
[queue addOperation:operation];
This however would cause a problem. When the image is finished loading and you pass it to an UIImageView on the screen there will be a big delay until the Image is actually rendered because the main (UI) thread will not be aware of the action until it chooses to refresh (a few seconds later). The solution to this is to add a 'performInMainThread' message to the end of the main method of the NSOperation as such:
-(void)main {
NSData *bgImageData = [NSData dataWithContentsOfURL:self.targetUrl];
UIImage *img = [UIImage imageWithData:bgImageData];
[self performSelectorOnMainThread:#selector(insertImageLoaded:)
withObject:img
waitUntilDone:YES];
}
The 'insertImageLoaded' is in the NSOperation and would load call a setImage:(UIImage*)img to the component you want.

writeImageToSavedPhotosAlbum save only a few images

I've been following apples example, QA1702, on how to capture images using the AVFoundation. I won't cite the code here because of space concern. A brief description of what I'm trying to achieve:
Use the iPhone camera to pass a "video" (actually a sequence of images) to a web server, and I know this is possible. However in order to be able to pass the an image using the HTTP POST as in this example, I have to save the image. not necessarily in the photos album but I wan't to be able to view the pictures there as well in debug purposes.
The apple QA1702 contains 3 methods:
- (void)setupCaptureSession
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
//this is modified to be void as you might see, will get back to this
- (void) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
in setupCaptureSession I start the session as in the example. the captureOutput is only running the imageFromSampleBuffer, and that's where I've added some changes:
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
//library is declared in .h and is a ALAssetsLibrary
[library writeImageToSavedPhotosAlbum:quartzImage orientation:ALAssetOrientationDown completionBlock:nil];
// Release the Quartz image
CGImageRelease(quartzImage);
I've removed the creation of the UIImage and changed it to void typ since I do the writeImageToSavedPhotosAlbum: with the CGImageRef here instead.
The problem as I see it is that during the 10sec that I capture images ~150 calls to captureOutput are made, and therefor the same amount to writeImageToSavedPhotos but only ~5-10 pictures are saved. I'm aware of the memory abuse this is but since I'm not getting any warnings I can't figure out why not more images are created. and what can I do about it? Is it because, and I'm only guessing now, writeImageToSavedPhotos starts new threads and the iPhone can't handle more than a certain amount of threads. I've read something about NSOperationQueue, should I look into it?
On a side note, I use a NSTimer in setupCaptureSession:
[NSTimer scheduledTimerWithTimeInterval: 10.0 target:self selector:#selector(timerFireMethod:) userInfo:nil repeats: NO];
however I want to start it in first call to captureOutput in order to avoid time elapsing during the startup of video camera. but if I move this code line to captureOutput then timerFireMethod: is never called? any ideas?
This is solvable with NSOperationQueue, but is no longer interesting to me since writing to file is way to ineffective for most applications.

AVCaptureVideoPreviewLayer: taking a snapshot

I'm trying to emulate the animation seen in the default camera app, where a snapshot of the cameras viewfinder is animated into the corner of the apps display.
The AVCaptureVideoPreviewLayer object that holds the key to solving this problem isn't very open to these requirements: trying to create a copy of it in a new layer with ..
- (id)initWithLayer:(id)layer
.. returns an empty layer, without the image snapshot, so clearly there is some deeper magic going on here.
Your clues/boos are most welcome.
M.
facing the same woes, from a slightly different angle.
Here are possible solutions, that none are too great IMO:
You can add to an AVCaptureSession both an AVCaptureStillImageOutput and an AVCaptureVideoDataOutput. When you set the sessionPreset to AVCaptureSessionPresetHigh you'll start getting frames by the API, and when you switch to AVCaptureSessionPresetPhoto you can take real images. So right before taking the picture, you can switch to video, get a frame, and then return to camera. Major caveat is that it takes a "long" time (couple of seconds) for the camera to switch between the video camera and picture camera.
Another option would be to use only the camera output (AVCaptureStillImageOutput), and use UIGetScreenImage to get a screen capture of the phone. You could then crop out the controls and leave only the image. This gets complicated if you're showing UI controls over the image. Also, according to this post, Apple started rejecting apps that use this function (it was always iffy).
Aside from these I also tried playing with AVCaptureVideoPreviewLayer. There's this post to save a UIView or CALayer to a UIImage. But it all produces clear or white images. I tried accessing the layer, the view's layer, the superlayer, the presentationLayer, the modelLayer, but to no avail. I guess the data in AVCaptureVideoPreviewLayer is very internal, and not really part of the regular layer infrastructure.
Hope this helps,
Oded.
I think you should add an AVCaptureVideoDataOutput to the current session with:
AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc] init];
videoOutput.videoSettings = #{ (NSString *)kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA) };
[session addOutput:videoOutput];
dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
[videoOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
Then, implement the delegate method below to get your image snapshot:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
// Add your code here that uses the image.
dispatch_async(dispatch_get_main_queue(), ^{
_imageView.image = image;
});
}
This will consume memory and reduce the performance of the app. To improve, you can also optimize your AVCaptureVideoDataOutput with:
videoOutput.minFrameDuration = CMTimeMake(1, 15);
You can also use alwaysDiscardsLateVideoFrames.
there are 2 ways to grab frames of the preview.. AVCaptureVideoDataOutput & AVCaptureStillImageOutput :)
is your capture session is setup to grab video frames, then make your layer with the cgimage from a chosen frame. if it's setup for stills, wait until getting your still image and make your layer from a scaled down version of that cgimage. if you don't have an output on your session yet, you'll have to add one i think.
Starting in iOS 7, you can use UIView::snapshotViewAfterScreenUpdates to snapshot the UIView wrapping your AVCaptureVideoPreviewLayer. This is not the same as UIGetScreenImage, which will get your app rejected.
UIView *snapshot = [self.containerView snapshotViewAfterScreenUpdates:YES];
Recall the old-school way of turning a view into an image. For some reason it worked on everything except for camera previews:
UIGraphicsBeginImageContextWithOptions(self.containerView.bounds.size, NO, [UIScreen mainScreen].scale);
[self.containerView drawViewHierarchyInRect:self.containerView.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

OpenGLES fails to generate Framebuffers in iPhone thread

I've got a lovely OpenGLES code slice that renders up images for me. When I want to, I can call a function on it:
-(UIImage *)renderToImage;
That does a lot of rendering work and returns me an image. This includes the generation of FBOs, textures, etc.
Lately, I've found myself needing to enhance this. The image generation takes four seconds, so I want to pass off the work to another thread and let the app continue. This seemed simple enough. I made a method with this code:
-(void) generateRandomNewImage:(MyViewController *)evc{
UIImage * renderedImage = [self renderToImage];
NSString * fileLoc = [self writeToTempFile:renderedImage];
NSLog(#"File location:%#",fileLoc);
[evc performSelectorOnMainThread:#selector(imageGenerationComplete:) withObject:fileLoc waitUntilDone:NO];
}
Hopefully you can see the logic going on here. This method renders the image, saves it to the filesystem, and calls a method on the main thread's viewcontroller to let it know the file is ready. This code is inside my opengl renderer. It's called here, in the main thread's viewcontroller:
thread = [[NSThread alloc] initWithTarget:renderer
selector:#selector(generateRandomNewImage:)
object:self];
[thread start];
To me, that seems fine too. When I run this code, I get told in my console that my framebuffer object status were error'ed, with a status of zero. I have no idea why. As a result, I get a blank image (saving to the temp files work, by the way, I've tested them).
To test, I put all of this code into the main thread, didn't create any new threads or anything. It all worked fine. As soon as I try and pass off the image generation to another thread, I hit problems.
Using OpenGL in another thread is not that simple as that, only one thread can use a OpenGL context at a time, and your second thread doesn't have a OpenGL context, thus all OpenGL calls fail.
Solution: Create another OpenGL context for the second thread, and read this.