Thread Safe (CG) Version of UIGraphicsGetImageFromCurrentImageContext? - iphone

Im trying to render PDF pages to images on a background thread, is there a thread safe way of generating an image from a pdfContext/context that doesn't not use:
UIGraphicsGetImageFromCurrentImageContext

AFAIK (almost) all Core Graphics calls should be safe to use in a background thread as long you don't draw on screen and don't share resources (such as graphics contexts) among multiple threads.

Related

OpenGL ES 2.0 iPhone - Rendering on background thread block main thread

I'm rendering OpenGL Context on a background thread with a different EAGLContext than the main thread.
I use something like this:
- (void)renderInBackground {
EAGLContext *context = [[EAGLContext] alloc] init];
[EAGLContext setCurrentContext:context];
Rendering..
}
However, even though this is performed in a background thread, when using a heavy shader, the main thread gets blocked and the UI gets stuck.
Why is the background thread blocking the main thread?
the methods are not synchronized.
You do have finite CPU and GPU resources to perform all of the rendering and UI interactions, so if you max out the GPU you will slow down everything else in your application.
That said, you should be able to render on a background thread without completely halting all UI elements. I do some fairly intense rendering in my open source Molecules application, all of it using a background GCD queue, yet you can still scroll in popovers and otherwise interact with the interface.
I describe the process I use in this answer, but the basic setup is a single-wide GCD queue that relies on a dispatch semaphore to prevent the enqueueing of additional rendering frames while one is still being processed. Threads and blocks have some overhead to them, so if they are being fired off faster than they can be processed, this can lead to resource exhaustion. The semaphore prevents this.
Wrapping all interactions with my OpenGL ES context in this queue provides lock-free usage of this shared resource, and I see a significant performance boost over simply running this all on the main thread on the multicore devices. As I said, I'm still able to interact with the UI during even heavy rendering here.

crash bug because of using CATiledLayer and Core Data on iOS5

I am using CATiledLayer to render NSManagedObjects.
But you know, CATiledLayer render objects in background threads. This make my app crash on iOS5
I know that I should use separated NSManagedContext for each of threads but this way make performance get bad. (because I have to save the NSManagedContext more often to transfer data to other threads).
Do you guys know the better way to work around my problem? Please help!!!
Sorry for my poor English!
NSManagedObjectContext is not thread safe, nor are NSMangedObjects. You should create a MOC on the background thread, pass in any IDs (which ARE thread safe), and load them on the background thread context.
UPDATE:
One alternative is to create plain old obj-c objects, or even just a regular NSDictionay, which contains the necessary data and pass those to the background thread. So after your MO is populated, create a POOCO, copy in the necessary data, and pass that to your background thread for processing. This will avoid disk access.

How to buffer PDF pages on CALayers in iPhone

I am working on a PDFReader application.I am making use of CALayer to render the pdf contents.currently one pdf page is being rendered at a time and is displayed on the visible view.I want to buffer few pages(say one previous page and one next page for example) in advance while the user is reading the current page.Can any one please suggest me a better way of achieving this buffering mechanism?Thanks in advance
You can take a look at this open source PDF viewer for iOS, it implements the features you asked about: http://www.vfr.org/2011/09/pdf-reader-viewer-v2-2/
If you want to draw some content in the background, you can look into using the Grand Central Dispatch API, and drawing using Core Graphics commands. You will need to be careful about thread safety, such as checking/waiting for the background drawing to finish before trying to the push the results to the display.
I found a quite useful post, Image manipulation and drawing using Quartz in the background threads, on ensuring that you only use thread safe commands to create your drawing context (the example creates a bitmap context, but obviously you will be looking to create a PDF context using CGPDFContextCreate or similar).

Preloading assets on iOS

Suppose you have several animations that we want to achieve with a sequence of PNG in a UIImageView.
If we have many images in the animation we have a delay from the moment we send the message [myImgView startAnimation] this becausethe loading of all images in memory.
I noticed that the loading is lazy: as long as the startAnimation message is not sent images are not loaded into memory.
To avoid the problem of delay i load all the animation in the app delegate, attach as subview and than animate once. I want to understand what is the best solution? And if my actual solution have a draback?
You're right about lazy loading. I've never been able to determine if its actually lazy spooling from disk or lazy decompression but in either case, a UIImage (and, underneath, a CGImage) is not necessarily fully processed and ready to draw until it is actually used. I assume that it may conceivably become unready again in the future, depending on exactly how Apple handle memory warnings internally.
If you wanted to be really keen, I guess the best solution would be to use CoreGraphics to load and decompress the images in the background, pausing to finish loading upon startAnimation only if loading hasn't already finished. You can't achieve that directly with UIKit objects since they're not callable anywhere but on the main thread. You'd need to get a CGImage, draw to a bitmap context then create an image from the bitmap context and post that onto the main thread to be wrapped into a UIImage. And it'd probably be smart to use an NSOperationQueue to marshall the complete list of operations.
Supposing you don't mind the additional startup cost of blocking while you load all the PNGs, and are dealing with memory warnings correctly, there shouldn't be any problems with your current approach and I can't think of a better solution while remaining in high level Objective-C stuff.

Is drawRect: called on multiple threads when using a CATiledlayer?

I know that drawLayer: and drawlayer:inContext: are called on multiple threads when using a CATiledlayer, but what about drawRect:?
Apple's PhotoScroller example code uses drawRect: to get its images from disk, and it has no special code for handling threads.
I am trying to determine whether my model for a CATiledLayer must be thread-safe.
I have found CATiledLayer is using multiple background threads in the iOS Simulator, but a single background thread on my iPhone.
My Mac has a dual core processor, while my iPhone has a single core (A4).
I suspect an iOS device with an A5 CPU will also use multiple threads.
Yes, drawRect can and will be called on multiple threads (tested on OS 4.2).
This behaviour is less obvious if your drawing is fast enough to outpace the arrival of new zoom gestures so your app may work fine until tested with rapid input of zoom gestures.
One alternative is to make your model thread-safe.
If thread-safety is achieved by synchronizing most of the access to the data model to one drawing thread at a time then then you might do just as well to mutex the body of drawRect with something like #syncrhonize(self) which seems to work.
I haven't found a way to request that CATiledLayer only uses one background thread.
Have you seen this technical Q&A from Apple?
It doesn't answer your question directly, but it could help you decide how to implement your model.