Which part of OpenGL ES does make my iPhone App bit slow? - iphone

I am currently developing my iPhone App with OpenGL ES. It is mirror app with brightness and contrast. But the problem i am having now is it is bit slower(about 0.2s delay) when you use it. But the frame rate is about 60 seconds. So my quesion is which part of OpenGL takes time to process?

What you have is lag (not slowness). And it's not caused by OpenGL (at least not entirely). The latencies happen in the camera and the process of reading and decoding the camera pictures.
Some latency is unavoidable:
It takes a whole video frame for the camers to capture the image and to encode the image into digital data
It takes a whole display frame do draw the frame to the display.
So the shortest lag you can get are about 1s/30 + 1s/60 = 0.05s
Any latency above this is created due to processing overhead. And most likely I'd say yours comes from decoding the image and maybe buffer allocations in that process. However I'd need to see your sourcecode to tell for sure.

Related

Lower the Video Quality for Higher FPS When Live Filtering Using CIImage

I'm trying to doing live camera filter like Instagram and Path. Since I'm not skilled enough to handle OpenGL ES. I use iOS 5's CoreImage instead.
I use this call back method to intercept and filter each frame from the camera:
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
The session preset I use is AVCaptureSessionPresetPhoto, since I need to take high-quality photos in the end.
If I just present the buffer to screen without any CIImage filtering, the average FPS would reach 26 or even more, which is great.
If I start to apply some CIFilters to the image data, the FPS would drop to as low as 10 or even lower. The live video would start to look bad.
I understand that Instagram and Path would use OpenGL ES directly rather than wrapped frameworks such as CoreImage, so that they could build more efficient code for GPU rendering. At the same time, I also notice that Instagram actually lowers the sample video quality to further reduce the GPU burden. Below is the screenshot I took when my app (left) and Instagram (right) are both capturing live video. Pay attention to the letter Z and S in both pictures. You can see that Instagram's video quality is slightly lower than mine.
So right now I'm considering various ways to reduce the live video frame quality. But I really have no idea which way is better. And how should I implement it.
Try to reduce this (CMSampleBufferRef)sampleBuffer's quality before converting it into a CIImage object.
Try to find some APIs from CIImage or CIFilter or CIContext to lower the video frame quality.
Try to find some APIs from OpenGL ES to lower the video frame quality.
Again, I don't have any clues now. So any suggestions would be greatly appreciated!
AVcaptureSession has the property sessionPreset which allow video quality to set low medium or high.
Below code set quality to medium.
[self.mSession setSessionPreset:AVCaptureSessionPresetMedium];
If you're unwilling to lower the video quality (and I think AVCaptureSessionPresetPhoto is pretty low anyway), your best bet is either optimizing your filters, or lowering the frame-rate.
You may think, well, I'm already getting a lower frame-rate. However, setting this in advance will optimize the way the frames are dropped. So, if you're getting 10fps now, try setting the max frame rate to, say 15, and you might just get that 15. 15fps is plenty good for a preview.
(I've worked on Flare, on one of the most demanding apps out there itun.es/iLQ3bR.)

Reduced quality OpenGL ES screenshots (iPhone)

I'm currently using this method from Apple to take screenshots of my OpenGL ES iPhone game. The screenshots look great. However taking a screenshot causes a small stutter in the game play (which otherwise runs smoothly at 60 fps). How can I modify the method from Apple to take lower quality screenshots (hence eliminating the stutter caused by taking the screenshot)?
Edit #1: the end goal is to create a video of the game play using AVAssetWriter. Perhaps there's a more efficient way to generate the CVPixelBuffers referenced in this SO post.
What is the purpose of the recording?
If you want to replay a sequence on the device you can look into saving the object positions etc instead and redraw the sequence in 3D. This also makes it possible to replay sequences from other view positions.
If you want to show the game play on i.e. youtube or other you can look into recording the game play with another device/camera or record some game play running in the simulator using some screen capture software as ScreenFlow.
The Apple method uses glReadPixels() which just pulls all the data across from the display buffer, and probably triggers sync barriers, etc, between GPU and CPU. You can't make that part faster or lower resolution.
Are you doing this to create a one-off video? Or are you wanting the user to be able to trigger this behavior in the production code? If the former, you could do all sorts of trickery to speed it up-- render to a smaller size for everything, don't present at all and just capture frames based on a recording of the input data running into the game, or other such tricks, or going even further run that whole simulation at half speed to get all the frames.
I'm less helpful if you need an actual in-game function for this. Perhaps someone else will be.
If all else fails.
Get one of these
http://store.apple.com/us/product/MC748ZM/A
And then convert that composite video to digital through some sort of external device.
I've done this when I converted vhs movies to dvd a long time ago.

Best Idea to show a movie on iphone app

i creating an application with movie animation , i have a animation some group of insects fly around a lamp ,
so i render my animation as png images [they are about 400 png image frames] and when i want play insects movie on my app ... it seems the iphone running out of memory (i try short animation and works fine) but long animation not ! .
what's your suggestion to play this animation on iphone sdk , i don't know the MPMovieController is a good idea ! because my animation doesn't any background .
Best way is to create normal movie file and play it with standard controller.
If it's unacceptable, you can do something like this. Two threads, one thread loads images from disk, another one shows them per 1/15 (1/30, whatever) of a second. There are several difficulties:
iPhone flash storage speed. It's very limited. So you can do something like 128x128 animation with ease, and fail to do so with full-screen one.
Such flash using drains battery very fast.
Maybe, you can also store some part of your animation in memory (every second frame for example) so you'll have to load less from flash.

iPhone OpenGLES: Textures are consuming too much memory and the program crashes with signal "0"

I am not sure what the problem is. My app is running fine on the simulator but when I try to run it on the iPhone it crashes during debugging or without debugging with signal "0". I am using the Texture2D.m and OpenGLES2DView.m from the examples provided by Apple. I profiled the app on the iPhone with Instruments using the Memory tracer from the Library and when the app died the final memory consumed was about 60Mb real and 90+Mb virtual. Is there some other problem or is the iPhone just killing the application because it has consumed too much memory? If you need any information please state it and I will try to provide it. I am creating thousands of textures at load time which is why the memory consumption is so high. Really cant do anything about reducing the number of pics being loaded. I was running before on just UIImage but it was giving me really low frame rates. I read on this site that I should use OpenGLES for higher frame rates.
Also sub question is there any way not to use UIImage to load the png file and then use the Texture class provided to create the texture for OpenGLES functions to use it for drawing? Is there some function in OpenGLES which will create a texture straight from a png file?
thousands of textures? really? how many of them are on the screen at one time? perhaps you can only load some of them at a time, or if they're small, you should combine them into fewer larger textures.
the general guideline I've heard is that you are limited to 24MB of texture memory.
there's nothing built into OpenGLES that loads from disk, but you can use a file parser like stb_image to do it yourself.
I tried load as ten texture pieces of 2048x2048 pixels.
Texture memory exceed 24MB, but iPhone3GS is able to loaded and rendered it.
I also recommend stb_image or SOIL texture loader.
(stb_image library is used SOIL library.)

iPhone short animation: video or image sequence?

I have read several post on both matters but I haven't seen anyone comparing so far.
Suppose I just want full screen animation without any transparency etc, just a couple of seconds animation (1''-2'') when an app starts. Does anyone know how "video" compares to "sequence of images" (320x480 # 30) on the iPhone, regarding performance etc?
I think there are a few points to think about here.
Size of animation as pointed out above. You could try a framerate of 15 images per second so that could be 45 images for 3s. That is quite a lot data.
The video would be compressed as mentioned before in H.264 (Baseline Profile Level 3.0) format or MPEG-4 Part 2 video (Simple Profile) format. Which means its going to be reasonably small.
I think you will need to go for video because,
1. 45 full screen PNG images is going to require a lot of ram. I don't think this is going to work that well.
Lastly you will need to ad the Media Player Framework which will have to be loaded into memory and this going to increase your load times.
MY ADVICE: Sounds like the animation is a bit superfluous to the app, I hate apps that take ages to load and this is only going to increase you app startup times. If you can avoid doing this, then dont do it. Make you app fast. If you could do this at some other time after load then that is cool.
The video will be a lot more compressed than a sequence of images, because video compression takes previous frame data into account to reduce bitrate. It will take more power to decode, however the iPhone has hardware for that, and the OS has APIs that use this hardware, so I wouldn't feel bad about making use of them.
do not overlook the possibility of rendering the sequence in real-time.