How do I take for example 25 photos in one second at low resolution?
I want to make a burst mode but with a lot of images at low resolution (640x480)
Later I need to increase to 40fps. I mean, take 40 photos in one second
25 frames per second is very close to the iPhone movie fps of 24.0 − 30 fps. Why not make a movie, then pull out the frames from the movie to get your shots.
The existing camera access in iOS spits out frames at about 30fps. You can't increase that, period. (I assume that this is due to the underlying hardware, but I don't really know.)
You'll also have a hard time with memory. 640x480 * 3 (RGB) * 25 frames ~= 23mb of RAM. Getting that much memory -- much less that much in a second -- in iOS is going to be challenging.
To get started with this, take a look around for AVFoundation sample code -- there's plenty of it to be had -- and experiment a bit to see what you can realistically and reliably get out of a device.
Related
I am currently developing my iPhone App with OpenGL ES. It is mirror app with brightness and contrast. But the problem i am having now is it is bit slower(about 0.2s delay) when you use it. But the frame rate is about 60 seconds. So my quesion is which part of OpenGL takes time to process?
What you have is lag (not slowness). And it's not caused by OpenGL (at least not entirely). The latencies happen in the camera and the process of reading and decoding the camera pictures.
Some latency is unavoidable:
It takes a whole video frame for the camers to capture the image and to encode the image into digital data
It takes a whole display frame do draw the frame to the display.
So the shortest lag you can get are about 1s/30 + 1s/60 = 0.05s
Any latency above this is created due to processing overhead. And most likely I'd say yours comes from decoding the image and maybe buffer allocations in that process. However I'd need to see your sourcecode to tell for sure.
I tested my app with instruments:
When scrolling table views, it's around 20-50 fps, (more like 25 average). Is that good enough? I've reused the table view cells and did quite a lot of optimisations.
Slow frame rates are like pornography - hard to define but you know it when you see it.
In other words, worry about the feel, not the numbers. If it feels laggy, it is.
20-50 is great. Television has a frame rate around 30, and movies sometimes have frame rates as low as 24. For non gaming situations you can do just fine going as low as 15. Rates lower than that will start to feel laggy.
I made a board game with includes just some little animations. I reduced the fps from 60 to 30 to reduce the processor load. But the device still gets very warm.
Another application made without cocos2d is not heating it so much.
Are there any methods to calm the iPhone down?
The device state is as follows:
Wifi is always enabled
The app uses gamecenter
GPS is inactive
fps is always on 30
I use cocos2d-iphone as engine
It might be worth experimenting with different director types, e.g. kCCDirectorTypeNSTimer, and seeing if that helps at all. Those will have the biggest effect on the main loop of the game.
You should also spend some time with Instruments if you've not already, as that will show you where the CPU is spending its time and give you some hints on where you could ease things up.
I've noticed that a sequence of small time animations in cocos2d takes a lot of processor time. I've tried making tips at the level which will pulse in size. 0.1 second pulse up, 0.15 down and 0.2 stay. And i've put it all in a repeat forever sequence. Everything was terribly slow. Then i've just made the animation manually and the device calmed down and fps increased back to 60
When showing menus or dialogs that do not require animation, you can actually lower your framerate even further.
I'm developing a game for the iPhone. I've decided that 30FPS is plenty so I've written some code that only allows the App to present the render buffer every 1/30 of a second. When I tried to verify this with Instruments I got varying information.
On an iPod Touch (2009 edition, 32G) it reports 30 FPS for Core Animation Frames Per Second.
On an iPhone 3G I get wildly varying results. And not just less than 30 FPS. I see >30 FPS on a regular basis. It actually seems to hang closer to 36-39.
To investigate this anomaly I added my own FPS to the app and update it once per second. I stays right at 29 FPS on both devices.
So, does anyone have any suggestions as to what might be going on? I expect Instruments to be accurate so it really concerns me that it appears inaccurate. It makes me think I have a bug somewhere, but I sure can't find it.
Are you using CADisplayLink? This might give you a little bit more precision on your main loop.
I have read several post on both matters but I haven't seen anyone comparing so far.
Suppose I just want full screen animation without any transparency etc, just a couple of seconds animation (1''-2'') when an app starts. Does anyone know how "video" compares to "sequence of images" (320x480 # 30) on the iPhone, regarding performance etc?
I think there are a few points to think about here.
Size of animation as pointed out above. You could try a framerate of 15 images per second so that could be 45 images for 3s. That is quite a lot data.
The video would be compressed as mentioned before in H.264 (Baseline Profile Level 3.0) format or MPEG-4 Part 2 video (Simple Profile) format. Which means its going to be reasonably small.
I think you will need to go for video because,
1. 45 full screen PNG images is going to require a lot of ram. I don't think this is going to work that well.
Lastly you will need to ad the Media Player Framework which will have to be loaded into memory and this going to increase your load times.
MY ADVICE: Sounds like the animation is a bit superfluous to the app, I hate apps that take ages to load and this is only going to increase you app startup times. If you can avoid doing this, then dont do it. Make you app fast. If you could do this at some other time after load then that is cool.
The video will be a lot more compressed than a sequence of images, because video compression takes previous frame data into account to reduce bitrate. It will take more power to decode, however the iPhone has hardware for that, and the OS has APIs that use this hardware, so I wouldn't feel bad about making use of them.
do not overlook the possibility of rendering the sequence in real-time.