Realtime Video Processing on iPhone - iphone

We have a requirement to 'read' an LED Pulse lamp using the iPhone video camera. The LED lamp emits the light based on some load conditions.
Is there any related iPhone API to help achieve this goal?
Thanks much.

You can use the AVFoundation framework to read and process the live video stream from the camera. WWDC 2010 Session 405 gives you a good overview of AVFoundation.

There are iOS AV APIs to get raw pixel bitmaps from the video camera(s). Detecting any specific image or brightness within these raw bitmaps has to be done in your own code.

Related

Live Streaming Video in iPhone

I am new to iPhone Development. I need to capture video. While I'm capturing video it display on server too. Something like live streaming.
Anyone have idea from where I should have to start for this functionality?
Thanks in Advance.
Your question seems similar to this
Xcode ios: Streaming of video file while recording and removed redundant personal statements
First Half Solution
Using AVFoundation you can get video Buffer/frames while recording.
Second Half
But for uploading i didn't find any solution
There is Input Stream option there in iOS APIs but it need some file path. but as video is not recorded we didn't have any path.
Edit 1
Here is Best Example for AVFoundation provided by Apple, you can start with
I recommend you to use wowza wowza.com/https://www.wowza.com, it has all the features, from live stream, video on demand and etc.

IPhone: Video API: Live video streaming modify

I have a question about video stream processing. Is it possible to get access and modify real time video stream during recording (f.e. I want to add some text to video)? I can do this as a preview by getting separate frames, but I'm looking for tool which will allow me to store video with my text in video frames.
Probably there is already some libraries/tools available (but I haven't found any yet).
Try GPUIMAGE library. It can help you.
You should check AVCam sample code by apple. That might be a starting point.

Capturing video while processing it through a shader on iPhone

I am trying to develop an iPhone app that processes/filters and records video.
I have two sample apps that have aspects of what I need and am trying to combine them.
AVCamDemo from the WWDC10 sample code package (Apple Developer ID required)
This deals with capturing/recording video.
Brad Larson's ColorTracking sample app referenced here
This deals with live processing of video using OpenGL ES
I get stuck when trying to combine the two.
What I have been trying to do is to use AVCaptureVideoOutput and the AVCaptureVideoDataOutputSampleBufferProtocol to process/filter the video frames through OpenGL ES (as in -2-), and at the same time somehow use AVCaptureMovieFileOutput to record the processed video (as in -1-).
Is this approach possible? If so, how would I need to set up the connections within the AVSession?
Or do I need to use the AVCaptureVideoDataOutputSampleBufferProtocol to process/filter the video AND then recombine the individual frames back into a movie - without using AVCaptureMovieFileOutput to save the movie file?
Any suggestions for the best approach to accomplish this are much appreciated!

How to record videos from an iphone App

Does anyone know how to record videos from an iphone app without making it to close..
it would be greatly helpful, if any one provides the sample code...
Maybe you can see the Apple note:
How to capture video frames from the camera as images using AV Foundation

video API iphone

I want to play H.264 video streamed from network. To play video iphone provides the media player API. Does anyone one aware of any documented or undocumented API for decoding and playing single video frame?
Is it Live video or on demand ?
If live the only way is to use iPhone OS 3 with MediaPlayer.
take a look at the discussions group from apple there is some interesting thread.