Applying Effect to iPhone Camera Preview "Video" Using OpenGL - iphone

My goal is to write a custom camera view controller that:
Can take photos in all four interface orientations with both the back and, when available, front camera.
Properly rotates and scales the preview "video" as well as the full resolution photo.
Allows a (simple) effect to be applied to BOTH the preview "video" and full resolution photo.
My previous effort is documented in this question. My latest attempt was to modify Apple's sample GLVideoFrame (from WWDC 2010). However, I have not been able to get the iPhone 4 to display the preview "video" properly when the session preset is AVCaptureSessionPresetPhoto.
Has anyone tried this or know why the example doesn't work with this preset?
Apple's example uses a preset with 640x480 video dimensions and a default texture size of 1280x720. The iPhone 4 back camera delivers only 852x640 when the preset is AVCaptureSessionPresetPhoto.
iOS device camera video/photo dimensions when preset is AVCaptureSessionPresetPhoto:
iPhone 4 back: video is 852x640 & photos are 2592x1936
iPhone 4 front: video & photos are 640x480
iPod Touch 4G back: video & photos are 960x720
iPod Touch 4G front: video & photos are 640x480
iPhone 3GS: video is 512x384 & photos are 2048x1536
Update
I got the same garbled video result when switching Brad Larson's ColorTracking example (blog post) to use the AVCaptureSessionPresetPhoto.

The issue is that AVCaptureSessionPresetPhoto is now context-aware and runs in different resolutions based on whether you are displaying video or still image captures.
The live preview is different for this mode because it pads the rows with extra bytes. I'm guessing this is some sort of hardware optimization.
In any case, you can see how I solved the problem here:
iOS CVImageBuffer distorted from AVCaptureSessionDataOutput with AVCaptureSessionPresetPhoto

The AVCaptureSessionPresetPhoto is for taking pictures, not capturing live feed. You can read about it here: http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/03_MediaCapture.html
(My belief is that this is actually two different cams or sensors, as they behave very differently, and there's a couple of seconds delay just for switching between the Photo and, say, 640x480).
You can't even use both presets at the same time, and switching between them is a headache as well - How to get both the video output and full photo resolution image in AVFoundation Framework
HTH, although not what you wanted to hear...
Oded.

Related

OpenCV detect iphone orientation

I have a site where users can upload video. When testing some video uploads that are processed with OpenCV and Python, if the video was recorded on an iPhone it always assumes the video was taken in landscape mode by rotating the phone 90 degrees to the left, such that videos in portrait mode are sideways and videos taken in the other landscape direction (90 degrees to the right) are upside down.
I know I can use OpenCV to rotate videos, but is there a way to detect:
a) if the video is even taken with an iPhone or not
b) if so, what the orientation should be, how much to rotate the video by
OpenCV is an computer vision library, for your problem you can't use OpenCV (AFAIK). What you need is to get the metadata of the video. Metadata contains the all the information you need about that video. Here you can see what does metadata contains. You should search how to extract metadata from a video. Take a look at this.
Good luck!

Is it possible to programmatically capture iPhone 5S slow motion video?

I couldn't find an answer to this question, and looking at Apples own apps like iMovie on iOS, the video picker does not offer a slow motion option on the iPhone 5S.
The image picker offers very little control over the video. If you are willing to dive deeper into the APIs you can use AVFoundation to capture your video and manipulate the camera properties as you see fit.
iOS 7 introduces a new AVCaptureDeviceFormat class that will give you the maximum and minimum supported frame-rates for the capture device, and you can use these to set a custom frame rate on the camera itself. I don't have an iPhone 5S to hand so I can't actually verify whether this API goes all the way down to 120FPS.

How to record screen to video on iPhone with openGL (view preview layer) and UIkit elements?

I have searched everywhere and tried mixing and matching different bits of code but I haven't found anything that works or anyone with the same question.
Basically I want to be able to create video demos of iPhone apps that include standard UIKit elements and also the image coming from the camera (video preview layer). I don't want to use airPlay or iOS simulator to project onto the desktop then capture because I want to be able to make videos outside in public. I have successfully been able to video capture the screen with this code but with the video preview layer being blank. I read that its because its using openGL and what I'm capturing is from the CPU, not the GPU. I have successfully used GPUImage from Brad Larson to capture the video preview layer but it doesn't capture the rest of the UIView. I have seen code that combines both and converts to an image but I'm not sure if that would be too slow for realtime video capture. Can someone point me in the right direction?
It might not be the cleanest solution, but it will work nonetheless: did you consider jailbreaking? I hope Apple does sue me for this one but if you really want to record your screen then simply install a screen recorder. Enough options can be found: http://www.google.be/search?q=iphone+jailbreak+record+screen
And if you don't like it: recover your phone for a previous backup.
(for the record: I'm against jailbreaking and posting this from a productivity point of view)

AVFoundation Camera Preview Screen gives wrong zoom

I'm currently developing an app that has a camera functionality, with a custom camera screen, featuring a preview screen and an overlay.
I'm using the AVFoundation classes and methods as per the eradication of UIScreenCapture.
The problem I have is that the preview data I get from AVCaptureSession is too zoomed in. If i take a picture with that screen, and another with the iPhone's default camera app, without moving the iPhone, the difference in zoom is far too much.
I need the zoom of my app to be the same as is default for iPhone camera app.
I've tried changing the AVCaptureVideoPreviewLayer.videoGravity, to any of it's 3 possible values, to no avail.
Please, any leads on this problem are truly appreciated.
Arcantos' solution was mostly correct. That will work assuming you're on an iPhone 3G (or any device with a camera that supports 640x480). An iPhone 4 may run into some issues there.
A more correct way would be to test for the availability of and apply this preset:
captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
This will use the raw camera data, regardless of the native resolution.
Turned out to be a resolution issue.
It was fixed by using
myCaptureSession.sessionPreset = AVCaptureSessionPreset640x480
Note that iPhone 3g does not support that, so you have to ask wheter the device supports it
[[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] supportsAvCaptureSessionPreset:AVCaptureSessionPreset640x480]
Is the aspect ratio of your preview pane identical to that of the camera capture data? If not, the OS may be changing the zoom to fit the data rect into your requested aspect ratio.

How to programmatically produce a video on the iPad platform?

I would like to programmatically produce a video using the microphone on the iPad (for sound) and the screen display (for visual)? Is this possible? How should I proceed?
Possible, but not easy. A rough outline:
Start capturing the audio: How do I record audio on iPhone with AVAudioRecorder?
Capture individual images of the screen as fast as you can: How to capture current view screenshot and reuse in code? (iPhone SDK)
Later, combine the two into a video. I have no clue how to do this part.