iOS7 - Outputting frames with AVCaptureMetadataOutput? - iphone

I am trying to use the iOS7 QR reading functions in the AVFoundation framework using the following code:
-(void)setupCaptureSession_iOS7 {
self.session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input)
{
NSLog(#"Error: %#", error);
return;
}
[session addInput:input];
//Turn on point autofocus for middle of view
[device lockForConfiguration:&error];
CGPoint point = CGPointMake(0.5,0.5);
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
//Add the metadata output device
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
NSLog(#"%lu",(unsigned long)output.availableMetadataObjectTypes.count);
for (NSString *s in output.availableMetadataObjectTypes)
NSLog(#"%#",s);
//You should check here to see if the session supports these types, if they aren't support you'll get an exception
output.metadataObjectTypes = #[AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeUPCECode];
output.rectOfInterest = CGRectMake(0, 0, 320, 480);
[session startRunning];
// Assign session to an ivar.
[self setSession:self.session];
}
This code obviously doesn't render the frames to the screen (yet). This is because, instead of using the AVCaptureVideoPreviewLayer class to display the preview, I need to display the frames as a UIImage (this is because I have want to display the frames multiple times on the view).
If I use AVCaptureVideoDataOutput as the output, I'm able to export the frames using by grabbing them from the captureOutput:didOutputSampleBuffer:fromConnection: callback. But I can't find an equivalent way to call get the frameBuffer when using AVCaptureMetadataOutput as the output.
Does anyone have any idea how to do this?

Related

Strang flash behaver on AVCaptureDeviceInput

I use this code to get flash working on the video that i recorded
NSError *error = nil;
// Create the session
AVCaptureSession *session = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPresetMedium;
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
[session addInput:input];
// Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init] ;
[session addOutput:output];
// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
// Specify the pixel format
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[output setAlwaysDiscardsLateVideoFrames:YES];
// If you wish to cap the frame rate to a known value, such as 15 fps, set
//turn flash on
[session beginConfiguration];
BOOL lockAcquired = [device lockForConfiguration:&error];
if (!lockAcquired) {
// log err and handle...
} else {
if ([device hasTorch] && [device hasFlash]){
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
}
[device unlockForConfiguration];
[session commitConfiguration];
// Start the session running to start the flow of data
[session startRunning];
but there is an strange behavior when recording start , the flash turn on then turn off for less a second then permanently stay on , Did anyone know why and how i solve it ?
Try making AVCaptureSession *session and AVCaptureDeviceInput *input member variables of your class. They need to survive function scope

UIImage orientation incorrect from AVCaptureStillImageOutput

I'm trying to capture pixel data from an AVCaptureStillImageOutput and noticed that upon cropping the image to a CGImage, it becomes re-oriented. To test this, I output a temporary image to the photo library. Now I've noticed that even before the image is cropped, it's thumbnail is rotated while the full image is not. (This later becomes a problem when I pass the UIImage to my custom pixelDataManager that requires proper dimensions of the image.)
Setting captureVideoPreviewLayer.orientation = AVCaptureVideoOrientationPortrait; and [videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait]; did not seem to change anything.
Any thoughts???
I'll break it up into sections...
1) Setting up the session, input, and output:
// Create a capture session
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
// Add capture layer to visible view
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResize;
captureVideoPreviewLayer.frame = screenBounds;
captureVideoPreviewLayer.bounds = screenBounds;
captureVideoPreviewLayer.orientation = AVCaptureVideoOrientationPortrait;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// Setup error handling
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
// Start session
[session startRunning];
// Output image data
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
2) Setting up the video connection:
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
if ([videoConnection isVideoOrientationSupported])
{
[videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
}
3) Output the captured image:
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
//......
}
When you do this, though the image is being provided to you as a UIImage, it's actually using the underlying Quartz CGImage data which has a different origin point (lower left I think) which means that when you use the image it's rotated to the side.
I found a C function that you can call with the UIImage as parameter that fixes it and returns the fixed UIImage.
http://blog.logichigh.com/2008/06/05/uiimage-fix/

Error while recording video on iphone using AVFoundation

Im trying to record video using AVFoundation
I can save images but not video. When trying to save a video, I
got an error saying:
[AVCaptureMovieFileOutput startRecordingToOutputFileURL:recordingDelegate:] - no active/enabled connections.'
And here is my code:
session = [[AVCaptureSession alloc] init];
//session is global object.
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.imgV.bounds;
[self.imgV.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device =[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
NSLog(#"start 3");
AVCaptureDeviceInput *input =
[AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
NSLog(#"Error");
}
[session addInput:input];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
aMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
[session addOutput:aMovieFileOutput];
[session startRunning];
[self performSelector:#selector(startRecording) withObject:nil afterDelay:10.0];
//[aMovieFileOutput startRecordingToOutputFileURL:fileURL recordingDelegate:self];
//previously i used to do this way but saw people doing it after delay thought it might be taking some time to initialized so tried this way also.
}
- (void) startRecording
{
NSLog(#"startRecording");
NSString *plistPath;
NSString *rootPath;
rootPath= [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
plistPath = [rootPath stringByAppendingPathComponent:#"test.mov"];
NSURL *fileURL = [[NSURL alloc] initFileURLWithPath:plistPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:plistPath]) {
NSLog(#"file exist %s n url %# ",[rootPath UTF8String],fileURL);
}
[aMovieFileOutput startRecordingToOutputFileURL:fileURL recordingDelegate:self];
}
Also I am trying to test this on Iphone 3G with IOs-4.1.
Also worth noting this can happen if you've set the session preset to 'photo'.
[session setSessionPreset:kCaptureSessionPresetPhoto];
Which should be:
[session setSessionPreset:kCaptureSessionPresetVideo];
You should be adding your outputs between a [AVCaptureSession beginConfiguration] and [AVCaptureSession commitConfiguration] pair.
This error can occur when you pass an object as a delegate using startRecordingToOutputFileURL:recordingDelegate: method, but this object does not have the
captureOutput:didStartRecordingToOutputFileAtURL:fromConnections: AVCaptureFileOutputRecordingDelegate method implemented.
Implement the AVCaptureFileOutputRecordingDelegate protocol. Make sure the path of video file is correct and the video file is not exist, as the movie file output does not overwrite existing resources.

UIlabel is not getting updated inside AVCaptureSession Delegate

I am learning objective c and doing a sample app to fetch video feed from iPhone camera. I was able to get the feeds from camera and display it on screen. Also I was trying to update some UILabel in screen for each frame from the video inside the delegate method. But the label value is not getting updated always. Here is the code I am using
This section will initialize the capture
- (void)initCapture
{
NSError *error = nil;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]) {
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
}
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
//AVCaptureStillImageOutput *imageCaptureOutput = [[AVCaptureStillImageOutput alloc] init];
AVCaptureVideoDataOutput *captureOutput =[[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
//captureOutput.minFrameDuration = CMTimeMake(1, 1);
captureOutput.alwaysDiscardsLateVideoFrames = YES;
dispatch_queue_t queue;
queue = dispatch_queue_create("cameraQueue", NULL);
[captureOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// Set the video output to store frame in BGRA (It is supposed to be faster)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
self.captureSession = [[AVCaptureSession alloc] init];
[self.captureSession addInput:captureInput];
[self.captureSession addOutput:captureOutput];
self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession];
self.prevLayer.frame = CGRectMake(0, 0, 320, 320);
self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.videoPreview.layer addSublayer: self.prevLayer];
[self.captureSession startRunning];
}
This method is called for each video frame.
#pragma mark AVCaptureSession delegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
i++;
self.lblStatus.Text = [NSString stringWithFormat:#"%d",i];
}
I am trying to print UILabel inside this method but it is not printed always. THere is much delay for the label text to change.
Could someone help please?
Thanks.
Your sampleBufferDelegate's captureOutput is being called from a non-main thread - updating GUI objects from there can do no good. Try using performSelectorOnMainThread instead.

iphone . processing frames that are being recorded by the camera

I have this app the records video and I need to fire a method every time a frame is grabbed. After banging my head on the wall, I decided to try the following: create a dispatch queue, as I would grab a video from the output, just to have a method called when the frame is recorded by the camera.
I am trying to understand a section of code created by Apple to record videos to figure out how I should add the dispatch queue. This is the apple code and the section marked between asterisks is what I have added, in order to create the queue. It compiles without errors, but captureOutput: didOutputSampleBuffer: fromConnection: is never called.
- (BOOL) setupSessionWithPreset:(NSString *)sessionPreset error:(NSError **)error
{
BOOL success = NO;
// Init the device inputs
AVCaptureDeviceInput *videoInput = [[[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:error] autorelease];
[self setVideoInput:videoInput]; // stash this for later use if we need to switch cameras
AVCaptureDeviceInput *audioInput = [[[AVCaptureDeviceInput alloc] initWithDevice:[self audioDevice] error:error] autorelease];
[self setAudioInput:audioInput];
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
[self setMovieFileOutput:movieFileOutput];
[movieFileOutput release];
// Setup and start the capture session
AVCaptureSession *session = [[AVCaptureSession alloc] init];
if ([session canAddInput:videoInput]) {
[session addInput:videoInput];
}
if ([session canAddInput:audioInput]) {
[session addInput:audioInput];
}
if ([session canAddOutput:movieFileOutput]) {
[session addOutput:movieFileOutput];
}
[session setSessionPreset:sessionPreset];
// I added this *****************
dispatch_queue_t queue = dispatch_queue_create("myqueue", NULL);
[[self videoDataOutput] setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// ******************** end of my code
[session startRunning];
[self setSession:session];
[session release];
success = YES;
return success;
}
What I need is just a method where I can process every frame that is being recorded.
thanks
Having set yourself as the delegate, you'll receive a call to:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
Every time a new frame is captured. You can put whatever code you want in there — just be careful because you won't be on the main thread. It's probably safest to do a quick [target performSelectorOnMainThread:#selector(methodYouActuallyWant)] in -captureOutput:didOutputSampleBuffer:fromConnection:.
Addition: I use the following as setup in my code, and that successfully leads to the delegate method being called. I'm unable to see any substantial difference between it and what you're using.
- (id)initWithSessionPreset:(NSString *)sessionPreset delegate:(id <AAVideoSourceDelegate>)aDelegate
{
#ifndef TARGET_OS_EMBEDDED
return nil;
#else
if(self = [super init])
{
delegate = aDelegate;
NSError *error = nil;
// create a low-quality capture session
session = [[AVCaptureSession alloc] init];
session.sessionPreset = sessionPreset;
// grab a suitable device...
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// ...and a device input
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if(!input || error)
{
[self release];
return nil;
}
[session addInput:input];
// create a VideDataOutput to route output to us
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:[output autorelease]];
// create a suitable dispatch queue, GCD style, and hook self up as the delegate
dispatch_queue_t queue = dispatch_queue_create("aQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// set 32bpp BGRA pixel format, since I'll want to make sense of the frame
output.videoSettings =
[NSDictionary
dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
}
return self;
#endif
}
- (void)start
{
[session startRunning];
}
- (void)stop
{
[session stopRunning];
}
// create a suitable dispatch queue, GCD style, and hook self up as the delegate
dispatch_queue_t queue = dispatch_queue_create("aQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
Also very important into
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
be sure to put a
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init]; at the beginning and a [pool drain] at the end else will crash after too many processes.