Strang flash behaver on AVCaptureDeviceInput - iphone

I use this code to get flash working on the video that i recorded
NSError *error = nil;
// Create the session
AVCaptureSession *session = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPresetMedium;
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
[session addInput:input];
// Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init] ;
[session addOutput:output];
// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
// Specify the pixel format
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[output setAlwaysDiscardsLateVideoFrames:YES];
// If you wish to cap the frame rate to a known value, such as 15 fps, set
//turn flash on
[session beginConfiguration];
BOOL lockAcquired = [device lockForConfiguration:&error];
if (!lockAcquired) {
// log err and handle...
} else {
if ([device hasTorch] && [device hasFlash]){
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
}
[device unlockForConfiguration];
[session commitConfiguration];
// Start the session running to start the flow of data
[session startRunning];
but there is an strange behavior when recording start , the flash turn on then turn off for less a second then permanently stay on , Did anyone know why and how i solve it ?

Try making AVCaptureSession *session and AVCaptureDeviceInput *input member variables of your class. They need to survive function scope

Related

iOS7 - Outputting frames with AVCaptureMetadataOutput?

I am trying to use the iOS7 QR reading functions in the AVFoundation framework using the following code:
-(void)setupCaptureSession_iOS7 {
self.session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input)
{
NSLog(#"Error: %#", error);
return;
}
[session addInput:input];
//Turn on point autofocus for middle of view
[device lockForConfiguration:&error];
CGPoint point = CGPointMake(0.5,0.5);
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
//Add the metadata output device
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
NSLog(#"%lu",(unsigned long)output.availableMetadataObjectTypes.count);
for (NSString *s in output.availableMetadataObjectTypes)
NSLog(#"%#",s);
//You should check here to see if the session supports these types, if they aren't support you'll get an exception
output.metadataObjectTypes = #[AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeUPCECode];
output.rectOfInterest = CGRectMake(0, 0, 320, 480);
[session startRunning];
// Assign session to an ivar.
[self setSession:self.session];
}
This code obviously doesn't render the frames to the screen (yet). This is because, instead of using the AVCaptureVideoPreviewLayer class to display the preview, I need to display the frames as a UIImage (this is because I have want to display the frames multiple times on the view).
If I use AVCaptureVideoDataOutput as the output, I'm able to export the frames using by grabbing them from the captureOutput:didOutputSampleBuffer:fromConnection: callback. But I can't find an equivalent way to call get the frameBuffer when using AVCaptureMetadataOutput as the output.
Does anyone have any idea how to do this?

How to get FNumber and ISOSpeedRatings from iPhone cam in realtime

I need to get camera parameters (exif data) such as FNumber, ISOSpeedRatings in realtime without taking real photos. Is there any way to do that?
Here is the complete solution. Dont forget to import appropriate frameworks and headers.
#import <AVFoundation/AVFoundation.h>
#import <ImageIO/CGImageProperties.h>
AVCaptureStillImageOutput *stillImageOutput;
AVCaptureSession *session;
- (void)viewDidLoad
{
[super viewDidLoad];
[self setupCaptureSession];
// Do any additional setup after loading the view, typically from a nib.
}
-(void)captureNow{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *__strong error) {
CFDictionaryRef exifAttachments = CMGetAttachment( imageDataSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
}];
}
// Create and configure a capture session and start it running
- (void)setupCaptureSession
{
NSError *error = nil;
// Create the session
session = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPreset352x288;
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
device.whiteBalanceMode = AVCaptureWhiteBalanceModeLocked;
device.focusMode = AVCaptureFocusModeLocked;
[device unlockForConfiguration];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// Handling the error appropriately.
}
[session addInput:input];
stillImageOutput = [AVCaptureStillImageOutput new];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
if ([session canAddOutput:stillImageOutput])
[session addOutput:stillImageOutput];
// Start the session running to start the flow of data
[session startRunning];
[self captureNow];
}

How to use torch light in iPhone app?

I need to use iPhone Flash light in my app. But, while the user switch on the flash the camera does not take picture. How can i do this? Here i have attached my code. But, when i switch on the flash light, the camera takes picture.
AVCaptureDeviceInput *flashInput = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[session addInput:flashInput];
[session addOutput:output];
[device unlockForConfiguration];
[output release];
[session commitConfiguration];
[session startRunning];
[self setTorchSession:session];
Where i am wrong in coding? Please help me. Thanks in advance.
I have a torch button in my app which uses the following 3 methods.
- (void)initialiseTorch {
if (!session) {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
session = [[AVCaptureSession alloc] init];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
[session addInput:input];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
[session startRunning];
[output release];
}
}
- (void)releaseTorch {
if (session) {
[session stopRunning];
[session release];
session = nil;
}
}
- (void) lightButtonPressed {
if (!session) {
[self initialiseTorch];
}
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[session beginConfiguration];
[device lockForConfiguration:nil];
if ([device torchMode] == AVCaptureTorchModeOn) {
[device setTorchMode:AVCaptureTorchModeOff];
} else {
[device setTorchMode:AVCaptureTorchModeOn];
}
[device unlockForConfiguration];
[session commitConfiguration];
}
The only difference I can see between our code is that you are also setting the Flash Mode. Also I configure my session, and then turn the torch on/off in a seperate beginConfiguration pass
check this...
- (void)torchOnOff: (BOOL) onOff
{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch]) {
[device lockForConfiguration:nil];
[device setTorchMode: onOff ? AVCaptureTorchModeOn : AVCaptureTorchModeOff];
[device unlockForConfiguration];
}
}

Error while recording video on iphone using AVFoundation

Im trying to record video using AVFoundation
I can save images but not video. When trying to save a video, I
got an error saying:
[AVCaptureMovieFileOutput startRecordingToOutputFileURL:recordingDelegate:] - no active/enabled connections.'
And here is my code:
session = [[AVCaptureSession alloc] init];
//session is global object.
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.imgV.bounds;
[self.imgV.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device =[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
NSLog(#"start 3");
AVCaptureDeviceInput *input =
[AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
NSLog(#"Error");
}
[session addInput:input];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
aMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
[session addOutput:aMovieFileOutput];
[session startRunning];
[self performSelector:#selector(startRecording) withObject:nil afterDelay:10.0];
//[aMovieFileOutput startRecordingToOutputFileURL:fileURL recordingDelegate:self];
//previously i used to do this way but saw people doing it after delay thought it might be taking some time to initialized so tried this way also.
}
- (void) startRecording
{
NSLog(#"startRecording");
NSString *plistPath;
NSString *rootPath;
rootPath= [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
plistPath = [rootPath stringByAppendingPathComponent:#"test.mov"];
NSURL *fileURL = [[NSURL alloc] initFileURLWithPath:plistPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:plistPath]) {
NSLog(#"file exist %s n url %# ",[rootPath UTF8String],fileURL);
}
[aMovieFileOutput startRecordingToOutputFileURL:fileURL recordingDelegate:self];
}
Also I am trying to test this on Iphone 3G with IOs-4.1.
Also worth noting this can happen if you've set the session preset to 'photo'.
[session setSessionPreset:kCaptureSessionPresetPhoto];
Which should be:
[session setSessionPreset:kCaptureSessionPresetVideo];
You should be adding your outputs between a [AVCaptureSession beginConfiguration] and [AVCaptureSession commitConfiguration] pair.
This error can occur when you pass an object as a delegate using startRecordingToOutputFileURL:recordingDelegate: method, but this object does not have the
captureOutput:didStartRecordingToOutputFileAtURL:fromConnections: AVCaptureFileOutputRecordingDelegate method implemented.
Implement the AVCaptureFileOutputRecordingDelegate protocol. Make sure the path of video file is correct and the video file is not exist, as the movie file output does not overwrite existing resources.

UIlabel is not getting updated inside AVCaptureSession Delegate

I am learning objective c and doing a sample app to fetch video feed from iPhone camera. I was able to get the feeds from camera and display it on screen. Also I was trying to update some UILabel in screen for each frame from the video inside the delegate method. But the label value is not getting updated always. Here is the code I am using
This section will initialize the capture
- (void)initCapture
{
NSError *error = nil;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]) {
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
}
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
//AVCaptureStillImageOutput *imageCaptureOutput = [[AVCaptureStillImageOutput alloc] init];
AVCaptureVideoDataOutput *captureOutput =[[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
//captureOutput.minFrameDuration = CMTimeMake(1, 1);
captureOutput.alwaysDiscardsLateVideoFrames = YES;
dispatch_queue_t queue;
queue = dispatch_queue_create("cameraQueue", NULL);
[captureOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// Set the video output to store frame in BGRA (It is supposed to be faster)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
self.captureSession = [[AVCaptureSession alloc] init];
[self.captureSession addInput:captureInput];
[self.captureSession addOutput:captureOutput];
self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession];
self.prevLayer.frame = CGRectMake(0, 0, 320, 320);
self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.videoPreview.layer addSublayer: self.prevLayer];
[self.captureSession startRunning];
}
This method is called for each video frame.
#pragma mark AVCaptureSession delegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
i++;
self.lblStatus.Text = [NSString stringWithFormat:#"%d",i];
}
I am trying to print UILabel inside this method but it is not printed always. THere is much delay for the label text to change.
Could someone help please?
Thanks.
Your sampleBufferDelegate's captureOutput is being called from a non-main thread - updating GUI objects from there can do no good. Try using performSelectorOnMainThread instead.