AVCaptureConnection Freezes my app on the second call - iphone

this is my code that freezes my app
AVCaptureConnection *videoConnection = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
if ([videoConnection isVideoOrientationSupported] )
{
[videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
}
this is my videoOutput init
// Setup video capture
videoInput = [[AVCaptureDeviceInput deviceInputWithDevice: front? frontVideoDevice: rearVideoDevice error: &error] retain];
videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoOutput setAlwaysDiscardsLateVideoFrames: YES];
// Set the video output to store frame in BGRA (It is supposed to be faster)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[videoOutput setVideoSettings:videoSettings];
when I initialize the session in the second time my app stop responding
but if i remove the code of the avcaptureconnection everything is fine
?

...!
I called
[session startRunning]
before the
AVCaptureConnection *videoConnection = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
if ([videoConnection isVideoOrientationSupported] )
{
[videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
}
my bad :)

Related

Integrate iphone camera into ViewController's subview UIView

I am using following code to embed Camera into my application view.
Here is my code
- (void)viewDidLoad
{
[super viewDidLoad];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.cameraView.bounds;
[self.cameraView.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input)
{
[Utilities alertDisplay:#"Error" message:#"Camera not found. Please use Photo Gallery instead."];
}
[session addInput:input];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
}
-(AVCaptureDevice *)backFacingCameraIfAvailable{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices){
if (device.position == AVCaptureDevicePositionBack){
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if (!captureDevice){
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
- (IBAction)scanButtonPressed:(id)sender
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
//self.vImage.image = image;
}];
}
The problem i am facing is, I don't get any camera opened in my cameraView and also on scanBtnPressed i get
stillImageOutput.connections = 0 objects.
What is wrong?
Well, I just copy pasted your code into a blank project and it worked fine when changing self.cameraView.layer to self.view.layer. However, I did try creating self.cameraView and never initializing it and it had similar consequences to those you described.
Overall, I would check to make sure that self.cameraView isn't nil. If it's done programmatically, make sure you're calling alloc/init and setting a frame, and if it's an IBOutlet make sure it's properly linked.

cannot change AVCaptureConnection orientation

I want to change the video orientation from front camera,
So I choose the previewLayer.orientation,and it indeed work .
Since the method is deprecated from IOS6 and I get a warning to use previewLayer.connection.videoOrientation
but I cannot access the connection property from previewLayer
- (void)addDeviceInput {
[session beginConfiguration];
[session removeInput:videoInput];
[self setVideoInput:nil];
AVCaptureDeviceInput *newDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self frontFacingCamera] error:nil];
if ([session canAddInput:newDeviceInput]) {
[session addInput:newDeviceInput];
[self setVideoInput:newDeviceInput];
}
newDeviceInput = nil;
if (previewLayer) {
[previewLayer removeFromSuperlayer];
[previewLayer release];
previewLayer = nil;
}
[session commitConfiguration];
}
- (id)setupCameraSession {
if (!(self = [super init])) {
return nil;
}
self.frameRate = 0;
session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
[session setSessionPreset:AVCaptureSessionPreset352x288];
[self addDeviceInput]; //invoke previous method
AVCaptureVideoDataOutput * newVideoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
newVideoDataOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],
kCVPixelBufferPixelFormatTypeKey,nil];
newVideoDataOutput.alwaysDiscardsLateVideoFrames = YES;
if ([session canAddOutput:newVideoDataOutput]) {
[session addOutput:newVideoDataOutput];
[self setVideoOutput:newVideoDataOutput];
}
[newVideoDataOutput release];
self.frameRate = VIDEO_FRAME_RATE;
[session commitConfiguration];
AVCaptureVideoPreviewLayer * newPreviewLayer =
[[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
//[newPreviewLayer setSession:session]; //it doesn't work? and get error
[self setPreviewLayer:newPreviewLayer];
[newPreviewLayer release];
[session startRunning];
return self;
}
I get the error
error: Execution was interrupted, reason: Attempted to dereference an invalid ObjC Object or send it an unrecognized selector.

UIImage orientation incorrect from AVCaptureStillImageOutput

I'm trying to capture pixel data from an AVCaptureStillImageOutput and noticed that upon cropping the image to a CGImage, it becomes re-oriented. To test this, I output a temporary image to the photo library. Now I've noticed that even before the image is cropped, it's thumbnail is rotated while the full image is not. (This later becomes a problem when I pass the UIImage to my custom pixelDataManager that requires proper dimensions of the image.)
Setting captureVideoPreviewLayer.orientation = AVCaptureVideoOrientationPortrait; and [videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait]; did not seem to change anything.
Any thoughts???
I'll break it up into sections...
1) Setting up the session, input, and output:
// Create a capture session
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
// Add capture layer to visible view
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResize;
captureVideoPreviewLayer.frame = screenBounds;
captureVideoPreviewLayer.bounds = screenBounds;
captureVideoPreviewLayer.orientation = AVCaptureVideoOrientationPortrait;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// Setup error handling
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
// Start session
[session startRunning];
// Output image data
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
2) Setting up the video connection:
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
if ([videoConnection isVideoOrientationSupported])
{
[videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
}
3) Output the captured image:
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
//......
}
When you do this, though the image is being provided to you as a UIImage, it's actually using the underlying Quartz CGImage data which has a different origin point (lower left I think) which means that when you use the image it's rotated to the side.
I found a C function that you can call with the UIImage as parameter that fixes it and returns the fixed UIImage.
http://blog.logichigh.com/2008/06/05/uiimage-fix/

UIlabel is not getting updated inside AVCaptureSession Delegate

I am learning objective c and doing a sample app to fetch video feed from iPhone camera. I was able to get the feeds from camera and display it on screen. Also I was trying to update some UILabel in screen for each frame from the video inside the delegate method. But the label value is not getting updated always. Here is the code I am using
This section will initialize the capture
- (void)initCapture
{
NSError *error = nil;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]) {
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
}
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
//AVCaptureStillImageOutput *imageCaptureOutput = [[AVCaptureStillImageOutput alloc] init];
AVCaptureVideoDataOutput *captureOutput =[[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
//captureOutput.minFrameDuration = CMTimeMake(1, 1);
captureOutput.alwaysDiscardsLateVideoFrames = YES;
dispatch_queue_t queue;
queue = dispatch_queue_create("cameraQueue", NULL);
[captureOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// Set the video output to store frame in BGRA (It is supposed to be faster)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
self.captureSession = [[AVCaptureSession alloc] init];
[self.captureSession addInput:captureInput];
[self.captureSession addOutput:captureOutput];
self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession];
self.prevLayer.frame = CGRectMake(0, 0, 320, 320);
self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.videoPreview.layer addSublayer: self.prevLayer];
[self.captureSession startRunning];
}
This method is called for each video frame.
#pragma mark AVCaptureSession delegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
i++;
self.lblStatus.Text = [NSString stringWithFormat:#"%d",i];
}
I am trying to print UILabel inside this method but it is not printed always. THere is much delay for the label text to change.
Could someone help please?
Thanks.
Your sampleBufferDelegate's captureOutput is being called from a non-main thread - updating GUI objects from there can do no good. Try using performSelectorOnMainThread instead.

AVAssetWritter does not work with audio

I'm trying to get audio to work with the video for an iOS application. The video is fine. No audio is recorded to the file (My iPhone speaker works.)
Here's the init setup:
session = [[AVCaptureSession alloc] init];
menu->session = session;
menu_open = NO;
session.sessionPreset = AVCaptureSessionPresetMedium;
camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
microphone = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
menu->camera = camera;
[session beginConfiguration];
[camera lockForConfiguration:nil];
if([camera isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]){
camera.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
}
if([camera isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus]){
camera.focusMode = AVCaptureFocusModeContinuousAutoFocus;
}
if([camera isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance]){
camera.whiteBalanceMode = AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance;
}
if ([camera hasTorch]) {
if([camera isTorchModeSupported:AVCaptureTorchModeOn]){
[camera setTorchMode:AVCaptureTorchModeOn];
}
}
[camera unlockForConfiguration];
[session commitConfiguration];
AVCaptureDeviceInput * camera_input = [AVCaptureDeviceInput deviceInputWithDevice:camera error:nil];
[session addInput:camera_input];
microphone_input = [[AVCaptureDeviceInput deviceInputWithDevice:microphone error:nil] retain];
AVCaptureVideoDataOutput * output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
output.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[session addOutput:output];
output.minFrameDuration = CMTimeMake(1,30);
dispatch_queue_t queue = dispatch_queue_create("MY QUEUE", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
audio_output = [[[AVCaptureAudioDataOutput alloc] init] retain];
queue = dispatch_queue_create("MY QUEUE", NULL);
AudioOutputBufferDelegate * special_delegate = [[[AudioOutputBufferDelegate alloc] init] autorelease];
special_delegate->normal_delegate = self;
[special_delegate retain];
[audio_output setSampleBufferDelegate:special_delegate queue:queue];
dispatch_release(queue);
[session startRunning];
Here is the beginning and end of recording:
if (recording) { //Hence stop recording
[video_button setTitle:#"Video" forState: UIControlStateNormal];
recording = NO;
[writer_input markAsFinished];
[audio_writer_input markAsFinished];
[video_writer endSessionAtSourceTime: CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate: start_time],30)];
[video_writer finishWriting];
UISaveVideoAtPathToSavedPhotosAlbum(temp_url,self,#selector(video:didFinishSavingWithError:contextInfo:),nil);
[start_time release];
[temp_url release];
[av_adaptor release];
[microphone lockForConfiguration:nil];
[session beginConfiguration];
[session removeInput:microphone_input];
[session removeOutput:audio_output];
[session commitConfiguration];
[microphone unlockForConfiguration];
[menu restateConfigiration];
[vid_off play];
}else{ //Start recording
[vid_on play];
[microphone lockForConfiguration:nil];
[session beginConfiguration];
[session addInput:microphone_input];
[session addOutput:audio_output];
[session commitConfiguration];
[microphone unlockForConfiguration];
[menu restateConfigiration];
[video_button setTitle:#"Stop" forState: UIControlStateNormal];
recording = YES;
NSError *error = nil;
NSFileManager * file_manager = [[NSFileManager alloc] init];
temp_url = [[NSString alloc] initWithFormat:#"%#/%#", NSTemporaryDirectory(), #"temp.mp4"];
[file_manager removeItemAtPath: temp_url error:NULL];
[file_manager release];
video_writer = [[AVAssetWriter alloc] initWithURL: [NSURL fileURLWithPath:temp_url] fileType: AVFileTypeMPEG4 error: &error];
NSDictionary *video_settings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey,[NSNumber numberWithInt:360], AVVideoWidthKey,[NSNumber numberWithInt:480], AVVideoHeightKey,nil];
writer_input = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:video_settings] retain];
AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
audio_writer_input = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings: [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey,[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,[NSNumber numberWithFloat: 44100.0], AVSampleRateKey,[NSNumber numberWithInt: 64000], AVEncoderBitRateKey,[NSData dataWithBytes: &acl length: sizeof(acl) ], AVChannelLayoutKey,nil]] retain];
audio_writer_input.expectsMediaDataInRealTime = YES;
av_adaptor = [[AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput: writer_input sourcePixelBufferAttributes:NULL] retain];
[video_writer addInput:writer_input];
[video_writer addInput: audio_writer_input];
[video_writer startWriting];
[video_writer startSessionAtSourceTime: CMTimeMake(0,1)];
start_time = [[NSDate alloc] init];
}
Here is the delegate for the audio:
#implementation AudioOutputBufferDelegate
-(void)captureOutput: (AVCaptureOutput *) captureOutput didOutputSampleBuffer: (CMSampleBufferRef) sampleBuffer fromConnection: (AVCaptureConnection *) conenction{
if (normal_delegate->recording) {
CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer,CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate: normal_delegate->start_time],30));
[normal_delegate->audio_writer_input appendSampleBuffer: sampleBuffer];
}
}
#end
The video method doesn't matter because it works. "restateConfigiration" just sorts out the session configuration otherwise the torch goes off etc:
[session beginConfiguration];
switch (quality) {
case Low:
session.sessionPreset = AVCaptureSessionPresetLow;
break;
case Medium:
session.sessionPreset = AVCaptureSessionPreset640x480;
break;
}
[session commitConfiguration];
[camera lockForConfiguration:nil];
if([camera isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]){
camera.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
}
if([camera isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus]){
camera.focusMode = AVCaptureFocusModeContinuousAutoFocus;
}
if([camera isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance]){
camera.whiteBalanceMode = AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance;
}
if ([camera hasTorch]) {
if (torch) {
if([camera isTorchModeSupported:AVCaptureTorchModeOn]){
[camera setTorchMode:AVCaptureTorchModeOn];
}
}else{
if([camera isTorchModeSupported:AVCaptureTorchModeOff]){
[camera setTorchMode:AVCaptureTorchModeOff];
}
}
}
[camera unlockForConfiguration];
THank you for any help.
AVAssetWriter and Audio
This may be the same issue as mentioned in the linked post. Try commenting out these lines
[writer_input markAsFinished];
[audio_writer_input markAsFinished];
[video_writer endSessionAtSourceTime: CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate: start_time],30)];
Edit
I don't know if the way you are setting the presentation time stamp is necessarily wrong. The way I handle this is with a local variable that is set to 0 on start. Then when my delegate receives the first packet I do:
if (_startTime.value == 0) {
_startTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
}
and then
[bufferWriter->writer startWriting];
[bufferWriter->writer startSessionAtSourceTime:_startTime];
Your code looks valid as you are calculating the time difference for each received packet. However, AVFoundation calculates this for you, and also optimizes the timestamps for placement in the interleaved container. Another thing I am unsure of is each CMSampleBufferRef for audio contains more then 1 data buffer where each data buffer has it's own PTS. I am not sure if setting the PTS automatically adjusts all the other data buffers.
Where my code differs from yours is I use a single dispatch queue for both audio and video. In the callback I use (some code removed).
switch (bufferWriter->writer.status) {
case AVAssetWriterStatusUnknown:
if (_startTime.value == 0) {
_startTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
}
[bufferWriter->writer startWriting];
[bufferWriter->writer startSessionAtSourceTime:_startTime];
//Break if not ready, otherwise fall through.
if (bufferWriter->writer.status != AVAssetWriterStatusWriting) {
break ;
}
case AVAssetWriterStatusWriting:
if( captureOutput == self.captureManager.audioOutput) {
if( !bufferWriter->audioIn.readyForMoreMediaData) {
break;
}
#try {
if( ![bufferWriter->audioIn appendSampleBuffer:sampleBuffer] ) {
[self delegateMessage:#"Audio Writing Error" withType:ERROR];
}
}
#catch (NSException *e) {
NSLog(#"Audio Exception: %#", [e reason]);
}
}
else if( captureOutput == self.captureManager.videoOutput ) {
if( !bufferWriter->videoIn.readyForMoreMediaData) {
break;;
}
#try {
if (!frontCamera) {
if( ![bufferWriter->videoIn appendSampleBuffer:sampleBuffer] ) {
[self delegateMessage:#"Video Writing Error" withType:ERROR];
}
}
else {
CMTime pt = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
flipBuffer(sampleBuffer, pixelBuffer);
if( ![bufferWriter->adaptor appendPixelBuffer:pixelBuffer withPresentationTime:pt] ) {
[self delegateMessage:#"Video Writing Error" withType:ERROR];
}
}
}
#catch (NSException *e) {
NSLog(#"Video Exception Exception: %#", [e reason]);
}
}
break;
case AVAssetWriterStatusCompleted:
return;
case AVAssetWriterStatusFailed:
[self delegateMessage:#"Critical Error Writing Queues" withType:ERROR];
bufferWriter->writer_failed = YES ;
_broadcastError = YES;
[self stopCapture] ;
return;
case AVAssetWriterStatusCancelled:
break;
default:
break;
}