Im trying to record video using AVFoundation
I can save images but not video. When trying to save a video, I
got an error saying:
[AVCaptureMovieFileOutput startRecordingToOutputFileURL:recordingDelegate:] - no active/enabled connections.'
And here is my code:
session = [[AVCaptureSession alloc] init];
//session is global object.
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.imgV.bounds;
[self.imgV.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device =[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
NSLog(#"start 3");
AVCaptureDeviceInput *input =
[AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
NSLog(#"Error");
}
[session addInput:input];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
aMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
[session addOutput:aMovieFileOutput];
[session startRunning];
[self performSelector:#selector(startRecording) withObject:nil afterDelay:10.0];
//[aMovieFileOutput startRecordingToOutputFileURL:fileURL recordingDelegate:self];
//previously i used to do this way but saw people doing it after delay thought it might be taking some time to initialized so tried this way also.
}
- (void) startRecording
{
NSLog(#"startRecording");
NSString *plistPath;
NSString *rootPath;
rootPath= [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
plistPath = [rootPath stringByAppendingPathComponent:#"test.mov"];
NSURL *fileURL = [[NSURL alloc] initFileURLWithPath:plistPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:plistPath]) {
NSLog(#"file exist %s n url %# ",[rootPath UTF8String],fileURL);
}
[aMovieFileOutput startRecordingToOutputFileURL:fileURL recordingDelegate:self];
}
Also I am trying to test this on Iphone 3G with IOs-4.1.
Also worth noting this can happen if you've set the session preset to 'photo'.
[session setSessionPreset:kCaptureSessionPresetPhoto];
Which should be:
[session setSessionPreset:kCaptureSessionPresetVideo];
You should be adding your outputs between a [AVCaptureSession beginConfiguration] and [AVCaptureSession commitConfiguration] pair.
This error can occur when you pass an object as a delegate using startRecordingToOutputFileURL:recordingDelegate: method, but this object does not have the
captureOutput:didStartRecordingToOutputFileAtURL:fromConnections: AVCaptureFileOutputRecordingDelegate method implemented.
Implement the AVCaptureFileOutputRecordingDelegate protocol. Make sure the path of video file is correct and the video file is not exist, as the movie file output does not overwrite existing resources.
Related
I am currently trying to make an app stream raw data from mic over mulipeer connectivity.
I have been using this tutorial as a base https://robots.thoughtbot.com/streaming-audio-to-multiple-listeners-via-ios-multipeer-connectivity
Now however I am struggling with changing the URL from itunes library to my local file.
I am no advanced programmer and this is some kind of summer project.
When the program is getting music from itunes library it uses this code:
- (void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
[self dismissViewControllerAnimated:YES completion:nil];
if (self.outputStreamer) return;
self.song = mediaItemCollection.items[0];
NSMutableDictionary *info = [NSMutableDictionary dictionary];
info[#"title"] = [self.song valueForProperty:MPMediaItemPropertyTitle] ? [self.song valueForProperty:MPMediaItemPropertyTitle] : #"";
info[#"artist"] = [self.song valueForProperty:MPMediaItemPropertyArtist] ? [self.song valueForProperty:MPMediaItemPropertyArtist] : #"";
MPMediaItemArtwork *artwork = [self.song valueForProperty:MPMediaItemPropertyArtwork];
UIImage *image = [artwork imageWithSize:self.albumImage.frame.size];
if (image)
info[#"artwork"] = image;
if (info[#"artwork"])
self.albumImage.image = info[#"artwork"];
else
self.albumImage.image = nil;
self.songTitle.text = info[#"title"];
self.songArtist.text = info[#"artist"];
[self.session sendData:[NSKeyedArchiver archivedDataWithRootObject:[info copy]]];
NSArray *peers = [self.session connectedPeers];
if (peers.count) {
self.outputStreamer = [[TDAudioOutputStreamer alloc] initWithOutputStream:[self.session outputStreamForPeer:peers[0]]];
[self.outputStreamer streamAudioFromURL:[self.song valueForProperty:MPMediaItemPropertyAssetURL]];
[self.outputStreamer start];
But I want it to get music from the recorder:
NSArray *pathComponents = [NSArray arrayWithObjects:
[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject],
#"MyAudioMemo.m4a",
nil];
NSURL *outputFileURL = [NSURL fileURLWithPathComponents:pathComponents];
recorder = [[AVAudioRecorder alloc] initWithURL:outputFileURL settings:recordSetting error:nil];
player = [[AVAudioPlayer alloc] initWithContentsOfURL:recorder.url error:nil];
I have been struggling with this for a while now and would appreciate any kind of help!
I am trying to use the iOS7 QR reading functions in the AVFoundation framework using the following code:
-(void)setupCaptureSession_iOS7 {
self.session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input)
{
NSLog(#"Error: %#", error);
return;
}
[session addInput:input];
//Turn on point autofocus for middle of view
[device lockForConfiguration:&error];
CGPoint point = CGPointMake(0.5,0.5);
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
//Add the metadata output device
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
NSLog(#"%lu",(unsigned long)output.availableMetadataObjectTypes.count);
for (NSString *s in output.availableMetadataObjectTypes)
NSLog(#"%#",s);
//You should check here to see if the session supports these types, if they aren't support you'll get an exception
output.metadataObjectTypes = #[AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeUPCECode];
output.rectOfInterest = CGRectMake(0, 0, 320, 480);
[session startRunning];
// Assign session to an ivar.
[self setSession:self.session];
}
This code obviously doesn't render the frames to the screen (yet). This is because, instead of using the AVCaptureVideoPreviewLayer class to display the preview, I need to display the frames as a UIImage (this is because I have want to display the frames multiple times on the view).
If I use AVCaptureVideoDataOutput as the output, I'm able to export the frames using by grabbing them from the captureOutput:didOutputSampleBuffer:fromConnection: callback. But I can't find an equivalent way to call get the frameBuffer when using AVCaptureMetadataOutput as the output.
Does anyone have any idea how to do this?
I am developing app for the connecting ip camera and get stream. I don't know how to convert stream into video. I have analysed, and then i tried FFMPEG library. But i did not get correct url for ios with FFMPEG. Can you suggest the guide for implementation.
Now i am integrating ffmpeg, when i integrated i got the error when opening avformat_find_stream_info() function. I used the increase or less the value of pFormatCtx->max_analyze_duration. But no solution for this.
[mjpeg # 0x8b85000] max_analyze_duration 1000 reached at 40000
[mjpeg # 0x8b85000] Estimating duration from bitrate, this may be inaccurate
Pls help how to solve this.
Look at the "AVCam" example from Apple.
It does exactly that, and with lots of inline comments.
https://developer.apple.com/library/ios/#samplecode/AVCam/Introduction/Intro.html
Here is a small code snippet that will point you in the right direction:
Create an AVCaptureSession first...
// Init the device inputs
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil];
AVCaptureDeviceInput *newAudioInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self audioDevice] error:nil];
// Setup the still image file output
AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:
AVVideoCodecJPEG, AVVideoCodecKey,
nil];
[newStillImageOutput setOutputSettings:outputSettings];
[outputSettings release];
// Create session (use default AVCaptureSessionPresetHigh)
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
Create the video file output
AVCaptureMovieFileOutput *aMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([aSession canAddOutput:aMovieFileOutput])
[aSession addOutput:aMovieFileOutput];
Save the file:
-(void)recorder:(AVCamRecorder *)recorder recordingDidFinishToOutputFileURL:(NSURL *)outputFileURL error:(NSError *)error
{
if ([[self recorder] recordsAudio] && ![[self recorder] recordsVideo]) {
// If the file was created on a device that doesn't support video recording, it can't be saved to the assets
// library. Instead, save it in the app's Documents directory, whence it can be copied from the device via
// iTunes file sharing.
[self copyFileToDocuments:outputFileURL];
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[[UIApplication sharedApplication] endBackgroundTask:[self backgroundRecordingID]];
}
if ([[self delegate] respondsToSelector:#selector(captureManagerRecordingFinished:)]) {
[[self delegate] captureManagerRecordingFinished:self];
}
}
else {
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)]) {
[[self delegate] captureManager:self didFailWithError:error];
}
}
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[[UIApplication sharedApplication] endBackgroundTask:[self backgroundRecordingID]];
}
if ([[self delegate] respondsToSelector:#selector(captureManagerRecordingFinished:)]) {
[[self delegate] captureManagerRecordingFinished:self];
}
}];
[library release];
}
}
If I just have the video added as an input, the preview layer works fine, and the video records fine, but if I try to add audio, the preview layer freezes, and video file is corrupted. What could be going on here that is causing all of this?
-(void) record {
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
session.sessionPreset = AVCaptureSessionPresetMedium;
CALayer *viewLayer = self.vImagePreview.layer;
NSLog(#"viewLayer = %#", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [self frontFacingCameraIfAvailable];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:nil];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [paths objectAtIndex:0];
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
NSString *archives = [documentsDirectoryPath stringByAppendingPathComponent:#"archives"];
NSString *outputpathofmovie = [[archives stringByAppendingPathComponent:#"Test"] stringByAppendingString:#".mp4"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputpathofmovie];
[session addInput:input];
[session addInput:audioInput];
[session addOutput:movieFileOutput];
[session commitConfiguration];
[session startRunning];
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
}
-(AVCaptureDevice *)frontFacingCameraIfAvailable
{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices)
{
if (device.position == AVCaptureDevicePositionFront)
{
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if ( ! captureDevice)
{
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
I am using AVFoundation framework in my iPad app to record videos.When recording of a video is completed, I create a thumbnail image for same video using AVFoundation.But if I record two videos without much time delay in between then thumbnail is not created properly.Clearly it's taking some delay.How to avoid that or at least how can I know when the thumbnail of previous video has got created completely so that I can show some 'Wait' symbol to the user for that time period?
Please help as I am having no clue to resolve the issue.
//delegate method which gets called when recording ends.
//Here I create the thumb and store it in self.thumb
-(void)recorder:(AVCamRecorder *)recorder recordingDidFinishToOutputFileURL:(NSURL *)outputFileURL error:(NSError *)error
{
self.thumb=nil;
AVURLAsset *asset=[[AVURLAsset alloc] initWithURL:outputFileURL options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform=TRUE;
[asset release];
CMTime thumbTime = CMTimeMakeWithSeconds(0,1);
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
if (result != AVAssetImageGeneratorSucceeded) {
NSLog(#"couldn't generate thumbnail, error:%#", error);
}
self.thumb=[UIImage imageWithCGImage:im];
[generator release];
};
CGSize maxSize = CGSizeMake(320, 180);
generator.maximumSize = maxSize;
[generator generateCGImagesAsynchronouslyForTimes:[NSArray arrayWithObject:[NSValue valueWithCMTime:thumbTime]] completionHandler:handler];
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[[UIApplication sharedApplication] endBackgroundTask:[self backgroundRecordingID]];
}
if ([[self delegate] respondsToSelector:#selector(captureManagerRecordingFinished:)]) {
[[self delegate] captureManagerRecordingFinished:self];
}
}
[self copyFileToDocuments:outputFileURL];
}
//Save video and thumbnail to documents
- (void) copyFileToDocuments:(NSURL *)fileURL
{
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss"];
NSString *destinationPath = [documentsDirectory stringByAppendingFormat:#"/output_%#.mov", [dateFormatter stringFromDate:[NSDate date]]];
[dateFormatter release];
NSError *error;
if (![[NSFileManager defaultManager] copyItemAtURL:fileURL toURL:[NSURL fileURLWithPath:destinationPath] error:&error]) {
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)]) {
[[self delegate] captureManager:self didFailWithError:error];
}
}
else
{
destinationPath=[[destinationPath lastPathComponent]stringByDeletingPathExtension];
[self saveImage:self.thumb withName:destinationPath];
}
}
//Method Where I save self.thumb in app documents
-(void) saveImage:(UIImage*)image withName:(NSString*)fileName
{
NSFileManager *fileManager = [NSFileManager defaultManager];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
//Save Image
NSData *imageData = UIImagePNGRepresentation(image); //convert image into .png format.
NSString *fullPath = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.png",fileName]];
[fileManager createFileAtPath:fullPath contents:imageData attributes:nil];
}
Above is my code.
But self.thumb is not getting properly saved if I don't give delay in two recordings.
I used a different method.When user clicks 'Record' I first click a photo from camera and then start recording the video.Later I use the same photo as a thumbnail.
I know this is not the best approach but the best thing is its working ;)
Hope this helps someone with the same problem.