Front Camera Video Recording iPhone 4? - iphone

i am trying to record a video from front camera of iPhone 4 using AVFoundation Framework with the help of WWDC samples i got from iPhone developer program. But i still cant get it to work..the video does not get recorded or mayb saved in my iPhone library...here's the code i am trying to use...it would b really helpful if someone culd help me with the problem i am having??
-(void)recordVideo
{
AVCaptureDeviceInput *videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil];
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session addInput:videoInput];
[session addOutput:movieFileOutput];
[movieFileOutput release];
if (![session isRunning])
{
[self performSelector:#selector(startRecording) withObject:nil afterDelay:1.0];
[session startRunning];
}
}
- (void) startRecording
{
NSLog(#"start recording");
AVCaptureConnection *videoConnection = [playVideo connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self movieFileOutput] connections]];
if ([videoConnection isVideoOrientationSupported]) {
[videoConnection setVideoOrientation:[self orientation]];
}
[[self movieFileOutput] startRecordingToOutputFileURL:[self tempFileURL]
recordingDelegate:self];
}
- (void) stopRecording
{
NSLog(#"stop recording");
[[self movieFileOutput] stopRecording];
}
- (NSURL *) tempFileURL
{
NSLog(#"temp file url");
NSString *outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"output.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath]) {
NSLog(#"exists");
}
[outputPath release];
return [outputURL autorelease];
}
- (void) setConnectionWithMediaType:(NSString *)mediaType enabled:(BOOL)enabled;
{
[[playVideo connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self movieFileOutput] connections]] setEnabled:enabled];
}
+ (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections;
{
NSLog(#"connection with media type");
for ( AVCaptureConnection *connection in connections ) {
for ( AVCaptureInputPort *port in [connection inputPorts] ) {
if ( [[port mediaType] isEqual:mediaType] ) {
return [[connection retain] autorelease];
}
}
}
return nil;
}
#implementation recordVideo (Internal)
- (AVCaptureDevice *) cameraWithPosition:(AVCaptureDevicePosition) position
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device position] == position) {
return device;
}
}
return nil;
}
- (AVCaptureDevice *) backFacingCamera
{
NSLog(#"back");
return [self cameraWithPosition:AVCaptureDevicePositionBack];
}
- (AVCaptureDevice *) frontFacingCamera
{
NSLog(#"front ");
return [self cameraWithPosition:AVCaptureDevicePositionFront];
}
#end
#implementation recordVideo (AVCaptureFileOutputRecordingDelegate)
- (void) captureOutput:(AVCaptureFileOutput *)captureOutput
didStartRecordingToOutputFileAtURL:(NSURL *)fileURL
fromConnections:(NSArray *)connections
{
NSLog(#"did start recording");
}
- (void) captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error
{
NSLog(#"did finish recording output file");
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error){
if (error && [delegate respondsToSelector:#selector(assetLibraryError:forURL:)]) {
[delegate assetLibraryError:error forURL:assetURL];
}
}];
}
else {
}
[library release];
}
#end

You are throwing away the errors - how can you debug if you do that?
You need to be more specific in what is / is not happening. Which methods are called? What errors are reported by the SDK (if any)? etc...

you have released reference to movieFileOutput in -(void)recordVideo;
and you are still using released movieFileOutput in - (void) startRecording as well as
other functions.You better not release object that you are using.

Related

Capture a still image on iOS7 through a console app

I am attempting to capture a still image on my Jailbroken iPhone through a console.
Disclaimer: I am a objective-c newbie - most of this code is from: AVCaptureStillImageOutput never calls completition handler
I just want to know if it can work in a terminal.
I have tried various examples on SO and none seem to work. My question is - Is it even possible? or does Apple lock this down for some reason (accessing the camera through the terminal might not be allowed etc).
Here is the my current camera.m:
#import <Foundation/Foundation.h>
#import <CoreVideo/CoreVideo.h>
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVFoundation.h>
#import <ImageIO/CGImageProperties.h>
#import <stdio.h>
#import <UIKit/UIKit.h>
#interface Camera : NSObject {
}
#property (readwrite, retain) AVCaptureStillImageOutput *stillImageOutput;
- (AVCaptureDevice *)frontFacingCameraIfAvailable;
- (void)setupCaptureSession;
- (void)captureNow;
#end
#implementation Camera
#synthesize stillImageOutput;
-(AVCaptureDevice *) frontFacingCameraIfAvailable{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices){
if (device.position == AVCaptureDevicePositionFront){
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if (!captureDevice){
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
-(void) setupCaptureSession {
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
//[self.view.layer addSublayer:captureVideoPreviewLayer];
NSError *error = nil;
AVCaptureDevice *device = [self frontFacingCameraIfAvailable];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
[session addOutput:self.stillImageOutput];
[session startRunning];
}
-(void) captureNow {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(#"about to request a capture from: %#", self.stillImageOutput);
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSLog(#"image captured: %#", [imageData bytes]);
}];
}
#end
int main(int argc, const char *argv[]) {
Camera *camera = [[Camera alloc] init];
[camera setupCaptureSession];
[camera captureNow];
}
Here is my Makefile:
XCODE_BASE=/Applications/Xcode.app/Contents
IPHONEOS=$(XCODE_BASE)/Developer/Platforms/iPhoneOS.platform
SDK=$(IPHONEOS)/Developer/SDKs/iPhoneOS7.1.sdk
FRAMEWORKS=$(SDK)/System/Library/Frameworks/
INCLUDES=$(SDK)/usr/include
camera:
clang -mios-version-min=7.0 \
-isysroot $(SDK) \
-arch armv7 \
camera.m \
-lobjc \
-framework Foundation -framework AVFoundation -framework CoreVideo -framework CoreMedia -framework CoreGraphics -framework CoreImage -framework UIKit -framework ImageIO \
-o camera
Copy camera binary to iPhone:
scp camera root#10.1.1.32:/var/root/camera
root#10.1.1.32's password:
camera 100% 51KB 51.0KB/s 00:00
iPhone result of running the app:
iPhone:~ root# ./camera
2014-03-21 15:17:55.550 camera[9483:507] about to request a capture from: <AVCaptureStillImageOutput: 0x14e2b940>
As you can see, the AVCaptureSession gets setup correctly, the frontal camera is found and set, The AVCaptureConnection gets set correctly.
However captureStillImageAsynchronouslyFromConnection:completionHandler: never gets called.
Is this even possible to do? Am I doing something wrong?
EDIT:
I have modified captureNow to sleep on the thread until completionHandler completes and the app just waits forever after starting.
-(void) captureNow {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
__block BOOL done = NO;
NSLog(#"about to request a capture from: %#", self.stillImageOutput);
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSLog(#"image captured");
done = YES;
}];
while (!done) {
[NSThread sleepForTimeInterval:1.0];
}
}
Your app exited before capture photo. Needs add completionBlock arg to -captureNow and in main implement some like this:
__block BOOL done = NO;
[camera captureWithCompletionBlock:^(UIImage* image)
{
done = YES;
}];
while (!done)
{
[[NSRunLoop mainRunLoop] runUntilDate:[NSDate dateWithTimeIntervalSinceNow:0.1]];
}
I mean some like this -
- (void)captureWithCompletionBlock:(void(^)(UIImage* image))block {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(#"about to request a capture from: %#", self.stillImageOutput);
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSLog(#"image captured: %#", [imageData bytes]);
block(image);
}];
}
Also you need retain AVCaptureSession, just add it's as property.
#property (nonatomic,strong) AVCaptureSession* session;

playing a movie stops avcapturesession recording

I have an iOS app that records the video from front facing camera in the background and was working fine. But now I am trying to play a short mp4 from at the same time and the playback using MPMoviePlayerController stops the capture session.
I tried AVPlayer instead with the same result.
I also set the [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
Still no luck. Did anyone faced and solved the same problem.
Thanks for any suggestion.
using ios5 SDK.
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
UIButton *recButton=[[UIButton alloc] initWithFrame:CGRectMake(10,200, 200,40)] ;
recButton.backgroundColor = [UIColor blackColor];
[recButton addTarget:self action:#selector(startRecording) forControlEvents:UIControlEventTouchUpInside];
[self.view addSubview:recButton];
isRecording=NO;
}
- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
}
-(void) viewWillAppear:(BOOL)animated
{
self.navigationController.navigationBarHidden = YES;
}
-(void) viewWillDisappear:(BOOL)animated
{
self.navigationController.navigationBarHidden = NO;
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone) {
return (interfaceOrientation == UIInterfaceOrientationPortrait);
} else {
return YES;
}
}
#pragma mark video playing
-(void) startRecording
{
if (isRecording) {
[self stopVideoRecording];
isRecording=NO;
}
else
{
[self initCaptureSession];
[self startVideoRecording];
isRecording=YES;
}
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"new"
ofType:#"mov"]];
[self playMovieAtURL:url];
}
-(void) playMovieAtURL: (NSURL*) theURL {
player =
[[MPMoviePlayerController alloc] initWithContentURL: theURL ];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
player.scalingMode = MPMovieScalingModeAspectFill;
player.controlStyle = MPMovieControlStyleNone;
[player prepareToPlay];
// Register for the playback finished notification
[[NSNotificationCenter defaultCenter]
addObserver: self
selector: #selector(myMovieFinishedCallback:)
name: MPMoviePlayerPlaybackDidFinishNotification
object: player];
[player.view setFrame: self.view.bounds];
[self.view addSubview:player.view];
// Movie playback is asynchronous, so this method returns immediately.
[player play];
}
// When the movie is done, release the controller.
-(void) myMovieFinishedCallback: (NSNotification*) aNotification
{
MPMoviePlayerController* theMovie = [aNotification object];
[[NSNotificationCenter defaultCenter]
removeObserver: self
name: MPMoviePlayerPlaybackDidFinishNotification
object: theMovie];
[player.view removeFromSuperview];
[self stopVideoRecording];
}
#pragma mark -
#pragma mark recording
-(void) initCaptureSession
{
NSLog(#"Setting up capture session");
captureSession = [[AVCaptureSession alloc] init];
//----- ADD INPUTS -----
NSLog(#"Adding video input");
//ADD VIDEO INPUT
AVCaptureDevice *VideoDevice = [self frontFacingCameraIfAvailable ];
//[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (VideoDevice)
{
NSError *error;
videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:VideoDevice error:&error];
if (!error)
{
if ([captureSession canAddInput:videoInputDevice])
[captureSession addInput:videoInputDevice];
else
NSLog(#"Couldn't add video input");
}
else
{
NSLog(#"Couldn't create video input");
}
}
else
{
NSLog(#"Couldn't create video capture device");
}
//ADD AUDIO INPUT
NSLog(#"Adding audio input");
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *error = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
if (audioInput)
{
[captureSession addInput:audioInput];
}
//----- ADD OUTPUTS ---
//ADD MOVIE FILE OUTPUT
NSLog(#"Adding movie file output");
movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
// Float64 TotalSeconds = 60; //Total seconds
// int32_t preferredTimeScale = 30; //Frames per second
// CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale); //<<SET MAX DURATION
// movieFileOutput.maxRecordedDuration = maxDuration;
movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024; //<<SET MIN FREE SPACE IN BYTES FOR RECORDING TO CONTINUE ON A VOLUME
if ([captureSession canAddOutput:movieFileOutput])
[captureSession addOutput:movieFileOutput];
//SET THE CONNECTION PROPERTIES (output properties)
[self CameraSetOutputProperties]; //(We call a method as it also has to be done after changing camera)
//----- SET THE IMAGE QUALITY / RESOLUTION -----
//Options:
// AVCaptureSessionPresetHigh - Highest recording quality (varies per device)
// AVCaptureSessionPresetMedium - Suitable for WiFi sharing (actual values may change)
// AVCaptureSessionPresetLow - Suitable for 3G sharing (actual values may change)
// AVCaptureSessionPreset640x480 - 640x480 VGA (check its supported before setting it)
// AVCaptureSessionPreset1280x720 - 1280x720 720p HD (check its supported before setting it)
// AVCaptureSessionPresetPhoto - Full photo resolution (not supported for video output)
NSLog(#"Setting image quality");
[captureSession setSessionPreset:AVCaptureSessionPresetMedium];
if ([captureSession canSetSessionPreset:AVCaptureSessionPreset640x480]) //Check size based configs are supported before setting them
[captureSession setSessionPreset:AVCaptureSessionPreset640x480];
//----- START THE CAPTURE SESSION RUNNING -----
[captureSession startRunning];
}
//********** CAMERA SET OUTPUT PROPERTIES **********
- (void) CameraSetOutputProperties
{
AVCaptureConnection *CaptureConnection=nil;
//SET THE CONNECTION PROPERTIES (output properties)
NSComparisonResult order = [[UIDevice currentDevice].systemVersion compare: #"5.0.0" options: NSNumericSearch];
if (order == NSOrderedSame || order == NSOrderedDescending) {
// OS version >= 5.0.0
CaptureConnection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
// if (CaptureConnection.supportsVideoMinFrameDuration)
// CaptureConnection.videoMinFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
// if (CaptureConnection.supportsVideoMaxFrameDuration)
// CaptureConnection.videoMaxFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
// if (CaptureConnection.supportsVideoMinFrameDuration)
// {
// // CMTimeShow(CaptureConnection.videoMinFrameDuration);
// // CMTimeShow(CaptureConnection.videoMaxFrameDuration);
// }
} else {
// OS version < 5.0.0
CaptureConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:[movieFileOutput connections]];
}
//Set landscape (if required)
if ([CaptureConnection isVideoOrientationSupported])
{
AVCaptureVideoOrientation orientation = AVCaptureVideoOrientationPortrait;// AVCaptureVideoOrientationLandscapeRight; //<<<<<SET VIDEO ORIENTATION IF LANDSCAPE
[CaptureConnection setVideoOrientation:orientation];
}
//Set frame rate (if requried)
//CMTimeShow(CaptureConnection.videoMinFrameDuration);
//CMTimeShow(CaptureConnection.videoMaxFrameDuration);
}
- (void) startVideoRecording
{
//Create temporary URL to record to
NSString *outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"output.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath])
{
NSError *error;
if ([fileManager removeItemAtPath:outputPath error:&error] == NO)
{
//Error - handle if requried
NSLog(#"file remove error");
}
}
//Start recording
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
}
-(void) stopVideoRecording
{
[movieFileOutput stopRecording];
}
//********** DID FINISH RECORDING TO OUTPUT FILE AT URL **********/
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error
{
NSLog(#"didFinishRecordingToOutputFileAtURL - enter");
BOOL RecordedSuccessfully = YES;
if ([error code] != noErr)
{
// A problem occurred: Find out if the recording was successful.
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if (value)
{
RecordedSuccessfully = [value boolValue];
}
}
if (RecordedSuccessfully)
{
//----- RECORDED SUCESSFULLY -----
NSLog(#"didFinishRecordingToOutputFileAtURL - success");
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error)
{
if (error)
{
NSLog(#"File save error");
}
else
{
recordedVideoURL=assetURL;
}
}];
}
else {
NSString *assetURL=[self copyFileToDocuments:outputFileURL];
if(assetURL!=nil)
{
recordedVideoURL=[NSURL URLWithString:assetURL];
}
}
}
}
- (NSString*) copyFileToDocuments:(NSURL *)fileURL
{
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss"];
NSString *destinationPath = [documentsDirectory stringByAppendingFormat:#"/output_%#.mov", [dateFormatter stringFromDate:[NSDate date]]];
NSError *error;
if (![[NSFileManager defaultManager] copyItemAtURL:fileURL toURL:[NSURL fileURLWithPath:destinationPath] error:&error]) {
NSLog(#"File save error %#", [error localizedDescription]);
return nil;
}
return destinationPath;
}
- (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections
{
for ( AVCaptureConnection *connection in connections ) {
for ( AVCaptureInputPort *port in [connection inputPorts] ) {
if ( [[port mediaType] isEqual:mediaType] ) {
return connection;
}
}
}
return nil;
}
- (AVCaptureDevice *)frontFacingCameraIfAvailable
{
// look at all the video devices and get the first one that's on the front
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices)
{
if (device.position == AVCaptureDevicePositionFront)
{
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if ( ! captureDevice)
{
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
#pragma mark -
#end
I have this problem too,I don't know how to fix it,but I know the problem is here:
[captureSession addInput:audioInput];
If you delete this code of line,it will be work fine,I think it's audio mix or some audio problem.
I still finding the answer.
I have found the answer here: answer ,it work!
but u remember add AudioToolbox.framework,maybe helpful for you.

How to capture image without displaying preview in iOS

I want to capture images at specific instances, for example when a button is pushed; but I don't want to show any video preview screen. I guess captureStillImageAsynchronouslyFromConnection is what I need to use for this scenario. Currently, I can capture image if I show a video preview. However, if I remove the code to show the preview, the app crashes with the following output:
2012-04-07 11:25:54.898 imCapWOPreview[748:707] *** Terminating app
due to uncaught exception 'NSInvalidArgumentException', reason: '***
-[AVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection:completionHandler:] -
inactive/invalid connection passed.'
*** First throw call stack: (0x336ee8bf 0x301e21e5 0x3697c35d 0x34187 0x33648435 0x310949eb 0x310949a7 0x31094985 0x310946f5 0x3109502d
0x3109350f 0x31092f01 0x310794ed 0x31078d2d 0x37db7df3 0x336c2553
0x336c24f5 0x336c1343 0x336444dd 0x336443a5 0x37db6fcd 0x310a7743
0x33887 0x3382c) terminate called throwing an exception(lldb)
So here is my implementation:
BIDViewController.h:
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface BIDViewController : UIViewController
{
AVCaptureStillImageOutput *stillImageOutput;
}
#property (strong, nonatomic) IBOutlet UIView *videoPreview;
- (IBAction)doCap:(id)sender;
#end
Relevant staff inside BIDViewController.m :
#import "BIDViewController.h"
#interface BIDViewController ()
#end
#implementation BIDViewController
#synthesize capturedIm;
#synthesize videoPreview;
- (void)viewDidLoad
{
[super viewDidLoad];
[self setupAVCapture];
}
- (BOOL)setupAVCapture
{
NSError *error = nil;
AVCaptureSession *session = [AVCaptureSession new];
[session setSessionPreset:AVCaptureSessionPresetHigh];
/*
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.videoPreview.bounds;
[self.videoPreview.layer addSublayer:captureVideoPreviewLayer];
*/
// Select a video device, make an input
AVCaptureDevice *backCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
if (error)
return NO;
if ([session canAddInput:input])
[session addInput:input];
// Make a still image output
stillImageOutput = [AVCaptureStillImageOutput new];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
if ([session canAddOutput:stillImageOutput])
[session addOutput:stillImageOutput];
[session startRunning];
return YES;
}
- (IBAction)doCap:(id)sender {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *__strong error) {
// Do something with the captured image
}];
}
With the above code, if doCap is called, then the crash occurs. On the other hand, if I remove the following comments in setupAVCapture function
/*
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.videoPreview.bounds;
[self.videoPreview.layer addSublayer:captureVideoPreviewLayer];
*/
then it works without any problem.
In summary, my questions is, How can I capture images at controlled instances without showing preview ?
I use the following code for capturing from front facing camera (if available) or using the back camera. Works well on my iPhone 4S.
-(void)viewDidLoad{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureDevice *device = [self frontFacingCameraIfAvailable];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
//stillImageOutput is a global variable in .h file: "AVCaptureStillImageOutput *stillImageOutput;"
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
}
-(AVCaptureDevice *)frontFacingCameraIfAvailable{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices){
if (device.position == AVCaptureDevicePositionFront){
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if (!captureDevice){
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
-(IBAction)captureNow{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections){
for (AVCaptureInputPort *port in [connection inputPorts]){
if ([[port mediaType] isEqual:AVMediaTypeVideo]){
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error){
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments){
// Do something with the attachments if you want to.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.vImage.image = image;
}];
}
Well, i was facing a similar issue where by the captureStillImageAsynchronouslyFromConnection:stillImageConnection was raising an exception that the passed connection is invalid. Later on, i figured out that when i made properties for the session and stillImageOutPut to retain values, the issue got resolved.
I am a javascript developer. Wanted to create an iOS native framework for my cross-platform Javascript project
When I started to do the same, I faced many issues with methods deprecated and other runtime errors.
After fixing all the issues, below is the answer which is compliant with iOS 13.5
The code helps you to take a picture on a button click without a preview.
Your .h file
#interface NoPreviewCameraViewController : UIViewController <AVCapturePhotoCaptureDelegate> {
AVCaptureSession *captureSession;
AVCapturePhotoOutput *photoOutput;
AVCapturePhotoSettings *photoSetting;
AVCaptureConnection *captureConnection;
UIImageView *imageView;
}
#end
Your .m file
- (void)viewDidLoad {
[super viewDidLoad];
imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height - 140)];
[self.view addSubview:imageView];
UIButton *takePicture = [UIButton buttonWithType:UIButtonTypeCustom];
[takePicture addTarget:self action:#selector(takePicture:) forControlEvents:UIControlEventTouchUpInside];
[takePicture setTitle:#"Take Picture" forState:UIControlStateNormal];
takePicture.frame = CGRectMake(40.0, self.view.frame.size.height - 140, self.view.frame.size.width - 40, 40);
[self.view addSubview:takePicture];
[self initCaptureSession];
}
- (void) initCaptureSession {
captureSession = [[AVCaptureSession alloc] init];
if([captureSession canSetSessionPreset: AVCaptureSessionPresetPhoto] ) {
[captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
}
AVCaptureDeviceInput *deviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil];
if ([captureSession canAddInput:deviceInput]) {
[captureSession addInput:deviceInput];
}
photoOutput = [[AVCapturePhotoOutput alloc] init];
if ([captureSession canAddOutput:photoOutput]) {
[captureSession addOutput:photoOutput];
}
[captureSession startRunning];
}
-(void) setNewPhotoSetting {
photoSetting = [AVCapturePhotoSettings photoSettingsWithFormat:#{AVVideoCodecKey : AVVideoCodecTypeJPEG}];
[photoOutput setPhotoSettingsForSceneMonitoring:photoSetting];
}
- (IBAction)takePicture:(id)sender {
captureConnection = nil;
[self setNewPhotoSetting];
for (AVCaptureConnection *connection in photoOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual: AVMediaTypeVideo]) {
captureConnection = connection;
NSLog(#"Value of connection = %#", connection);
NSLog(#"Value of captureConnection = %#", captureConnection);
break;
}
}
if (captureConnection) {
break;
}
}
[photoOutput capturePhotoWithSettings:photoSetting delegate:self];
}

How to detect faces within a video session with iOS 5

Within my App I'm using a videosession where I can take pictures with. However I'd like to have face detection within this videosession. I looked at Apple's example "SquareCam" which is exactly what I'm looking for. However implementing their code in my project is driving me bonkers.
#import "CaptureSessionManager.h"
#import <ImageIO/ImageIO.h>
#implementation CaptureSessionManager
#synthesize captureSession;
#synthesize previewLayer;
#synthesize stillImageOutput;
#synthesize stillImage;
#pragma mark Capture Session Configuration
- (id)init {
if ((self = [super init])) {
[self setCaptureSession:[[AVCaptureSession alloc] init]];
}
return self;
}
- (void)didReceiveMemoryWarning
{
// Releases the view if it doesn't have a superview.
NSLog(#"memorywarning");
// Release any cached data, images, etc that aren't in use.
}
- (void)addVideoPreviewLayer {
[self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] autorelease]];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
}
- (void)addVideoInput {
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([videoDevice isFocusModeSupported:AVCaptureFocusModeLocked]) {
NSError *error = nil;
if ([videoDevice lockForConfiguration:&error]) {
NSLog(#"focus");
videoDevice.focusMode = AVCaptureFocusModeLocked;
videoDevice.focusMode = AVCaptureFocusModeContinuousAutoFocus;
//videoDevice.focusMode = AVCaptureSessionPresetPhoto
videoDevice.whiteBalanceMode = AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance;
[captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
[videoDevice unlockForConfiguration];
}
else {
}
}
// Respond to the failure as appropriate.
if (videoDevice) {
NSError *error;
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (!error) {
if ([[self captureSession] canAddInput:videoIn])
[[self captureSession] addInput:videoIn];
else
NSLog(#"Couldn't add video input");
}
else
NSLog(#"Couldn't create video input");
}
else
NSLog(#"Couldn't create video capture device");
}
- (void)addStillImageOutput
{
[self setStillImageOutput:[[[AVCaptureStillImageOutput alloc] init] autorelease]];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
[[self stillImageOutput] setOutputSettings:outputSettings];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[[self captureSession] addOutput:[self stillImageOutput]];
[outputSettings release];
}
- (void)captureStillImage
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(#"about to request a capture from: %#", [self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[self setStillImage:image];
[image release];
[[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
}];
}
- (void)dealloc {
[[self captureSession] stopRunning];
[previewLayer release], previewLayer = nil;
[captureSession release], captureSession = nil;
[stillImageOutput release], stillImageOutput = nil;
[stillImage release], stillImage = nil;
[super dealloc];
}
#end
Besides my videosession I did succeeded with recognizing faces within a UIImage which I imported inside my project. I did this with the example of #Abhinav Jha (How to properly instantiate CIDetector class object in iOS 5 face detection API).
CIImage *ciImage = [[CIImage alloc] initWithImage:[UIImage imageNamed:#"Photo.JPG"]];
if (ciImage == nil)
[imageView setImage:[UIImage imageNamed:#"Photo.JPG"]];
NSDictionary *options = [[NSDictionary alloc] initWithObjectsAndKeys:
#"CIDetectorAccuracy", #"CIDetectorAccuracyHigh",nil];
CIDetector *ciDetector = [CIDetector detectorOfType:CIDetectorTypeFace
context:nil
options:options];
NSArray *features = [ciDetector featuresInImage:ciImage];
NSLog(#"no of face detected: %d", [features count]);
Hope someone can point me in the right direction with combining the two examples!

Video Recording using AVFoundation Framework iPhone?

I'm developing an application with the help of sample code from the WWDC 2010 AVCamDemo example. In the app I need to record a video from the front camera of iPhone, but since the new iPhone 4 is not available at my place I am not able to test the code properly.
I would be really thankful if someone can give me a heads up whether I'm going in the right direction or not. The limited code I could test on my iPhone 3G (upgraded to iOS 4.1) crashes when I set the AVCaptureSession, as shown in the code below:
- (void)recordVideo
{
NSLog(#"video recording on");
AVCaptureDevice *videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:videoCaptureDevice error:nil];
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
[movieFileOutput release];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session addInput:videoInput];
[session addOutput:movieFileOutput];
[self setSession:session]; // crashes
if (![session isRunning])
{
[self performSelector:#selector(startRecording) withObject:nil afterDelay:1.0];
[session startRunning];
}
}
- (void)startRecording
{
AVCaptureConnection *videoConnection = [playVideo connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self movieFileOutput] connections]];
if ([videoConnection isVideoOrientationSupported]) {
[videoConnection setVideoOrientation:[self orientation]];
}
[[self movieFileOutput] startRecordingToOutputFileURL:[self tempFileURL]
recordingDelegate:self];
}
- (void) stopRecording
{
NSLog(#"stop recording");
[[self movieFileOutput] stopRecording];
}
- (NSURL *) tempFileURL
{
NSString *outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"output.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath]) {
NSLog(#"file saved");
}
[outputPath release];
return [outputURL autorelease];
}
+ (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections;
{
for ( AVCaptureConnection *connection in connections ) {
for ( AVCaptureInputPort *port in [connection inputPorts] ) {
if ( [[port mediaType] isEqual:mediaType] ) {
return [[connection retain] autorelease];
}
}
}
return nil;
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didStartRecordingToOutputFileAtURL:(NSURL *)fileURL
fromConnections:(NSArray *)connections
{
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error
{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error)];
}
[library release];
}
movieFileOutput is released immediately after it has been allocated. (line 9)