How to use AVCaptureConnnections? - iphone

i am new to AVFoundation and i am trying to implement a video camera with AVFoundation here is my basic setup. Basically, when you click a button it will call the showCamera method. In here i create the session and then add an audio input and video input then add the video output.
Where in here do i add the AVCaptureConnection and how do i do it? Is there some tutorial that shows how to use the connections? Any help is appreciated.
- (IBAction) showCamera
{
//Add the camview to the current view ontop of controller
[[[[[UIApplication sharedApplication] delegate] self] window] addSubview:camView];
session = [[AVCaptureSession alloc] init];
//Set preset on session to make recording scale high
if ([session canSetSessionPreset:AVCaptureSessionPresetHigh]) {
session.sessionPreset = AVCaptureSessionPresetHigh;
}
// Add inputs and outputs.
NSArray *devices = [AVCaptureDevice devices];
//Print out all devices on phone
for (AVCaptureDevice *device in devices)
{
if ([device hasMediaType:AVMediaTypeVideo])
{
if ([device position] == AVCaptureDevicePositionBack)
{
//Add Rear Video input to session
[self addRearCameraInputToSession:session withDevice:device];
}
}
else if ([device hasMediaType:AVMediaTypeAudio])
{
//Add Microphone input to session
[self addMicrophoneInputToSession:session withDevice:device];
}
else
{
//Show error that your camera does not have a phone
}
}
//Add movie output
[self addMovieOutputToSession:session];
//Construct preview layer
[self constructPreviewLayerWithSession:session onView:camView];
}

You don't add AVCaptureConnections manually. When you have both an input and an output added to the AVCaptureSession object, the connections are automatically created for you. Quoth the documentation:
When an input or an output is added to a session, the session greedily forms connections between all the compatible capture inputs’ ports and capture outputs.
Unless you need to disable one of the automatically-created connections, or change the videoMirrored or videoOrientation properties, you shouldn't have to worry about them at all.

Take a look at following URLs...
Documentation from Apple:
An Article:
Video Recording using AVFoundation Framework iPhone?
I think, it will help you....

Related

How to record video without taking up the whole screen

I'm trying to implement a video recorder view controller that will be embedded inside of another view controller.
I implemented a MPMoviePlayedViewController, but that takes up the whole screen and is not what I want. I followed this tutorial, but did not get the results I was looking for - http://www.appcoda.com/video-recording-playback-ios-programming/
tl;dr - Is it possible to record a video without taking up the whole screen? How would one go about implementing that?
Try using the UIContainerView. It allows one UIViewController and its views to be displayed in a container of your desired size. Similar to a UITabBarController, which displays its child view controllers in a specified part of the screen, not the whole screen.
For examples and documentation, check out Apple's (hopefully secure) Developer site.
http://developer.apple.com/library/ios/#featuredarticles/ViewControllerPGforiPhoneOS/CreatingCustomContainerViewControllers/CreatingCustomContainerViewControllers.html
You can use AVFoundation to if you need more control see this link to the apple documentation.
First setup your AVCaptureSession with an AVCaptureDevice, AVCaptureInput, and an AVCaptureVideoPreviewLayer.
Second create the view you want to display the output in and add it to your view controller
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *videoInput= [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
[session addInput:videoInput];
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
UIView *videoDisplay = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 100, 100)];
previewLayer.frame = videoDisplay.bounds;
[videoDisplay.layer addSublayer:previewLayer];
[session startRunning];

Is it possible to play video using Avplayer in Background?

I am using Avplayer to show video clips and when i go back (app in background) video stop. How can i keep playing the video?
I have search about background task & background thread ,IOS only support music in background (Not video)
http://developer.apple.com/library/ios/#documentation/iphone/conceptual/iphoneosprogrammingguide/ManagingYourApplicationsFlow/ManagingYourApplicationsFlow.html
here is some discussion about play video in background
1) https://discussions.apple.com/thread/2799090?start=0&tstart=0
2) http://www.cocoawithlove.com/2011/04/background-audio-through-ios-movie.html
But there are many apps in AppStore, that play video in Background like
Swift Player : https://itunes.apple.com/us/app/swift-player-speed-up-video/id545216639?mt=8&ign-mpt=uo%3D2
SpeedUpTV : https://itunes.apple.com/ua/app/speeduptv/id386986953?mt=8
This method supports all the possibilities:
Screen locked by the user;
List item
Home button pressed;
As long as you have an instance of AVPlayer running iOS prevents auto lock of the device.
First you need to configure the application to support audio background from the Info.plist file adding in the UIBackgroundModes array the audio element.
Then put in your AppDelegate.m into
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions:
these methods
[[AVAudioSession sharedInstance] setDelegate: self];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
and #import < AVFoundation/AVFoundation.h >
Then in your view controller that controls AVPlayer
-(void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
}
and
- (void)viewWillDisappear:(BOOL)animated
{
[mPlayer pause];
[super viewWillDisappear:animated];
[[UIApplication sharedApplication] endReceivingRemoteControlEvents];
[self resignFirstResponder];
}
then respond to the
- (void)remoteControlReceivedWithEvent:(UIEvent *)event {
switch (event.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
if([mPlayer rate] == 0){
[mPlayer play];
} else {
[mPlayer pause];
}
break;
case UIEventSubtypeRemoteControlPlay:
[mPlayer play];
break;
case UIEventSubtypeRemoteControlPause:
[mPlayer pause];
break;
default:
break;
}
}
Another trick is needed to resume the reproduction if the user presses the home button (in which case the reproduction is suspended with a fade out).
When you control the reproduction of the video (I have play methods) set
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(applicationDidEnterBackground:) name:UIApplicationDidEnterBackgroundNotification object:nil];
and the corresponding method to be invoked that will launch a timer and resume the reproduction.
- (void)applicationDidEnterBackground:(NSNotification *)notification
{
[mPlayer performSelector:#selector(play) withObject:nil afterDelay:0.01];
}
Its works for me to play video in Backgorund.
Thanks to all.
If you try to change the background mode:
Sorry, App store wont approve it.MPMoviePlayerViewController playback video after going to background for youtube
In my research, someone would take the sound track out to play in te background when it goes into background as the video would be pause and get the playbacktime for resume playing when go into foreground
It is not possible to play background music/video using Avplayer. But it is possible using
MPMoviePlayerViewController. I have done this in one of my app using this player & this app
is run successfully to appstore.
Try with this snippet, I've already integrated this with my app & it's being useful for me..hope this will work for you!!
Follow the steps given below:
Add UIBackgroundModes in the APPNAME-Info.plist, with the selection
App plays audio
Then add the AudioToolBox framework to the folder frameworks.
In the APPNAMEAppDelegate.h add:
-- #import < AVFoundation/AVFoundation.h>
-- #import < AudioToolbox/AudioToolbox.h>
In the APPNAMEAppDelegate.m add the following:
// Set AudioSession
NSError *sessionError = nil;
[[AVAudioSession sharedInstance] setDelegate:self];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:&sessionError];
/* Pick any one of them */
// 1. Overriding the output audio route
//UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
//AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute,
sizeof(audioRouteOverride), &audioRouteOverride);
========================================================================
// 2. Changing the default output audio route
UInt32 doChangeDefaultRoute = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryDefaultToSpeaker, sizeof(doChangeDefaultRoute), &doChangeDefaultRoute);
into the
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
but before the two lines:
[self.window addSubview:viewController.view];
[self.window makeKeyAndVisible];
Enjoy Programming!!
Swift version for the accepted answer.
In the delegate:
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: nil)
AVAudioSession.sharedInstance().setActive(true, error: nil)
In the view controller that controls AVPlayer
override func viewDidAppear(animated: Bool) {
UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
self.becomeFirstResponder()
}
override func viewWillDisappear(animated: Bool) {
mPlayer.pause()
UIApplication.sharedApplication().endReceivingRemoteControlEvents()
self.resignFirstResponder()
}
Don't forget to "import AVFoundation"
In addition to fattomhk's response, here's what you can do to achieve video forwarding to the time it should be after your application comes in foreground:
Get currentPlaybackTime of playing video when go to background and store it in lastPlayBackTime
Store the time when application goes to background (in userDefault probably)
Again get the time when application comes in foreground
Calculate the duration between background and foreground time
Set current playback time of video to lastPlayBackTime + duration
If you are Playing video using WebView you can handle using javascript to play video on background.
strVideoHTML = #"<html><head><style>body.........</style></head> <body><div id=\"overlay\"><div id=\"youtubelogo1\"></div></div><div id=\"player\"></div> <script> var tag = document.createElement('script'); tag.src = \"http://www.youtube.com/player_api\"; var firstScriptTag = document.getElementsByTagName('script')[0]; firstScriptTag.parentNode.insertBefore(tag, firstScriptTag); var player; events: { 'onReady': onPlayerReady, } }); } function onPlayerReady(event) { event.target.playVideo(); } function changeBG(url){ document.getElementById('overlay').style.backgroundImage=url;} function manualPlay() { player.playVideo(); } function manualPause() { player.pauseVideo(); } </script></body> </html>";
NSString *html = [NSString stringWithFormat:strVideoHTML,url, width, height,videoid;
webvideoView.delegate = self;
[webvideoView loadHTMLString:html baseURL:[NSURL URLWithString:#"http://www.your-url.com"]];
On view diappear call
strscript = #"manualPlay();
[webvideoView stringByEvaluatingJavaScriptFromString:strscript];
I'd like to add something that for some reason ended up being the culprit for me. I had used AVPlayer and background play for a long time without problems, but this one time I just couldn't get it to work.
I found out that when you go background, the rate property of the AVPlayer sometimes seems to dip to 0.0 (i.e. paused), and for that reason we simply need to KVO check the rate property at all times, or at least when we go to background. If the rate dips below 0.0 and we can assume that the user wants to play (i.e. the user did not deliberately tap pause in remote controls, the movie ended, etc) we need to call .play() on the AVPlayer again.
AFAIK there is no other toggle on the AVPlayer to keep it from pausing itself when app goes to background.

Switch On/Off flash while recording the Video in iphone app

I am recording video from my iPhone app.
I am using an overlay over the camera and placed a button in the overlay. I want to know with the help of which function can I turn camera's flash on/off while the video is being recorded.
How can I set a flash button in the camera overlay?
If you are using AVFoundation for video recording, You should first check if device has torch/flash because torch is available when video is being recorded from the back camera, you can not have the torch/flash when using front camera.
using something like this
- (BOOL) hasTorch
{
return [[[self avCaptureDeviceInput] device] hasTorch];
}
and then set the torch accordingly using AVCaptureTorchMode
- (void) setTorchMode:(AVCaptureTorchMode)torchMode
{
AVCaptureDevice *device = [[self videoInput] device];
if ([device isTorchModeSupported:torchMode] && [device torchMode] != torchMode) {
NSError *error;
if ([device lockForConfiguration:&error]) {
[device setTorchMode:torchMode];
[device unlockForConfiguration];
} else {
id deleg = [self delegate];
if ([deleg respondsToSelector:#selector(acquiringDeviceLockFailedWithError:)]) {
[deleg acquiringDeviceLockFailedWithError:error];
}
}
}
}
if you follow the AVCam Demo from Apple you will get your answers basically.
Assuming you are using UIImagePickerController (from your tag), use the cameraFlashMode provided by UIImagePickerController to control the flash.
You can set its value to UIImagePickerControllerCameraFlashModeOff, UIImagePickerControllerCameraFlashModeAuto or UIImagePickerControllerCameraFlashModeOn. Default is auto.

AVCaptureDeviceOutput not calling delegate method captureOutput

I am building an iOS application (my first) that processes video still frames on the fly. To dive into this, I followed an example from the AV* documentation from Apple.
The process involves setting up an input (the camera) and an output. The output works with a delegate, which in this case is the controller itself (it conforms and implements the method needed).
The problem I am having is that the delegate method never gets called. The code below is the implementation of the controller and it has a couple of NSLogs. I can see the "started" message, but the "delegate method called" never shows.
This code is all within a controller that implements the "AVCaptureVideoDataOutputSampleBufferDelegate" protocol.
- (void)viewDidLoad {
[super viewDidLoad];
// Initialize AV session
AVCaptureSession *session = [AVCaptureSession new];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
[session setSessionPreset:AVCaptureSessionPreset640x480];
else
[session setSessionPreset:AVCaptureSessionPresetPhoto];
// Initialize back camera input
AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:camera error:&error];
if( [session canAddInput:input] ){
[session addInput:input];
}
// Initialize image output
AVCaptureVideoDataOutput *output = [AVCaptureVideoDataOutput new];
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[output setVideoSettings:rgbOutputSettings];
[output setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked (as we process the still image)
//[output addObserver:self forKeyPath:#"capturingStillImage" options:NSKeyValueObservingOptionNew context:#"AVCaptureStillImageIsCapturingStillImageContext"];
videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[output setSampleBufferDelegate:self queue:videoDataOutputQueue];
if( [session canAddOutput:output] ){
[session addOutput:output];
}
[[output connectionWithMediaType:AVMediaTypeVideo] setEnabled:YES];
[session startRunning];
NSLog(#"started");
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSLog(#"delegate method called");
CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
self.theImage.image = [UIImage imageWithCGImage: cgImage ];
CGImageRelease( cgImage );
}
Note: I'm building with iOS 5.0 as a target.
Edit:
I've found a question that, although asking for a solution to a different problem, is doing exactly what my code is supposed to do. I've copied the code from that question verbatim into a blank xcode app, added NSLogs to the captureOutput function and it doesn't get called. Is this a configuration issue? Is there something I'm missing?
Your session is a local variable. Its scope is limited to viewDidLoad. Since this is a new project, I assume it's safe to say that you're using ARC. In that case that object won't leak and therefore continue to live as it would have done in the linked question, rather the compiler will ensure the object is deallocated before viewDidLoad exits.
Hence your session isn't running because it no longer exists.
(aside: the self.theImage.image = ... is unsafe since it performs a UIKit action of the main queue; you probably want to dispatch_async that over to dispatch_get_main_queue())
So, sample corrections:
#implementation YourViewController
{
AVCaptureSession *session;
}
- (void)viewDidLoad {
[super viewDidLoad];
// Initialize AV session
session = [AVCaptureSession new];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
[session setSessionPreset:AVCaptureSessionPreset640x480];
else
/* ... etc ... */
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSLog(#"delegate method called");
CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
dispatch_sync(dispatch_get_main_queue(),
^{
self.theImage.image = [UIImage imageWithCGImage: cgImage ];
CGImageRelease( cgImage );
});
}
Most people advocate using an underscore at the beginning of instance variable names nowadays but I omitted it for simplicity. You can use Xcode's built in refactor tool to fix that up after you've verified that the diagnosis is correct.
I moved the CGImageRelease inside the block sent to the main queue to ensure its lifetime extends beyond its capture into a UIImage. I'm not immediately able to find any documentation to confirm that CoreFoundation objects have their lifetime automatically extended when captured in a block.
I've found one more reason why didOutputSampleBuffer delegate method may not be called — save to file and get sample buffer output connections are mutually exclusive. In other words, if your session already has AVCaptureMovieFileOutput and then you add AVCaptureVideoDataOutput, only AVCaptureFileOutputRecordingDelegate delegate methods are called.
Just for the reference, I couldn't find anywhere in AV Foundation framework documentation explicit description of this limitation, but Apple support confirmed this a few years ago, as noted in this SO answer.
One way to solve the problem is to remove AVCaptureMovieFileOutput entirely and manually write recorded frames to the file in didOutputSampleBuffer delegate method, alongside your custom buffer data processing. You may find these two SO answers useful.
In my case the problem is there because I call
if ([_session canAddOutput:_videoDataOutput])
[_session addOutput:_videoDataOutput];
before I call
[_session startRunning];
I'm just start call addOutput: after startRunning
Hope it's help somebody.
My captureOutput function was not called either. And the accepted answer did not exactly point at my problem, as my session was already an instance variable.
BUT, my DispatchQueue for my video frames was local. And the dispatchQueue must ALSO be an instance variable. I don't quite understand why this should be necessary. Perhaps the underlying AVCapture code only keeps a weak pointer to it?
The documentation is very confusing on this.

Loading Multiple Movies in iPhone MPMoviePlayerController

-(void)initAndPlayMovie:(NSURL *)movieURL
{
// Initialize a movie player object with the specified URL
MPMoviePlayerController *mp = [[MPMoviePlayerController alloc] initWithContentURL:movieURL];
if (mp)
{
self.moviePlayer = mp;
[mp release];
[self.moviePlayer play];
}
}
Here, in above code, We can pass just one movie URL. Isn't it possible to pass multiple urls to it?
So, Movie Player will load second url after playing first one.
Is it possible? How can we do that?
Right now, when I try to pass other url, after finishing first one.
- (void) moviePlayBackDidFinish:(NSNotification*)notification
{
[self initAndPlayMovie:secondURL];
}
The Device First change its orientation while loading and after loading Device again come back to landscape mode.
How to resolve this problem?
You might want to change the orientation by changing the statusBar orientation before you start playing videos and change it back after you are done with all.
[[UIApplication sharedApplication] setStatusBarOrientation: UIInterfaceOrientationLandscapeRight animated:YES];
You should be able to call setContentURL just as the first movie is about to close to change to another movie. Check endPlaybackTime and fire off your method to invoke setContentURL one second prior to the movie ending.