I am trying to capture video using AVCaptureMovieFileOutput. For that I am using sample code of apple. I don't have great uderstanding on this but for start video capture I am using following code:
-(void)startRecordingWithOrientation:(AVCaptureVideoOrientation)videoOrientation;
{
AVCaptureConnection *videoConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self movieFileOutput] connections]];
if ([videoConnection isVideoOrientationSupported])
[videoConnection setVideoOrientation:videoOrientation];
[[self movieFileOutput] startRecordingToOutputFileURL:[self outputFileURL] recordingDelegate:self];
// After this method my session say recording is yes.
}
Here:
[self movieFileOutput] returning a object of AVCaptureMovieFileOutput.
Now I have delegate methods for handling further things.
The problamatic part is Delegate Methods for AVCaptureMovieFileOutput are being called sometime and some time this doesn't.
Max probably when I pop this page after first recording and then I come back on this screen I face that delegate methods are not being called.
I have to Kill application and this works for new recording.
Please tell me the solution.
EDIT: The delegate method calls only once when I delete application and than reinstall app. After that this never get called. Even If I don't capture video and come on the screen and go back delegate don't get called. I am using apple's AVCam demo and added a screen before recorder screen.
Are you sure that the file you trying to save doesnt exist yet?
If it does, movie capture will fail to start, hence no delegane methods will be called.
Related
I'm working with XCode 4.6, and am trying to build a local notification feature on iOS that will execute a function upon reentering the app.Basically I would like to change the text in some of the labels and add some sound upon reentering the app. I thought I was on the right track, but only some parts of my code work when reentering the app via local notification.
First I added this function to my AppDelegate:
-(void)application:(UIApplication *)application didReceiveLocalNotification:(UILocalNotification *)notification{
NSLog(#"test1"); //this traces successfully
myAppViewController * controller = [myAppViewController alloc];
[controller doSomething]; //calling a function in my myAppViewController.m
}
I thought I had figured it out, but now only the NSLog works in my function in myAppViewController.m:
-(void)doSomething{
NSLog(#"do something"); //traces successfully
self.notificationTime.text=#"something else"; //nothing happens here, it works in other functions but not here
[self doSomethingElse]; //calling another function from this function for further testing
}
The next function is called....
-(void)doSomethingElse{
NSLog(#"do something else"); //this works
//this whole thing doesn't work -- no sound --
NSURL* url = [[NSBundle mainBundle] URLForResource:#"cash" withExtension:#"mp3"];
NSAssert(url, #"URL is valid.");
self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
[self.player prepareToPlay];
[self.player play];
//this doesn't work again
self.notificationTime.text=#"something else";
}
I was hoping to get some general advice here and it would be much appreciated. If anyone knows a complete different way of solving the problem, that would be great as well!
The didReceiveLocalNotification method is only called when you application is running in the foreground. If you see a badge and click on the App to start it, then you need to process the local notification using application:willFinishLaunchingWithOptions: (or application:didFinishLaunchingWithOptions:) To get at your local notification in either of these two methods, use UIApplicationLaunchOptionsLocalNotificationKey as a key to the options dictionary.
Note, once you extract the local notification from the launch options, it is a viable approach to call your didReceiveLocalNotification method.
You shouldn't need to allocate a second instance of the app controller. You can just use self. If you do that, does the code work as expected?
I have an application which takes some pictures.
My whole application is based on the AVCam sample code from WWDC 2010.
I've messed with it a lot and yet, up until now I can't figure out how to release the camera view properly which releases the camera session...
All i'm trying to do is the following:
Open camera view controller
Take some photos
Close camera view Controller
Open it again
The second time I push the viewController the session is lost, preview is not available and capturing is not available as well. I've published full example code on github.
My workaround for the issue was not to release the camera at all so the Camera View Controller acts as a Singleton, which I think is not the right way. moreover, with this behavior I couldn't figure out a way to support camera when application went to the background (phone call for example).
Please advice. How do I destruct the camera session? and is it important to do so?
I've added the following message to AVCamCaptureManager
- (void) destroySession {
if ([delegate respondsToSelector:#selector(captureManagerSessionWillEnd:)]) {
[delegate captureManagerSessionWillEnd:self];
}
// remove the device inputs
[session removeInput:[self videoInput]];
[session removeInput:[self audioInput]];
// release
[session release];
// remove AVCamRecorder
[recorder release];
if ([delegate respondsToSelector:#selector(captureManagerSessionEnded:)]) {
[delegate captureManagerSessionEnded:self];
}
}
I'm calling destroySession when the viewController holding the camera get close (on my example it's -closeCamera: of AVCamViewController).
For the full working example, you're welcome to download AVCam-CameraReleaseTest on github.com
G
I think that may help you have a look .
http://red-glasses.com/index.php/tutorials/ios4-take-photos-with-live-video-preview-using-avfoundation/
I'm writing my first iPad app that plays a video on a portion of the screen. My problem is that if the user changes to another view while the video is playing, the audio keeps playing in the background. I assume I have to add something to the "viewDidUnload" method but I'm not sure what to do. Any ideas? Thanks for any info.
You can try adding
[myMoviePlayer pause]; // assume myMoviePlayer is an instance variable
[myMoviePlayer stop];
myMoviePlayer = nil;
to your viewWillDisappear method.
I'm currently attempting to set up background audio for an app I'm developing for iOS 4. The app doesn't have a dedicated music player viewController, however, unlike other background audio apps such as Pandora, which makes the task a bit more confusing.
I've set the appropriate Info.plist settings correctly and have an AVAudioPlayer object in my app delegate which is accessible from everywhere. When the user plays a song, I replace the AVAudioPlayer with a new one initialized with the song and play it. This all works great, except now I have no idea how to go about supporting remote control events.
Based on Apple's documentation, I have this:
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
}
- (void)viewWillDisappear:(BOOL)animated {
[super viewWillDisappear:animated];
[[UIApplication sharedApplication] endReceivingRemoteControlEvents];
[self resignFirstResponder];
}
- (BOOL)canBecomeFirstResponder {
return YES;
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)event {
switch(event.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
if([iPhoneAppDelegate backgroundAudioPlayer].playing)
[iPhoneAppDelegate pauseBackgroundAudioPlayer];
else
[iPhoneAppDelegate playBackgroundAudioPlayer];
break;
}
}
The thing is, where do I put this? Apple's documentation seems to suggest this should go in some view controller somewhere, but my app has lots of view controllers and navigation controllers. Wherever I try to put this, for some reason tapping the Toggle Play/Pause button in the multitasking tray remote controls either causes the song to just pause for a moment and then unpause, or somehow causes the song to play twice.
The documentation examples are a bit misleading, but there is no need to subclass anything anywhere. The correct place to put remoteControlReceivedWithEvent: is in the application delegate, as it remains in the responder chain regardless of whether the app is in the foreground or not. Also the begin/end receiving remote control events should be based on whether you actually need the events, not on the visibility of some random view.
I found a couple of solutions to receiving global remote control events on the Apple Developer Forums after a bit of searching.
One way is to subclass UIWindow and override its remoteControlReceivedWithEvent:.
The second, perhaps nicer way is to subclass UIApplication and override sendEvent:. That way, you can intercept all the remote control events and handle them there globally, and not have any other responders handle them later in the responder chain.
- (void)sendEvent:(UIEvent *)event {
if (event.type == UIEventTypeRemoteControl) {
// Handle event
}
else
[super sendEvent:event];
}
The second method didn't work for me, sendEvent was never called. However the first method worked just nicely (subclassing UIWindow).
I struggled with this one for a while and none of the answers above worked. The bug in my code, and I hope that it will help someone reading this, was that I had the AudioSession set to mix with others. You want to be the foreground audio player to get Remote Control events. Check to see if you have INCORRECT code like this:
[[AVAudioSession sharedInstance] setDelegate: self];
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: nil];
UInt32 doSetProperty = 0;
AudioSessionSetProperty (
kAudioSessionProperty_OverrideCategoryMixWithOthers,
sizeof (doSetProperty),
&doSetProperty
);
NSError *activationError = nil;
[[AVAudioSession sharedInstance] setActive: YES error: &activationError];
And remove the AudioSessionSetProperty, or change doSetProperty to 1.
No need to subclass Window or forward events. Simply handle it from your main view controller. See the Audio Mixer (MixerHost) example for details.
http://developer.apple.com/LIBRARY/IOS/#samplecode/MixerHost/Listings/Classes_MixerHostViewController_m.html
Documentation explains it very well.
https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/Remote-ControlEvents/Remote-ControlEvents.html
One thing that seems to influence this behavior is any category options you set for your AVAudioSession using setCategory:withOptions:error: instead of just setCategory:error:. In particular, from trial and error, it appears that if you set AVAudioSessionCategoryOptionMixWithOthers you will not get remote control events; the now playing controls will still control the iPod app. If you set AVAudioSessionCategoryOptionDuckOthers you will get remote control events, but it seems like there may be some ambiguity regarding which app is controlled. Setting the categoryOptions to 0 or just calling setCategory:error: works best.
I have an application that requires the iPhone screen to remain active (or not, depending on user choice). I've done this by disabling the application idle timer, which works fine and dandy until I start playing media via the MPMusicPlayerController. Due to a bug in the SDK, this then reenables the idle timer with no apparent way to disable it again.
My app flow is:
App starts
Screen stays on
<...time passes...>
Play audio file
Idle timer kicks in
Screen turns off
I have an empty audio file playing in the background to stop the phone going into deep sleep, but I'd really like to keep the screen unlocked too.
Has anyone managed to figure out a workaround for this?
I had a similiar problem, and found a fix for it. The fix might work for you too:
I call a method periodically (every 10 seconds), which sets idleTimerDisabled first to NO, then to YES.
- (void)calledEveryTenSeconds
{
[UIApplication sharedApplication].idleTimerDisabled = NO;
[UIApplication sharedApplication].idleTimerDisabled = YES;
}
Only setting to YES alone does not fix the problem. It seems the property has to change first to be recognized by UIApplication.
My problem was, that the screen kept turning dark as soon as I switched music tracks on the iPod player via the headphone remote. My guess is, that this is the same issue as you are experiencing.
You should simply turn off the idle timer. What I usually do in a viewcontroller that needs to stay 'awake' is this:
- (void) viewWillAppear:(BOOL)animated
{
[[UIApplication sharedApplication] setIdleTimerDisabled: YES];
}
- (void) viewWillDisappear: (BOOL) animated
{
[[UIApplication sharedApplication] setIdleTimerDisabled: NO];
}
This will make sure the screen will not get locked due to user inactivity.
I found a solution to this problem. Invoke a method that disables the idleTimer in about 5 seconds after you start playing the music. It's a bit of a hack, but it is a workaround.
[[SoundEngine mainEngine] playMusic];
[self performSelector:#selector(setIdleTimeDisabled) withObject:nil afterDelay:5.0];
- (void) setIdleTimeDisabled {
[UIApplication sharedApplication].idleTimerDisabled = YES;
NSLog(#"Setting idleTimer to TRUE");}
let player = MPMusicPlayerController.applicationMusicPlayer()
player.setQueueWithStoreIDs(["some id"])
player.play()
player.pause()