I'm in the process of developing an iPhone app, and I'm running into a very minor issue. Essentially, when I load a certain view, I want the video to play. When the video is done playing it'll return to the view with a menu and some options. One of the options is to replay the video. Currently I have the replay video option working, but I'm unable to play the video when the view is loaded.
I've implemented a playMovie and moviePlayBackDidFinish method. I then placed [self playMovie] in the viewDidLoad method thinking it would call the playMovie method initially and thus play the movie when the view got loaded, but it doesn't seem to work.
If anyone could explain why this method of thinking doesn't work and also a proper way of doing this, it'd be greatly appreciated.
I would try viewDidAppear instead of viewDidLoad depending on your viewController. A view can often only load once, and you may be wasting that as it's loading in the background.
Also, Here's a sample movie Did Finish method, I'm wondering if you're releasing something you shouldn't be, or never telling the movie to stop properly:
- (void) moviePlayBackDidFinish:(NSNotification*)notification
{
NSNumber *reason = [[notification userInfo] objectForKey:MPMoviePlayerPlaybackDidFinishReasonUserInfoKey];
switch ([reason integerValue])
{
/* The end of the movie was reached. */
case MPMovieFinishReasonPlaybackEnded:
/*
Add your code here to handle MPMovieFinishReasonPlaybackEnded.
*/
break;
/* An error was encountered during playback. */
case MPMovieFinishReasonPlaybackError:
NSLog(#"An error was encountered during playback");
[self performSelectorOnMainThread:#selector(displayError:) withObject:[[notification userInfo] objectForKey:#"error"]
waitUntilDone:NO];
[self removeMovieViewFromViewHierarchy];
[self removeOverlayView];
[self.backgroundView removeFromSuperview];
break;
/* The user stopped playback. */
case MPMovieFinishReasonUserExited:
[self removeMovieViewFromViewHierarchy];
[self removeOverlayView];
[self.backgroundView removeFromSuperview];
break;
default:
break;
}
}
Related
I am using Avplayer to show video clips and when i go back (app in background) video stop. How can i keep playing the video?
I have search about background task & background thread ,IOS only support music in background (Not video)
http://developer.apple.com/library/ios/#documentation/iphone/conceptual/iphoneosprogrammingguide/ManagingYourApplicationsFlow/ManagingYourApplicationsFlow.html
here is some discussion about play video in background
1) https://discussions.apple.com/thread/2799090?start=0&tstart=0
2) http://www.cocoawithlove.com/2011/04/background-audio-through-ios-movie.html
But there are many apps in AppStore, that play video in Background like
Swift Player : https://itunes.apple.com/us/app/swift-player-speed-up-video/id545216639?mt=8&ign-mpt=uo%3D2
SpeedUpTV : https://itunes.apple.com/ua/app/speeduptv/id386986953?mt=8
This method supports all the possibilities:
Screen locked by the user;
List item
Home button pressed;
As long as you have an instance of AVPlayer running iOS prevents auto lock of the device.
First you need to configure the application to support audio background from the Info.plist file adding in the UIBackgroundModes array the audio element.
Then put in your AppDelegate.m into
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions:
these methods
[[AVAudioSession sharedInstance] setDelegate: self];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
and #import < AVFoundation/AVFoundation.h >
Then in your view controller that controls AVPlayer
-(void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
}
and
- (void)viewWillDisappear:(BOOL)animated
{
[mPlayer pause];
[super viewWillDisappear:animated];
[[UIApplication sharedApplication] endReceivingRemoteControlEvents];
[self resignFirstResponder];
}
then respond to the
- (void)remoteControlReceivedWithEvent:(UIEvent *)event {
switch (event.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
if([mPlayer rate] == 0){
[mPlayer play];
} else {
[mPlayer pause];
}
break;
case UIEventSubtypeRemoteControlPlay:
[mPlayer play];
break;
case UIEventSubtypeRemoteControlPause:
[mPlayer pause];
break;
default:
break;
}
}
Another trick is needed to resume the reproduction if the user presses the home button (in which case the reproduction is suspended with a fade out).
When you control the reproduction of the video (I have play methods) set
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(applicationDidEnterBackground:) name:UIApplicationDidEnterBackgroundNotification object:nil];
and the corresponding method to be invoked that will launch a timer and resume the reproduction.
- (void)applicationDidEnterBackground:(NSNotification *)notification
{
[mPlayer performSelector:#selector(play) withObject:nil afterDelay:0.01];
}
Its works for me to play video in Backgorund.
Thanks to all.
If you try to change the background mode:
Sorry, App store wont approve it.MPMoviePlayerViewController playback video after going to background for youtube
In my research, someone would take the sound track out to play in te background when it goes into background as the video would be pause and get the playbacktime for resume playing when go into foreground
It is not possible to play background music/video using Avplayer. But it is possible using
MPMoviePlayerViewController. I have done this in one of my app using this player & this app
is run successfully to appstore.
Try with this snippet, I've already integrated this with my app & it's being useful for me..hope this will work for you!!
Follow the steps given below:
Add UIBackgroundModes in the APPNAME-Info.plist, with the selection
App plays audio
Then add the AudioToolBox framework to the folder frameworks.
In the APPNAMEAppDelegate.h add:
-- #import < AVFoundation/AVFoundation.h>
-- #import < AudioToolbox/AudioToolbox.h>
In the APPNAMEAppDelegate.m add the following:
// Set AudioSession
NSError *sessionError = nil;
[[AVAudioSession sharedInstance] setDelegate:self];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:&sessionError];
/* Pick any one of them */
// 1. Overriding the output audio route
//UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
//AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute,
sizeof(audioRouteOverride), &audioRouteOverride);
========================================================================
// 2. Changing the default output audio route
UInt32 doChangeDefaultRoute = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryDefaultToSpeaker, sizeof(doChangeDefaultRoute), &doChangeDefaultRoute);
into the
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
but before the two lines:
[self.window addSubview:viewController.view];
[self.window makeKeyAndVisible];
Enjoy Programming!!
Swift version for the accepted answer.
In the delegate:
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: nil)
AVAudioSession.sharedInstance().setActive(true, error: nil)
In the view controller that controls AVPlayer
override func viewDidAppear(animated: Bool) {
UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
self.becomeFirstResponder()
}
override func viewWillDisappear(animated: Bool) {
mPlayer.pause()
UIApplication.sharedApplication().endReceivingRemoteControlEvents()
self.resignFirstResponder()
}
Don't forget to "import AVFoundation"
In addition to fattomhk's response, here's what you can do to achieve video forwarding to the time it should be after your application comes in foreground:
Get currentPlaybackTime of playing video when go to background and store it in lastPlayBackTime
Store the time when application goes to background (in userDefault probably)
Again get the time when application comes in foreground
Calculate the duration between background and foreground time
Set current playback time of video to lastPlayBackTime + duration
If you are Playing video using WebView you can handle using javascript to play video on background.
strVideoHTML = #"<html><head><style>body.........</style></head> <body><div id=\"overlay\"><div id=\"youtubelogo1\"></div></div><div id=\"player\"></div> <script> var tag = document.createElement('script'); tag.src = \"http://www.youtube.com/player_api\"; var firstScriptTag = document.getElementsByTagName('script')[0]; firstScriptTag.parentNode.insertBefore(tag, firstScriptTag); var player; events: { 'onReady': onPlayerReady, } }); } function onPlayerReady(event) { event.target.playVideo(); } function changeBG(url){ document.getElementById('overlay').style.backgroundImage=url;} function manualPlay() { player.playVideo(); } function manualPause() { player.pauseVideo(); } </script></body> </html>";
NSString *html = [NSString stringWithFormat:strVideoHTML,url, width, height,videoid;
webvideoView.delegate = self;
[webvideoView loadHTMLString:html baseURL:[NSURL URLWithString:#"http://www.your-url.com"]];
On view diappear call
strscript = #"manualPlay();
[webvideoView stringByEvaluatingJavaScriptFromString:strscript];
I'd like to add something that for some reason ended up being the culprit for me. I had used AVPlayer and background play for a long time without problems, but this one time I just couldn't get it to work.
I found out that when you go background, the rate property of the AVPlayer sometimes seems to dip to 0.0 (i.e. paused), and for that reason we simply need to KVO check the rate property at all times, or at least when we go to background. If the rate dips below 0.0 and we can assume that the user wants to play (i.e. the user did not deliberately tap pause in remote controls, the movie ended, etc) we need to call .play() on the AVPlayer again.
AFAIK there is no other toggle on the AVPlayer to keep it from pausing itself when app goes to background.
I have an algorithm that takes a few seconds to load some stuff, and I want to first set the string on a label to say "loading" before the actual loading begins. This is all within the same layer, this is not switching between scenes.
I thought I could simply do this:
-(void)startLoading{
[self unscheduleAllSelectors];//just in case the update is already scheduled
[self.loadingLabel setString:#"Loading...."];
[self scheduleUpdate];
}
Then, I have this:
-(void)update:(ccTime)delta{
[self unscheduleUpdate];
[self beginLoading];//another method that loads all the stuff
}
My understanding is that my method beginLoading should not run until the next frame. Thus my label should get updated correctly. However this is not happening. There is a slight freeze while all my assets get loaded and my label never gets updated before the loading begins.
Am I missing a step?
Nope ure not missing anything. I stopped fighting this and now use this kind of 'delayed' task catapult. It should make certain you will get a draw in the transition from the first to the second tick:
-(void) startLoading{
_loadTicker=0; // an NSUInteger iVar declared in the .h
[self schedule:#selector(tickOffLoading:)];
}
-(void) tickOffLoading:(ccTime) dt{
_loadTicker++;
if(_loadTicker==1) {
[self.loadingLabel setString:#"Loading...."];
} else {
[self unschedule:#selector(tickOffLoading:)];
[self beginLoading];
}
}
I have an iPhone game in which players can tweet their score.
I use TWTweetComposeViewController for this.
Since the Twitter sheet can take some time to load, I would like a "loading..." layer to show up after the player has clicked the tweet button, while waiting for the Twitter sheet to show up.
My problem is that the layer (named "colcol" in the code below) only shows up once the tweet sheet is ready! It's as if the layer waited for the tweet sheet to be ready to show up. Which is definitely not what I expect.
Any idea why ?
Thank you!
Here is the tweetScore function, called when the user touches a CCMenuItemImage:
- (void) tweetScore: (CCMenuItem *) menuItem {
colcol=[CCLayerColor layerWithColor:ccc4(0,0, 0, 200) width:50 height:50];
colcol.position=ccp(winSize.width/2-25,winSize.height/2-25);
[self addChild:colcol z:15];
if ([TWTweetComposeViewController canSendTweet])
{
TWTweetComposeViewController *tweetSheet = [[TWTweetComposeViewController alloc] init];
[tweetSheet setInitialText:[NSString stringWithFormat:#"I scored %d!", playerScore]];
tweetSheet.completionHandler = ^(TWTweetComposeViewControllerResult result) {
[appDelegate.viewController dismissModalViewControllerAnimated:YES];
switch (result) {
case TWTweetComposeViewControllerResultCancelled:
[self removeChild:colcol cleanup:YES];
break;
case TWTweetComposeViewControllerResultDone:
[[GCHelper sharedInstance] reportAchievementIdentifier:#"tweet_score" percentComplete:100.0];
[self removeChild:colcol cleanup:YES];
break;
default:
[self removeChild:colcol cleanup:YES];
break;
}
};
[appDelegate.viewController presentModalViewController:tweetSheet animated:YES];
}
else
{
// handle this case
}
}
In general, screen updates in CoreAnimation, UIKit, and OS X are deferred until the end of the current pass through the run loop. You are adding the CALayer, then doing some time-consuming work (setting up the TWTweetComposeViewController), then returning so the run loop can finish -- so there's no time for a screen update to happen in between.
Try setting up the TWTweetComposeViewController in a separate pass through the run loop, using dispatch_async(dispatch_get_main_queue(), ^{ /* your code here */ }).
I'd like to display an activity indicator BEFORE the work undertaken by willAnimateRotationToInterfaceOrientation:duration: begins. Most of the time in my app, this work is quickly completed and there would be no need for an activity indicator, but occasionally (first rotation, i.e. before I have cached data, when working with a large file) there can be a noticeable delay. Rather than re-architect my app to cope with this uncommon case, I'd rather just show the UIActivityIndicatorView while the app generates a cache and updates the display.
The problem is (or seems to be) that the display is not updated between the willRotateToInterfaceOrientation:duration and the willAnimateRotationToInterfaceOrientation:duration: method. So asking iOS to show UIActivityIndicator view in willRotate method doesn't actually affect the display until after the willAnimateRotation method.
The following code illustrates the issue. When run, the activity indicator appears only very briefly and AFTER the simulateHardWorkNeededToGetDisplayInShapeBeforeRotation method has completed.
Am I missing something obvious? And if not, any smart ideas as to how I could work around this issue?
Update: While suggestions about farming the heavy lifting off to another thread etc. are generally helpful, in my particular case I kind of do want to block the main thread to do my lifting. In the app, I have a tableView all of whose heights need to be recalculated. When - which is not a very common use case or I wouldn't even be considering this approach - there are very many rows, all the new heights are calculated (and then cached) during a [tableView reloadData]. If I farm the lifting off and let the rotate proceed, then after the rotate and before the lifting, my tableView hasn't been re-loaded. In the portrait to landscape case, for example, it doesn't occupy the full width. Of course, there are other workarounds, e.g. building a tableView with just a few rows prior to the rotate and then reloading the real one over that etc.
Example code to illustrate the issue:
#implementation ActivityIndicatorViewController
#synthesize activityIndicatorView = _pgActivityIndicatorView;
#synthesize label = _pgLabel;
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
return (interfaceOrientation != UIInterfaceOrientationPortraitUpsideDown);
}
- (void) willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration;
{
NSLog(#"willRotate");
[self showActivityIndicatorView];
}
- (void) willAnimateRotationToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration;
{
NSLog(#"willAnimateRotation");
[self simulateHardWorkNeededToGetDisplayInShapeBeforeRotation];
}
- (void) didRotateFromInterfaceOrientation:(UIInterfaceOrientation)fromInterfaceOrientation;
{
NSLog(#"didRotate");
[self hideActivityIndicatorView];
}
- (void) simulateHardWorkNeededToGetDisplayInShapeBeforeRotation;
{
NSLog(#"Starting simulated work");
NSDate* date = [NSDate date];
while (fabs([date timeIntervalSinceNow]) < 2.0)
{
//
}
NSLog(#"Finished simulated work");
}
- (void) showActivityIndicatorView;
{
NSLog(#"showActivity");
if (![self activityIndicatorView])
{
UIActivityIndicatorView* activityIndicatorView = [[UIActivityIndicatorView alloc] initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleGray];
[self setActivityIndicatorView:activityIndicatorView];
[[self activityIndicatorView] setCenter:[[self view] center]];
[[self activityIndicatorView] startAnimating];
[[self view] addSubview: [self activityIndicatorView]];
}
// in shipping code, an animation with delay would be used to ensure no indicator would show in the good cases
[[self activityIndicatorView] setHidden:NO];
}
- (void) hideActivityIndicatorView;
{
NSLog(#"hideActivity");
[[self activityIndicatorView] setHidden:YES];
}
- (void) dealloc;
{
[_pgActivityIndicatorView release];
[super dealloc];
}
- (void) viewDidLoad;
{
UILabel* label = [[UILabel alloc] initWithFrame:CGRectMake(50.0, 50.0, 0.0, 0.0)];
[label setText:#"Activity Indicator and Rotate"];
[label setTextAlignment: UITextAlignmentCenter];
[label sizeToFit];
[[self view] addSubview:label];
[self setLabel:label];
[label release];
}
#end
The app doesn't update the screen to show the UIActivityIndicatorView until the main run loop regains control. When a rotation event happens, the willRotate... and willAnimateRotation... methods are called in one pass through the main run loop. So you block on the hard work method before displaying the activity indicator.
To make this work, you need to push the hard work over to another thread. I would put the call to the hard work method in the willRotate... method. That method would call back to this view controller when the work is completed so the view can be updated. I would put show the activity indicator in the willAnimateRotation... method. I wouldn't bother with a didRotateFrom... method. I recommend reading the Threaded Programming Guide.
Edit in response to a comment: You can effectively block user interaction by having the willAnimateRotation... method put a non functioning interface on screen such as a view displaying a dark overlay over and the UIActivityIndicatorView. Then when the heavy lifting is done, this overlay is removed, and the interface becomes active again. Then the drawing code will have the opportunity to properly add and animate the activity indicator.
More digging (first in Matt Neuberg's Programming iPhone 4) and then this helpful question on forcing Core Animation to run its thread from stackoverflow and I have a solution that seems to be working well. Both Neuberg and Apple issue strong caution about this approach because of the potential for unwelcome side effects. In testing so far, it seems to be OK for my particular case.
Changing the code above as follows implements the change. The key addition is [CATransaction flush], forcing the UIActivityIndicatorView to start displaying even though the run loop won't be ended until after the willAnimateRotationToInterfaceOrientation:duration method completes.
- (void) willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration;
{
NSLog(#"willRotate");
[self showActivityIndicatorView];
[CATransaction flush]; // this starts the animation right away, w/o waiting for end of the run loop
}
- (void) willAnimateRotationToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration;
{
NSLog(#"willAnimateRotation");
[self simulateHardWorkNeededToGetDisplayInShapeBeforeRotation];
[self hideActivityIndicatorView];
}
- (void) didRotateFromInterfaceOrientation:(UIInterfaceOrientation)fromInterfaceOrientation;
{
NSLog(#"didRotate");
}
Try performing you work on a second thread after showing the activity view.
[self showActivityIndicatorView];
[self performSelector:#selector(simulateHardWorkNeededToGetDisplayInShapeBeforeRotation) withObject:nil afterDelay:0.01];
Either execute the heavy lifting in a background thread and post the results in the foreground thread to update the UI (UIKit is only thread safe since iOS 4.0):
[self performSelectorInBackground:#selector(simulateHardWorkNeededToGetDisplayInShapeBeforeRotation) withObject:nil]
Or you can schedule the heavy lifting method to be executed after the rotation took place:
[self performSelector:#selector(simulateHardWorkNeededToGetDisplayInShapeBeforeRotation) withObject:nil afterDelay:0.4]
But these are only hacks and the real solution is to have proper background processing if your UI needs heavy processing to get updated, may it be in portrait or landscape. NSOperation and NSOperationQueue is a good place to start.
I have an app that streams music using AudioStreamer class by Matt Gallagher. This works fine as a background process except I want to be able to skip to the next song once the stream is finished. Unfortunately this part doesn't work.
Initially I had a timer that was monitoring the stream but realized that when the app backgrounds this timer no longer runs. So I tried adding a delegate callback in the packet read function:
void ASReadStreamCallBack(CFReadStreamRef aStream, CFStreamEventType eventType, void* inClientInfo)
{
AudioStreamer* streamer = (AudioStreamer *)inClientInfo;
double percent = [streamer progress]/[streamer duration];
if(percent>=0.98 || (percent>=0.95 && [streamer isIdle])){
if([streamer.delegate respondsToSelector:#selector(didFinishPlayingStream:)] ){
[streamer.delegate didFinishPlayingStream:streamer];
streamer.delegate = nil;
}
}
[streamer handleReadFromStream:aStream eventType:eventType];
}
This works fine when the app is in the foreground but no longer works when the app is backgrounding. The delegate method basically sends a request to get the stream URL for the next song, then once it has it creates a new AudioStreamer class
While the app is in background you can implemente the delegate to handle the different remote control states.
- (void)remoteControlReceivedWithEvent:(UIEvent *)receivedEvent {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
if (player.isPlaying) {
[player pause];
} else {
[player start];
}
break;
case UIEventSubtypeRemoteControlPreviousTrack:
break;
case UIEventSubtypeRemoteControlNextTrack:
[self skipSong:nil];
break;
default:
break;
} }
Something like this works for me.
I've uploaded my AudioPlayer/streamer class inspired in part by Matt Gallagher's AudioStreamer to
https://code.google.com/p/audjustable.
One of the cooler features is its support for gapless playback. This means the AudioQueue is never closed between gaps; keeping iOS from suspending your app.
You can implement AudioPlayerDelegate:didFinishBufferingSourceWithQueueItemId and AudioPlayerDelegate:didFinishPlayingQueueItemId to queue up the next track by calling AudioPlayer:queueDataSource.
Let me know if you need help using it.