playableDuration returns 0 in iOS5 - iphone

Has anyone else noticed that playableDuration property of MPMoviePlayerController class always returns 0 in iOS 5. This used to work fine in previous versions of iOS. I use it to set the value of a progress bar.
Here is piece of code that used to work under 4.x SDK just fine (i.e., the playableDuration attribute returned the correct non-zero value while buffering the stream), but under SDK 5.x it always returns zero.
- (void) updateMeter {
NSLog(#"playableDuration = %f", streamPlayer.playableDuration);
}
- (void)viewDidLoad
{
[super viewDidLoad];
streamPlayer = [[MPMoviePlayerController alloc]
initWithContentURL:[NSURL URLWithString:#"http://99.198.118.250:8158/"]];
NSTimer *updateBarTimer = [NSTimer scheduledTimerWithTimeInterval:0.5
target:self selector:#selector(updateMeter)
userInfo:nil repeats:YES];
streamPlayer.controlStyle = MPMovieControlStyleEmbedded;
[streamPlayer play];
}

Use your exact code but replace the url with: http://devimages.apple.com/iphone/samples/bipbop/gear1/prog_index.m3u8
For me your code failed unless I pointed the player to a segmented .m3u8 file.
I did some tests with a .mp4 movie and an .mp3 audio file locally on my computer and both worked fine as well.
I'm speculating here but I believe it's probable that while streaming media, the MPMoviePlayerController is using the .m3u8 file to deduce player item data on the fly? That's my guess anyway. What's curious is that if this is the case, why does it work at all for your url? Which leads me to my next comment...
You would probably have better results using AVFoundation rather than the MediaPlayer framework. I switched to it in my own work as well. It's less "prepackaged" but simply provides much more control.

Related

Proper way to quickly switch between videos

I am creating an application which displays a certain video based on an external event, which may require the playing video to change quickly - once per second or more. However, there must not be a gap or lag between the videos.
What is the best way to do this? There are only four videos, each about two megabytes.
I was considering creating four MPMoviePlayerControllers, and have their views added to the main view but hidden, and switching by pausing and hiding the current video, then unhiding and playing the next video. Is there a more elegant solution?
Edit
Here's some more information for my exact sitaution:
The different video frames share mostly common pixels- so it's OK for a frame to stick during switch, but NOT okay for black frames to appear.
Each video is only about ten seconds long, and there are only four videos. The general state transitions are 1<->2<->3<->4->1.
The video playback should compatible with simultaneous AVAudioRecorder recording. As far as I can tell, MPMoviePlayerController is not.
You'll need to set up and prebuffer all the video streams to avoid hiccups, so I don't think your multiple MPMoviePlayerController solution is too far off the mark.
However, that specific solution is potentially problematic because each movie player has its own UI. The UIs do not synchronize with each other, so one might be showing the control bar, another not; one might be in full screen mode, etc. Switching among them will cause visual discontinuities.
Since it sounds like your video switching is not necessarily user-initiated, I'm guessing you don't care too much about the standard video player UI.
I would suggest dropping down to the underlying layer, AV Foundation. In theory, you can just create an AVPlayerItem for each video. This is a stream-management object with no UI overhead, so it's perfect for what you're doing. You could then -- again, in theory -- create one AVPlayer and one AVPlayerLayer to handle the display. When you wanted to switch from one AVPlayerItem stream to another, you could call the AVPlayer's replaceCurrentItemWithPlayerItem: message to swap out the data stream.
I made a little test project (GitHub) to verify this, and unfortunately the straightforward solution isn't quite perfect. There is no video flow glitching, but in the transition from AVPlayer to AVPlayer, the presentation layer seems to briefly flash the previous movie's last-seen frame at the size appropriate to the next movie. It seems to help to allocate separate AVPlayer objects for each movie and switch among them to a constant player layer. There still seems to be an instantaneous flash of background, but at least it's a subtle defect. Here's the gist of the code:
#interface ViewController : UIViewController
{
NSMutableArray *players;
AVPlayerLayer *playerLayer;
}
#property (nonatomic) IBOutlet UIView *videoView;
- (IBAction) selectVideo:(id)sender;
#end
#implementation ViewController
#synthesize videoView;
- (void)viewDidLoad
{
[super viewDidLoad];
NSArray *videoTitles = [NSArray arrayWithObjects:#"Ultimate Dog Tease",
#"Backin Up", #"Herman Cain", nil];
players = [NSMutableArray array];
for (NSString *title in videoTitles) {
AVPlayerItem *player = [AVPlayer playerWithURL:[[NSBundle mainBundle]
URLForResource:title withExtension:#"mp4"]];
[player addObserver:self forKeyPath:#"status" options:0 context:nil];
[players addObject:player];
}
playerLayer = [AVPlayerLayer playerLayerWithPlayer:[players objectAtIndex:0]];
playerLayer.frame = self.videoView.layer.bounds;
playerLayer.videoGravity = AVLayerVideoGravityResizeAspect;
[self.videoView.layer addSublayer:playerLayer];
}
- (void) observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context
{
[object removeObserver:self forKeyPath:#"status"];
for (AVPlayer *player in players) {
if (player.status != AVPlayerStatusReadyToPlay) {
return;
}
}
// All videos are ready to go
[self playItemAtIndex:0];
}
- (void) playItemAtIndex:(NSUInteger)idx
{
AVPlayer *newPlayer = [players objectAtIndex:idx];
if (newPlayer != playerLayer.player) {
[playerLayer.player pause];
playerLayer.player = newPlayer;
}
[newPlayer play];
}
- (IBAction) selectVideo:(id)sender
{
[self playItemAtIndex:((UILabel *)sender).tag];
}
#end
Half the code is there just to observe the state of the players and make sure that playback doesn't start until all videos have been buffered.
Allocating three separate AVPlayerLayers (in addition to three AVPlayers) prevents any sort of flash. Unfortunately, an AVPlayer connected to an undisplayed AVPlayerLayer seems to assume that it doesn't need to maintain a video buffer. Every switch among layers then produces a transient video stutter. So that's no good.
A couple of things to note when using AV Foundation are:
1) The AVPlayer object doesn't have built-in support for looped playback. You'll have to observe for the end of the current video and manually seek to time zero.
2) There's no UI at all other than the video frame, but again, I'm guessing that this might actually be an advantage for you.
The MPMoviePlayerController is a singleton. Four instances will share the same pointer, the same view, etc. With the native player, I think you have only two alternatives: one is to change the contentURL property when you want a transition. If the latency this way unacceptable, the other alternative is to produce a longer video with the shorter clips concatenated. You can create very quick jumps within the single, longer clip by setting currentPlaybackTime.

AVPlayerItem fails with AVStatusFailed and error code "Cannot Decode"

I'm running into a strange issue, I hope someone can help.
In my iOS app I create a video with a custom soundtrack using MutableComposition by combining a video from the user's photo library and an audio file from the app bundle. I then use an AVPlayer and AVPlayerItem to play the video back to the user using a custom video player I made.
Each time a new composition is created, the assets, the player and the composition are cleared, released and it basically starts from a clean, init state.
All works fine, until after exactly 4 successful videos created this way every other attempt to create the player fails with error Cannot Decode. It does not matter if its the same video I'm recreating, has no relation to the size/length of the video or the audio file it simply always fails exactly on the fifth attempt, like clockwork. Once it fails, it will then always fail!
This is weird, because it just decoded the same video four times with no problem, so all of a sudden it fails? So, if anyone has a clue, please let me know.
Ok everyone, I have the answer to this straight from Apple. I used one of my developer TSI lifelines to ask the question, and I'll summarize the response.
There is a limit on the number of concurrent video players that AVFoundation will allow. It is due to the limitations of iOS hardware. The limit for current devices is 4 players. If you create a 5th player, you will get the "cannot decode" error. It is not a limit on the number of instances of AVPlayer, or AVPlayerItem. Rather,it is the association of AVPlayerItem with an AVPlayer which creates a "render pipeline", and you are limited to 4 of these. For example, this causes a new render pipeline:
AVPlayer *player = [AVPlayer playerWithPlayerItem:somePlayerItem];
// assuming the AVPlayerItem is ready to go with an AVAsset that has been loaded
I was also warned that you cannot assume that you will have 4 pipelines available to you. Another App may be using one or more. Indeed, I have seen this happen on an iPad, but it was not clear which app was using a pipeline.
So, there you go, it was totally undocumented, but that is the story.
I ran into the same error message after creating 4 AVPlayer instances, the fix in my case wasn't exactly the same though. Perhaps this will help anyone else who comes across this problem.
What I eventually found is that the AVPlayers were not being released when I had thought they were. In my case I was pushing my AVPlayer View Controller onto a Navigation Controller. Even though I was only creating one AVPlayer instance at a time, when the View Controllers are popped off a nav controller they were not being released immediately. It was then very easy for me to reach 4 AVPlayer instances before the old View Controllers were cleaned up.
It wasn't until I made sure that the previous players were released that this problem went away. To be complete I released the AVPlayerItem, AVPlayer and set the player on the AVPlayerLayer to nil before releasing.
I have to wonder if there is some limit on AVPlayer instances, unintentional or not. A related bit of info from the docs:
https://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html
"Multiple player layers: You can create arbitrarily many AVPlayerLayer objects from a single AVPlayer instance, but only the most-recently-created such layer will display any video content on-screen."
This one was absolutely killing me until I figured it out, picking up clues from this thread and a few others. The biggest single problem in my code was that I was instantiating my video player controller every time I wanted to play a video. Now, it gets instantiated once in the primary controller (in this case, my DetailViewContoller):
#interface DetailViewController () {
VideoPlayerViewController *videoPlayerViewController;
}
- (void) viewDidLoad
{
[super viewDidLoad];
videoPlayerViewController = [[VideoPlayerViewController alloc] initWithNibName: nil bundle: nil];
}
When I want to show a video, I call my DetailViewController's startVideoPlayback method:
- (void) startVideoPlayback: (NSString *)videoUID
{
videoPlayerViewController.videoUID = videoUID;
[self presentModalViewController: videoPlayerViewController animated: YES];
}
(NOTE: I'm passing it 'videoUID' -- a unique identified that was used to create the video in another part of the app.)
In the VideoPlayerViewController (which is largely cribbed from Apple's AVPlayerDemo sample), the one-time screen setup (initializing the AVPlayer, setting up the toolbar, etc.) is done in viewDidLoad -- which now only get's called once, and all per-video setup gets done within viewWillAppear, which then calls prepareToPlay:
- (void) prepareToPlay
{
[self initScrubberTimer];
[self syncPlayPauseButtons];
[self syncScrubber];
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
//*** Retrieve and play video at associated with this videoUID
NSString *destinationPath = [documentsDirectory stringByAppendingFormat: #"/%#.mov", videoUID];
if ([self fileExists: destinationPath]) {
//*** Show the activity indicator spinny thing
[pleaseWait startAnimating];
[self setURL: [NSURL fileURLWithPath: destinationPath]];
//*** Get things going with the first video in this session
if (isFirst) {
isFirst = NO;
//*** Subseqeunt videos replace the first one
} else {
[self.mPlayer replaceCurrentItemWithPlayerItem: [AVPlayerItem playerItemWithURL: [NSURL fileURLWithPath: destinationPath]]];
}
}
}
OK, I figured out a solution, I hope this is helpful to anyone who may stumble on something similar to this problem.
The solution in my case was to initialize the asset for the AVPlayer and the AVPlayerItem on the main thread and make sure I don't create the actual AVPlayerLayer before the playerItem and the player objects return with status "ReadyToPlay".
This proved to be tricky to isolate and I still don't know why it worked the first 4 times and then failed consistently on the 5th time.
Till, I couldn't really include the code, it wasn't a matter of one line or even a few functions. It was a complex problem that I couldn't isolate to begin with. Thanks for the comments though.
It seems like that issue can be caused by any decoding tasks, not only actual players.
I randomly had this problem when I implemented a background task to extract frames from currently playing videos with generateCGImagesAsynchronously
I need to display 4 videos on screen and a race condition would sometime cause the frame extraction to start before the video started playing and I would wait for isReadyForDisplay forever.
Not sure what a good recover strategy is if you can't avoid the condition in the first place, I would probably try to replaceCurrentItem

AVAudioPlayer's playAtTime Property doesn't working in IOS 3.0 / App Crash

Am developed a universal application which reads a audio file from local and will play using AvAudioPlayer. Here i using PlayAtTime property of AvAudioPlayer to forward and backward the audio.
Here my problem is this playAtTime property is working fine with ios 4.0 and later and not working in IOS-3.0.
I refered framwork which says - (BOOL)playAtTime:(NSTimeInterval)time NS_AVAILABLE(10_7, 4_0); in the sense i cant use it in ios 3.0 right..?? then whats the fix for ios 3.0
is there any solution to overcome this. And any suggestion.??
A quick look at the documents will clearly let u know that it is available only from iOS 4.0 onwards
Availability
Available in iOS 4.0 and later.
Try
currentTime
The playback point, in seconds, within the timeline of the sound
associated with the audio player.
#property NSTimeInterval currentTime
Discussion
If the sound is playing, currentTime is the offset of the current
playback position, measured in seconds from the start of the sound. If
the sound is not playing, currentTime is the offset of where playing
starts upon calling the play method, measured in seconds from the
start of the sound.
By setting this property you can seek to a specific point in a sound
file or implement audio fast-forward and rewind functions.
playAtTime is available from iOS 4.0, see the doc for more info ;)
http://developer.apple.com/library/ios/#DOCUMENTATION/AVFoundation/Reference/AVAudioPlayerClassReference/Reference/Reference.html
if you are not fussed about super accurate syncing, you can always try something like this instead (will work in IOS4+ as well, just not as accurately). note - to cancel it you need to save the return value and send it an "[mySavedTimer invalidate]" message.
#interface AVAudioPlayer (playAtTimePreIOS4)
-(NSTimeInterval) deviceCurrentTime_preIOS4;
-(NSTimer *) playAtTime_preIOS4 :(NSTimeInterval)atTime;
#end
#implementation AVAudioPlayer (playAtTimePreIOS4)
-(NSTimeInterval) deviceCurrentTime_preIOS4 {
static NSDate *fakeIt = nil;
if (fakeIt==nil) {
fakeIt = [[NSDate alloc] init];
}
return 0.0-[fakeIt timeIntervalSinceNow];
}
-(NSTimer *) playAtTime_preIOS4 :(NSTimeInterval)atTime {
NSTimeInterval fromNow = atTime - self.deviceCurrentTime_preIOS4;
return [NSTimer scheduledTimerWithTimeInterval:fromNow target:self selector:#selector(play) userInfo:nil repeats:NO];
}

Using Finch to play sounds on iphone

I have an application which has two kinds of sounds:
1. Background sounds, looping forever till next view.
2. Sounds clips that could be played separately or all in one go.
Previously i tried using AVAudioPlayer but there was always a very little silence interval between two clips (to which i didn't find any solution, if you have one, please post in reply), so shifted to Finch e.g. http://github.com/zoul/Finch/tree/master
Now here is the deal:
I am using AVAudioPlayer to play looping forever sounds while Finch to play those back-to-back clips. I am doing this:
- (void)viewDidLoad {
[super viewDidLoad];
soundEngine = [[Finch alloc] init];
/***
CLIPPED
***/
}
-(void) playclip:(id) sender {
NSString *fullPath = [[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent:#"sitar.wav"];
NSLog(#"Song is %#", fullPath);
clip = [[Sound alloc] initWithFile:fullPath];
NSLog(#"Clip Length: %#", clip.length);
NSLog(#"Clip Playing ? : %d", clip.playing);
[clip play];
NSLog(#"Played Clip");
}
This is almost same code as the demo project provided as in the git repository. But for some reason i get this on console:
Song is /Users/shoaibi/Library/Application Support/iPhone Simulator/User/Applications/17FBA26F-6047-4D56-9E45-ADAFE07B7234/Test.app/sitar.wav
Failed to create OpenAL source, error code a004.
Clip Length: (null)
Clip Playing ? : 0
Played Clip
An ls on the directory printed on console reveals that there is sitar.wav there with the correct permissions. What could be the issue? may be the background sound AVAudioPlayer which i stop using:
- (void)viewWillAppear:(BOOL)animated {
//Stop the background music before my ears bleed :P
d = [[UIApplication sharedApplication] delegate];
if (d.bg.playing)
{
[d.bg stop];
NSLog(#"Background music stoped");
}
}
is not freeing up sound resource?
Latest Finch now checks whether OpenAL is initialized when you initialize a new Sound. If current OpenAL context is not set because you forgot or failed to initialize Finch, you get a better error message. Give it a try, it might be the case that your initialization code is buggy.
I got the same error when I called the sound clip allocs in my viewWillAppear - seems it was called before the awakeFromNib finished doing its stuff. Anyway, shifting the allocs so they happened later fixed it.

Playing sounds in iPhone games - slowdowns

I'm using the AVAudioPlayer to play sound FX on a separate thread in my app. Right now my app is fairly simple, I have a UIScrollView for a scrolling background, and a couple of CALayers that represent a player and an enemy in a simple side-scroller type game (think Mario or Kung Fu or whatnot)...I can move the player left and right, with the view scrolling appropriately with no problems. I have an "attack" button that's playing a sound effect...and many times when I play the sound effect, I get a little bit of a hitch, graphically. Is this just par for the course on the iPhone using Quartz? Is there some way to have this perform a little more smoothly without starting to investigate OpenGL?
Use NSData dataWithContentsOfMappedFile instead of dataWithContentsOfFile.
Also AVAudioPlayer handles it's own threading, you don't need to create one. [AVAudioPlayer play] returns after it started playing, not after it is done playing.
And don't forget to release your AVAudioPlayer when you no longer need it, for example if you only want to play a sound onces, implemement the audioPlayerDidFinishPlaying:successfully: delegate and release it in there.
How are you playing the sound? If I was to make a guess it would be that each time the sound is played the OS is actually loading it from disk as well as playing it.
Update:
Ok, you really don't need to be creating a new thread to play a sound. This alone could cause a stutter in a game and is also redundant - The AVAudioPlayer play method is asynchronous - it returns immediately and not when the sound ends.
I had the same problem with slowdown. To solve it, I first play and pause the sound when it first loads at volume 0 - this works better than prepareToPlay which still seems to result in slowdown:
player.volume = 0.0;
[player play];
[player pause];
player.volume = 1.0;
When it comes time to play a sound I just [player play];. Then every update cycle I loop through all my active sounds and catch ones that are about to end and rewind them instead of letting them expire - this avoids the overhead and slowdown of starting up the sound from scratch. The example here stops the sounds 1/15th of a second from the end of the sound:
for ({each player in active sounds})
{
if ((player.duration - player.currentTime) <= (1.0/15.0))
{
[player pause];
player.currentTime = 0.0;
}
}
Now it's ready to go for the next play. Seems to work fine!
Update: Turns out I was still getting some pauses the very first time the sound was played. I think playing it at zero volume wasn't actually doing anything, but I changed it to play at 0.0001 volume and then pause - it's inaudible but seems to do the job of prepping the sound.
Had the same problem at first. Per documentation you need to call "prepareToPlay" every time a sound has finished playing. Adopting the AVAudioPlayerDelegate protocol and adding the following method will do this.
-(void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag{
[player prepareToPlay];
}
(Using an answer rather than a comment for length and so I can show code):
My sound playing method is as follows:
- (void)playSoundThreaded:(id)soundPlayer
{
if(soundPlayer == nil)
return;
NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];
[(AVAudioPlayer*)soundPlayer play];
[pool release];
}
- (void)playSound:(AVAudioPlayer*)player
{
if(player == nil)
return;
[NSThread detachNewThreadSelector:#selector(playSoundThreaded:)
toTarget:self withObject:(id)player];
}
And the sound is loaded up as follows:
fxPlayer = [[AVAudioPlayer alloc] initWithData:
[NSData dataWithContentsOfFile:
[[NSBundle mainBundle] pathForResource:#"voicehiya" ofType:#"caf"]] error:NULL];
We've had some inkling that the sound is being reloaded from disk every time...but have no idea as to why that is or how to prevent it.
I think the thread is over complicating your player. Try this for a go:
/* setup the sound player */
AVAudioPlayer *player1=[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:audioFilePath] error:NULL];
player1.numberOfLoops = 0;
player1.volume=1.0f;
[player1 prepareToPlay];
and then playSound becomes
[player play];
And don't forget to have the AudioSession category set correctly too.
Why not just use AudioServicesPlaySystemSound? This is well suited to sound effects.
I use it in my game (vConqr) which is also based on a UIScrollView, with numerous other views (so all having their own CALayer), and it doesn't significantly impact performance.
I've been investigating this for a few days now thanks to a project I'm on at work. I added some home-rolled profiling to my code and found that the calls to play sounds were causing frame rate issues in my OpenGL game. I've got a few recommendations for you based on what I've found:
Use an AVAudioPlayer playing an mp3 for your background music.
Use OpenAL for your sound effects.
For sound effects, use the audio converter command line tool from apple to get your sounds to .caf format. This is an uncompressed format that doesn't require the CPU to decompress your sound to memory before sending it to the sound card. It can go directly, saving you time. Instructions on using the tool can be found HERE. (scroll to the very bottom of the page)
Use the lowest bit depths that still sound good. You don't really need 128 kbps - you can probably get away with 20 for a lot of things. Lower bit rate = less processing = faster sound.
I'll let you know if I find anything else that's useful! Good luck!