We have an SKVideoNode that we're adding to an SKScene in a SpriteKit game, and the mp4 video plays just fine maybe 90% of the time, 10% of the time it just renders transparent video with the audio playing just fine.
What I mean by transparent, is that this video sits overtop of our game board, and when it glitches out, the game can be seen in plain sight below, though nothing is responsive because the video node is positioned over top of everything blocking user interaction. The audio from the video still plays fine so i know its trying to play.
Its totally inconsistent. The video plays fine for the most part, but what seems like 10% of the time, it just doesn't render any video content to the node, only the audio.
We are seeing this in all versions of iOS.
Our node code:
NSURL *fileURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"intro" ofType:#"mp4"]];
AVPlayer* player = [AVPlayer playerWithURL:fileURL];
SKVideoNode* introVideoNode = [[SKVideoNode alloc] initWithAVPlayer:player];
introVideoNode.size = CGSizeMake(self.frame.size.width,self.frame.size.height);
introVideoNode.position = CGPointMake(CGRectGetMidX(self.frame), CGRectGetMidY(self.frame));
introVideoNode.name = #"introVideo";
// this video plays over top of many other SKSpriteNodes
introVideoNode.zPosition = 8000;
[self addChild:introVideoNode];
[introVideoNode play];
Thoughts?
Related
I need to record video from the iPhone camera, and play an MP3 file at the same time.
I started with AVCam sample code, which I'm sure you all have. It works great for recording video.
However, I then added the following code to play an MP3. This MP3-playing code works in a different app of mine, but when I insert it into this sample code the MP3 not only does not play, the AVCamCaptureManager's recordingDidFinishToOutputFileURL never gets called, so the video never gets saved out.
It's like the audio playing code conflicts with the video capture code. Any ideas?
Here's the audio playing code I put into AVCam:
AVAudioPlayer *audioPlayer = nil;
NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/soundeffect.mp3", [[NSBundle mainBundle] resourcePath]]];
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
[audioPlayer setVolume:1.0];
[audioPlayer prepareToPlay];
[m_audioPlayer play];
Pretty simple code here. Not sure why this audio playing code causes recordingDidFinishToOutputFileURL to never get called...
Thanks for any ideas.
I figured it out: I just had to remove some code within AVCam that allocated AVCaptureDeviceInput - audioInput. That was unnecessary and conflicted with my audio playback code.
I have edited my videos in final cut pro and used their export to http live streaming, which includes an audio, cell low and hi video, wifi low and hi, .m3u8 and index files. I have put all the files onto my web server and am using this to call the videos
-(IBAction)introVideo:(id)sender
{
NSLog(#"intro button pressed");
NSString *url = #"http://www.andalee.com/iPhoneVideos/intro/Intro.m3u8";
MPMoviePlayerViewController* moviePlayer = [[MPMoviePlayerViewController alloc] initWithContentURL:[NSURL URLWithString:url]];
[self presentMoviePlayerViewControllerAnimated:moviePlayer];
}
(Side note: how should this be released?)
Here is the Index.m3u8
#EXTM3U
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=225416,CODECS="mp4a.40.2, avc1.42e015"
Intro%20-%20Cellular%20Low.segments/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=480386,CODECS="mp4a.40.2, avc1.42e015"
Intro%20-%20Cellular%20High.segments/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=751434,CODECS="mp4a.40.2, avc1.42e01e"
Intro%20-%20Wi-Fi%20Low.segments/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1250210,CODECS="mp4a.40.2, avc1.4d401e"
Intro%20-%20Wi-Fi%20High.segments/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=2545049,CODECS="mp4a.40.2, avc1.4d401e"
Intro%20-%20Broadband%20Low.segments/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=5056100,CODECS="mp4a.40.2, avc1.4d401f"
Intro%20-%20Broadband%20High.segments/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=33290,CODECS="mp4a.40.2"
Intro%20-%20Audio%20for%20HTTP%20Live%20Streaming.segments/prog_index.m3u8
When I test my app I initially get video and sound, but after 30 seconds I lose video while the audio continues to play. Any ideas what would be causing this?
This could simply be caused by a low bandwidth condition, which will trigger a bitrate change (in this case to the audio only version). If you try it in the emulator with a local server it might work correctly.
Most likely the file which is used next after the intro has wrong codec or the path is not right. Make sure all of the paths in Intro.m3u8 are correct and reachable from outside.
Im using the AVFoundation framework to play sound files. The problem im having is that its stopping music from playing when the audio file gets used, im not saying play both files continuously, but play the sound file, then pick up the ipod music right where it left off. Is there any way i can use AVFoundation is this kind of way? or is there a better framework for it?
Here is what my code looks like:
click = [NSURL fileURLWithPath:[NSString stringWithFormat:#"#/Click.WAV", [[NSBundle mainBundle] resourcePath]]];
audioPlayer = [[AVAudioPlayer alloc]initWithContentsOfURL:click error:nil];
audioPlayer.numberOfLoops = 1;
[click release];
[audioPlayer play];
This code works completely fine, i had to type it out so ignore any problems that there might be with it.
Thanks,
Jacob
You can use the AVAudioSession class to change the audio "category" of your app: thus you can allow it to play on top of the iPod music. Use the -setCategory:error: method, and you will probably want to use AVAudioSessionCategoryAmbient. More info can be found in the Audio Session Programming Guide.
Is there a way to playback a video file frame by frame using either MPMoviePlayer or AVPlayer?
Or even another movie player that I do not know about?
Here is what I want to do. I want to load a video into a fullscreen player and move the content one frame at a time based on user interaction. This will need to be pretty solid as I would need to accurately control what frame the movie player was displaying at any one time.
Ideally I would love to know if it were possible to load a video and control the frame displayed using code.
I know that I could do this using a UIImageView animation but tests show that this uses FAR to much memory.
When using AVPlayer, you can use the - (void)stepByCount:(NSInteger)stepCount method of its current AVPlayerItem to step forward or backward.
AVPlayer *mPlayer = [AVPlayer playerWithURL:url];
[mPlayer.currentItem stepByCount:1];
In my app I have a timer that counts down, and when it stops an image is displayed and a sound is played.
The sound is loaded like this:
NSString *path = [[NSBundle mainBundle] pathForResource:#"scream" ofType:#"wav"];
AudioServicesCreateSystemSoundID((CFURLRef)[NSURL fileURLWithPath:path], &soundID);
And Played like this on timer end:
AudioServicesPlaySystemSound(soundID);
My problem is that the sound doesn't get played on time. Even though the sound instance is called before the image, the image appears before the sound. This only occurs the first time the sound is played. The sound is loaded in the viewDidLoad method.
Is there a better way to do it, or what am I doing wrong?
Thanks...
Maybe the sound is delayed while the speaker is powered up (for the first sound). If so, maybe you can work around it by playing a different, unnoticeable sound earlier so that the speaker is ready to go when you request the "real" sound.