I'm trying to play more than one video at the same time in Cocoa Touch.
I can play one video with MPMoviePlayerController, but it does not allow to play more than one: "Note: Although you can create multiple MPMoviePlayerController objects and present their views in your interface, only one movie player at a time can play its movie."
Basically I'm trying to display complex, high resolution animation in a loop with autostart, but using PNGs makes the app size way too big (+500 MB)
I converted one object of the animation to a movie which is about 50x smaller now.
I think its not possible to play two video at the same time. But you can play video in queue refer this article....
Related
Is there any way to add more than 1 video player in a single View? I am getting the .m4v list from server adn have to display that much videos in single page.
It doesnot matter if I play 1 video at a time but which ever video I want must play there itself.
So the videos are placed one below other.
I have tried using mpmoviecontroller but it's drawback is it can have only one instance in the whole applicaiton So if i try to alloc 2 players in a view then the first one does not work and only one player works..
Is there any legal alternate way for the same?
MPMoviePlayerController will allow multiple instances, but only one of them can be playing their movie at any given time.
Please check the link : http://developer.apple.com/library/ios/#documentation/mediaplayer/reference/MPMoviePlayerController_Class/MPMoviePlayerController/MPMoviePlayerController.html
I used AVPlayer class (which also MPMoviePlayerController uses under the hood) to successfully play multiple videos from multiple source.
The process of using it is more involved than MPMoviePlayerController, but I would say it is definitely worth to look at if you want some customized behavior in your player. The official programming guide is a good start.
I am trying to see if it is possible to record a video from the iPhone's camera and write this to a file. I then want the video to start playing on the screen a set time after. This all needs to happen continuously. For example, I want the video on the screen to always be 20 seconds behind what the camera is recording.
Some background:
I have a friend who is a coach and would like for his players to be able to see their last play. This could be accomplished by a feed going to a TV from an iPad always 20 seconds behind what is recorded. This needs to continually run until practice is over. (I would connect the iPad to the TV either with a cable or AirPlay to an Apple TV). The video would never need to be saved and should just be discarded after playing.
Is this even possible with the APIs AVFoundation offers? Will the iPhone let you write to a file and read from a file at the same time to accomplish this? Any other better way to accomplish this?
Thanks for your time.
Instead of writing to a file, how about saving your frames in a circular buffer big enough to hold X seconds of video?
The way I would start to do this would be to look at what's provided in AVCaptureVideoDataOutput and its delegate methods (where you can get the frame data from).
I want to play a video (with sound) and record video from the front-facing camera at the same time. The view finder for the camera should appear as a small "picture-in-picture" in the bottom right hand corner of the screen while the movie plays full screen behind it. Is this possible? Is layering the appropriate classes on top of each other possible?
Check out the AVFoundation framework, which is used for much of the audio and video programming in iOS.
In your case you could use an AVPlayer and AVPlayerLayer to play your movie, and an AVCaptureSession, an AVCaptureVideoPreviewLayer, and an AVCaptureMovieFileOutput to record.
If you are familiar with Core Animation, you can set the bounds and add sublayers to AVPlayerLayer and AVCaptureVideoPreviewLayer to achieve you desired interface layout.
These classes are very well documented, and the AVFoundation Programming Guide clearly explains their interaction.
Feel free to comment with any questions.
I have been facing this since so many days but I have not reach to any conclusion.
My problem is : I want to play an mp3 file but not simply by clicking on a play button.
It is this way I want to play it.
*There is a slider that I can drag using finger, I want that the mp3 should play with the frequency with which I am dragging the finger (or speed with which I am dragging my finger, so that it will give an effect of fast forwarding (funny type of voice)) or if I drag slider slowyly the output will be slow *
The problem is the output of the sound is not coming out smooth. its very distorted and disturbed voice.
I want the outuput to be smoother.
Please help. Any suggestions please. Presently I am using AVAudioPlayer and passing the time value based upon slider input to play the file. (It does not seems to be feasible though).
I feel that it is possible using openAL only and no other way. Because using openAL we can modify the frequency of the sound file (pitch)
CAN SOME ONE PLEASE REFER ME A LINK TO openAL implementation for iPhone . I have never played a sound file using openAL
Help!!
You won't be able to do it with AVAudioPlayer, as it does not support pitch operations.
You can load and decode the entire track into memory for playback with OpenAL (which supports pitch), or you can do realtime loading/decoding and pitch changing using Audio Units (MUCH lower level, and more complicated, though).
I have implement an application.In which there is list of videos.when user touch on the video then video play in MPMoviePlayer.But when video is finished then i want to play another video automatically.Is it possible?
Yes. Do you want it to play another one randomly? You could do that. Or do you have an NSArray that contains all the different MPMoviePlayer URLs to play in order? If you do, then loop through the array objects and play each. You will have to have a MPMoviePlayerPlaybackDidFinishNotification to know when each videos finish so you can start the next one.