Is there a way I can take a snapshot of a movie or clip being played in iPhone's MPMoviePlayerController programmatically?
You are probably looking for this, available on iOS 4.x+
To facilitate the creation of video
bookmarks or chapter links for a long
movie, the MPMoviePlayerController
class defines methods for generating
thumbnail images at specific times
within a movie. You can request a
single thumbnail image using the
thumbnailImageAtTime:timeOption:
method or request multiple thumbnail
images using the
requestThumbnailImagesAtTimes:timeOption:
method.
More at: http://developer.apple.com/iphone/library/documentation/MediaPlayer/Reference/MPMoviePlayerController_Class/MPMoviePlayerController/MPMoviePlayerController.htmlmethod.
Related
I'm using CCVideoPlayer class in one of my iPhone/iPad apps to play the video on various types of gestures.
As of now, when the video ends, I replace the video with a jpeg image of the last frame of the video.
Is there anyway using which I can retain the video's last frame so that I don't have to replace with a jpeg everytime?
Im building an app that allows the user to record a video (in app) by pressing a button on the main screen. I don't want the user to be taken to the photo app because the video will only be able to be viewed on the app (Max of 15 seconds) and I can't quite get it. Anyone have the code to do this? A good example of what i want the camera to do is the camera in the app Cinemagram. Thanks for any help.
If you plan on saving the movie to the user's photo library, then you can use UIImagePickerController. In particular, you should read the guide that accompanies the class.
However, if you only want the video to be temporary, then you will probably want to use AVFoundation. You would then need to configure an AVCaptureSession with an AVCaptureMovieFileOutput to write the video to disk. Then, when you are ready to play the video, create an AVURLAsset with the file url that you just wrote, use that to create an AVPlayer to play the video, and add an AVPlayerLayer to your view, with said player, to display the video.
Either way, I would recommend studying the examples that Apple provides.
AVCam and
AVPlayerDemo should be more than enough to get you started (especially the AVCam example project).
I am a newbie trying to capture camera video images using AVFoundation and
want to render the captured frames without using AVCaptureVideoPreviewLayer. I
want a slider control to be able to slow down or speed up the rate of display of
camera images.
Using other peoples code as examples, I can capture images and using an NSTimer,
with my slider control can define on the fly how often to display them, but I
can't convert the image to something I can display. I want to move these
images into a UIView or UIImageView and render them in the timer Fire function.
I have looked at Apples AVCam app, (which uses an AVCaptureVideoPreviewLayer)
but because it has its own built in AVCaptureSession, I can't adjust how often the
images are displayed. (well, you can adjust the preview layer frame rate but
that can't be done on the fly)
I have looked at the AVFoundation programming guide, which talks about AVAssets
and AVPlayer, etc. but I can't see how a camera image can be turned into an
AVAsset. When I look at the AVFoundation guide, and other demos which show how
to define an AVAsset, it only gives me choices of using http stream data to
create the asset, or a url to define an asset using an existing file. I can't
figure out how to make my captured UIImage into an AVAsset, in which case I guess
I could use an AVPlayer, AVPlayerItems and AVAssetTracks to show the image with
an observeValueForKeyPath function checking status and doing [myPlayer play].
(I also studied the WWDC session 405 "Exploring AV Foundation" to see how that
is done)
I have tried similar code as in the WWDC Session 409 "Using the Camera on iPhone."
Like that myCone demo, I can set up the device, the input, the capture session,
the output, the setting up of a callback function to a CMSampleBuffer, and I
can collect UIImages and size them, etc. At this point I want to send that image
to a UIView or UIimageView. The session 409 just talks about doing it with
CFShow(sampleBuffer). This wasn't explained, and I guess its just assuming a
knowledge of Core Foundation I don't yet have. I think I am turning the captured
output in the sample buffer into a UIImage, but I can't figure out how to render
it. I created an IBOutlet UIImageView in my nib file, but when I try to stuff
the image into that view, nothing gets displayed. Do I need an AVPlayerLayer?
I have looked at the UIImagePickerViewController as an alternate method of
controlling how often I display captured camera images, and I dont see that I
can change the time on the fly to display images using that controller either.
So, as you can see, I am learning this stuff with the Apple development forum and
their documentation, the WWDC videos, and various websites such as
stackoverflow.com but have yet to see any examples of doing camera to screen
without using AVCaptureVideoPreviewLayer, UIImagePickderViewController or by
using an AVAsset that isnt already a file or http stream.
Can anybody make a suggestion? Thanks in advance.
I am developing an iPhone application in which I play videos using MPMoviePlayerController.
Sometimes, some of the videos don't play immediately after I call play on MPMoviePlayerController.
I have called prepareToPlay and in the notified method of MPMediaPlaybackIsPreparedToPlayDidChangeNotification, I am calling play on MPMoviePlayerController.
Could someone help in identifying the problem here?
Thanks,
Laxmilal
From my answer in a similar thread (reducing-the-initial-delay-when-playing-remote-video-content) - Note this fragment of the solution is valid for both, remote and local video content.
Use theMPMoviePlayerController.movieSourceTypeproperty when initializing your
player to cut down the media
recognition delay.
From the MPMoviePlayerController Class Reference:
The default value of this property is
MPMovieSourceTypeUnknown. This
property provides a clue to the
playback system as to how it should
download and buffer the movie content.
If you know the source type of the
movie, setting the value of this
property before playback begins can
improve the load times for the movie
content. If you do not set the source
type explicitly before playback, the
movie player controller must gather
this information, which might delay
playback.
I'm trying to play more than one video at the same time in Cocoa Touch.
I can play one video with MPMoviePlayerController, but it does not allow to play more than one: "Note: Although you can create multiple MPMoviePlayerController objects and present their views in your interface, only one movie player at a time can play its movie."
Basically I'm trying to display complex, high resolution animation in a loop with autostart, but using PNGs makes the app size way too big (+500 MB)
I converted one object of the animation to a movie which is about 50x smaller now.
I think its not possible to play two video at the same time. But you can play video in queue refer this article....