I am selecting a video clip from the iPhone camera roll using UIImagePickerController within the AVFoundation framework, I have set it up so the user is able to adjust the length of the video by trimming it. Is there a way to set the maximum and minimum length the user is able to have the video as, for example I want the video clip to be a maximum length of 15 seconds and a minimum length also of 15 seconds.
What's the best way of going about doing this?
`imagePickerController.videoMaximumDuration = 15.0f;` // limits video length to 15 seconds.
where imagePickerController is object of UIImagePickerController.
Using videoMaximumDuration method you can restrict length of video from both ways. Like if you are recording video an alert will popup saying you cannot record video more than 15 sec and if you are selecting any video file from your library, first it will check the length of your video if length is more than 15 sec. Again alert will popup saying video is larger than 60 sec but there will be two options i.e. use or cancel. If you select to use then it will crop the length of video upto 15sec from the beginning.
UIImagePickerController has a property, videoMaximumDuration you can set.
Related
I am working on an application that has to be tight control of time flow during a movie recording.
Apple says the iPhone 5 can capture HD video up to 30 fps. If shoot a video and play it on quicktime I see a variable FPS, that reaches 30 fps at some moments, but at the same time quicktime reports the video as being 29.75 fps.
As far as I understand, for each second of video, an integer number of frames should be displayed, not a fractional number. I first thought that could be related to drop frames. Then I decided to design an method to measure drop frames and realized that for every second of video, the iPhone drops from 1 to 4 frames. Also discovered that every time a frame is dropped iPhone simply copies the last frame again to fill the gap. So in theory, dropping a frame would make no difference in the total number of frames a move would have.
So, this is my problem. What this 29.75 fps is telling? How this number is obtained?
It's not so much that x number of frames are shown per second, but each frame is shown for 1/x seconds. NTSC (the TV standard in US, Japan and others), runs at 29.97fps. So, each frame is shown for a bit more than 3/100ths of a second before the next frame is drawn. So, in your case, each frame is displayed for roughly .0336 of a second before the next one is shown.
MPMoviePlayerController i am playing a video and when i pause a video, when i click button i want forward the video by some time. Does any one know how to forward video from current time ? if yes what is the minimum time that i can forward video ? like is it millisecond
or seconds
Seeking very much depends on your content. Factors influencing the skip-able durations are: content format (MP4 local/progressive download or HTTP Stream/M3U8), i-frame frequency, TS-chunk-size (for M3U8) to name the major points. See wikipedias explanation on i-frames.
MPMoviePlayerController itself does not impose additional limitations.
To get very exact seeking, use MP4 with a high i-frame frequency. Note, that will dramatically affect the encoded video size.
I need to show some text on the running video in my iPhone application. I am using AVFOUNDATION.FRAMEWORK to stream my videos. What exactly I need is that, for example, if my video length is about 5 minutes then I want to show text on the video from minute 3 to minute 4 then at the end of the video for few seconds.
I have placed a UIView with UILabel on the video view, which is properly showing the text on it but it is constantly there. I want to limit this text so that whenever I want to show it on my video it comes up.
Is it possible?
I think you should set a NSTimer with delay and fire it when you want to show the label and again use another timer to hide it and so on. ie, set timer according, you will have to manually set it.
I am trying to see if it is possible to record a video from the iPhone's camera and write this to a file. I then want the video to start playing on the screen a set time after. This all needs to happen continuously. For example, I want the video on the screen to always be 20 seconds behind what the camera is recording.
Some background:
I have a friend who is a coach and would like for his players to be able to see their last play. This could be accomplished by a feed going to a TV from an iPad always 20 seconds behind what is recorded. This needs to continually run until practice is over. (I would connect the iPad to the TV either with a cable or AirPlay to an Apple TV). The video would never need to be saved and should just be discarded after playing.
Is this even possible with the APIs AVFoundation offers? Will the iPhone let you write to a file and read from a file at the same time to accomplish this? Any other better way to accomplish this?
Thanks for your time.
Instead of writing to a file, how about saving your frames in a circular buffer big enough to hold X seconds of video?
The way I would start to do this would be to look at what's provided in AVCaptureVideoDataOutput and its delegate methods (where you can get the frame data from).
I have a UISlider which controls the playback seek of the video. However, when i drag this slider to a certain position and set the currentPlaybackTime to the appropriate float value, the movie starts playing from approximately 2 seconds earlier, causing the slider to flicker.
If this behavior is Apple's default, is there a way i can stop the video from rewinding?