I'm using the av foundation framework to record video.
I've set a max duration of 10 sec for the video output.
Is it possible to loop the recording over the 10 sec interval instead of stop the recording.
My goal is to only store the 10 most recent sec of the video output.
Thanks!
In your case, you wouldn't like to stop the record because you can never know when the user will eventually stop it, then your only option is to save the whole thing and trim it after the user clicked the stop button.
To trim a video, try using the videoMaximumDuration of UIVideoEditorController
See an example of UIVideoEditorController here:
http://iphone-book-sample.googlecode.com/svn/trunk/Chapter12/UIVideoEditorControllerSample/
Related
I need to quickly seek thru H.264 encoded video stream in MP4 container. I am using libav to decode frames, so I stumbled upon avformat_seek_file() method.
My problem is, assuming H.264 stream begins with keyframe, when I seek to timestamp 0 (regardless of time_base), I should be at the beggining of the stream. But Im not. I usually get few seconds into video. Also, if I seek to, for example 10 seconds, I usually get around 12 or so. Is it possible for keyframes to be so "rare"? It seems that AVSEEK_FLAG_ANY has no impact on seek result. Tested on multiple FullHD H.264 MP4 videos.
Code:
unsigned long seekTo = 0;
//Doesen´t actually matter for 0 since it will be also 0
seekTo = av_rescale_q(seekTo, AVRational{1, AV_TIME_BASE}, pFormatCtx->streams[videoStream]->time_base);
int result = avformat_seek_file(pFormatCtx, videoStream, INT_FAST64_MIN, seekTo, seekTo, AVSEEK_FLAG_ANY);
avcodec_flush_buffers(pCodecCtx);
Try using av_seek_frame instead. Read here for some gotchas about using that and seeking around.
My problem is, assuming H.264 stream begins with keyframe, when I seek to timestamp 0 (regardless of time_base), I should be at the beggining of the stream
Note that some files can have their first keyframe at a negative DTS, e.g. you need to seek to timestamp -1 or something like this.
You can set the flag inside AVFMT_SEEK_TO_PTS into AVInputFormat::flags before opening the AVFormatContext to use PTS which will be 0-based.
I need to play 2 sounds with 2 AVAudioPlayer objects at the same exact time... so I found this example on Apple AVAudioPlayer Class Reference (https://developer.apple.com/library/mac/#documentation/AVFoundation/Reference/AVAudioPlayerClassReference/Reference/Reference.html):
- (void) startSynchronizedPlayback {
NSTimeInterval shortStartDelay = 0.01; // seconds
NSTimeInterval now = player.deviceCurrentTime;
[player playAtTime: now + shortStartDelay];
[secondPlayer playAtTime: now + shortStartDelay];
// Here, update state and user interface for each player, as appropriate
}
What I don't understand is: why also the secondPlayer has the shorStartDelay?
Shouldn't it be without? I thought the first Player needed a 0.1 sec delay as it is called before the second Player... but in this code the 2 players have the delay...
Anyone can explain me if that is right and why?
Thanks a lot
Massy
If you only use the play method ([firstPlayer play];), firstPlayer will start before the second one as it will receive the call before.
If you set no delay ([firstPlayer playAtTime:now];), the firstPlayer will also start before de second one because firstPlayer will check the time at which it is supposed to start, and will see that it's already passed. Thus, it will have the same behaviour as when you use only the play method.
The delay is here to ensure that the two players start at the same time. It is supposed to be long enough to ensure that the two players receive the call before the 'now+delay' time has passed.
I don't know if I'm clear (English is not my native langage). I can try to be more clear if you have questions
Yeah what he said ^ The play at time will schedule both players to start at that time (sometime in the future).
To make it obvious, you can set "shortStartDelay" to 2 seconds and you will see there will be a two second pause before both items start playing.
Another tip to keep in mind here are that when you play/pause AVAudioPlayer they dont actually STOP at exactly the same time. So when you want to resume, you should also sync the audio tracks.
Swift example:
let currentDeviceTime = firstPlayer.deviceCurrentTime
let trackTime = firstPlayer.currentTime
players.forEach {
$0.currentTime = trackTime
$0.play(atTime: currentDeviceTime + 0.1)
}
Where players is a list of AVAudioPlayers and firstPlayer is the first item in the array.
Notice how I am also resetting the "currentTime" which is how many seconds into the audio track you want to keep playing. Otherwise every time the user plays/pauses the track they drift out of sync!
I have used seekToTime in my application and its working properly. But I want to have some more information about it. Which is, if I have a streaming video file of 1 minute now I want to play it from second 15th to second 45 (first 15 seconds and the last 15 seconds will not play).
How can I do it?
I know that by use of seekToTime I can play a video from 15th second but how to stop it at 45th second and also get noticed by the method that the video has played for the specified time period?
CMTime timer = CMTimeMake(15, 1);
[player seekToTime:timer];
The above code takes me to the 15th second of the streaming file but how to stop it on 45th second and get notified too?
I have searched a lot but couldn't get any info.
Thanks
EDIT:
As #codeghost suggested, simply use forwardPlaybackEndTime.
You can simply use:
yourAVPlayerItem.forwardPlaybackEndTime = CMTimeMake(10, 1);
Here 10 is the time till the AVPlayerItem will play.
Set the forwardPlaybackEndTime property on your AVPlayerItem.
I don't know if there is something build-in for this in AVPlayer, but what i would do is build an function and call it with :
[self performSelector:#selector(stopPlaying) withObject:nil afterDelay:45];
-(void)stopPlaying{
[player pause];
}
this will stop playing after 45 seconds,and of course you can put instead 45 any number of seconds that you want.
You can use [AVPlayer addPeriodicTimeObserverForInterval:queue:usingBlock:] for that purpose.
Get the AVPlayer.currentTime periodically, and call pause or stop on the exact time.
I need to extract first Ten(10) secs from an audio file.
We have the following code.
var file = recording.stop();
sound = Titanium.Media.createSound({sound:file});
I need to extract first 10 seconds from sound object.
How we can do this?
Also, How to merge two sound objects.
ex:
sound1 has 10 secs and sound2 has 15secs.
I want sound3 which is sound1 + sound2,
which should play both sound1 and sound2 continuously.
Thanks in Advance.
var sound = Ti.Media.createSound({
sound: file,
duration: 10
});
That will get you the first 10 seconds of the sound however even with the looping set to true you will need to create your own custom timers to alternate between the sounds playing.
I'm using the MPMoviePlayer to play a movie. I want to be able to play the movie and allow the user to skip to a certain time in the film at the press of a button.
I'm setting the currentPlaybackTime property of the player but it doesn't seem to work. Instead it simply stats the movie from the beginning no matter what value I use.
Also I log the currentPlaybackTime property through a button click, it always returns a large number but it sometimes returns a minus value?! is this expected? (e.g. -227361412)
Sample code below:
- (IBAction) playShake
{
NSLog(#"playback time = %d",playerIdle.currentPlaybackTime);
[self.playerIdle setCurrentPlaybackTime:2.0];
return;
}
I have successfully used this method of skipping to a point in a movie in the past. I suspect that your issue is actually with the video itself. When you set the currentPlaybackTime MPMoviePlayer will skip to the nearest keyframe in the video. If the video has few keyframes or you're only skipping forward a few seconds this could cause the video to start over from the beginning when you change the currentPlaybackTime.
-currentPlaybackTime returns a NSTimeInterval (double) which you are printing as a signed int. This will result in unexpected values. Try either casting to int (int)playerIdle.currentPlaybackTime or printing the double %1.3f.