certain movie frames edit using GPUImage - iphone

I want to apply effects on certain frames of movie using GPUImage. I have successfully added effect on entire movie file, so is there a way to add different effects on different frames?
For example, I want to apply effect of Sepia on video from 5 seconds to 10 seconds. So I need 0-5 seconds to be original video, 5-10 seconds with Sepia effect and 10 - video total seconds with original video.
Also, I want to draw text/image on certain frames using GPUImage, is it possible?
Any response will be greatly appreciated.

You could ask MPMoviePlayerController or AVAssetImageGenerator to generate a thumbnail at the time you specify.
iPhone Read UIimage (frames) from video with AVFoundation
AVAssetImageGenerator provides images rotated
If you'd like videos instead of just frames, you could trim a section out of the video and apply an effect to that. This takes the URL of your video and trims it to the specified time.
AVURLAsset *videoAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:videoAsset presetName:AVAssetExportPresetHighestQuality];
exportSession.outputURL = [NSURL fileURLWithPath:outputURL];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTimeRange timeRange = CMTimeRangeMake(CMTimeMake(startMilliseconds, 1000), CMTimeMake(endMilliseconds - startMilliseconds, 1000));
exportSession.timeRange = timeRange;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:
///
// Call something to apply the effect
///
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed:%#", exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Canceled:%#", exportSession.error);
break;
default:
break;
}
}];
Upon completion, you'd then apply your effect and if you went with the video clip route, combine them, and encode them.
How to combine video clips with different orientation using AVFoundation

Related

Audio player duration changes while playing

When I take the duration of an audio file before playing it with:
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
...
[audioPlayer prepareToPlay];
duration = audioPlayer.duration;
I get a duration of for example 16.71s. Then when taking samples every 0.2s while playing, the duration changes. It changes every 1.2 to 1.6 seconds to: 16.71s, 17.02s, 17.23s, 17.33s, 17.38s, 17.43s, 17.38s, 17.29s, 17.31s, then stays at 17.51s for the last 2.5 seconds. So it goes up and down.
I sample in a method that updates a slider position, and also shows the elapsed time and total duration. You can imagine it looks really weird to see the duration (which is int-truncated) go from 16 to 17 while playing. Additionally, the slider position will be off with all this drifting.
Does anyone have an idea why this is?
Now that we're talking about duration: Does anyone know why audio player.duration can return values that are about twice the actual duration when [audioPlayer prepareToPlay] is omitted?
The duration returned by avaudioplayer's duration method is a best estimate of the total duration of the file based on how much of the file has been decoded so far. That's why the duration continues to get refined as you check it over time.
In order to get a better time, I use AVAsset's duration method. It explains what is going on a bit better here:
https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVAsset_Class/Reference/Reference.html#//apple_ref/occ/instp/AVAsset/duration
If you specify providesPreciseDurationAndTiming = YES, then AVAsset will decode the file if needed to determine its duration with accuracy. If the decode time is too long for your use, you can disable it.
In my situation, I use the AVURLAsset subclass:
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:localURL options:[NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil]];
float t = CMTimeGetSeconds(asset.duration);
#AndyKorth's answer is the best! Here it is in Swift
guard let audioPlayer = audioPlayer, let url = audioPlayer.url else { return }
let assetOpts = [AVURLAssetPreferPreciseDurationAndTimingKey: true]
let asset = AVURLAsset(url: url, options: assetOpts)
let assetDuration: Float64 = CMTimeGetSeconds(asset.duration)
// compare both
print("assetDuration: ", assetDuration)
print("audioPlayerDuration: ", Float(audioPlayer.duration))

How to add videoPlayer as an subview?

Hello I am facing an problem related to adding subview;
I am following the Code:
NSString *urlStr = [NSString stringWithFormat:#"http:x/iphone0.m3u8"];
NSURL *videoURL = [NSURL URLWithString:urlStr];
MPMoviePlayerController *iVideoPlayer = [[MPMoviePlayerController alloc]initWithContentURL:videoURL];
[self.view addSubview:iVideoPlayer.view ];
if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad)
{
// The device is an iPad running iPhone 3.2 or later.
iVideoPlayer.view.frame = CGRectMake(353,258,320,240);
}
else
{
iVideoPlayer.view.frame = CGRectMake(156,96,168,148);
}
[iVideoPlayer play];
In this code I like to add an video player as an subview . I have successfully add the videoPlayer.but problem is that during the video playing ,if i click on the specified area (CGRectMake(353,258,320,240);the video stop. I like apply the videoPlayer Function (next,Pause,Volume up/Down)which are not done .
How to resolve this .?
You have some options for the control style of your MPMoviePlayerController instance. By setting the controlStyle property to one of the following options, you can enable certain player controls (like pause, play, etc).
MPMovieControlStyleNone,
MPMovieControlStyleEmbedded,
MPMovieControlStyleFullscreen,
MPMovieControlStyleDefault
The descriptions for the above styles can be found here: MPMoviePlayerController Class Reference
Here is an example of how to set this property.
[iVideoPlayer setControlStyle:MPMovieControlStyleEmbedded]

How to play a segment of a movie in iOS

I have a video, and I want to display the video at a specific time time and stop it at a specific time. I am using MPMoviePlayerController.
You should take a look at the MPMoviePlayerController Class Reference, and the initialPlaybackTime and endPlaybackTime properties.
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:url];
player.initialPlaybackTime = 5; // beginning time in seconds
player.endPlaybackTime = 15; // end time for playback in seconds

Is it possible to programmatically create video frame-by-frame in iOS?

I want to make an app where users can create funny stick figure animations.
It would be cool if it is possible to export them as video. Can I "draw" video frames frame by frame and render them into a H.264 or other video format?
The length will be between 2 seconds and 5 minutes. I heared a while back that there is a framework to edit video but in my case I really need to create a video from scratch. What are my options?
You might need to use a multimedia framework which provides more lower level control, like gstreamer or ffmeg.
Alternately, you can create an MJPEG and figure out a way to transcode it.
Yes, you can examine :
CEMovieMaker
Usage:
UIImage *frameImg = <Some Image>;
NSDictionary *settings = [CEMovieMaker videoSettingsWithCodec:AVVideoCodecTypeH264
withWidth:source.size.width
andHeight:source.size.height
];
///
CEMovieMaker * movieMaker = [[CEMovieMaker alloc] initWithSettings:settings];
/// Complete video
[movieMaker createMovieFromImages:[self.movieImages copy] withCompletion:^(NSURL *fileURL){
//AVPlayerViewController or
MPMoviePlayerViewController *playerController = [[MPMoviePlayerViewController alloc] initWithContentURL:fileURL];
[playerController.view setFrame:self.view.bounds];
[self presentMoviePlayerViewControllerAnimated:playerController];
[playerController.moviePlayer prepareToPlay];
[playerController.moviePlayer play];
[self.view addSubview:playerController.view];
}];

AVPlayer Questions, while Live Streaming (iOS)

I have AVPlayer Questions.
1.How to control the volume of it?
2.How to know if the AVPlayer is reloading music because bad connection, do i have some inidication of it?
AVPlayer uses the system volume, so if you need to provide controls for this you can use MPVolumeView which gives you the slider for volume control.
For audio fading, you can use an AVAudioMix. Here's some code:
//given an AVAsset called asset...
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
id audioMix = [[AVAudioMix alloc] init];
id volumeMixInput = [[AVMutableAudioMixInputParameters alloc] init];
//fade volume from muted to full over a period of 3 seconds
[volumeMixInput setVolumeRampFromStartVolume:0 toEndVolume:1 timeRange:
CMTimeRangeMake(CMTimeMakeWithSeconds(0, 1), CMTimeMakeWithSeconds(3, 1))];
[volumeMixnput setTrackID:[[asset tracks:objectAtIndex:0] trackID]];
[audioMix setInputParameters:[NSArray arrayWithObject:volumeMixInput]];
[playerItem setAudioMix:audioMix];
You can also abruptly set the volume for a mix at a given time with:
[volumeMixInput setVolume:.5 atTime:CMTimeMakeWithSeconds(15, 1)];
Hope this helps. This API is definitely not obvious. I'd highly recommend watching the WWDC 10 video entitled Discovering AV Foundation. It's excellent.