Fading in and out individual AVMutableCompositionTracks - iphone

I have multiple AVAssets, and I create individual AVMutableCompositionTracks for each. I then create an AVMutableComposition and add each AVMutableCompositionTrack to it and then create an AVAssetExportSession, init with the AVMutableComposition and run the exporter. This allows me to create a single audio file made up of many overlapping audio sources.
I can trim and delay each source audio file by setting the parameters when I insertTimeRange on each AVMutableCompositionTrack. What I can't figure out is how to fade in and out of each individual track. I can do a master fade on the export session by using setVolumeRampFromStartVolume via AVMutableAudioMixInputParameters, and I know how to do fades on AVPlayers using the same method, but I don't think AVMutableAudioMixInputParameters can be used on an AVMutableCompositionTrack, right?
So how can I add a fade to a AVMutableCompositionTrack?
Thanks!

AVMutableAudioMixInputParameters actually can be used with AVMutableCompositionTracks. I use them. It just isn't stored within the composition. Instead, you'll need to set the audioMix property of any AVPlayer or AVAssetExportSession you use.

Related

Merging audio tracks in a single track using AVMutableComposition

I'm dealing with an app that should be able to mix multiple video and audio clips in a single movie file (.mp4).
At the momento the resulting movie has one video track, that is the concatenation of all imported video clips, and two audio tracks, the one originating from the imported video clips and the other originating from the imported audio clips.
What I'm trying to do is to merge those two audio tracks because I would like to have only one audio track in the resulting movie file. While we can use the instruction layer technique and merging multiple video tracks together I was not able to find something similar for audio.
I read about audio mixing object but I think it only mixes audio from both tracks. Instead I would like to have only One.
Obviously I tried to add video's audio and simple audios to the same track, but the resulting video remains black, that means to me that something has gone wrong with the asset building process. Naturally inserting different audios at the same time range is not a good think :-)
Any suggestions?
OK, finally I should have found the solution. Really using the AVMutableAudioMix resulting movie file has only one audio track instead of two.
EDIT
Answering to Justin comment, here is the trick:
let audioMix = AVMutableAudioMix()
let vip = AVMutableAudioMixInputParameters(track: self.videoAudioTrack!)
vip.trackID = self.videoAudioTrack!.trackID
vip.setVolume(self.videoAudioMixerVolume, at: .zero)
let aip = AVMutableAudioMixInputParameters(track: self.audioTrack!)
aip.trackID = self.audioTrack!.trackID
aip.setVolume(self.audioMixerVolume, at: .zero)
audioMix.inputParameters = [vip, aip]
easset.audioMix = audioMix
Where videoAudioTrack is the audio track for the video clip, wherease audioTrack is another simple audio track. easset is the AVAssetExporterSession object.

Adding images to AVMutableCompositionTrack

I can use AVMutableCompositionTrack to merge two video tracks together by simply calling insertTimeRange:ofTrack:atTime:error: and giving it an AVAssetTrack.
However, I have some video tracks and a few still images (jpg, png, etc) that I'd like to insert as still frames (with each image having a duration of a few seconds). I'm completely lost on how to insert the still images.
Is there no way to convert still images into an AVAssetTrack and insert them into an AVMutableCompositionTrack? Am I going to be forced into using the lower-level/more cumbersome (albeit more powerful) AVAssetReader/AVAssetWriter? If there was some way to add an image to an AVMutableCompositionTrack (and specify its duration) I'd really like to know how.

Reloading AVMutableComposition

I have a AVMutableComposition that I've associated with an AVPlayer.
It seems if I add tracks to this composition the AVPlayer doesn't play the added tracks unless I set the composition on the player again. Right now I'm doing this by just replacing the current item with the same composition that is already set. After that it picks up the new tracks.
I haven't been able to figure out if this is the correct way of doing it, it somehow seems a bit strange having to do it like that.
Any ideas?
Thanks
Basically although it's mutable composition, apple recommend to create an immutable composition and create AVPlayerItem for playback purposes. It's a bit annoying but you can go around it. When you want to refresh the video just sample the time, create another AVPlayerItem and load it to the right time.
Here's snippet from apple's AVMutableComposition documentation
AVMutableComposition is a mutable subclass of AVComposition you use
when you want to create a new composition from existing assets. You
can add and remove tracks, and you can add, remove, and scale time
ranges.
You can make an immutable snapshot of a mutable composition for
playback or inspection as follows:
AVMutableComposition *myMutableComposition =
<#a mutable composition you want to inspect or play in its current state#>;
AVComposition *immutableSnapshotOfMyComposition = [myMutableComposition copy];
// Create a player to inspect and play the composition.
AVPlayerItem *playerItemForSnapshottedComposition =
[[AVPlayerItem alloc] initWithAsset:immutableSnapshotOfMyComposition];
Further information: Full Document
If you just remove a track from the mutable composition which has associated with the playerItem, you can call the seeking method on the AVPlayer object to the current time with a completion handler to force the player clearing the loaded buffer.
// refresh player
player.seek(to: playerItem.currentTime()) { _ in }

want to create an audio playlist dynamically by using ".plist" file an then want to play it in iphone in built audio player

I want to create an audio playlist dynamically by using ".plist" file an then want to play it in iphone in built audio player. But during creation of audio list in plist there is no option in "type" to select data type for audio file.
Plz guide any one what approach I should follow ?
I'm not sure if I understood your question correctly. Can you specify what APIs you are using and what you are trying to do with more detail?
As far as Property Lists go, they are very generic so there wouldn't be anything for audio files in particular. You could specify the data type as a string.
How to approach this: First of all, you need to read the list into an NSArray and use that to generate a table view with your track listing (use tableView:didDeselectRowAtIndexPath: to start playing a track upon the user selecting it). To play a track use AVAudioPlayer.

Cross-fade within AVMutableVideoComposition

I have successfully composed an AVMutableComposition with multiple video clips and can view it and export it, and I would like to be able to transition between them using a cross-fade, so I want to use AVMutableVideoComposition. I can't find any examples on how to even arrange and play a couple AVAsset videos in succession. Does anyone have an example of how to add tracks to an AVMutableVideoComposition with the equivalent of AVMutableComposition's insertTimeRange, or how to set up a cross-fade?
[self.composition insertTimeRange:CMTimeRangeMake(kCMTimeZero,asset.avAsset.duration)
ofAsset:asset.avAsset
atTime:self.composition.frameDuration
error:nil]
I found an example called AVEditDemo from Apple's WWDC 2010 Sample Code.
https://developer.apple.com/library/ios/samplecode/AVCustomEdit/Introduction/Intro.html
There is a lot of detail in the sample, but I'll summarize: You need to use both AVMutableComposition and AVMutableVideoComposition. Add tracks individually to AVMutableComposition instead of with the simpler insertTimeRange, as it allows you to set overlapping times on the tracks. The tracks also need to be added to the AVMutableVideoComposition as AVMutableVideoCompositionLayerInstructions with an opacity ramp. Finally, to play back in an AVPlayer, you need to create an AVPlayerItem using both the AVMutableComposition and AVMutableVideoComposition.
It seems like going each level deeper in the api – in this case from MPMoviePlayer with an asset to AVPlayer with an AVComposition and finally to an AVVideoComposition – increases necessary coding exponentially.