I have an array of local files, I downloaded them from a .m3u8 playlist, since i have to save for later local playing.
All the files are in .ts format, and I want to merge them all in a single video file. I already tried to merge the files using AVMutableComposition, I tried to convert all local files to a AVAsset but the property .tracks always return 0, so i presume the AVAsset was incorrect, then I tried to rename all the files to MPEG but the problem still the same.
Does anyone have any idea on how to, read correctly those files, here is my code so far :
func mergeAllVideos(filesPath: URL) {
let allVideos = extractAllFile(atPath: filesPath.absoluteString)
var arrayVideos = [AVURLAsset]()
var atTimeM: CMTime = CMTimeMake(0, 0)
var lastAsset: AVAsset!
var layerInstructionsArray = [AVVideoCompositionLayerInstruction]()
var completeTrackDuration: CMTime = CMTimeMake(0, 1)
var videoSize: CGSize = CGSize(width: 0.0, height: 0.0)
var totalTime : CMTime = CMTimeMake(0, 0)
for asset in allVideos.enumerated() {
if let url = URL(string: asset.element) {
let newAsset = AVURLAsset(url: url)
arrayVideos.append(newAsset)
}
}
let mixComposition = AVMutableComposition()
for videoAsset in arrayVideos {
let player = AVPlayer(url: videoAsset.url)
let videoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
do {
if videoAsset == arrayVideos.first{
atTimeM = kCMTimeZero
} else{
atTimeM = totalTime
}
print(videoAsset.tracks)
try videoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0], at: atTimeM)
videoSize = videoTrack.naturalSize
} catch let error as NSError {
print("error: \(error)")
}
totalTime = CMTimeAdd(totalTime, videoAsset.duration)
completeTrackDuration = CMTimeAdd(completeTrackDuration, videoAsset.duration)
let videoInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)
if let asset = arrayVideos.last, videoAsset != asset {
videoInstruction.setOpacity(0.0, at: completeTrackDuration)
}
layerInstructionsArray.append(videoInstruction)
lastAsset = videoAsset
}
let mainInstruction = AVMutableVideoCompositionInstruction()
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, completeTrackDuration)
mainInstruction.layerInstructions = layerInstructionsArray
let mainComposition = AVMutableVideoComposition()
mainComposition.instructions = [mainInstruction]
mainComposition.frameDuration = CMTimeMake(1, 30)
mainComposition.renderSize = CGSize(width: videoSize.width, height: videoSize.height)
let documentDirectory = NSSearchPathForDirectoriesInDomains(.developerApplicationDirectory, .userDomainMask, true)[0]
let dateFormatter = DateFormatter()
dateFormatter.dateStyle = .long
dateFormatter.timeStyle = .short
let date = dateFormatter.string(from: NSDate() as Date)
let savePath = (documentDirectory as NSString).appendingPathComponent("mergeVideo-\(date).mov")
let url = NSURL(fileURLWithPath: savePath)
let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
exporter!.outputURL = url as URL
exporter!.outputFileType = AVFileTypeQuickTimeMovie
exporter!.shouldOptimizeForNetworkUse = true
exporter!.videoComposition = mainComposition
exporter!.exportAsynchronously {
print("exported")
}
}
You can't work with .tsfiles, also renaming of .tv extension to .mp4 will not be enough
You need to translate .tv file to mp4. You can use this lib https://github.com/Keemotion/TS2MP4 for this
And then you will be able to mix videos
Related
I have an array of video recordings stored as AVURLAsset and I want to merge the videos into 1 clip so that they play back to back with no time in-between videos. Below is what I'm hoping to do, but it is only showing the 1st video. Note: I want the videos to play back-to-back and without blank space between each. For example, if there is real time between recording1 and recording2 when the phone was not recording, the output should NOT show this.
Thanks for your help!
func mergeVideos(handler: #escaping (_ asset: AVAssetExportSession)->()) {
let mixComposition = AVMutableComposition()
var videoTime = CMTime.zero
for video in self.recordings {
let tracks = video.tracks(withMediaType: .video)
let videoTrack = tracks[0]
let videoTimeRange = CMTimeRangeMake(start: .zero, duration: video.duration)
guard let compositionVideoTrack: AVMutableCompositionTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)) else { return }
do {
try compositionVideoTrack.insertTimeRange(videoTimeRange, of: videoTrack, at: videoTime)
videoTime = CMTimeAdd(videoTime, video.duration)
} catch {
print("An error occurred")
return
}
}
guard let documentDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first else { return }
let dateFormatter = DateFormatter()
dateFormatter.dateStyle = .long
dateFormatter.timeStyle = .short
let date = dateFormatter.string(from: Date())
let url = documentDirectory.appendingPathComponent("mergeVideo-\(date).mov")
guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) else { return }
exporter.outputURL = url
exporter.outputFileType = AVFileType.mov
exporter.shouldOptimizeForNetworkUse = true
handler(exporter)
}
The solution is to move the AVMutableCompositionTrack outside of the for loop. My updated function to merge videos (with audio) is below. Hope this helps someone!
var recordings = [AVURLAsset]()
func mergeVideos(handler: #escaping (_ asset: AVAssetExportSession)->()) {
let videoComposition = AVMutableComposition()
var lastTime: CMTime = .zero
guard let videoCompositionTrack = videoComposition.addMutableTrack(withMediaType: .video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)) else { return }
guard let audioCompositionTrack = videoComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)) else { return }
for video in self.recordings {
//add audio/video
do {
try videoCompositionTrack.insertTimeRange(CMTimeRangeMake(start: .zero, duration: video.duration), of: video.tracks(withMediaType: .video)[0], at: lastTime)
try audioCompositionTrack.insertTimeRange(CMTimeRangeMake(start: .zero, duration: video.duration), of: video.tracks(withMediaType: .audio)[0], at: lastTime)
} catch {
print("Failed to insert audio or video track")
return
}
//update time
lastTime = CMTimeAdd(lastTime, video.duration)
}
//create url
guard let documentDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first else { return }
let dateFormatter = DateFormatter()
dateFormatter.dateStyle = .long
dateFormatter.timeStyle = .short
let date = dateFormatter.string(from: Date())
let url = documentDirectory.appendingPathComponent("mergeVideo-\(date).mov")
//export
guard let exporter = AVAssetExportSession(asset: videoComposition, presetName: AVAssetExportPresetHighestQuality) else { return }
exporter.outputURL = url
exporter.outputFileType = AVFileType.mp4
exporter.shouldOptimizeForNetworkUse = true
handler(exporter)
}
I've been trying to search for this all day, but all answers point to older versions of Swift or Obj-C.
I tried layer instructions, but AVMutableComposition has no member instructions. I remember this being really easy with just an affineTransform, but now I no longer know where I found this.
var mainVideoURL:URL!
let paths = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)
let tempPath = paths[0] + "/mainVideo.mp4"
if(FileManager.default.fileExists(atPath: tempPath)){
guard (try? FileManager.default.removeItem(atPath: tempPath)) != nil else {
print("remove path failed")
self.enableButtons(enabled:true)
return
}
}
mainVideoURL = URL(fileURLWithPath: tempPath)
let firstAsset = AVURLAsset(url: fileURL)
let mixComposition = AVMutableComposition()
// repeat video number of times
let videoRepeat = photoVideoRepeats
for i in 0 ... videoRepeat - 1 {
do {
try mixComposition.insertTimeRange(CMTimeRangeMake(kCMTimeZero, firstAsset.duration),
of: firstAsset,
at: kCMTimeZero + CMTimeMultiply(firstAsset.duration,Int32(i)))
} catch _ {
print("Failed to load first track")
}
}
guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) else { return }
After my video was recorded, I was able to do a CGAffineTransform on a AVMutableCompositionTrack.
In my case I needed to merge an audio track with the video, but you can see where the transforms take place:
func mergeVideoAndAudio(videoUrl: URL,
audioUrl: URL) -> AVAsset {
let mixComposition = AVMutableComposition()
var mutableCompositionVideoTrack = [AVMutableCompositionTrack]()
var mutableCompositionAudioTrack = [AVMutableCompositionTrack]()
var mutableCompositionAudioOfVideoTrack = [AVMutableCompositionTrack]()
//start merge
let aVideoAsset = AVAsset(url: videoUrl)
let aAudioAsset = AVAsset(url: audioUrl)
let compositionAddVideo = mixComposition.addMutableTrack(withMediaType: .video,
preferredTrackID: kCMPersistentTrackID_Invalid)
let compositionAddAudio = mixComposition.addMutableTrack(withMediaType: .audio,
preferredTrackID: kCMPersistentTrackID_Invalid)
let compositionAddAudioOfVideo = mixComposition.addMutableTrack(withMediaType: .audio,
preferredTrackID: kCMPersistentTrackID_Invalid)
let aVideoAssetTrack: AVAssetTrack = aVideoAsset.tracks(withMediaType: AVMediaType.video)[0]
let aAudioOfVideoAssetTrack: AVAssetTrack? = aVideoAsset.tracks(withMediaType: AVMediaType.audio).first
let aAudioAssetTrack: AVAssetTrack = aAudioAsset.tracks(withMediaType: AVMediaType.audio)[0]
// Default must have tranformation
compositionAddVideo?.preferredTransform = aVideoAssetTrack.preferredTransform
var transforms = aVideoAssetTrack.preferredTransform
if UIDevice.current.orientation == UIDeviceOrientation.landscapeLeft {
transforms = transforms.concatenating(CGAffineTransform(rotationAngle: CGFloat(-90.0 * .pi / 180)))
transforms = transforms.concatenating(CGAffineTransform(translationX: 1280, y: 0))
}
else if UIDevice.current.orientation == UIDeviceOrientation.landscapeRight {
transforms = transforms.concatenating(CGAffineTransform(rotationAngle: CGFloat(90.0 * .pi / 180)))
transforms = transforms.concatenating(CGAffineTransform(translationX: 1280, y: 0))
}
else if UIDevice.current.orientation == UIDeviceOrientation.portraitUpsideDown {
transforms = transforms.concatenating(CGAffineTransform(rotationAngle: CGFloat(180.0 * .pi / 180)))
transforms = transforms.concatenating(CGAffineTransform(translationX: 0, y: 720))
}
compositionAddVideo?.preferredTransform = transforms
mutableCompositionVideoTrack.append(compositionAddVideo!)
mutableCompositionAudioTrack.append(compositionAddAudio!)
mutableCompositionAudioOfVideoTrack.append(compositionAddAudioOfVideo!)
do {
try mutableCompositionVideoTrack[0].insertTimeRange(CMTimeRangeMake(start: CMTime.zero,
duration: aVideoAssetTrack.timeRange.duration),
of: aVideoAssetTrack,
at: CMTime.zero)
//In my case my audio file is longer then video file so i took videoAsset duration
//instead of audioAsset duration
try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(start: CMTime.zero,
duration: aVideoAssetTrack.timeRange.duration),
of: aAudioAssetTrack,
at: CMTime.zero)
// adding audio (of the video if exists) asset to the final composition
if let aAudioOfVideoAssetTrack = aAudioOfVideoAssetTrack {
try mutableCompositionAudioOfVideoTrack[0].insertTimeRange(CMTimeRangeMake(start: CMTime.zero,
duration: aVideoAssetTrack.timeRange.duration),
of: aAudioOfVideoAssetTrack,
at: CMTime.zero)
}
} catch {
print(error.localizedDescription)
}
return mixComposition
}
I am trying to merge the videos from the array of avssets but I am able to get only first and last video. Videos between the array are showing black area.
Check the code I am using.
func mergeVideoArray() {
let mixComposition = AVMutableComposition()
for videoAsset in videoURLArray {
let videoTrack =
mixComposition.addMutableTrack(withMediaType: AVMediaType.video,
preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
do {
try videoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration),
of: videoAsset.tracks(withMediaType: AVMediaType.video).first!,
at: totalTime)
videoSize = (videoTrack?.naturalSize)!
} catch let error as NSError {
print("error: \(error)")
}
let trackArray = videoAsset.tracks(withMediaType: .audio)
if trackArray.count > 0 {
let audioTrack =
mixComposition.addMutableTrack(withMediaType: AVMediaType.audio,
preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
do {
try audioTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAsset.tracks(withMediaType: AVMediaType.audio).first!, at: audioTime)
audioTime = audioTime + videoAsset.duration
}
catch {
}
}
totalTime = totalTime + videoAsset.duration
let videoInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack!)
if videoAsset != videoURLArray.last{
videoInstruction.setOpacity(0.0, at: videoAsset.duration)
}
layerInstructionsArray.append(videoInstruction)
}
let mainInstruction = AVMutableVideoCompositionInstruction()
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, totalTime)
mainInstruction.layerInstructions = layerInstructionsArray
let mainComposition = AVMutableVideoComposition()
mainComposition.instructions = [mainInstruction]
mainComposition.frameDuration = CMTimeMake(1, 30)
mainComposition.renderSize = CGSize(width: videoSize.width, height: videoSize.height)
let url = "merge_video".outputURL
let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
exporter!.outputURL = url
exporter!.outputFileType = AVFileType.mov
exporter!.shouldOptimizeForNetworkUse = false
exporter!.videoComposition = mainComposition
exporter!.exportAsynchronously {
let video = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: video)
let player = AVPlayer(playerItem: playerItem)
self.playerViewController.player = player
self.present(self.playerViewController, animated: true) {
self.playerViewController.player!.play()
}
}
}
Please help me resolving this issue. Thanks in advance.
Note I am able to create a video from the array but only first and last index values are showing in videos. For rest of the values only blank screen is showing.
I just solved my question just need to update the one line in the code. Please have a look in to the code.
if videoAsset != videoURLArray.last{
videoInstruction.setOpacity(0.0, at: totalTime)
}
Note: just need to change the at position of the next video for the every value of array.
Right way to do it would be to change
if videoAsset != videoURLArray.last{
videoInstruction.setOpacity(0.0, at: videoAsset.duration)
}
to:
videoInstruction.setOpacity(0.0, at: totalTime)
I want to emphasize here that adding
totalTime = totalTime + videoAsset.duration
before setting opacity to layer instruction makes all the difference. Otherwise videos are black screen.
My question is, I am using the function below, to compose a video and audio. I want to keep video's original sound but it goes away somehow, I do not have any clue.
I got this function from this answer
I tried to change volumes right after appending AVMutableCompositionTracks but it did not work
For instance;
mutableVideoCompositionTrack.prefferedVolume = 1.0
mutableAudioCompositionTrack.prefferedVolume = 0.05
But still, all you can hear is only the audio file.
The function;
private func mergeAudioAndVideo(audioUrl: URL, videoUrl: URL, completion: #escaping (Bool)->Void){
let mixComposition = AVMutableComposition()
var mutableCompositionVideoTrack : [AVMutableCompositionTrack] = []
var mutableCompositionAudioTrack : [AVMutableCompositionTrack] = []
let totalVideoCompositionInstruction = AVMutableVideoCompositionInstruction()
let videoAsset = AVAsset(url: videoUrl)
let audioAsset = AVAsset(url: audioUrl)
mutableCompositionVideoTrack.append(mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid))
mutableCompositionAudioTrack.append(mixComposition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid))
mutableCompositionAudioTrack[0].preferredVolume = 0.05
mutableCompositionVideoTrack[0].preferredVolume = 1.0
let videoAssetTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0]
let audioAssetTrack = audioAsset.tracks(withMediaType: AVMediaTypeAudio)[0]
do {
try mutableCompositionVideoTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAssetTrack.timeRange.duration), of: videoAssetTrack, at: kCMTimeZero)
try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAssetTrack.timeRange.duration), of: audioAssetTrack, at: kCMTimeZero)
}catch{
print("ERROR#1")
}
totalVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAssetTrack.timeRange.duration)
let mutableVideoComposition = AVMutableVideoComposition()
mutableVideoComposition.frameDuration = CMTimeMake(1, 30)
mutableVideoComposition.renderSize = CGSize(width: 1280, height: 720)
//exporting
savePathUrl = try! FileManager.default.url(for: FileManager.SearchPathDirectory.documentDirectory, in: FileManager.SearchPathDomainMask.userDomainMask, appropriateFor: nil, create: true).appendingPathComponent("merged").appendingPathExtension("mov")
let assetExport = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)!
assetExport.outputFileType = AVFileTypeMPEG4
assetExport.outputURL = savePathUrl
assetExport.shouldOptimizeForNetworkUse = true
do {
try FileManager.default.removeItem(at: savePathUrl)
}catch {
print(error)
}
assetExport.exportAsynchronously {
switch assetExport.status{
case .completed:
print("completed")
completion(true)
default:
print("failed \(assetExport.error!)")
completion(false)
}
}
}
You can adjust volume for video and audio separately #Faruk, Here a is little bit code for that.
//Extract audio from the video and the music
let audioMix: AVMutableAudioMix = AVMutableAudioMix()
var audioMixParam: [AVMutableAudioMixInputParameters] = []
let assetVideoTrack: AVAssetTrack = assetVideo.tracksWithMediaType(AVMediaTypeAudio)[0]
let assetMusicTrack: AVAssetTrack = assetMusic.tracksWithMediaType(AVMediaTypeAudio)[0]
let videoParam: AVMutableAudioMixInputParameters = AVMutableAudioMixInputParameters(track: assetVideoTrack)
videoParam.trackID = compositionAudioVideo.trackID
let musicParam: AVMutableAudioMixInputParameters = AVMutableAudioMixInputParameters(track: assetMusicTrack)
musicParam.trackID = compositionAudioMusic.trackID
//Set final volume of the audio record and the music
videoParam.setVolume(volumeVideo, atTime: kCMTimeZero)
musicParam.setVolume(volumeAudio, atTime: kCMTimeZero)
//Add setting
audioMixParam.append(musicParam)
audioMixParam.append(videoParam)
//Add audio on final record
//First: the audio of the record and Second: the music
do {
try compositionAudioVideo.insertTimeRange(CMTimeRangeMake(kCMTimeZero, assetVideo.duration), ofTrack: assetVideoTrack, atTime: kCMTimeZero)
} catch _ {
assertionFailure()
}
do {
try compositionAudioMusic.insertTimeRange(CMTimeRangeMake(CMTimeMake(Int64(startAudioTime * 10000), 10000), assetVideo.duration), ofTrack: assetMusicTrack, atTime: kCMTimeZero)
} catch _ {
assertionFailure()
}
//Add parameter
audioMix.inputParameters = audioMixParam
let completeMovie = "\(docsDir)/\(randomString(5)).mp4"
let completeMovieUrl = NSURL(fileURLWithPath: completeMovie)
let exporter: AVAssetExportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetHighestQuality)!
exporter.outputURL = completeMovieUrl
exporter.outputFileType = AVFileTypeMPEG4
exporter.audioMix = audioMix
exporter.exportAsynchronouslyWithCompletionHandler({
switch exporter.status {
case AVAssetExportSessionStatus.Completed:
print("success with output url \(completeMovieUrl)")
case AVAssetExportSessionStatus.Failed:
print("failed \(String(exporter.error))")
case AVAssetExportSessionStatus.Cancelled:
print("cancelled \(String(exporter.error))")
default:
print("complete")
}
})
}
Here's the combined code of all the answers. It took me a while to decode those answers so I decided to add it here for any future users:
Swift 5:
enum MixError: Error {
case TimeRangeFailure
case ExportFailure
}
var selectedVideoLevel = 1.0
var selectedMusicLevel = 1.0
func mix(videoUrl: URL, musicUrl: URL, completion: ((Result<URL, Error>) -> Void)?) {
let videoAsset = AVAsset(url: videoUrl)
let musicAsset = AVAsset(url: musicUrl)
let audioVideoComposition = AVMutableComposition()
let audioMix = AVMutableAudioMix()
var mixParameters = [AVMutableAudioMixInputParameters]()
let videoCompositionTrack = audioVideoComposition
.addMutableTrack(withMediaType: .video, preferredTrackID: .init())!
let audioCompositionTrack = audioVideoComposition
.addMutableTrack(withMediaType: .audio, preferredTrackID: .init())!
let musicCompositionTrack = audioVideoComposition
.addMutableTrack(withMediaType: .audio, preferredTrackID: .init())!
let videoAssetTrack = videoAsset.tracks(withMediaType: .video)[0]
let audioAssetTrack = videoAsset.tracks(withMediaType: .audio).first
let musicAssetTrack = musicAsset.tracks(withMediaType: .audio)[0]
let audioParameters = AVMutableAudioMixInputParameters(track: audioAssetTrack)
audioParameters.trackID = audioCompositionTrack.trackID
let musicParameters = AVMutableAudioMixInputParameters(track: musicAssetTrack)
musicParameters.trackID = musicCompositionTrack.trackID
audioParameters.setVolume(selectedVideoLevel, at: .zero)
musicParameters.setVolume(selectedMusicLevel, at: .zero)
mixParameters.append(audioParameters)
mixParameters.append(musicParameters)
audioMix.inputParameters = mixParameters
/// prevents video from unnecessary rotations
videoCompositionTrack.preferredTransform = videoAssetTrack.preferredTransform
do {
let timeRange = CMTimeRange(start: .zero, duration: videoAsset.duration)
try videoCompositionTrack.insertTimeRange(timeRange, of: videoAssetTrack, at: .zero)
if let audioAssetTrack = audioAssetTrack {
try audioCompositionTrack.insertTimeRange(timeRange, of: audioAssetTrack, at: .zero)
}
try musicCompositionTrack.insertTimeRange(timeRange, of: musicAssetTrack, at: .zero)
} catch {
completion?(.failure(MixError.TimeRangeFailure)
}
let exportUrl = FileManager.default
.urls(for: .applicationSupportDirectory, in: .userDomainMask).first?
.appendingPathComponent("\(Date().timeIntervalSince1970)-video.mp4")
let exportSession = AVAssetExportSession(
asset: audioVideoComposition,
presetName: AVAssetExportPresetHighestQuality
)
exportSession?.audioMix = audioMix
exportSession?.outputFileType = .m4v
exportSession?.outputURL = exportUrl
exportSession?.exportAsynchronously(completionHandler: {
guard let status = exportSession?.status else { return }
switch status {
case .completed:
completion?(.success(exportUrl!))
case .failed:
completion?(.failure(MixError.ExportError)
default:
print(status)
}
})
}
I figured it out. It seems an AVAsset which loads a video holds the audio and video separately. So you can reach them writing``
videoAsset.tracks(withMediaType: AVMediaTypeAudio)[0] //audio of a video
videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0] //video of a video(without sound)
So I added these lines to the code and it worked!
var mutableCompositionBackTrack : [AVMutableCompositionTrack] = []
mutableCompositionBackTrack.append(mixComposition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid))
try mutableCompositionBackTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAssetTrack.timeRange.duration), of: backAssetTrack, at: kCMTimeZero)
There is a still missing point that I do not know how to do, and that is setting volumes of these audio assets. I will update this answer as soon as I figure out how.
I start saying that I spent a lot of time searching through documentation, posts here and somewhere else, but I can't figure out the solution for this problem.
I'm using AVAssetExportSession for exporting an .mp4 file stored in a AVAsset instance.
What I do is:
I check the isExportable property of AVAsset
I then get an array of exportPresets compatible with the AVAsset instance
I take the AVAssetExportPreset1920x1080, or, if not existing I try to export the media with AVAssetExportPresetPassthrough (FYI, 100% of times, the preset I need is always contained in the list, but I tried also the passthrough option and it doesn't work anyway)
The outputFileType is AVFileTypeMPEG4 and I tried also by assigning the .mp4 extension to the file, but nothing makes it work.
I always receive this error
Error Domain=AVFoundationErrorDomain Code=-11838 "Operation Stopped"
UserInfo={NSUnderlyingError=0x600000658c30 {Error
Domain=NSOSStatusErrorDomain Code=-12109 "(null)"},
NSLocalizedFailureReason=The operation is not supported for this
media., NSLocalizedDescription=Operation Stopped}
Below is the code I'm using
func _getDataFor(_ item: AVPlayerItem, completion: #escaping (Data?) -> ()) {
guard item.asset.isExportable else {
completion(nil)
return
}
let compatiblePresets = AVAssetExportSession.exportPresets(compatibleWith: item.asset)
var preset: String = AVAssetExportPresetPassthrough
if compatiblePresets.contains(AVAssetExportPreset1920x1080) { preset = AVAssetExportPreset1920x1080 }
guard
let exportSession = AVAssetExportSession(asset: item.asset, presetName: preset),
exportSession.supportedFileTypes.contains(AVFileTypeMPEG4) else {
completion(nil)
return
}
var tempFileUrl = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0].appendingPathComponent("temp_video_data.mp4", isDirectory: false)
tempFileUrl = URL(fileURLWithPath: tempFileUrl.path)
exportSession.outputURL = tempFileUrl
exportSession.outputFileType = AVFileTypeMPEG4
let startTime = CMTimeMake(0, 1)
let timeRange = CMTimeRangeMake(startTime, item.duration)
exportSession.timeRange = timeRange
exportSession.exportAsynchronously {
print("\(exportSession.error)")
let data = try? Data(contentsOf: tempFileUrl)
_ = try? FileManager.default.removeItem(at: tempFileUrl)
completion(data)
}
}
Seems like converting the AVAsset instance in a AVMutableComposition did the trick. If, please, anyone knows the reason why this works let me know.
This is the new _getDataFor(_:completion:) method implementation
func _getDataFor(_ item: AVPlayerItem, completion: #escaping (Data?) -> ()) {
guard item.asset.isExportable else {
completion(nil)
return
}
let composition = AVMutableComposition()
let compositionVideoTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID(kCMPersistentTrackID_Invalid))
let compositionAudioTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID(kCMPersistentTrackID_Invalid))
let sourceVideoTrack = item.asset.tracks(withMediaType: AVMediaTypeVideo).first!
let sourceAudioTrack = item.asset.tracks(withMediaType: AVMediaTypeAudio).first!
do {
try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, item.duration), of: sourceVideoTrack, at: kCMTimeZero)
try compositionAudioTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, item.duration), of: sourceAudioTrack, at: kCMTimeZero)
} catch(_) {
completion(nil)
return
}
let compatiblePresets = AVAssetExportSession.exportPresets(compatibleWith: composition)
var preset: String = AVAssetExportPresetPassthrough
if compatiblePresets.contains(AVAssetExportPreset1920x1080) { preset = AVAssetExportPreset1920x1080 }
guard
let exportSession = AVAssetExportSession(asset: composition, presetName: preset),
exportSession.supportedFileTypes.contains(AVFileTypeMPEG4) else {
completion(nil)
return
}
var tempFileUrl = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0].appendingPathComponent("temp_video_data.mp4", isDirectory: false)
tempFileUrl = URL(fileURLWithPath: tempFileUrl.path)
exportSession.outputURL = tempFileUrl
exportSession.outputFileType = AVFileTypeMPEG4
let startTime = CMTimeMake(0, 1)
let timeRange = CMTimeRangeMake(startTime, item.duration)
exportSession.timeRange = timeRange
exportSession.exportAsynchronously {
print("\(tempFileUrl)")
print("\(exportSession.error)")
let data = try? Data(contentsOf: tempFileUrl)
_ = try? FileManager.default.removeItem(at: tempFileUrl)
completion(data)
}
}
Swift 5:
import Foundation
import AVKit
func getDataFor(_ asset: AVAsset, completion: #escaping (Data?) -> ()) {
guard asset.isExportable,
let sourceVideoTrack = asset.tracks(withMediaType: .video).first,
let sourceAudioTrack = asset.tracks(withMediaType: .audio).first else {
completion(nil)
return
}
let composition = AVMutableComposition()
let compositionVideoTrack = composition.addMutableTrack(withMediaType: .video, preferredTrackID: CMPersistentTrackID(kCMPersistentTrackID_Invalid))
let compositionAudioTrack = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: CMPersistentTrackID(kCMPersistentTrackID_Invalid))
do {
try compositionVideoTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: asset.duration), of: sourceVideoTrack, at: .zero)
try compositionAudioTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: asset.duration), of: sourceAudioTrack, at: .zero)
} catch {
completion(nil)
return
}
let compatiblePresets = AVAssetExportSession.exportPresets(compatibleWith: composition)
var preset = AVAssetExportPresetPassthrough
let preferredPreset = AVAssetExportPreset1920x1080
if compatiblePresets.contains(preferredPreset) {
preset = preferredPreset
}
let fileType: AVFileType = .mp4
guard let exportSession = AVAssetExportSession(asset: composition, presetName: preset),
exportSession.supportedFileTypes.contains(fileType) else {
completion(nil)
return
}
let tempFileUrl = URL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent("temp_video_data")
exportSession.outputURL = tempFileUrl
exportSession.outputFileType = fileType
let startTime = CMTimeMake(value: 0, timescale: 1)
let timeRange = CMTimeRangeMake(start: startTime, duration: asset.duration)
exportSession.timeRange = timeRange
exportSession.exportAsynchronously {
print(tempFileUrl)
print(String(describing: exportSession.error))
let data = try? Data(contentsOf: tempFileUrl)
try? FileManager.default.removeItem(at: tempFileUrl)
completion(data)
}
}
Check if you set delegate property for AVURLAsset correctly.
[self.playerAsset.resourceLoader setDelegate:self queue:dispatch_get_main_queue()];
And conform to AVAssetResourceLoaderDelegate protocol.
That is all you need to do.
I had this same issue because I was adding an audio track to a video without audio. Removing the audio track fixed it.
I ran into this problem because the Microphone permission was off/denied. Once I set turned it on this error went away.
I solved this problem by removing the CompositionTrack with media type .audio and empty segments from the AVMutableComposition
if let audioTrack = exportComposition.tracks(withMediaType: .audio).first,
audioTrack.segments.isEmpty {
exportComposition.removeTrack(audioTrack)
}
I had the same error, when I try to ExportSession with AVAssetExportPresetPassthrough always fail and in my case I can't use another preset because I must have the same resolution like at origin video
I fixed
let asset = AVAsset(url: originUrl)
let videoTrack = asset.tracks( withMediaType: .video ).first! as AVAssetTrack
let videoComposition = AVMutableVideoComposition()
videoComposition.renderSize = CGSize(
width: videoTrack.naturalSize.width,
height: videoTrack.naturalSize.height
)
videoComposition.frameDuration = CMTime(
value: 1,
timescale: videoTrack.naturalTimeScale
)
let transformer = AVMutableVideoCompositionLayerInstruction(
assetTrack: videoTrack
)
transformer.setOpacity(1.0, at: CMTime.zero)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = timeRange
instruction.layerInstructions = [transformer]
videoComposition.instructions = [instruction]
guard let exportSession = AVAssetExportSession(
asset: asset,
presetName: AVAssetExportPresetMediumQuality
) else {
return handleFailure(error: .mediaSavingError, completion: completion)
}
exportSession.videoComposition = videoComposition
exportSession.outputURL = outputUrl
exportSession.outputFileType = .mp4
exportSession.timeRange = timeRange
exportSession.exportAsynchronously { [weak self] in
"your code"
}
Forks great for me, and it's saved the same resolution as video before