Why do I get this output when using AVPlayer in Swift? - swift

I am trying to play an audio file using avplayer in swift, when I play a file i generated combining two files, i get this output
playing file:"file location".m4a -- file:///
however when I play another remade sound file it plays fine, and i don't get the -- file:/// in the output after playing it
this is how I am playing the audio
func play(url:NSURL) {
do {
soundPlayer = AVPlayer(url: url as URL)
soundPlayer.volume = 1.0
soundPlayer.play()
} catch let error as NSError {
print(error.localizedDescription)
} catch {
print("failed")
}
}
and this is what I am using to concatenate two audio files
func makeSounds(sounds: [NSURL], preName: String){
let composition = AVMutableComposition()
print(sounds)
for sound in sounds {
let compositionAudioTrack:AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
let avAsset = AVURLAsset(url: sound as URL)
let track = avAsset.tracks(withMediaType: AVMediaTypeAudio)[0]
let timeRange = CMTimeRange(start: CMTimeMake(0, 600), duration: track.timeRange.duration)
try! compositionAudioTrack.insertTimeRange(timeRange, of: track, at: composition.duration)
}
let documentDirectoryURL = NSURL(fileURLWithPath: Urls.user)
var fileDestinationUrl = documentDirectoryURL.appendingPathComponent("\(SoundData.Name)\(preName).m4a")
let assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)
assetExport?.outputFileType = AVFileTypeAppleM4A
assetExport?.outputURL = fileDestinationUrl
assetExport?.exportAsynchronously(completionHandler:
{
switch assetExport!.status
{
case AVAssetExportSessionStatus.failed:
print("failed \(assetExport?.error)")
case AVAssetExportSessionStatus.cancelled:
print("cancelled \(assetExport?.error)")
case AVAssetExportSessionStatus.unknown:
print("unknown\(assetExport?.error)")
case AVAssetExportSessionStatus.waiting:
print("waiting\(assetExport?.error)")
case AVAssetExportSessionStatus.exporting:
print("exporting\(assetExport?.error)")
default:
soundsToPlay.soundLocation = String(describing: fileDestinationUrl!)
print("Audio Concatenation Complete")
}
})
}
the audio file location appears correct to the url i am setting it to be exported to, but it doesn't play the sound file, i just get that error

AVAssetExportSession needs a path string in 'File' format, which is usually obtained through URL.relativeString, which includes 'file:///'

Related

Swift/Cocoa: How do I export an AVMutableComposition as wav or aiff audio file

I have a AVMutableComposition containing only audio that I want to export to a .wav audio file.
The simplest solution for exporting audio I found was using AVAssetExportSession like in this simplified example:
let composition = AVMutableComposition()
// add tracks...
let exportSession = AVAssetExportSession(asset: composition,
presetName: AVAssetExportPresetAppleM4A)!
exportSession.outputFileType = .m4a
exportSession.outputURL = someOutUrl
exportSession.exportAsynchronously {
// done
}
But it only works for .m4a
This post mentions that in order to export to other formats, one would have to use AVAssetReader and AVAssetWriter, unfortunately though it does not go into further details.
I have tried to implement it but got stuck in the process.
This is what I have so far (again simplified):
let composition = AVMutableComposition()
let outputSettings: [String : Any] = [
AVFormatIDKey: kAudioFormatLinearPCM,
AVLinearPCMIsBigEndianKey: false,
AVLinearPCMIsFloatKey: false,
AVLinearPCMBitDepthKey: 32,
AVLinearPCMIsNonInterleaved: false,
AVSampleRateKey: 44100.0,
AVChannelLayoutKey: NSData(),
]
let assetWriter = try! AVAssetWriter(outputURL: someOutUrl, fileType: .wav)
let input = AVAssetWriterInput(mediaType: .audio, outputSettings: outputSettings)
assetWriter.add(input)
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: CMTime.zero)
input.requestMediaDataWhenReady(on: .main) {
// as I understand, I need to bring in data from my
// AVMutableComposition here...
let sampleBuffer: CMSampleBuffer = ???
input.append(sampleBuffer)
}
assetWriter.finishWriting {
// done
}
It boils down to my question:
Can you provide a working example for exporting audio from a AVMutableComposition to a wav file?
After some more research I came up with the following solution.
The missing piece was the usage of AVAssetReader.
(simplified code)
// composition
let composition = AVMutableComposition()
// add stuff to composition
// reader
guard let assetReader = try? AVAssetReader(asset: composition) else { return }
assetReader.timeRange = CMTimeRange(start: .zero, duration: CMTime(value: composition.duration.value, timescale: composition.duration.timescale))
let assetReaderAudioMixOutput = AVAssetReaderAudioMixOutput(audioTracks: composition.tracks(withMediaType: .audio), audioSettings: nil)
assetReader.add(assetReaderAudioMixOutput)
guard assetReader.startReading() else { return }
// writer
let outputSettings: [String : Any] = [
AVFormatIDKey: kAudioFormatLinearPCM,
AVLinearPCMIsBigEndianKey: false,
AVLinearPCMIsFloatKey: false,
AVLinearPCMBitDepthKey: 32,
AVLinearPCMIsNonInterleaved: false,
AVSampleRateKey: 44100.0,
AVChannelLayoutKey: NSData(),
]
guard let assetWriter = try? AVAssetWriter(outputURL: someOutUrl, fileType: .wav) else { return }
let writerInput = AVAssetWriterInput(mediaType: .audio, outputSettings: outputSettings)
assetWriter.add(writerInput)
guard assetWriter.startWriting() else { return }
assetWriter.startSession(atSourceTime: CMTime.zero)
let queue = DispatchQueue(label: "my.queue.id")
writerInput.requestMediaDataWhenReady(on: queue) {
// capture assetReader in my block to prevent it being released
let readerOutput = assetReader.outputs.first!
while writerInput.isReadyForMoreMediaData {
if let nextSampleBuffer = readerOutput.copyNextSampleBuffer() {
writerInput.append(nextSampleBuffer)
} else {
writerInput.markAsFinished()
assetWriter.endSession(atSourceTime: composition.duration)
assetWriter.finishWriting() {
DispatchQueue.main.async {
// done, call my completion
}
}
break;
}
}
}
as you mention you are creating .m4a successfully why not convert .m4a file to .wav file in just put some line of code
I simply changed the extension of the file to .wav and removed the .m4a file and it worked.
func getDirectory() -> URL {
let path = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
let documentDirectory = path[0]
return documentDirectory
}
let date = Date().timeIntervalSince1970
fileName = getDirectory().appendingPathComponent("\(date).m4a")
wavFileName = getDirectory().appendingPathComponent("\(date).wav")
try! FileManager.default.copyItem(at: fileName, to: wavFileName)
try! FileManager.default.removeItem(at: fileName)
I even played .wav file and it's working fine.
audioPlayer = try! AVAudioPlayer(contentsOf: wavFileName)
audioPlayer.play()
check out above example is this extension replacement causing of any load or time in your app or not

How to save AVMutableComposition (directly) without AVMutableVideoComposition to camera roll/gallery/photos - Swift 5

I am one step away from finishing my first original app so I'm quite desperate for an answer.
I have created an AVMutableComposition (not AVMutableVideoComposition) that combines multiple clips. All I want to do is save it to camera roll or gallery. The only way I know how to do so is using AVMutableVideoComposition but in my current situation that would take a really long time because of the way I structured my app. I have also used PHPhotoLibrary but that only applies to URL videos. (is there a way to convert AVMutableComposition's to URL's?)
Here's what I've tried that doesn't work:
var mixComposition = AVMutableComposition() //trying to save this
func saveToGallery() {
let savePathUrl: URL = URL(fileURLWithPath: NSHomeDirectory() + "/Documents/newVideo.mp4")
do {
try FileManager.default.removeItem(at: savePathUrl)
} catch { print(error.localizedDescription) }
let assetExport: AVAssetExportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)!
assetExport.outputFileType = AVFileType.mp4
assetExport.outputURL = savePathUrl
assetExport.shouldOptimizeForNetworkUse = true
assetExport.exportAsynchronously { () -> Void in
switch assetExport.status {
case AVAssetExportSessionStatus.completed:
print("successfully saved") // but it still prints this
case AVAssetExportSessionStatus.failed:
print("failed")
case AVAssetExportSessionStatus.cancelled:
print("cancelled")
default:
print("complete")
}
}
}
#IBAction func SaveButtonDidTouch(_ sender: Any) {
saveToGallery() //??
}
It still prints "successfully saved" even though nothing is added to my gallery/camera roll.
So what is the most efficient way to save an AVMutableComposition? Where did I go wrong here?

Can't able to get Video Tracks from AVURLAsset for HLS videos(.m3u8 format) for AVPlayer?

I am developing a custom video player to stream HLS videos from server. I can successfully play HLS videos using AVPlayerItem and AVPlayer.
After that I want to add subtitle track and audio tracks for my video player. So I used AVMutableComposition to do so. So now the issue is when I am creating AVURLAsset for HLS Videos, I can't able to get video tracks from AVURLAsset. It is giving me always 0 tracks. I tried "loadValuesAsynchronously" of AVURLAsset and I tried adding KVO for "tracks" of AVPlayerItem. But None of these producing me any positive result.
I am using the following code.
func playVideo() {
let videoAsset = AVURLAsset(url: videoURL!)
let composition = AVMutableComposition()
// Video
let videoTrack = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
do {
let tracks = videoAsset.tracks(withMediaType: .video)
guard let track = tracks.first else {
print("Can't get first video track")
return
}
try videoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: track, at: kCMTimeZero)
} catch {
print(error)
return
}
guard let subtitlesUrl = Bundle.main.url(forResource: "en", withExtension: "vtt") else {
print("Can't load en.vtt from bundle")
return
}
//Subtitles
let subtitleAsset = AVURLAsset(url: subtitlesUrl)
let subtitleTrack = composition.addMutableTrack(withMediaType: .text, preferredTrackID: kCMPersistentTrackID_Invalid)
do {
let subTracks = subtitleAsset.tracks(withMediaType: AVMediaType.text)
guard let subTrack = subTracks.first else {
print("Can't get first subtitles track")
return
}
try subtitleTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: subTrack, at: kCMTimeZero)
} catch {
print(error)
return
}
// Prepare item and play it
let item = AVPlayerItem(asset: composition)
self.player = AVPlayer(playerItem: item)
self.playerLayer = AVPlayerLayer.init()
self.playerLayer.frame = self.bounds
self.playerLayer.contentsGravity = kCAGravityResizeAspect
self.playerLayer.player = player
self.layer.addSublayer(self.playerLayer)
self.player.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
self.player.play()
}
This procedure working well for .mp4 videos but not for HLS Videos(.m3u8). Anyone have some working solution for this?
or
How can we get tracks from HLS videos using AVURLAsset? If this is not possible then How can achieve similar result ?
Please let me know you feedback.
Many more thanks in advance.
For HLS video tracks(withMediaType: .video) will return an empty array.
Use this instead: player.currentItem.presentationSize.width and player.currentItem.presentationSize.height.
Pls let me know if it works.
I didn't have the exact same problem as you. But I got around a similar problem (querying for HDR) by instead of querying the tracks on the AVURLAsset, I queried the tracks on the AVPlayerItem.
Set up an observer on the item status:
player?.observe(\AVPlayer.currentItem?.status,
options: [.new, .initial], changeHandler: { [weak self] player, _ in
DispatchQueue.main.async {
self?.observedItemStatus(from: player)
}
})
Then query the AVMediaType of your choice (in your case text).
func observedItemStatus(from avPlayer: AVPlayer) {
guard let currentItem = avPlayer.currentItem else { return }
// ideally execute code based on currentItem.status...for the brevity of this example I won't.
let hasLegibleMedia = currentItem.tracks.first(where: {
$0.assetTrack?.mediaType == AVMediaType.text
})?.assetTrack.hasMediaCharacteristic(.legible)
}
Alternatively if you need more than just a Bool, you could do a loop to access the assetTrack you really want.

AVMutableComposition with AKAudioFiles just exports the first track

I recently have struggling to get AVMutableComposition working in my synthesizer app. I implemented a function like the one below without Audiokit and it worked perfectly fine. However, when I run the function below I only can here the first file in the files arrray in the outputed AKAudioFile. Is this a problem with how AKAudioFiles are copied or how AVMutableComposition works?
private func createCombinedRecordingFrom(_ files: [AKAudioFile], completion: #escaping (AKAudioFile?) -> ()) {
let composition = AVMutableComposition()
for file in files {
let asset = file.avAsset
let timeRange = CMTimeRange(start: kCMTimeZero, duration: asset.duration)
let track = composition.addMutableTrack(withMediaType: .audio,
preferredTrackID: kCMPersistentTrackID_Invalid)
do {
try track?.insertTimeRange(timeRange, of: asset.tracks[0], at: kCMTimeZero)
} catch {
print(error.localizedDescription)
completion(nil)
}
}
let assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)!
assetExport.audioTimePitchAlgorithm = .spectral
assetExport.outputFileType = .m4a
let temporaryURL = URL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent(UUID().uuidString).appendingPathExtension("m4a")
assetExport.outputURL = temporaryURL
assetExport.exportAsynchronously() {
do {
let audioFile = try AKAudioFile(forReading: temporaryURL)
completion(audioFile)
} catch {
print(error.localizedDescription)
completion(nil)
}
}
}

How to record a video and make it slow motion

I am working on an iPhone app for school and need some help. The app should record video, make it slow motion (about 2x), then save it to the photo library. So far I have everything except how to make the video slow motion. I know it can be done as there is already an app in the App Store that does it.
How can I take a video I've saved to a temp url and adjust the speed before saving it to the photo library?
If you need to export your video then you need to use the AVMutableComposition Class
Then add your video as an AVAsset to an AVMutableComposition and scale it with:
- (void)scaleTimeRange:(CMTimeRange)timeRange toDuration:(CMTime)duration
Finally you export it using AVAssetExportSession Class
I written a code that makes your video in "slow motion" and saves it in Photos Library. "Main Thing This Code Works In Swift 5". Creating "Slow motion" video in iOS swift is not easy, that I came across many "slow motion" that came to know not working or some of the codes in them are depreciated. And so I finally figured a way to make slow motion in Swift.
This code can be used for 120fps are greater than that too. Just add the url of your video and make it slow
Here is the "code snippet I created for achieving slow motion"
func slowMotion(pathUrl: URL) {
let videoAsset = AVURLAsset.init(url: pathUrl, options: nil)
let currentAsset = AVAsset.init(url: pathUrl)
let vdoTrack = currentAsset.tracks(withMediaType: .video)[0]
let mixComposition = AVMutableComposition()
let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
let videoInsertError: Error? = nil
var videoInsertResult = false
do {
try compositionVideoTrack?.insertTimeRange(
CMTimeRangeMake(start: .zero, duration: videoAsset.duration),
of: videoAsset.tracks(withMediaType: .video)[0],
at: .zero)
videoInsertResult = true
} catch let videoInsertError {
}
if !videoInsertResult || videoInsertError != nil {
//handle error
return
}
var duration: CMTime = .zero
duration = CMTimeAdd(duration, currentAsset.duration)
//MARK: You see this constant (videoScaleFactor) this helps in achieving the slow motion that you wanted. This increases the time scale of the video that makes slow motion
// just increase the videoScaleFactor value in order to play video in higher frames rates(more slowly)
let videoScaleFactor = 2.0
let videoDuration = videoAsset.duration
compositionVideoTrack?.scaleTimeRange(
CMTimeRangeMake(start: .zero, duration: videoDuration),
toDuration: CMTimeMake(value: videoDuration.value * Int64(videoScaleFactor), timescale: videoDuration.timescale))
compositionVideoTrack?.preferredTransform = vdoTrack.preferredTransform
let dirPaths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).map(\.path)
let docsDir = dirPaths[0]
let outputFilePath = URL(fileURLWithPath: docsDir).appendingPathComponent("slowMotion\(UUID().uuidString).mp4").path
if FileManager.default.fileExists(atPath: outputFilePath) {
do {
try FileManager.default.removeItem(atPath: outputFilePath)
} catch {
}
}
let filePath = URL(fileURLWithPath: outputFilePath)
let assetExport = AVAssetExportSession(
asset: mixComposition,
presetName: AVAssetExportPresetHighestQuality)
assetExport?.outputURL = filePath
assetExport?.outputFileType = .mp4
assetExport?.exportAsynchronously(completionHandler: {
switch assetExport?.status {
case .failed:
print("asset output media url = \(String(describing: assetExport?.outputURL))")
print("Export session faiied with error: \(String(describing: assetExport?.error))")
DispatchQueue.main.async(execute: {
// completion(nil);
})
case .completed:
print("Successful")
let outputURL = assetExport!.outputURL
print("url path = \(String(describing: outputURL))")
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: outputURL!)
}) { saved, error in
if saved {
print("video successfully saved in photos gallery view video in photos gallery")
}
if (error != nil) {
print("error in saing video \(String(describing: error?.localizedDescription))")
}
}
DispatchQueue.main.async(execute: {
// completion(_filePath);
})
case .none:
break
case .unknown:
break
case .waiting:
break
case .exporting:
break
case .cancelled:
break
case .some(_):
break
}
})
}
slowmoVideo is an OSS project which appears to do this very nicely, though I don't know that it would work on an iPhone.
It does not simply make your videos play at 0.01× speed. You can
smoothly slow down and speed up your footage, optionally with motion
blur. How does slow motion work? slowmoVideo tries to find out where
pixels move in the video (this information is called Optical Flow),
and then uses this information to calculate the additional frames.