swift: how to delete part of audio? - swift

I'm creating a simple audio editing tool to trim and delete from an audio.
I implemented the trim function and it is working fine. However I searched and tried to implement the delete function and here is my code:
func deleteExportAsset(_ asset: AVAsset, fileName: String, completeAudioTime: CGFloat) -> URL {
print("\(#function)")
let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
let trimmedSoundFileURL = documentsDirectory.appendingPathComponent(fileName)
print("saving to \(trimmedSoundFileURL.absoluteString)")
if FileManager.default.fileExists(atPath: trimmedSoundFileURL.absoluteString) {
print("sound exists, removing \(trimmedSoundFileURL.absoluteString)")
do {
if try trimmedSoundFileURL.checkResourceIsReachable() {
print("is reachable")
}
try FileManager.default.removeItem(atPath: trimmedSoundFileURL.absoluteString)
} catch {
print("could not remove \(trimmedSoundFileURL)")
print(error.localizedDescription)
}
}
print("creating export session for \(asset)")
if let exporter = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) {
exporter.outputFileType = AVFileType.m4a
exporter.outputURL = trimmedSoundFileURL
let timeRange1 = CMTimeRangeFromTimeToTime(CMTime(seconds: 0, preferredTimescale: 100), CMTime(seconds: endTimeOfRange1, preferredTimescale: 100))
let timeRange2 = CMTimeRangeFromTimeToTime(CMTime(seconds: startTimeOfRange2)), preferredTimescale: 100), CMTime(seconds: Double(completeAudioTime), preferredTimescale: 100))
exporter.timeRange = CMTimeRangeGetUnion(timeRange1, timeRange2)
// do it
exporter.exportAsynchronously(completionHandler: {
print("export complete \(exporter.status)")
switch exporter.status {
case AVAssetExportSessionStatus.failed:
if let e = exporter.error {
print("export failed \(e)")
}
case AVAssetExportSessionStatus.cancelled:
print("export cancelled \(String(describing: exporter.error))")
default:
print("export complete")
}
})
} else {
print("cannot create AVAssetExportSession for asset \(asset)")
}
return trimmedSoundFileURL
}
what I'm doing here is creating 2 Ranges. Range1 from 0 ->time1 and Range2 from time2->endOfAudio. (I want to delete from time1 -> time2)
then I'm creating the union between 2 ranges.
however, nothing is happening to the audio. It is saved exactly like it was before this function.

CMTimeRangeGetUnion returns another CMTimeRange, which is just a (start-)time and a duration. So there is nothing than can hold the two time ranges required to do what you are expecting. In extension, AVAssetExportSession has no API that takes a list of time ranges to export.
But there is a way to accomplish it. The idea is to create an editable copy of the asset, delete the time range, and then export the editable copy. AVMutableComposition does this:
// assuming 'asset', 'endTimeOfRange1' and 'startTimeOfRange2' from the question:
// create empty mutable composition
let composition: AVMutableComposition = AVMutableComposition()
// copy all of original asset into the mutable composition, effectively creating an editable copy
try composition.insertTimeRange( CMTimeRangeMake( kCMTimeZero, asset.duration), of: asset, at: kCMTimeZero)
// now edit as required, e.g. delete a time range
let startTime = CMTime(seconds: endTimeOfRange1, preferredTimescale: 100)
let endTime = CMTime(seconds: startTimeOfRange2, preferredTimescale: 100)
composition.removeTimeRange( CMTimeRangeFromTimeToTime( startTime, endTime))
// since AVMutableComposition is an AVAsset subclass, it can be exported with AVAssetExportSession (or played with an AVPlayer(Item))
if let exporter = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)
{
// configure session and exportAsynchronously as above.
// You don't have to set the timeRange of the exportSession
}
Note that copying from the asset to the composition only modifies some in-memory structures defining which samples go where on the time line, but doesn't actually moves any media samples around. This is not done until exporting; as result, editing is (relatively) fast, and you have to keep the source file around at least until export is finished.

Related

AVMutableComposition goes silent every once in a while

I'm trying to create a an audio loop in Swift and it works very well 9/10 times. But then suddenly, the 10th time (or so) the insertTimeRange function seems to fail in some way.
I can see that the player has the correct length, but instead of taking the full 60 seconds of audio from every loop, it just seems to take a very, very short part of it and loop every minute. Illustration of the problem with five loops:
.----------.----------.----------.----------.----------
(short but hearable audio = ., complete silence = -)
Unfortunately, it doesn't throw any error. Here's how I create the composition:
private func createComposition(audioURL: URL, minutesToLoop: UInt) -> AVMutableComposition? {
// Initiate new composition
let composition = AVMutableComposition()
// Add one audio track (channel) to the composition
let compositionAudioTrack: AVMutableCompositionTrack? = composition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: 0)
// Create AVAsset from audio file (mp3 -> React Native bundle string -> URL -> AVAsset)
let asset = AVURLAsset(url: audioURL)
// Extract (now compatible) audio track from AVAsset
let track = asset.tracks(withMediaType: AVMediaType.audio)[0]
// Create a time range from 0 - 60 seconds that we can use to cut that out from the track
let timeRange = CMTimeRangeMake(start: .zero, duration: CMTimeMakeWithSeconds(60, preferredTimescale: 600))
if (compositionAudioTrack != nil) {
// Repeat for as many minutes as user specified in the app
for _ in 0...(minutesToLoop - 1) {
do {
// Take first 60 seconds from the audio file (timeRange, of: track) and paste over and over again exactly at the end of the track (at: composition.duration)
try compositionAudioTrack!.insertTimeRange(timeRange, of: track, at: composition.duration)
} catch {
print("FAILED TO MERGE AUDIO")
return nil
}
}
}
return composition
}
For future AVMutableComposition lovers, I found the solution. You needed to add a boolean to your asset options, like so:
let asset = AVURLAsset(url: audioURL, options: [AVURLAssetPreferPreciseDurationAndTimingKey: true])

Swift - AVAssetExportSession exportSession.exportAsynchronously completion handler not called

I used this link and following code in my project but AVAssetExportSession - exportAsynchronously method completion handler doesn't called in my project:
StackLink
func encodeVideo(at videoURL: URL, completionHandler: ((URL?, Error?) -> ())?) {
let avAsset = AVURLAsset(url: videoURL, options: nil)
let startDate = Date()
//Create Export session
guard let exportSession = AVAssetExportSession(asset: avAsset, presetName: AVAssetExportPresetPassthrough) else {
completionHandler?(nil, nil)
return
}
//Creating temp path to save the converted video
let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] as URL
let filePath = documentsDirectory.appendingPathComponent("rendered-Video.mp4")
//Check if the file already exists then remove the previous file
if FileManager.default.fileExists(atPath: filePath.path) {
do {
try FileManager.default.removeItem(at: filePath)
} catch {
completionHandler?(nil, error)
}
}
exportSession.outputURL = filePath
exportSession.outputFileType = .mp4
exportSession.shouldOptimizeForNetworkUse = true
let start = CMTimeMakeWithSeconds(0.0, preferredTimescale: 0)
let range = CMTimeRangeMake(start: start, duration: avAsset.duration)
exportSession.timeRange = range
exportSession.exportAsynchronously {
switch exportSession.status {
case .failed:
print(exportSession.error ?? "NO ERROR")
completionHandler?(nil, exportSession.error)
case .cancelled:
print("Export canceled")
completionHandler?(nil, nil)
case .completed:
//Video conversion finished
let endDate = Date()
let time = endDate.timeIntervalSince(startDate)
print(time)
print("Successful!")
print(exportSession.outputURL ?? "NO OUTPUT URL")
completionHandler?(exportSession.outputURL, nil)
case .unknown:
print("Export Unknown Error")
default: break
}
}
}
I also share my project on GitHub that you can check it,
thanks
GitRepo
I use Xcode 12.3
it was iOS bug even screen recording on my iOS device doesn't work, after i know this bug, I restart my phone and everything goes fine but this takes me some times to understand the solution.
I'm using iOS 14.3

AVURLAsset returning empty array-Trying to Concatenate two files

I'm trying to concatenate two (multiple) audio files. I found a relevant post and solution at Concatenate Two Audio Files Swift
Here's the solution:
func mergeAudioFiles(audioFileUrls: NSArray) {
let composition = AVMutableComposition()
for i in 0 ..< audioFileUrls.count {
let compositionAudioTrack :AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
let asset = AVURLAsset(url: (audioFileUrls[i] as! NSURL) as URL)
let track = asset.tracks(withMediaType: AVMediaTypeAudio)[0]
let timeRange = CMTimeRange(start: CMTimeMake(0, 600), duration: track.timeRange.duration)
try! compositionAudioTrack.insertTimeRange(timeRange, of: track, at: composition.duration)
}
let documentDirectoryURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! as NSURL
self.mergeAudioURL = documentDirectoryURL.appendingPathComponent("FinalAudio.m4a")! as URL as NSURL
let assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)
assetExport?.outputFileType = AVFileTypeAppleM4A
assetExport?.outputURL = mergeAudioURL as URL
assetExport?.exportAsynchronously(completionHandler:
{
switch assetExport!.status
{
case AVAssetExportSessionStatus.failed:
print("failed \(assetExport?.error)")
case AVAssetExportSessionStatus.cancelled:
print("cancelled \(assetExport?.error)")
case AVAssetExportSessionStatus.unknown:
print("unknown\(assetExport?.error)")
case AVAssetExportSessionStatus.waiting:
print("waiting\(assetExport?.error)")
case AVAssetExportSessionStatus.exporting:
print("exporting\(assetExport?.error)")
default:
print("Audio Concatenation Complete")
}
})
}
Some parameters are out of date and I used the suggested fixes from Xcode, resulting in :
let composition = AVMutableComposition()
for i in 0 ..< audioFileUrls.count {
let compositionAudioTrack :AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: CMPersistentTrackID())!
let asset : AVURLAsset = AVURLAsset(url: (audioFileUrls[i] as! NSURL) as URL)
let track : AVAssetTrack = asset.tracks(withMediaType: AVMediaType.audio)[0]
let timeRange = CMTimeRange(start: CMTimeMake(value: 0, timescale: 600), duration: track.timeRange.duration)
try! compositionAudioTrack.insertTimeRange(timeRange, of: track, at: composition.duration)
}
let documentDirectoryURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! as NSURL
self.mergeAudioURL = documentDirectoryURL.appendingPathComponent("FinalAudio.m4a")! as URL as NSURL
let assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)
assetExport?.outputFileType = AVFileType.m4a
assetExport?.outputURL = mergeAudioURL as URL
assetExport?.exportAsynchronously(completionHandler:
{
switch assetExport!.status
{
case AVAssetExportSessionStatus.failed:
print("failed \(assetExport?.error)")
case AVAssetExportSessionStatus.cancelled:
print("cancelled \(assetExport?.error)")
case AVAssetExportSessionStatus.unknown:
print("unknown\(assetExport?.error)")
case AVAssetExportSessionStatus.waiting:
print("waiting\(assetExport?.error)")
case AVAssetExportSessionStatus.exporting:
print("exporting\(assetExport?.error)")
default:
print("Audio Concatenation Complete")
}
})
print("asset url \(mergeAudioURL)")
}
when I used the solution I get
2020-05-11 13:09:14.771381-0600 TimeCapsule[88538:12022916] *** Terminating app due to uncaught exception 'NSRangeException', reason: '*** -[__NSArray0 objectAtIndex:]: index 0 beyond bounds for empty NSArray'
The error occurs on the let track = asset.tracks(withMediaType: AVMediaTypeAudio)[0] line. Before that line, print outs show that asset.tracks.count = 0 asset.trackGroups.count = 0. When I remove the [0] index at the end of the line, the following line throws an error that "track has no member 'timeRange'". I can't figure out how to add a track to my newly created asset so that I can use the assets audio duration. Any help would be greatly appreciated, Im assuming it's just outdated syntax for the new Swift.
The error is clear
let tracks = asset.tracks(withMediaType: AVMediaTypeAudio)
guard !tracks.isEmpty else { return }
let track = tracks.first!
the file you try to concatenate has no audio tracks , hence array is empty
Please note that the sandbox environment will change every time 'url', I have encountered this problem before!

Why do I get this output when using AVPlayer in Swift?

I am trying to play an audio file using avplayer in swift, when I play a file i generated combining two files, i get this output
playing file:"file location".m4a -- file:///
however when I play another remade sound file it plays fine, and i don't get the -- file:/// in the output after playing it
this is how I am playing the audio
func play(url:NSURL) {
do {
soundPlayer = AVPlayer(url: url as URL)
soundPlayer.volume = 1.0
soundPlayer.play()
} catch let error as NSError {
print(error.localizedDescription)
} catch {
print("failed")
}
}
and this is what I am using to concatenate two audio files
func makeSounds(sounds: [NSURL], preName: String){
let composition = AVMutableComposition()
print(sounds)
for sound in sounds {
let compositionAudioTrack:AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
let avAsset = AVURLAsset(url: sound as URL)
let track = avAsset.tracks(withMediaType: AVMediaTypeAudio)[0]
let timeRange = CMTimeRange(start: CMTimeMake(0, 600), duration: track.timeRange.duration)
try! compositionAudioTrack.insertTimeRange(timeRange, of: track, at: composition.duration)
}
let documentDirectoryURL = NSURL(fileURLWithPath: Urls.user)
var fileDestinationUrl = documentDirectoryURL.appendingPathComponent("\(SoundData.Name)\(preName).m4a")
let assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)
assetExport?.outputFileType = AVFileTypeAppleM4A
assetExport?.outputURL = fileDestinationUrl
assetExport?.exportAsynchronously(completionHandler:
{
switch assetExport!.status
{
case AVAssetExportSessionStatus.failed:
print("failed \(assetExport?.error)")
case AVAssetExportSessionStatus.cancelled:
print("cancelled \(assetExport?.error)")
case AVAssetExportSessionStatus.unknown:
print("unknown\(assetExport?.error)")
case AVAssetExportSessionStatus.waiting:
print("waiting\(assetExport?.error)")
case AVAssetExportSessionStatus.exporting:
print("exporting\(assetExport?.error)")
default:
soundsToPlay.soundLocation = String(describing: fileDestinationUrl!)
print("Audio Concatenation Complete")
}
})
}
the audio file location appears correct to the url i am setting it to be exported to, but it doesn't play the sound file, i just get that error
AVAssetExportSession needs a path string in 'File' format, which is usually obtained through URL.relativeString, which includes 'file:///'

How to record a video and make it slow motion

I am working on an iPhone app for school and need some help. The app should record video, make it slow motion (about 2x), then save it to the photo library. So far I have everything except how to make the video slow motion. I know it can be done as there is already an app in the App Store that does it.
How can I take a video I've saved to a temp url and adjust the speed before saving it to the photo library?
If you need to export your video then you need to use the AVMutableComposition Class
Then add your video as an AVAsset to an AVMutableComposition and scale it with:
- (void)scaleTimeRange:(CMTimeRange)timeRange toDuration:(CMTime)duration
Finally you export it using AVAssetExportSession Class
I written a code that makes your video in "slow motion" and saves it in Photos Library. "Main Thing This Code Works In Swift 5". Creating "Slow motion" video in iOS swift is not easy, that I came across many "slow motion" that came to know not working or some of the codes in them are depreciated. And so I finally figured a way to make slow motion in Swift.
This code can be used for 120fps are greater than that too. Just add the url of your video and make it slow
Here is the "code snippet I created for achieving slow motion"
func slowMotion(pathUrl: URL) {
let videoAsset = AVURLAsset.init(url: pathUrl, options: nil)
let currentAsset = AVAsset.init(url: pathUrl)
let vdoTrack = currentAsset.tracks(withMediaType: .video)[0]
let mixComposition = AVMutableComposition()
let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
let videoInsertError: Error? = nil
var videoInsertResult = false
do {
try compositionVideoTrack?.insertTimeRange(
CMTimeRangeMake(start: .zero, duration: videoAsset.duration),
of: videoAsset.tracks(withMediaType: .video)[0],
at: .zero)
videoInsertResult = true
} catch let videoInsertError {
}
if !videoInsertResult || videoInsertError != nil {
//handle error
return
}
var duration: CMTime = .zero
duration = CMTimeAdd(duration, currentAsset.duration)
//MARK: You see this constant (videoScaleFactor) this helps in achieving the slow motion that you wanted. This increases the time scale of the video that makes slow motion
// just increase the videoScaleFactor value in order to play video in higher frames rates(more slowly)
let videoScaleFactor = 2.0
let videoDuration = videoAsset.duration
compositionVideoTrack?.scaleTimeRange(
CMTimeRangeMake(start: .zero, duration: videoDuration),
toDuration: CMTimeMake(value: videoDuration.value * Int64(videoScaleFactor), timescale: videoDuration.timescale))
compositionVideoTrack?.preferredTransform = vdoTrack.preferredTransform
let dirPaths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).map(\.path)
let docsDir = dirPaths[0]
let outputFilePath = URL(fileURLWithPath: docsDir).appendingPathComponent("slowMotion\(UUID().uuidString).mp4").path
if FileManager.default.fileExists(atPath: outputFilePath) {
do {
try FileManager.default.removeItem(atPath: outputFilePath)
} catch {
}
}
let filePath = URL(fileURLWithPath: outputFilePath)
let assetExport = AVAssetExportSession(
asset: mixComposition,
presetName: AVAssetExportPresetHighestQuality)
assetExport?.outputURL = filePath
assetExport?.outputFileType = .mp4
assetExport?.exportAsynchronously(completionHandler: {
switch assetExport?.status {
case .failed:
print("asset output media url = \(String(describing: assetExport?.outputURL))")
print("Export session faiied with error: \(String(describing: assetExport?.error))")
DispatchQueue.main.async(execute: {
// completion(nil);
})
case .completed:
print("Successful")
let outputURL = assetExport!.outputURL
print("url path = \(String(describing: outputURL))")
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: outputURL!)
}) { saved, error in
if saved {
print("video successfully saved in photos gallery view video in photos gallery")
}
if (error != nil) {
print("error in saing video \(String(describing: error?.localizedDescription))")
}
}
DispatchQueue.main.async(execute: {
// completion(_filePath);
})
case .none:
break
case .unknown:
break
case .waiting:
break
case .exporting:
break
case .cancelled:
break
case .some(_):
break
}
})
}
slowmoVideo is an OSS project which appears to do this very nicely, though I don't know that it would work on an iPhone.
It does not simply make your videos play at 0.01× speed. You can
smoothly slow down and speed up your footage, optionally with motion
blur. How does slow motion work? slowmoVideo tries to find out where
pixels move in the video (this information is called Optical Flow),
and then uses this information to calculate the additional frames.