How to play two audio files at once with Swift using AudioKit - swift

I have two short wav audio files that I'm trying to play at the same time. Using AudioKit, I have an AudioEngine(), and I'm assuming I should use a MultiSegmentAudioPlayer() as the output along with the scheduleSegments().
Here is what I have:
class AudioPlayClass : ObservableObject {
var player = MultiSegmentAudioPlayer()
let engine = AudioEngine()
init(){}
func playFiles(){
self.engine.output = player
do {
try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default, options: .defaultToSpeaker)
try AVAudioSession.sharedInstance().setActive(true)
try engine.start()
guard let url = Bundle.main.url(forResource: note1, withExtension: "wav", subdirectory: instrumentDirectory) else {return}
guard let url2 = Bundle.main.url(forResource: note2, withExtension: "wav", subdirectory: instrumentDirectory) else {return}
let audioFile = try AVAudioFile(forReading: url)
let audioFile2 = try AVAudioFile(forReading: url2)
let fileSampleRate = audioFile.processingFormat.sampleRate
let file2SampleRate = audioFile2.processingFormat.sampleRate
let fileNumberOfSamples = audioFile.length
let file2NumberOfSamples = audioFile2.length
let audioFileEndTime = Double(fileNumberOfSamples)/fileSampleRate
let audioFile2EndTime = Double(file2NumberOfSamples)/file2SampleRate
let segment1 = segment(audioFile: audioFile,
playbackStartTime: 0.0, fileStartTime: 0, fileEndTime: audioFileEndTime)
let segment2 = segment(audioFile: audioFile2,
playbackStartTime: 0.0, fileStartTime: 0, fileEndTime: audioFile2EndTime)
player2.scheduleSegments(audioSegments: [segment1, segment2])
player2.play()
} catch {
print(error.localizedDescription.debugDescription)
}
}
}
public struct segment : StreamableAudioSegment {
public var audioFile: AVAudioFile
public var playbackStartTime: TimeInterval
public var fileStartTime: TimeInterval
public var fileEndTime: TimeInterval
public var completionHandler: AVAudioNodeCompletionHandler?
}
I just have a basic understanding of playing audio in Swift and using AudioKit so any feedback would be greatly appreciated. Thanks!

I think MultiSegmentAudioPlayer is mostly for playing sounds sequentially. You probably just want 2 AudioPlayer() and play them both at the same time.

Solved this by creating two separate instances of an AudioEngine(), each with their own respective AudioPlayer(), and loaded and played them immediately one after the other.

Related

How do I load a Data object into a SCNScene?

I want to load a 3d usdz blob into a view, but since I only have the data object, I'm trying to initialize the scene with that with no luck.
To that, I initialize the SCNSceneSource() and then open it using .scene().
Now what I don't understand:
If I use a URL and load the scene directly - it works.
If I use a Data object on the same URL it doesn't.
Apple docs says, the data should be of type NSData but that seems wrong.
import SceneKit
let url = URL(string: "file:///Users/thilo/Desktop/Input/UU2.usdz")!
// working
let src_ok = SCNSceneSource(url: url)
let scn_ok = src_ok?.scene(options: nil, statusHandler: {
a,b,c,d in print("OK: \(a) \(b) \(String(describing: c)) \(d) ")
})
print("Ok: \(scn_ok)")
// Not working?
let data = try! Data(contentsOf: url)
let src_bad = SCNSceneSource(data: data)
let scn_bad = src_bad?.scene(options: nil, status handler: {
a,b,c,d in print("BAD: \(a) \(b) \(String(describing: c)) \(d) ")
})
print("Failed: \(scn_bad)")
running on Playground says:
Ok: Optional(<SCNScene: 0x6000038e1200>)
BAD: 0.0 SCNSceneSourceStatus(rawValue: 4) nil 0x000000016fa948bf
BAD: 0.0 SCNSceneSourceStatus(rawValue: 4) nil 0x000000016fa942af
BAD: 0.0 SCNSceneSourceStatus(rawValue: -1) Optional(Error Domain=NSCocoaErrorDomain Code=260 "Could not load the scene" UserInfo={NSLocalizedDescription=Could not load the scene, NSLocalizedRecoverySuggestion=An error occurred while parsing the COLLADA file. Please check that it has not been corrupted.}) 0x000000016fa942af
Failed: nil
What am I missing?
SCNSceneSource doesn't support .usdz in Data context
Official documentation says that SCNSceneSource object supports only .scn, .dae and .abc file formats. But it turns out that SceneKit doesn't support URL-loading of .usdz only in the context of working with Data. Thus, when working with Data, use files in the .scn format.
import SceneKit
import Cocoa
class GameViewController : NSViewController {
override func viewDidLoad() {
super.viewDidLoad()
if let url = URL(string: "file:///Users/swift/Desktop/ship.scn") {
let data = try! Data(contentsOf: url)
let source = SCNSceneSource(data: data)
let sceneView = self.view as! SCNView
sceneView.scene = source?.scene()
}
}
}
To load .usdz using URL, try SCNSceneSource.init?(url: URL)
class GameViewController : NSViewController {
override func viewDidLoad() {
super.viewDidLoad()
if let url = URL(string: "file:///Users/swift/Desktop/ship.usdz") {
let source = SCNSceneSource(url: url)
let sceneView = self.view as! SCNView
sceneView.scene = source?.scene()
}
}
}
Or use SCNScene object to load .usdz model
class GameViewController : NSViewController {
override func viewDidLoad() {
super.viewDidLoad()
let url = URL(fileURLWithPath: "/Users/swift/Desktop/ship.usdz")
do {
let scene = try SCNScene(url: url)
let sceneView = self.view as! SCNView
sceneView.scene = scene
sceneView.autoenablesDefaultLighting = true
} catch {
print(error.localizedDescription)
}
}
}
Gathering from the comment "does not support usdz" my solution is:
to create a temporary file ( .usdz) seems to be required by the API...
and then manually remove the temporary file after loading.
First extend FileManager with the below code:
public extension FileManager {
func temporaryFileURL(fileName: String = UUID().uuidString,ext: String) -> URL? {
return URL(fileURLWithPath: NSTemporaryDirectory(), isDirectory: true)
.appendingPathComponent(fileName + ext)
}
}
For a limited hard-coded use case:
let fm = FileManager.default
let tempusdz = fm.temporaryFileURL(ext:".usdz")!
fm.createFile(atPath: tempusdz.path(), contents: sceneData)
let src = SCNSceneSource(url: tempusdz)
if let scene = src?.scene(options: nil) {
....
}
try? fm.removeItem(at: tempusdz)
of course this is a hack, because it will only work if the data is in usdz format.
Since usdz is a ZIP archive, maybe testing for a zip and then just doing the below is a better option:
let sceneData:Data? = data
var sceneSrc: SCNSceneSource? = nil
var tempURL:URL? = nil
if let dataStart = sceneData?.subdata(in: 0..<4),
let dataMagic = String(data: dataStart, encoding: String.Encoding.utf8) as String?,
dataMagic == "PK\u{3}\u{4}" {
let fm = FileManager.default
tempURL = fm.temporaryFileURL(ext: ".usdz")
if let tempURL {
fm.createFile(atPath: tempURL.path(), contents: sceneData)
sceneSrc = SCNSceneSource(url: tempURL)
}
} else {
sceneSrc = SCNSceneSource(data: sceneData!)
}
let scene = sceneSrc?.scene()
if let tempURL {
try? FileManager.default.removeItem(at: tempURL)
}
Does anyone knows a better solution?
Is there an easy way to check the type of the Data ?
potential solution could be to verify the format of the data object and ensure that it is a valid COLLADA file.
import Foundation
let url = URL(string: "file:///Users/thilo/Desktop/Input/UU2.usdz")!
let data = try! Data(contentsOf: url)
print("Data size: \(data.count)")
print("Data format: \(data.description)")
you usually get these types of errors when the data wasn't properly formatted

[Swift]I want to instantly save the sound with AVAudioEngine's effect as a file

I'm creating a process to read an existing audio file, add an effect using AVAudioEngine, and then save it as another audio file.
However, with the following method using an AVAudioPlayerNode, the save process must wait until the end of playback.
import UIKit
import AVFoundation
class ViewController: UIViewController {
let engine = AVAudioEngine()
let playerNode = AVAudioPlayerNode()
let reverbNode = AVAudioUnitReverb()
override func viewDidLoad() {
super.viewDidLoad()
do {
let url = URL(fileURLWithPath: Bundle.main.path(forResource: "original", ofType: "mp3")!)
let file = try AVAudioFile(forReading: url)
// playerNode
engine.attach(playerNode)
// reverbNode
reverbNode.loadFactoryPreset(.largeChamber)
reverbNode.wetDryMix = 5.0
engine.attach(reverbNode)
engine.connect(playerNode, to: reverbNode, format: file.processingFormat)
engine.connect(reverbNode, to: engine.mainMixerNode, format: file.processingFormat)
playerNode.scheduleFile(file, at: nil, completionCallbackType: .dataPlayedBack){ [self] _ in
reverbNode.removeTap(onBus: 0)
}
// start
try engine.start()
playerNode.play()
let url2 = URL(fileURLWithPath: fileInDocumentsDirectory(filename: "changed.wav"))
let outputFile = try! AVAudioFile(forWriting: url2, settings: playerNode.outputFormat(forBus: 0).settings)
reverbNode.installTap(onBus: 0, bufferSize: AVAudioFrameCount(reverbNode.outputFormat(forBus: 0).sampleRate), format: reverbNode.outputFormat(forBus: 0)) { (buffer, when) in
do {
try outputFile.write(from: buffer)
} catch let error {
print(error)
}
}
} catch {
print(error.localizedDescription)
}
}
func getDocumentsURL() -> NSURL {
let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] as NSURL
return documentsURL
}
func fileInDocumentsDirectory(filename: String) -> String {
let fileURL = getDocumentsURL().appendingPathComponent(filename)
return fileURL!.path
}
}
Is there a way to complete the writing without waiting for the playback to complete? My ideal is to complete the write in the time required by CPU and storage performance.
It seems that
reverbNode.installTap(...) { (buffer, when) in ...}
in the code is processed in parallel with the current playback position. But I would like to dramatically improve the processing speed.
Best regards.

How to Play Audio Using AVAudioEngine and AVAudioPlayerNode in Swift?

I'm trying to play audio using AVAudioEngine and AVAudioPlayerNode. However, in "engine.connect...", I get this:
"PlaySound[21867:535102] [AudioHAL_Client] AudioHardware.cpp:666:AudioObjectGetPropertyData: AudioObjectGetPropertyData: no object with given ID 0"
The code after "engine.connect..." seems to run because it prints "buffer?.format", but there's no sound.
Could someone let me know what I'm missing? Here's my testing code.
let audioNode = AVAudioPlayerNode()
let path = Bundle.main.path(forResource: "Sounds/Test.mp3", ofType: nil)
let url = URL(fileURLWithPath: path!)
let file = try! AVAudioFile(forReading: url)
let buffer = AVAudioPCMBuffer(pcmFormat: file.processingFormat, frameCapacity: AVAudioFrameCount(file.length))
try! file.read(into:buffer!)
let engine = AVAudioEngine()
engine.attach(audioNode)
engine.connect(audioNode, to: engine.mainMixerNode, format: buffer?.format)
// PlaySound[21867:535102] [AudioHAL_Client] AudioHardware.cpp:666:AudioObjectGetPropertyData: AudioObjectGetPropertyData: no object with given ID 0
engine.prepare()
try! engine.start()
audioNode.play()
audioNode.scheduleBuffer(buffer!, at: nil, options: .loops, completionHandler: nil)
debugPrint(buffer?.format)
// Optional(<AVAudioFormat 0x60000212d1d0: 1 ch, 44100 Hz, Float32>)
In engine.connect(), I also tried engine.outputNode.outputFormat(forBus: 0) instead of buffer?.format. I got no luck.
It works if I put engine outside like this:
class ViewController: NSViewController {
let engine = AVAudioEngine()
override func viewDidLoad() {

Play an audio file using Swift for MacOS

I'm trying to simply play a file (in the main bundle or on the disk) using AVAudioFile, AVAudioEngine and AVAudioPlayerNode.
Here is what I'm doing:
import Foundation
import AppKit
import AudioToolbox
import AVFoundation
struct readFile {
static var arrayFloatValues:[Float] = []
static var points:[CGFloat] = []
}
class AudioAnalisys : NSObject {
class func open_audiofile() {
let audioEngine: AVAudioEngine = AVAudioEngine()
let audioPlayer: AVAudioPlayerNode = AVAudioPlayerNode()
//get where the file is
let url = Bundle.main.url(forResource: "TeamPlaylist", withExtension: "mp3")
//put it in an AVAudioFile
let audioFile = try! AVAudioFile(forReading: url!)
//Get the audio file format
//let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: file.fileFormat.sampleRate, channels: file.fileFormat.channelCount, interleaved: false)
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
//how many channels?
print(audioFile.fileFormat.channelCount)
print(audioFrameCount)
//Setup the buffer for audio data
let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(audioFile.length))
//put audio data in the buffer
try! audioFile.read(into: audioFileBuffer!)
//readFile.arrayFloatValues = Array(UnsafeBufferPointer(start: audioFileBuffer!.floatChannelData?[0], count:Int(audioFileBuffer!.frameLength)))
//Init engine and player
let mainMixer = audioEngine.mainMixerNode
audioEngine.attach(audioPlayer)
audioEngine.connect(audioPlayer, to:mainMixer, format: audioFileBuffer!.format)
audioPlayer.scheduleBuffer(audioFileBuffer!, completionHandler: nil)
audioEngine.prepare()
do {
try audioEngine.start()
print("engine started")
} catch let error {
print(error.localizedDescription)
}
audioPlayer.play()
}
}
I can see the channel count, the FrameCount.
I can't hear anything. What am I doing wrong?
Here is what I get in the console:
2
17414784
Optional(0x00006080000006c0)
2018-10-09 21:21:02.161593+0200 spectrum[1668:327525] [AudioHAL_Client] AudioHardware.cpp:666:AudioObjectGetPropertyData: AudioObjectGetPropertyData: no object with given ID 0
engine started
2018-10-09 21:21:02.594136+0200 spectrum[1668:327593] MessageTracer: Falling back to default whitelist
here is the answer:
Can't play file from documents in AVAudioPlayer
which leads to:
import Foundation
import AppKit
import AudioToolbox
import AVFoundation
struct readFile {
static var arrayFloatValues:[Float] = []
static var points:[CGFloat] = []
}
let audioEngine: AVAudioEngine = AVAudioEngine()
let audioPlayer: AVAudioPlayerNode = AVAudioPlayerNode()
class AudioAnalisys : NSObject {
class func open_audiofile() {
//get where the file is
let url = Bundle.main.url(forResource: "TeamPlaylist", withExtension: "mp3")
//put it in an AVAudioFile
let audioFile = try! AVAudioFile(forReading: url!)
//Get the audio file format
//let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: file.fileFormat.sampleRate, channels: file.fileFormat.channelCount, interleaved: false)
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
//how many channels?
print(audioFile.fileFormat.channelCount)
print(audioFrameCount)
//Setup the buffer for audio data
let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(audioFile.length))
//put audio data in the buffer
try! audioFile.read(into: audioFileBuffer!)
//readFile.arrayFloatValues = Array(UnsafeBufferPointer(start: audioFileBuffer!.floatChannelData?[0], count:Int(audioFileBuffer!.frameLength)))
//Init engine and player
let mainMixer = audioEngine.mainMixerNode
audioEngine.attach(audioPlayer)
audioEngine.connect(audioPlayer, to:mainMixer, format: audioFileBuffer!.format)
audioPlayer.scheduleBuffer(audioFileBuffer!, completionHandler: nil)
audioEngine.prepare()
do {
try audioEngine.start()
print("engine started")
} catch let error {
print(error.localizedDescription)
}
audioPlayer.play()
}
}

Mac - Swift 3 - queuing audio files and playing

I would like to write an app in swift 3 in order to play queued audio files without any gap, crack or noise when passing from one to another.
My first try was using AvAudioPlayer and AvAudioDelegate (AVAudioPlayer using array to queue audio files - Swift), but I don't know how to preload the next song to avoid gap. Even if I know how to do it, I am not certain it is the best way to achieve my goal.
AVQueuePlayer seems to be a better candidate for the job, it is made for that purpose, but I don't find any example to help me out.
Maybe it is only a problem of preloading or buffering? I am a bit lost in this ocean of possibilities.
Any suggestion is welcomed.
It is far to be perfect, specially if you want to do it twice or more ("file exist" error), but it can serve as a base.
What it does is taking two files (mines are aif samples of ap. 4 sec.), encode them in one file and play the resulting files. If you have hundreds of them, assembled aleatory or not, it can make great fun.
All credits for the mergeAudioFiles function goes to #Peyman and #Pigeon_39. Concatenate two audio files in Swift and play them
Swift 3
import Cocoa
import AVFoundation
var action = AVAudioPlayer()
let path = Bundle.main.path(forResource: "audiofile1.aif", ofType:nil)!
let url = URL(fileURLWithPath: path)
let path2 = Bundle.main.path(forResource: "audiofile2.aif", ofType:nil)!
let url2 = URL(fileURLWithPath: path2)
let array1 = NSMutableArray(array: [url, url2])
class ViewController: NSViewController, AVAudioPlayerDelegate
{
#IBOutlet weak var LanceStop: NSButton!
override func viewDidLoad()
{
super.viewDidLoad()
}
override var representedObject: Any?
{
didSet
{
// Update the view, if already loaded.
}
}
#IBAction func Lancer(_ sender: NSButton)
{
mergeAudioFiles(audioFileUrls: array1)
let url3 = NSURL(string: "/Users/ADDUSERNAMEHERE/Documents/FinalAudio.m4a")
do
{
action = try AVAudioPlayer(contentsOf: url3 as! URL)
action.delegate = self
action.numberOfLoops = 0
action.prepareToPlay()
action.volume = 1
action.play()
}
catch{print("error")}
}
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool)
{
if flag == true
{
}
}
var mergeAudioURL = NSURL()
func mergeAudioFiles(audioFileUrls: NSArray) {
//audioFileUrls.adding(url)
//audioFileUrls.adding(url2)
let composition = AVMutableComposition()
for i in 0 ..< audioFileUrls.count {
let compositionAudioTrack :AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
let asset = AVURLAsset(url: (audioFileUrls[i] as! NSURL) as URL)
let track = asset.tracks(withMediaType: AVMediaTypeAudio)[0]
let timeRange = CMTimeRange(start: CMTimeMake(0, 600), duration: track.timeRange.duration)
try! compositionAudioTrack.insertTimeRange(timeRange, of: track, at: composition.duration)
}
let documentDirectoryURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! as NSURL
self.mergeAudioURL = documentDirectoryURL.appendingPathComponent("FinalAudio.m4a")! as URL as NSURL
let assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)
assetExport?.outputFileType = AVFileTypeAppleM4A
assetExport?.outputURL = mergeAudioURL as URL
assetExport?.exportAsynchronously(completionHandler:
{
switch assetExport!.status
{
case AVAssetExportSessionStatus.failed:
print("failed \(assetExport?.error)")
case AVAssetExportSessionStatus.cancelled:
print("cancelled \(assetExport?.error)")
case AVAssetExportSessionStatus.unknown:
print("unknown\(assetExport?.error)")
case AVAssetExportSessionStatus.waiting:
print("waiting\(assetExport?.error)")
case AVAssetExportSessionStatus.exporting:
print("exporting\(assetExport?.error)")
default:
print("Audio Concatenation Complete")
}
})
}
}