Change BPM in real time with AVAudioEngine using Swift - swift

Hello I am trying to implement simple audio app using AVAudioEngine, which plays short wav audio files in a loop at some bpm, that can be changed in real time (by slider or something).
Current solution logic:
set bpm=60
create audioFile from sample.wav
calculate bufferSize: AVAudioFrameCount(audioFile.processingFormat.sampleRate * 60 / Double(bpm))
set bufferSize to audioBuffer
load file audioFile into audioBuffer.
schedule audioBuffer to play
This solution works, but the issue is - if I want to change bpm I need to recreate buffer with different bufferSize, so it will not be in real time, since I need to stop player and reschedule buffer with different bufferSize.
Any thoughts how it can be done ?
Thanks in advance !
Code (main part):
var bpm:Float = 30
let engine = AVAudioEngine()
var player = AVAudioPlayerNode()
var audioBuffer: AVAudioPCMBuffer?
var audioFile: AVAudioFile?
override func viewDidLoad() {
super.viewDidLoad()
audioFile = loadfile(from: "sound.wav")
audioBuffer = tickBuffer(audioFile: audioFile!)
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: audioFile?.processingFormat)
do {
engine.prepare()
try engine.start()
} catch {
print(error)
}
}
private func loadfile(from fileName: String) -> AVAudioFile? {
let path = Bundle.main.path(forResource: fileName, ofType: nil)!
let url = URL(fileURLWithPath: path)
do {
let audioFile = try AVAudioFile(forReading: url)
return audioFile
} catch {
print("Error loading buffer1 \(error)")
}
return nil
}
func tickBuffer(audioFile: AVAudioFile) -> AVAudioPCMBuffer {
let periodLength = AVAudioFrameCount(audioFile.processingFormat.sampleRate * 60 / Double(bpm))
let buffer = AVAudioPCMBuffer(pcmFormat: audioFile.processingFormat, frameCapacity: periodLength)!
try! audioFile.read(into: buffer)
buffer.frameLength = periodLength
return buffer
}
func play() {
player.scheduleBuffer(audioBuffer, at: nil, options: .loops, completionHandler: nil)
player.play()
}
func stop() {
player.stop()
}

Related

Recording speech synthesis to a saved file

Below is the code I've put together to attempt to take a phrase, save it to a file, then play that saved file. Not sure what area isn't working (not correct file name, not saving the file, not finding the file). Any help would be appreciated. (The speakPhrase is just a helper function to let me know that the speech synthesizer actually works, which it does).
import AVFoundation
import Foundation
class Coordinator {
let synthesizer: AVSpeechSynthesizer
var player: AVAudioPlayer?
init() {
let synthesizer = AVSpeechSynthesizer()
self.synthesizer = synthesizer
}
var recordingPath: URL {
let soundName = "Finally.caf"
// I've tried numerous file extensions. .caf was in an answer somewhere else. I would think it would be
// .pcm, but that doesn't work either.
// Local Directory
let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
return paths[0].appendingPathComponent(soundName)
}
func speakPhrase(phrase: String) {
let utterance = AVSpeechUtterance(string: phrase)
utterance.voice = AVSpeechSynthesisVoice(language: "en")
synthesizer.speak(utterance)
}
func playFile() {
print("Trying to play the file")
do {
try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default)
try AVAudioSession.sharedInstance().setActive(true)
player = try AVAudioPlayer(contentsOf: recordingPath, fileTypeHint: AVFileType.caf.rawValue)
guard let player = player else {return}
player.play()
} catch {
print("Error playing file.")
}
}
func saveAVSpeechUtteranceToFile() {
let utterance = AVSpeechUtterance(string: "This is speech to record")
utterance.voice = AVSpeechSynthesisVoice(language: "en-US")
utterance.rate = 0.50
synthesizer.write(utterance) { [self] (buffer: AVAudioBuffer) in
guard let pcmBuffer = buffer as? AVAudioPCMBuffer else {
fatalError("unknown buffer type: \(buffer)")
}
if pcmBuffer.frameLength == 0 {
// Done
} else {
// append buffer to file
do {
let audioFile = try AVAudioFile(forWriting: recordingPath, settings: pcmBuffer.format.settings, commonFormat: .pcmFormatInt16, interleaved: false)
try audioFile.write(from: pcmBuffer)
} catch {
print(error.localizedDescription)
}
}
}
}
}
Did you noticed the bufferCallback in the below function is called multiple times?
func write(_ utterance: AVSpeechUtterance,toBufferCallback bufferCallback: #escaping AVSpeechSynthesizer.BufferCallback)
So the root cause is pretty simple: the AVSpeechUtterance's audio is divided into multiple parts. On my iPhone, the callback calls about 20 times.
So if you create a new audio file in the closure every time, you will get a very tiny audio file(on my iPhone it was a 6kb audio file). That audio is not noticeable if you play it.
So replace the function to
func saveAVSpeechUtteranceToFile() {
let utterance = AVSpeechUtterance(string: "This is speech to record")
utterance.voice = AVSpeechSynthesisVoice(language: "en-US")
utterance.rate = 0.50
// Only create new file handle if `output` is nil.
var output: AVAudioFile?
synthesizer.write(utterance) { [self] (buffer: AVAudioBuffer) in
guard let pcmBuffer = buffer as? AVAudioPCMBuffer else {
fatalError("unknown buffer type: \(buffer)")
}
if pcmBuffer.frameLength == 0 {
// Done
} else {
do{
// this closure is called multiple times. so to save a complete audio, try create a file only for once.
if output == nil {
try output = AVAudioFile(
forWriting: recordingPath,
settings: pcmBuffer.format.settings,
commonFormat: .pcmFormatInt16,
interleaved: false)
}
try output?.write(from: pcmBuffer)
}catch {
print(error.localizedDescription)
}
}
}
}
BTW, I uploaded Github Demo here.
Finally, tell you how to inspect the file contents on an iOS device.
Xcode Window Menu -> Device and Simulators, do like below to copy out your app's content.

[Swift]I want to instantly save the sound with AVAudioEngine's effect as a file

I'm creating a process to read an existing audio file, add an effect using AVAudioEngine, and then save it as another audio file.
However, with the following method using an AVAudioPlayerNode, the save process must wait until the end of playback.
import UIKit
import AVFoundation
class ViewController: UIViewController {
let engine = AVAudioEngine()
let playerNode = AVAudioPlayerNode()
let reverbNode = AVAudioUnitReverb()
override func viewDidLoad() {
super.viewDidLoad()
do {
let url = URL(fileURLWithPath: Bundle.main.path(forResource: "original", ofType: "mp3")!)
let file = try AVAudioFile(forReading: url)
// playerNode
engine.attach(playerNode)
// reverbNode
reverbNode.loadFactoryPreset(.largeChamber)
reverbNode.wetDryMix = 5.0
engine.attach(reverbNode)
engine.connect(playerNode, to: reverbNode, format: file.processingFormat)
engine.connect(reverbNode, to: engine.mainMixerNode, format: file.processingFormat)
playerNode.scheduleFile(file, at: nil, completionCallbackType: .dataPlayedBack){ [self] _ in
reverbNode.removeTap(onBus: 0)
}
// start
try engine.start()
playerNode.play()
let url2 = URL(fileURLWithPath: fileInDocumentsDirectory(filename: "changed.wav"))
let outputFile = try! AVAudioFile(forWriting: url2, settings: playerNode.outputFormat(forBus: 0).settings)
reverbNode.installTap(onBus: 0, bufferSize: AVAudioFrameCount(reverbNode.outputFormat(forBus: 0).sampleRate), format: reverbNode.outputFormat(forBus: 0)) { (buffer, when) in
do {
try outputFile.write(from: buffer)
} catch let error {
print(error)
}
}
} catch {
print(error.localizedDescription)
}
}
func getDocumentsURL() -> NSURL {
let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] as NSURL
return documentsURL
}
func fileInDocumentsDirectory(filename: String) -> String {
let fileURL = getDocumentsURL().appendingPathComponent(filename)
return fileURL!.path
}
}
Is there a way to complete the writing without waiting for the playback to complete? My ideal is to complete the write in the time required by CPU and storage performance.
It seems that
reverbNode.installTap(...) { (buffer, when) in ...}
in the code is processed in parallel with the current playback position. But I would like to dramatically improve the processing speed.
Best regards.

AVAudioEngine doesn't playback a sound

I am trying to play with AVAudioEngine to playback the wav file. I tried to do it in a few different ways, but nothing work.
Try 1
...
private var audioEngine: AVAudioEngine = AVAudioEngine()
private var mixer: AVAudioMixerNode = AVAudioMixerNode()
private var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()
func Play1() {
guard let filePath = Bundle.main.url(forResource: "testwav", withExtension: "wav", subdirectory: "res") else {
print("file not found")
return
}
print("\(filePath)")
guard let audioFile = try? AVAudioFile(forReading: filePath) else{ return }
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
guard let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: audioFrameCount) else{ return }
do{
try audioFile.read(into: audioFileBuffer)
} catch{
print("over")
}
let mainMixer = audioEngine.mainMixerNode
audioEngine.attach(audioFilePlayer)
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
audioEngine.connect(mainMixer, to:audioEngine.outputNode, format: audioFileBuffer.format)
try? audioEngine.start()
audioFilePlayer.play()
audioFilePlayer.scheduleBuffer(audioFileBuffer, at: nil, options:AVAudioPlayerNodeBufferOptions.loops)
}
...
Try 2
...
private var audioEngine: AVAudioEngine = AVAudioEngine()
private var mixer: AVAudioMixerNode = AVAudioMixerNode()
private var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()
func Play2() {
DispatchQueue.global(qos: .background).async {
self.audioEngine.attach(self.mixer)
self.audioEngine.connect(self.mixer, to: self.audioEngine.outputNode, format: nil)
// !important - start the engine *before* setting up the player nodes
try! self.audioEngine.start()
let audioPlayer = AVAudioPlayerNode()
self.audioEngine.attach(audioPlayer)
// Notice the output is the mixer in this case
self.audioEngine.connect(audioPlayer, to: self.mixer, format: nil)
guard let fileUrl = Bundle.main.url(forResource: "testwav", withExtension: "wav", subdirectory: "res") else {
// guard let url = Bundle.main.url(forResource: "audiotest", withExtension: "mp3", subdirectory: "res") else {
print("mp3 not found")
return
}
do {
let file = try AVAudioFile(forReading: fileUrl)
audioPlayer.scheduleFile(file, at: nil, completionHandler: nil)
audioPlayer.play(at: nil)
} catch let error {
print(error.localizedDescription)
}
}
}
...
...
private var audioEngine: AVAudioEngine = AVAudioEngine()
private var mixer: AVAudioMixerNode = AVAudioMixerNode()
private var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()
func Play3() {
DispatchQueue.global(qos: .background).async {
self.audioEngine = AVAudioEngine()
_ = self.audioEngine.mainMixerNode
self.audioEngine.prepare()
do {
try self.audioEngine.start()
} catch {
print(error)
}
guard let url = Bundle.main.url(forResource: "testwav", withExtension: "wav", subdirectory: "res") else {
// guard let url = Bundle.main.url(forResource: "audiotest", withExtension: "mp3", subdirectory: "res") else {
print("mp3 not found")
return
}
let player = AVAudioPlayerNode()
player.volume = 1.0
do {
let audioFile = try AVAudioFile(forReading: url)
let format = audioFile.processingFormat
print(format)
self.audioEngine.attach(player)
self.audioEngine.connect(player, to: self.audioEngine.mainMixerNode, format: format)
player.scheduleFile(audioFile, at: nil, completionHandler: nil)
} catch let error {
print(error.localizedDescription)
}
player.play()
}
}
...
Also should be mentioned that there are no errors, while debugging I see that all the methods are executed and everything is ok, but I don't hear sound playback...
What am I doing wrong?
Try to activate your audio session with the following method:
func setActive(_ active: Bool, options: AVAudioSession.SetActiveOptions = []) throws.
Please note that if another active audio session has higher priority than yours (for example, a phone call), and neither audio session allows mixing, attempting to activate your audio session fails. Deactivating an audio session that has running audio objects stops them, deactivates the session, and return an AVAudioSession.ErrorCode.isBusy error.

Play an audio file using Swift for MacOS

I'm trying to simply play a file (in the main bundle or on the disk) using AVAudioFile, AVAudioEngine and AVAudioPlayerNode.
Here is what I'm doing:
import Foundation
import AppKit
import AudioToolbox
import AVFoundation
struct readFile {
static var arrayFloatValues:[Float] = []
static var points:[CGFloat] = []
}
class AudioAnalisys : NSObject {
class func open_audiofile() {
let audioEngine: AVAudioEngine = AVAudioEngine()
let audioPlayer: AVAudioPlayerNode = AVAudioPlayerNode()
//get where the file is
let url = Bundle.main.url(forResource: "TeamPlaylist", withExtension: "mp3")
//put it in an AVAudioFile
let audioFile = try! AVAudioFile(forReading: url!)
//Get the audio file format
//let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: file.fileFormat.sampleRate, channels: file.fileFormat.channelCount, interleaved: false)
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
//how many channels?
print(audioFile.fileFormat.channelCount)
print(audioFrameCount)
//Setup the buffer for audio data
let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(audioFile.length))
//put audio data in the buffer
try! audioFile.read(into: audioFileBuffer!)
//readFile.arrayFloatValues = Array(UnsafeBufferPointer(start: audioFileBuffer!.floatChannelData?[0], count:Int(audioFileBuffer!.frameLength)))
//Init engine and player
let mainMixer = audioEngine.mainMixerNode
audioEngine.attach(audioPlayer)
audioEngine.connect(audioPlayer, to:mainMixer, format: audioFileBuffer!.format)
audioPlayer.scheduleBuffer(audioFileBuffer!, completionHandler: nil)
audioEngine.prepare()
do {
try audioEngine.start()
print("engine started")
} catch let error {
print(error.localizedDescription)
}
audioPlayer.play()
}
}
I can see the channel count, the FrameCount.
I can't hear anything. What am I doing wrong?
Here is what I get in the console:
2
17414784
Optional(0x00006080000006c0)
2018-10-09 21:21:02.161593+0200 spectrum[1668:327525] [AudioHAL_Client] AudioHardware.cpp:666:AudioObjectGetPropertyData: AudioObjectGetPropertyData: no object with given ID 0
engine started
2018-10-09 21:21:02.594136+0200 spectrum[1668:327593] MessageTracer: Falling back to default whitelist
here is the answer:
Can't play file from documents in AVAudioPlayer
which leads to:
import Foundation
import AppKit
import AudioToolbox
import AVFoundation
struct readFile {
static var arrayFloatValues:[Float] = []
static var points:[CGFloat] = []
}
let audioEngine: AVAudioEngine = AVAudioEngine()
let audioPlayer: AVAudioPlayerNode = AVAudioPlayerNode()
class AudioAnalisys : NSObject {
class func open_audiofile() {
//get where the file is
let url = Bundle.main.url(forResource: "TeamPlaylist", withExtension: "mp3")
//put it in an AVAudioFile
let audioFile = try! AVAudioFile(forReading: url!)
//Get the audio file format
//let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: file.fileFormat.sampleRate, channels: file.fileFormat.channelCount, interleaved: false)
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
//how many channels?
print(audioFile.fileFormat.channelCount)
print(audioFrameCount)
//Setup the buffer for audio data
let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(audioFile.length))
//put audio data in the buffer
try! audioFile.read(into: audioFileBuffer!)
//readFile.arrayFloatValues = Array(UnsafeBufferPointer(start: audioFileBuffer!.floatChannelData?[0], count:Int(audioFileBuffer!.frameLength)))
//Init engine and player
let mainMixer = audioEngine.mainMixerNode
audioEngine.attach(audioPlayer)
audioEngine.connect(audioPlayer, to:mainMixer, format: audioFileBuffer!.format)
audioPlayer.scheduleBuffer(audioFileBuffer!, completionHandler: nil)
audioEngine.prepare()
do {
try audioEngine.start()
print("engine started")
} catch let error {
print(error.localizedDescription)
}
audioPlayer.play()
}
}

AVAssetWriter queue guidance Swift 3

Can anyone give me some guidance on using queues in AVFoundation, please?
Later on, in my app, I want to do some processing on individual frames so I need to use AVCaptureVideoDataOutput.
To get started I thought I'd capture images and then write them (unprocessed) using AVAssetWriter.
I am successfully streaming frames from the camera to image preview by setting up an AVCaptureSession as follows:
func initializeCameraAndMicrophone() {
// set up the captureSession
captureSession = AVCaptureSession()
captureSession.sessionPreset = AVCaptureSessionPreset1280x720 // set resolution to Medium
// set up the camera
let camera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
do {
let cameraInput = try AVCaptureDeviceInput(device: camera)
if captureSession.canAddInput(cameraInput){
captureSession.addInput(cameraInput)
}
} catch {
print("Error setting device camera input: \(error)")
return
}
videoOutputStream.setSampleBufferDelegate(self, queue: DispatchQueue(label: "sampleBuffer", attributes: []))
if captureSession.canAddOutput(videoOutputStream) {
captureSession.addOutput(videoOutputStream)
}
captureSession.startRunning()
}
Each new frame then triggers the captureOutput delegate:
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!)
{
let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
let cameraImage = CIImage(cvPixelBuffer: pixelBuffer!)
let bufferImage = UIImage(ciImage: cameraImage)
DispatchQueue.main.async {
// send captured frame to the videoPreview
self.videoPreview.image = bufferImage
// if recording is active append bufferImage to video frame
while (recordingNow == true) {
print("OK we're recording!")
// append images to video
while (writerInput.isReadyForMoreMediaData) {
let lastFrameTime = CMTimeMake(Int64(frameCount), videoFPS)
let presentationTime = frameCount == 0 ? lastFrameTime : CMTimeAdd(lastFrameTime, frameDuration)
pixelBufferAdaptor.append(pixelBuffer!, withPresentationTime: presentationTime)
frameCount += 1
}
}
}
}
So this streams frames to the image preview perfectly until I press the record button which calls the startVideoRecording function (which sets up AVAssetWriter). From that point on the delegate never gets called again!
AVAssetWriter is being set up like this:
func startVideoRecording() {
guard let assetWriter = createAssetWriter(path: filePath!, size: videoSize) else {
print("Error converting images to video: AVAssetWriter not created")
return
}
// AVAssetWriter exists so create AVAssetWriterInputPixelBufferAdaptor
let writerInput = assetWriter.inputs.filter{ $0.mediaType == AVMediaTypeVideo }.first!
let sourceBufferAttributes: [String : AnyObject] = [
kCVPixelBufferPixelFormatTypeKey as String : Int(kCVPixelFormatType_32ARGB) as AnyObject,
kCVPixelBufferWidthKey as String : videoSize.width as AnyObject,
kCVPixelBufferHeightKey as String : videoSize.height as AnyObject,
]
let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: writerInput, sourcePixelBufferAttributes: sourceBufferAttributes)
// Start writing session
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: kCMTimeZero)
if (pixelBufferAdaptor.pixelBufferPool == nil) {
print("Error converting images to video: pixelBufferPool nil after starting session")
assetWriter.finishWriting{
print("assetWritter stopped!")
}
recordingNow = false
return
}
frameCount = 0
print("Recording started!")
}
I'm new to AVFoundation but I suspect I'm screwing up my queues somewhere.
You have to use a separate serial queue for capturing video/audio.
Add this queue property to your class:
let captureSessionQueue: DispatchQueue = DispatchQueue(label: "sampleBuffer", attributes: [])
Start the session on captureSessionQueue, according to the Apple docs:
The startRunning() method is a blocking call which can take some time, therefore you should
perform session setup on a serial queue so that the main queue isn't blocked (which keeps the UI responsive).
captureSessionQueue.async {
captureSession.startRunning()
}
Set this queue to your capture output pixel buffer delegate:
videoOutputStream.setSampleBufferDelegate(self, queue: captureSessionQueue)
Call startVideoRecording inside captureSessionQueue:
captureSessionQueue.async {
startVideoRecording()
}
In the captureOutput delegate method put all AVFoundation methods calls into captureSessionQueue.async:
DispatchQueue.main.async
{
// send captured frame to the videoPreview
self.videoPreview.image = bufferImage
captureSessionQueue.async {
// if recording is active append bufferImage to video frame
while (recordingNow == true){
print("OK we're recording!")
/// Append images to video
while (writerInput.isReadyForMoreMediaData) {
let lastFrameTime = CMTimeMake(Int64(frameCount), videoFPS)
let presentationTime = frameCount == 0 ? lastFrameTime : CMTimeAdd(lastFrameTime, frameDuration)
pixelBufferAdaptor.append(pixelBuffer!, withPresentationTime: presentationTime)
frameCount += 1
}
}
}
}