I'm writing an iOS application in Swift which is recording user's voice and than can play it back with some voice effects, but the problem is that the playback is very silent when using built-in iPhone microphone. With headphones there is no problem.
Recording code:
let recordingName = "my_audio.m4a"
let pathArray = [dirPath, recordingName]
let filePath = NSURL.fileURLWithPathComponents(pathArray)
print(filePath)
let recordSettings: [String: AnyObject] = [
AVFormatIDKey: Int(kAudioFormatAppleLossless),
AVEncoderAudioQualityKey : AVAudioQuality.Max.rawValue,
AVEncoderBitRateKey : 320000,
AVNumberOfChannelsKey: 2,
AVSampleRateKey : 44100.0
]
let session = AVAudioSession.sharedInstance()
try! session.setCategory(AVAudioSessionCategoryPlayAndRecord)
try! audioRecorder = AVAudioRecorder(URL: filePath!, settings: recordSettings)
audioRecorder.meteringEnabled = true
audioRecorder.updateMeters()
audioRecorder.delegate = self
audioRecorder.prepareToRecord()
audioRecorder.record()
Playback code:
playerNode.stop()
playerNode.volume = 1.0
audioEngine.stop()
audioEngine.reset()
audioEngine.attachNode(playerNode)
audioEngine.attachNode(audioUnitTime)
audioEngine.connect(playerNode, to: audioUnitTime, format: receivedAudio.processingFormat)
audioEngine.connect(audioUnitTime, to: audioEngine.outputNode, format: receivedAudio.processingFormat)
playerNode.scheduleFile(receivedAudio, atTime: nil, completionHandler: nil)
try! audioEngine.start()
playerNode.play()
The only trace of solution for me is that the original apple voice recording app does the same but only when the upper right corner speaker icon is disabled. Though I wasn't able to find out what that speaker icon does.
Alright I found it finally. The problem was with the AVAudioSession:
let session = AVAudioSession.sharedInstance()
try! session.setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions:AVAudioSessionCategoryOptions.DefaultToSpeaker)
Related
I am working on a app with recording function and followed a tutorial on this. I would like to edit the function where the AudioRecorder saves the file so that the file goes in the same folder it would if I dragged an audiofile directly in to xcode.
I am having trouble fetching a singe recording from the directory the Audiorecorder is saving to atm and found another tutorial that shows how to find recordings based on name from the app folder.
This is the recordings func today:
func startRecording() {
let recordingSession = AVAudioSession.sharedInstance()
do {
try recordingSession.setCategory(.playAndRecord, mode: .default)
try recordingSession.setActive(true)
} catch {
print("Failed to set up recording session")
}
let documentPath = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
print("AUDIORECORDER document path: \(documentPath)")
let audioFilename = documentPath.appendingPathComponent("\(Date().toString(dateFormat: "dd-MM-YY_'at'_HH:mm:ss")).m4a")
print("AUDIORECORDER audioFilename: \(audioFilename)")
let settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 12000,
AVNumberOfChannelsKey: 1,
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
do {
audioRecorder = try AVAudioRecorder(url: audioFilename, settings: settings)
audioRecorder.record()
recording = true
} catch {
print("Could not start recording")
}
}
Any idea on how to do this? The tutorial only shows how to fetch all recordings to a list and I am trying to fetch a single recording. This is the fetch function if this helpes:
func fetchRecordings() {
recordings.removeAll()
let fileManager = FileManager.default
let documentDirectory = fileManager.urls(for: .documentDirectory, in: .userDomainMask)[0]
let directoryContents = try! fileManager.contentsOfDirectory(at: documentDirectory, includingPropertiesForKeys: nil)
for audio in directoryContents {
let recording = Recording(fileURL: audio, createdAt: getCreationDate(for: audio))
recordings.append(recording)
}
objectWillChange.send(self)
}
It does not really matter where it saves but I need to be able to save the name to a string (So that I can fetch the right audio to the right object) and then find the audio based on that string.
Thank you.
I have a AVMutableComposition containing only audio that I want to export to a .wav audio file.
The simplest solution for exporting audio I found was using AVAssetExportSession like in this simplified example:
let composition = AVMutableComposition()
// add tracks...
let exportSession = AVAssetExportSession(asset: composition,
presetName: AVAssetExportPresetAppleM4A)!
exportSession.outputFileType = .m4a
exportSession.outputURL = someOutUrl
exportSession.exportAsynchronously {
// done
}
But it only works for .m4a
This post mentions that in order to export to other formats, one would have to use AVAssetReader and AVAssetWriter, unfortunately though it does not go into further details.
I have tried to implement it but got stuck in the process.
This is what I have so far (again simplified):
let composition = AVMutableComposition()
let outputSettings: [String : Any] = [
AVFormatIDKey: kAudioFormatLinearPCM,
AVLinearPCMIsBigEndianKey: false,
AVLinearPCMIsFloatKey: false,
AVLinearPCMBitDepthKey: 32,
AVLinearPCMIsNonInterleaved: false,
AVSampleRateKey: 44100.0,
AVChannelLayoutKey: NSData(),
]
let assetWriter = try! AVAssetWriter(outputURL: someOutUrl, fileType: .wav)
let input = AVAssetWriterInput(mediaType: .audio, outputSettings: outputSettings)
assetWriter.add(input)
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: CMTime.zero)
input.requestMediaDataWhenReady(on: .main) {
// as I understand, I need to bring in data from my
// AVMutableComposition here...
let sampleBuffer: CMSampleBuffer = ???
input.append(sampleBuffer)
}
assetWriter.finishWriting {
// done
}
It boils down to my question:
Can you provide a working example for exporting audio from a AVMutableComposition to a wav file?
After some more research I came up with the following solution.
The missing piece was the usage of AVAssetReader.
(simplified code)
// composition
let composition = AVMutableComposition()
// add stuff to composition
// reader
guard let assetReader = try? AVAssetReader(asset: composition) else { return }
assetReader.timeRange = CMTimeRange(start: .zero, duration: CMTime(value: composition.duration.value, timescale: composition.duration.timescale))
let assetReaderAudioMixOutput = AVAssetReaderAudioMixOutput(audioTracks: composition.tracks(withMediaType: .audio), audioSettings: nil)
assetReader.add(assetReaderAudioMixOutput)
guard assetReader.startReading() else { return }
// writer
let outputSettings: [String : Any] = [
AVFormatIDKey: kAudioFormatLinearPCM,
AVLinearPCMIsBigEndianKey: false,
AVLinearPCMIsFloatKey: false,
AVLinearPCMBitDepthKey: 32,
AVLinearPCMIsNonInterleaved: false,
AVSampleRateKey: 44100.0,
AVChannelLayoutKey: NSData(),
]
guard let assetWriter = try? AVAssetWriter(outputURL: someOutUrl, fileType: .wav) else { return }
let writerInput = AVAssetWriterInput(mediaType: .audio, outputSettings: outputSettings)
assetWriter.add(writerInput)
guard assetWriter.startWriting() else { return }
assetWriter.startSession(atSourceTime: CMTime.zero)
let queue = DispatchQueue(label: "my.queue.id")
writerInput.requestMediaDataWhenReady(on: queue) {
// capture assetReader in my block to prevent it being released
let readerOutput = assetReader.outputs.first!
while writerInput.isReadyForMoreMediaData {
if let nextSampleBuffer = readerOutput.copyNextSampleBuffer() {
writerInput.append(nextSampleBuffer)
} else {
writerInput.markAsFinished()
assetWriter.endSession(atSourceTime: composition.duration)
assetWriter.finishWriting() {
DispatchQueue.main.async {
// done, call my completion
}
}
break;
}
}
}
as you mention you are creating .m4a successfully why not convert .m4a file to .wav file in just put some line of code
I simply changed the extension of the file to .wav and removed the .m4a file and it worked.
func getDirectory() -> URL {
let path = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
let documentDirectory = path[0]
return documentDirectory
}
let date = Date().timeIntervalSince1970
fileName = getDirectory().appendingPathComponent("\(date).m4a")
wavFileName = getDirectory().appendingPathComponent("\(date).wav")
try! FileManager.default.copyItem(at: fileName, to: wavFileName)
try! FileManager.default.removeItem(at: fileName)
I even played .wav file and it's working fine.
audioPlayer = try! AVAudioPlayer(contentsOf: wavFileName)
audioPlayer.play()
check out above example is this extension replacement causing of any load or time in your app or not
I've gone through the documentation and all the posts here, and I've gotten this far.
The function below should take an AVAsset and write out a .wav file. However, the file written out is of zero bytes. I'm not sure I can even inspect what the writer is writing at each step.
What am I missing?
static func writeAudioTrackToUrl(asset: AVAsset, _ url: URL) throws {
// initialize asset reader, writer
let assetReader = try AVAssetReader(asset: asset)
let assetWriter = try AVAssetWriter(outputURL: URL(fileURLWithPath: "/tmp/audiowav.wav"), fileType: .wav)
// get audio track
let audioTrack = asset.tracks(withMediaType: AVMediaType.audio).first!
// configure output audio settings
let audioSettings: [String : Any] = [
AVFormatIDKey: kAudioFormatLinearPCM,
AVSampleRateKey: 22050.0,
AVNumberOfChannelsKey: 1,
AVLinearPCMBitDepthKey: 16,
AVLinearPCMIsFloatKey: false,
AVLinearPCMIsBigEndianKey: false,
AVLinearPCMIsNonInterleaved: false
]
let assetReaderAudioOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: audioSettings)
if assetReader.canAdd(assetReaderAudioOutput) {
assetReader.add(assetReaderAudioOutput)
} else {
fatalError("could not add audio output reader")
}
let inputAudioSettings: [String:Any] = [AVFormatIDKey : kAudioFormatLinearPCM]
let audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: inputAudioSettings, sourceFormatHint: (audioTrack.formatDescriptions[0] as! CMFormatDescription))
let audioInputQueue = DispatchQueue(label: "audioQueue")
assetWriter.add(audioInput)
assetWriter.startWriting()
assetReader.startReading()
assetWriter.startSession(atSourceTime: CMTime.zero)
audioInput.requestMediaDataWhenReady(on: audioInputQueue) {
while (audioInput.isReadyForMoreMediaData) {
let sample = assetReaderAudioOutput.copyNextSampleBuffer()
if (sample != nil) {
audioInput.append(sample!)
} else {
audioInput.markAsFinished()
DispatchQueue.main.async {
assetWriter.finishWriting {
assetReader.cancelReading()
}
}
break
}
}
}
}
The problem here is that you convert the input audio to the LPCM format described by audioSettings, but then you give a sourceFormatHint of audioTrack.formatDescriptions[0] to the AVAssetWriterInput.
This is a problem because the audio track format descriptions are not going to be LPCM but a compressed format, like kAudioFormatMPEG4AAC.
Just drop the hint, I think it's for passing through compressed formats anyway.
Further, the LPCM in inputAudioSettings is under specified - why not pass audioSettings directly?
In summary, try this:
let audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: audioSettings)
p.s. don't forget to delete the output file before running, AVAssetWriter doesn't seem to overwrite existing files
I'm trying to play audio using AVAudioEngine and AVAudioPlayerNode. However, in "engine.connect...", I get this:
"PlaySound[21867:535102] [AudioHAL_Client] AudioHardware.cpp:666:AudioObjectGetPropertyData: AudioObjectGetPropertyData: no object with given ID 0"
The code after "engine.connect..." seems to run because it prints "buffer?.format", but there's no sound.
Could someone let me know what I'm missing? Here's my testing code.
let audioNode = AVAudioPlayerNode()
let path = Bundle.main.path(forResource: "Sounds/Test.mp3", ofType: nil)
let url = URL(fileURLWithPath: path!)
let file = try! AVAudioFile(forReading: url)
let buffer = AVAudioPCMBuffer(pcmFormat: file.processingFormat, frameCapacity: AVAudioFrameCount(file.length))
try! file.read(into:buffer!)
let engine = AVAudioEngine()
engine.attach(audioNode)
engine.connect(audioNode, to: engine.mainMixerNode, format: buffer?.format)
// PlaySound[21867:535102] [AudioHAL_Client] AudioHardware.cpp:666:AudioObjectGetPropertyData: AudioObjectGetPropertyData: no object with given ID 0
engine.prepare()
try! engine.start()
audioNode.play()
audioNode.scheduleBuffer(buffer!, at: nil, options: .loops, completionHandler: nil)
debugPrint(buffer?.format)
// Optional(<AVAudioFormat 0x60000212d1d0: 1 ch, 44100 Hz, Float32>)
In engine.connect(), I also tried engine.outputNode.outputFormat(forBus: 0) instead of buffer?.format. I got no luck.
It works if I put engine outside like this:
class ViewController: NSViewController {
let engine = AVAudioEngine()
override func viewDidLoad() {
Is there any sample code or tutorials about that? I've found that AVAudioRecorder supported since WatchOS 4.0 https://developer.apple.com/documentation/avfoundation/avaudiorecorder. But when I am trying to use it - it records 1 second and no actual sound (just noise).
Here is my code:
let audioURL = self.getRecordedFileURL()
print(audioURL.absoluteString)
let settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
do {
recorder = try AVAudioRecorder(url: audioURL, settings: settings)
recorder?.delegate = self
recorder?.record()
} catch {
finishRecording(success: false)
}
Also, should I use AudioSession here? If yes, is it required requestRecordPermission and how do deal with it? Thank you for your help!
This one works:
let recordingName = "audio.m4a"
let dirPath = getDirectory()
let pathArray = [dirPath, recordingName]
guard let filePath = URL(string: pathArray.joined(separator: "/")) else { return }
let settings = [AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey:12000,
AVNumberOfChannelsKey:1,
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
//start recording
do {
audioRecorder = try AVAudioRecorder(url: filePath, settings: settings)
audioRecorder.delegate = self
audioRecorder.record()
} catch {
print("Recording Failed")
}
func getDirectory()-> String {
let dirPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]
return dirPath
}
Don't forget to add NSMicrophoneUsageDescription into your phone companion app Info.plist.