I am trying to implement voice commands that is going to be built into my app. I am able to successfully get it to work with this code:
let node = audioEngine.inputNode
let recordingFormat = node.outputFormat(forBus: 0)
node.installTap(onBus: 0, bufferSize: 1024,
format: recordingFormat) {
(buffer, _) in
self.recognitionRequest!.append(buffer)
}
audioEngine.prepare()
try! audioEngine.start()
Please note that audioEngine is set to AVAudioEngine() in the class.
Although the code works fine, the problem occurs when I have bluetooth headphones connected. The line let node = audioEngine.inputNode prevents my iPhone from connecting and seeing my Bluetooth headphones.
If I remove that line, my headphones connect as expected, however, this means that I cannot speech recognition as that first line is required.
How can I use voice commands and still use my bluetooth headphones?
If it helps, I want the voice commands to listen from the iPhone microphone but I want sound to be played through the headphones. If no headphones are connected, then the sound should play from the iPhone too.
Related
I am working on audio recording. when application running on foreground i have to start audio recording and going to background at that time audio recording working fine.
But my question is that how to start audio recording when i am already in background, My audio recording function fired like this:
I have a Bluetooth LE device with buttons and an iOS app. Those two are paired (Bluetooth LE device and the iPhone which runs the iOS app) and the iOS app is listening for events on the Bluetooth LE device, events like a hit of a button.
Now, when the user hits a button on the Bluetooth LE device, the iOS app captures the event and I am able to run code even if the app is in background, but I am not able to start a voice recording.
I have already enable Background Modes:
Here is my Code for Audio Recording:
func startRecording() {
DispatchQueue.global(qos: .background).asyncAfter(deadline: DispatchTime.now(), qos: .background) {
let audioFilename = self.getDocumentsDirectory().appendingPathComponent("recording.m4a")
print("record Audio \(audioFilename)")
let settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 12000,
AVNumberOfChannelsKey: 1,
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
do {
self.audioRecorder = try AVAudioRecorder(url: audioFilename, settings: settings)
self.audioRecorder.delegate = self
self.audioRecorder.record()
} catch {
self.finishRecording(success: false)
}
}
}
I can't find proper solution to do that thing, Please suggest me proper way to do that, Thanks in Advance.
This is not possible to do without an explicit UI interaction for security reasons, or else it would be possible to "spy" on a person (once paired, a person outside of a room could start recording and listen to what happens inside the room).
Workarounds could be:
send a local notification e.g. "The device is ready to record, want to start?". Once you tap this, the app opens in foreground, and should be able to start the recording.
use CallKit (suitable for phone-like use cases). When the user presses "Accept" in the CallKit system UI, it is possible to start recording audio even in background.
Here are my solutions:
Trigger a CallKit call from you app and control your headphone to answer the call.
Use your headphone to record and transfer voice data with bluetooth channel.
I want to be able to record audio and play back positional audio at the same time.
To do this I need to use the .playAndRecord audio session category, and simultaneous recording and playback works. However, using this category the audio file is played without being positional (i.e. it's not spatial) when using bluetooth headphones. This works as expected when using wired headphones.
If I set the audio session category to .playback, the audio played is correctly positional for both wired and bluetooth headphones, however I'm not able to simultaneously record.
I've tried various audio session categories/option but have had no luck.
import AVFoundation
class PlayerRecorder: ObservableObject {
let engine = AVAudioEngine()
let mixer = AVAudioEnvironmentNode()
init() {
let audioSession = AVAudioSession.sharedInstance()
/*
Using .playAndRecord both recording and playback works, however
the audio that is played is NOT positional. .allowBluetooth is needed
so that bluetooth headphones can be used.
*/
try! audioSession.setCategory(.playAndRecord, mode: .default, options: .allowBluetooth)
/*
Using .playback the positional audio DOES work, however we are not able to record.
*/
// try! audioSession.setCategory(.playback)
self.engine.attach(self.mixer)
let stereoFormat = AVAudioFormat(standardFormatWithSampleRate: self.engine.outputNode.outputFormat(forBus: 0).sampleRate, channels: 2)
self.engine.connect(self.mixer, to: self.engine.outputNode, format: stereoFormat)
self.engine.prepare()
try! self.engine.start()
}
func play() {
let audioPlayer = AVAudioPlayerNode()
self.engine.attach(audioPlayer)
let monoFormat = AVAudioFormat(standardFormatWithSampleRate: self.engine.outputNode.outputFormat(forBus: 0).sampleRate, channels: 1)
self.engine.connect(audioPlayer, to: self.mixer, format: monoFormat)
// This file has to be in mono
let url = Bundle.main.url(forResource: "your-mono-audio-file.mp3", withExtension: nil)
let f = try! AVAudioFile(forReading: url!)
audioPlayer.scheduleFile(f, at: nil, completionHandler: nil)
audioPlayer.renderingAlgorithm = .HRTFHQ
audioPlayer.position = AVAudio3DPoint(x: 20.0, y: 5.0, z: 0.0)
audioPlayer.play()
}
}
I want to be able to record audio and play back positional audio at the same time.
This is not possible over standard Bluetooth if you're both playing to and recording from the Bluetooth headsets. The option allowBluetooth does not mean "allow Bluetooth." It means "prefer HFP if available." (It's the worst named constant I know in Core Bluetooth.) HFP is a low-bandwidth bidirectional audio protocol designed for phone calls.
If you switch to the high-bandwidth audio protocol, A2DP, you'll find it's unidirectional, and so you cannot record from the microphone.
There is no widely-deployed Bluetooth protocol that gives you both high quality audio and access to a microphone. If you control the firmware, you can develop your own proprietary microphone audio stream over BLE (or iAP2 if it's a MFi device). But otherwise, there isn't a current solution.
I keep hoping that LEA will fix this, but I can't find any hint that it will. I also had hoped aptX might fix it (even though iPhones don't support it), but no luck there, either. I'm not certain why this use case isn't being worked on by the Bluetooth committee and vendors, but as best I know, nothing is on the horizon.
I'm using XCode 11.4, WatchOS 6.2, Swift/UI to build a watch app that plays background audio via speaker or BlueTooth pending user choice. Here are the scenarios:
Pass:
From XCode run on target watch (series 5)
Start audio
Screen On: Audio plays via speaker or blueTooth
Screen Off (drop wrist): Background audio plays via speaker or blueTooth
Fail:
Launch app from watch as user would (not run as XCode target)
Start audio
Screen On: App plays via speaker or blueTooth
Screen Off (drop wrist): Background audio dies for speaker or blueTooth output scenarios
As far as I can tell I've:
Set the background capabilities correctly
Set Session category and activated prior to playing audio
do {
try session.setCategory(AVAudioSession.Category.playback,
mode: .default,
policy: useBlueTooth ? .longFormAudio : .default,
options: [])
} catch _ {
fatalError("AudioSession Failed.")
}
session.activate(options: []) { (success, error) in
guard error == nil else {
print("Audio Session Activation Error: \(error!.localizedDescription)")
// Handle the error here.
return
}
}
What might cause this failing behavior when not launched via XCode? Advice on how to debug/fix please. Thanks!
I am trying to implement a simple macOS app with screen recording capabilities.
I don't want to record a microphone input, but rather a sound that comes out of my Mac's speakers. Example: this way I want to be able to record a YouTube video to a file.
Is this possible with AVCaptureSession? Googling shows the examples that capture video and microphore, but not the internal audio.
Here is the working code that I have to capture video and microphone. What do I have to modify to disable the microphone and get the internal PC's sound that comes to the speakers?
session = AVCaptureSession()
session.sessionPreset = AVCaptureSession.Preset.high
movieFileOutput = AVCaptureMovieFileOutput()
let displayId: CGDirectDisplayID = CGDirectDisplayID(CGMainDisplayID())
let audioDevice = AVCaptureDevice.default(for: .audio)!
let audioInput = try! AVCaptureDeviceInput(device: audioDevice)
let videoInput: AVCaptureScreenInput = AVCaptureScreenInput(displayID: displayId)!
session.addInput(videoInput)
session.addInput(audioInput)
session.addOutput(movieFileOutput)
session.startRunning()
movieFileOutput.startRecording(to: self.destinationUrl, recordingDelegate: self)
I have not found an easy way to do it but it turns out that it is possible to make the original code record the audio given that there is another software installed on my machine: Soundflower.
Soundflower is an open source kernel extension for MacOS, designed to create a virtual audio output device that can also act as an input.
Given that the Soundflower is installed, one can use configure the macOS using the Applications / Utilities / Audio MIDI Setup app to send the audio to both virtual and real audio devices. This way the code above captures the audio from the Soundflower but you can still hear it on your normal audio output device.
The setup of the Applications / Utilities / Audio MIDI Setup application is described here: How can I send my computer's audio to multiple outputs?.
With iOS10 there are more possibilities to manage AUdioSession, but i couldn't manage to keep the headphone microphone as input while audio is going out through the iphone speaker.
The 'overrideOutputAudioPort' method below also override the input audio port as the iphone microphone
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(AVAudioSessionCategoryPlayAndRecord)
try! session.overrideOutputAudioPort(.speaker)
} catch {
}
Is there any solution to keep the headphone as input?
How I undeerstand this Apple documentation, this is not possible using AVAudioSession:
If a headset is plugged in at the time you set this property’s value to kAudioSessionOverrideAudioRoute_Speaker, the system changes the audio routing for input as well as for output: input comes from the built-in microphone; output goes to the built-in speaker.