Use built-in mic if Headset is plugged in - iphone

I am playing around AudioSessions in iOS and I want to use the built in mic of the iphone as audio input route even if an external headset (including mic) is plugged in. I'm able to detect if a headset is plugged in using the following code:
CFStringRef route;
UInt32 propertySize = sizeof(CFStringRef);
AudioSessionInitialize(NULL, NULL, NULL, NULL);
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &route);
if((route == NULL) || (CFStringGetLength(route) == 0)){
// Silent Mode
NSLog(#"AudioRoute: SILENT");
} else {
NSString* routeStr = (NSString*)route;
NSLog(#"AudioRoute: %#", routeStr);
NSRange headsetRange = [routeStr rangeOfString : #"Headset"];
if(headsetRange.location != NSNotFound) {
NSLog(#"Headset")
//route Audio IN to built-in mic.
}
.... more code
So, any ideas how to do this?

Related

Is it possible to 'duck' an AudioSession whilst using OpenAL?

Does anybody know if this is possible?
I have my audio session and OpenAL set-up like so:
// Allow their music to play in the background
AudioSessionInitialize(NULL, NULL, openALInterruptionListener, (__bridge void *)(self));
UInt32 sessionCategory = kAudioSessionCategory_AmbientSound;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(sessionCategory), &sessionCategory);
UInt32 allowMixing = false;
AudioSessionSetProperty(kAudioSessionProperty_OtherMixableAudioShouldDuck, sizeof(allowMixing), &allowMixing);
// use the device to make a context
_mContext = alcCreateContext(_mDevice, NULL);
// set my context to the currently active one
alcMakeContextCurrent(_mContext);
And I have ducking set-up like so:
- (void)setSoundDucked:(BOOL)soundDucked
{
if(soundDucked)
{
UInt32 allowMixing = true;
AudioSessionSetProperty(kAudioSessionProperty_OtherMixableAudioShouldDuck, sizeof(allowMixing), &allowMixing);
AudioSessionSetActive(false);
AudioSessionSetActive(true);
}
else
{
UInt32 allowMixing = false;
AudioSessionSetProperty(kAudioSessionProperty_OtherMixableAudioShouldDuck, sizeof(allowMixing), &allowMixing);
AudioSessionSetActive(false);
AudioSessionSetActive(true);
}
}
However, sound doesn't duck. It will only duck if I comment out the following lines:
// use the device to make a context
_mContext = alcCreateContext(_mDevice, NULL);
// set my context to the currently active one
alcMakeContextCurrent(_mContext);
Is there anyway of getting OpenAL to play nice with the audio ducking property?

How to determine sound recording source

While recording a sound using my iPad app, how can I know if the source of the sound is from the built-in microphone or headphone microphone?
Additional information: iOS ver 4.2 and above.
The way to determine this is to poll the hardware and query the current audio route.
Use the AudioSessionGetProperty object to get back the route of the audio.
This example by #TPoschel should set you on the right track.
- (void)playSound:(id) sender
{
if(player){
CFStringRef route;
UInt32 propertySize = sizeof(CFStringRef);
AudioSessionInitialize(NULL, NULL, NULL, NULL);
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &route);
if((route == NULL) || (CFStringGetLength(route) == 0)){
// Silent Mode
NSLog(#"AudioRoute: SILENT");
} else {
NSString* routeStr = (NSString*)route;
NSLog(#"AudioRoute: %#", routeStr);
/* Known values of route:
* "Headset"
* "Headphone"
* "Speaker"
* "SpeakerAndMicrophone"
* "HeadphonesAndMicrophone"
* "HeadsetInOut"
* "ReceiverAndMicrophone"
* "Lineout"
*/
NSRange headphoneRange = [routeStr rangeOfString : #"Headphone"];
NSRange headsetRange = [routeStr rangeOfString : #"Headset"];
NSRange receiverRange = [routeStr rangeOfString : #"Receiver"];
NSRange speakerRange = [routeStr rangeOfString : #"Speaker"];
NSRange lineoutRange = [routeStr rangeOfString : #"Lineout"];
if (headphoneRange.location != NSNotFound) {
// Don't change the route if the headphone is plugged in.
} else if(headsetRange.location != NSNotFound) {
// Don't change the route if the headset is plugged in.
} else if (receiverRange.location != NSNotFound) {
// Change to play on the speaker
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
} else if (speakerRange.location != NSNotFound) {
// Don't change the route if the speaker is currently playing.
} else if (lineoutRange.location != NSNotFound) {
// Don't change the route if the lineout is plugged in.
} else {
NSLog(#"Unknown audio route.");
}
}
[player play];
}
}

Detect if headphones (not microphone) are plugged in to an iOS device [duplicate]

This question already has answers here:
Are headphones plugged in? iOS7
(8 answers)
Closed 6 years ago.
I need to change my audio depending on whether or not headphones are plugged in. I'm aware of kAudioSessionProperty_AudioInputAvailable, which will tell me if there's a microphone, but I'd like to test for any headphones, not just headphones with a built-in microphone. Is this possible?
Here is a method of my own which is a slightly modified version of one found on this site : http://www.iphonedevsdk.com/forum/iphone-sdk-development/9982-play-record-same-time.html
- (BOOL)isHeadsetPluggedIn {
UInt32 routeSize = sizeof (CFStringRef);
CFStringRef route;
OSStatus error = AudioSessionGetProperty (kAudioSessionProperty_AudioRoute,
&routeSize,
&route);
/* Known values of route:
* "Headset"
* "Headphone"
* "Speaker"
* "SpeakerAndMicrophone"
* "HeadphonesAndMicrophone"
* "HeadsetInOut"
* "ReceiverAndMicrophone"
* "Lineout"
*/
if (!error && (route != NULL)) {
NSString* routeStr = (NSString*)route;
NSRange headphoneRange = [routeStr rangeOfString : #"Head"];
if (headphoneRange.location != NSNotFound) return YES;
}
return NO;
}
Here's a solution based on rob mayoff's comment:
- (BOOL)isHeadsetPluggedIn
{
AVAudioSessionRouteDescription *route = [[AVAudioSession sharedInstance] currentRoute];
BOOL headphonesLocated = NO;
for( AVAudioSessionPortDescription *portDescription in route.outputs )
{
headphonesLocated |= ( [portDescription.portType isEqualToString:AVAudioSessionPortHeadphones] );
}
return headphonesLocated;
}
Simply link to the AVFoundation framework.
Just a heads up for any future readers of this post.
Most of the AVToolbox methods have been deprecated with the release of iOS 7 without alternative so audio listeners are now largely redundancy
I started with the code given above by jpsetung, but there were a few issues with it for my use case:
No evidence of something called kAudioSessionProperty_AudioRoute in the docs
Leaks route
No audio session check
String check for headphones instead of logical awareness of categories
I was more interested in whether the iPhone was using its speakers, with "headphones" meaning "anything other than speakers". I feel that leaving out options like "bluetooth", "airplay", or "lineout" was dangerous.
This implementation broadens the check to allow for any type of specified output:
BOOL isAudioRouteAvailable(CFStringRef routeType)
{
/*
As of iOS 5:
kAudioSessionOutputRoute_LineOut;
kAudioSessionOutputRoute_Headphones;
kAudioSessionOutputRoute_BluetoothHFP;
kAudioSessionOutputRoute_BluetoothA2DP;
kAudioSessionOutputRoute_BuiltInReceiver;
kAudioSessionOutputRoute_BuiltInSpeaker;
kAudioSessionOutputRoute_USBAudio;
kAudioSessionOutputRoute_HDMI;
kAudioSessionOutputRoute_AirPlay;
*/
//Prep
BOOL foundRoute = NO;
CFDictionaryRef description = NULL;
//Session
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
AudioSessionInitialize(NULL, NULL, NULL, NULL);
});
//Property
UInt32 propertySize;
AudioSessionGetPropertySize(kAudioSessionProperty_AudioRouteDescription, &propertySize);
OSStatus error = AudioSessionGetProperty(kAudioSessionProperty_AudioRouteDescription, &propertySize, &description);
if ( !error && description ) {
CFArrayRef outputs = CFDictionaryGetValue(description, kAudioSession_AudioRouteKey_Outputs);
CFIndex count = CFArrayGetCount(outputs);
if ( outputs && count ) {
for (CFIndex i = 0; i < count; i++) {
CFDictionaryRef route = CFArrayGetValueAtIndex(outputs, i);
CFStringRef type = CFDictionaryGetValue(route, kAudioSession_AudioRouteKey_Type);
NSLog(#"Got audio route %#", type);
//Audio route type
if ( CFStringCompare(type, routeType, 0) == kCFCompareEqualTo ) {
foundRoute = YES;
break;
}
}
}
} else if ( error ) {
NSLog(#"Audio route error %ld", error);
}
//Cleanup
if ( description ) {
CFRelease(description);
}
//Done
return foundRoute;
}
Used like so:
if ( isAudioRouteAvailable(kAudioSessionOutputRoute_BuiltInSpeaker) ) {
//Do great things...
}

Why does this Audio Unit RemoteIO initialisation work on iPhone but not in simulator?

I am using the Audio Unit services to set up an output rendering callback so I can mix together synthesized audio. The code I have seems to work perfectly on the devices I have (iPod Touch, iPhone 3G, and iPad) but fails to work on the simulator.
On the simulator, the AudioUnitInitialise function fails and returns a value of -10851 (kAudioUnitErr_InvalidPropertyValue according to Apple documentation).
Here is my initialisation code.. anyone with more experience with this API than I see anything I'm doing incorrect here?
#define kOutputBus 0
#define kInputBus 1
...
static OSStatus playbackCallback(void *inRefCon,
AudioUnitRenderActionFlags* ioActionFlags,
const AudioTimeStamp* inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList* ioData)
{
// Mix audio here - but it never gets here on the simulator
return noErr;
}
...
{
OSStatus status;
// Describe audio component
AudioComponentDescription desc;
desc.componentType = kAudioUnitType_Output;
desc.componentSubType = kAudioUnitSubType_RemoteIO;
desc.componentFlags = 0;
desc.componentFlagsMask = 0;
desc.componentManufacturer = kAudioUnitManufacturer_Apple;
// Get component
AudioComponent inputComponent = AudioComponentFindNext(NULL, &desc);
// Get audio units
status = AudioComponentInstanceNew(inputComponent, &m_audio_unit);
if(status != noErr) {
NSLog(#"Failed to get audio component instance: %d", status);
}
// Enable IO for playback
UInt32 flag = 1;
status = AudioUnitSetProperty(m_audio_unit,
kAudioOutputUnitProperty_EnableIO,
kAudioUnitScope_Output,
kOutputBus,
&flag,
sizeof(flag));
if(status != noErr) {
NSLog(#"Failed to enable audio i/o for playback: %d", status);
}
// Describe format
AudioStreamBasicDescription audioFormat;
audioFormat.mSampleRate = 44100.00;
audioFormat.mFormatID = kAudioFormatLinearPCM;
audioFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
audioFormat.mFramesPerPacket = 1;
audioFormat.mChannelsPerFrame = 2;
audioFormat.mBitsPerChannel = 16;
audioFormat.mBytesPerPacket = 4;
audioFormat.mBytesPerFrame = 4;
// Apply format
status = AudioUnitSetProperty(m_audio_unit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Input,
kOutputBus,
&audioFormat,
sizeof(audioFormat));
if(status != noErr) {
NSLog(#"Failed to set format descriptor: %d", status);
}
// Set output callback
AURenderCallbackStruct callbackStruct;
callbackStruct.inputProc = playbackCallback;
callbackStruct.inputProcRefCon = self;
status = AudioUnitSetProperty(m_audio_unit,
kAudioUnitProperty_SetRenderCallback,
kAudioUnitScope_Global,
kOutputBus,
&callbackStruct,
sizeof(callbackStruct));
if(status != noErr) {
NSLog(#"Failed to set output callback: %d", status);
}
// Initialize (This is where it fails on the simulator)
status = AudioUnitInitialize(m_audio_unit);
if(status != noErr) {
NSLog(#"Failed to initialise audio unit: %d", status);
}
}
My XCode version is 3.2.2 (64 bit)
My Simulator version is 3.2 (Though the same issue occurs in 3.1.3 Debug or Release)
Thanks, I appreciate it!
compiling for a device and for a simulator is totally different. Most common things have the same expected result. For example loading a view switch between them playing sounds and so on. However when it comes to other things like playing sound with OpenAL loading 10 buffers and then switching between them the simulator cannot handle that but the devices can.
The way i see it is as long as it works on the device that's all I care about. Try not t pull your hair out just to make an application work on a simulator when it works fine on the device.
hope that helps
Pk
Did you configure and enable an Audio Session prior to calling your RemoteIO initialization code?
When you are setting the stream properties to the input bus, you are using kOutputBus for your input scope. That's probably not good. Also, you probably don't need to apply the render callback to the global scope, as you only need it for output. Furthermore, I think that your definitions of kOutputBus and kInputBus are wrong... when I look at working iPhone Audio code, it uses 0 for the input bus and 1 for the output bus.
I can also think of a few minor things in regards to the AudioStreamBasicDescription, though I don't think these will make much of a difference:
Add the kAudioFormatFlagsNativeEndian property to your format flags
Explicitly set the mReserved field to 0.

Detecting if headphones are plugged into iPhone

Does anyone know if you can detect if headphones are plugged into the iPhone, and if they aren't - disable sound from your application.
I think I could manage disabling sound, but the detection part I have yet to find anything on.
Thanks
With this code you can detect the changes between:
MicrophoneWired
Headphone
LineOut
Speaker
Detecting when an iOS Device connector was plugged/unplugged
Note: Since iOS 5 part of the "audioRouteChangeListenerCallback(...)" behavior is deprecated but you can update it with:
// kAudioSession_AudioRouteChangeKey_PreviousRouteDescription -> Previous route
// kAudioSession_AudioRouteChangeKey_CurrentRouteDescription -> Current route
CFDictionaryRef newRouteRef = CFDictionaryGetValue(routeChangeDictionary, kAudioSession_AudioRouteChangeKey_CurrentRouteDescription);
NSDictionary *newRouteDict = (NSDictionary *)newRouteRef;
// RouteDetailedDescription_Outputs -> Output
// RouteDetailedDescription_Outputs -> Input
NSArray * paths = [[newRouteDict objectForKey: #"RouteDetailedDescription_Outputs"] count] ? [newRouteDict objectForKey: #"RouteDetailedDescription_Outputs"] : [newRouteDict objectForKey: #"RouteDetailedDescription_Inputs"];
NSString * newRouteString = [[paths objectAtIndex: 0] objectForKey: #"RouteDetailedDescription_PortType"];
// newRouteString -> MicrophoneWired, Speaker, LineOut, Headphone
Greetings
http://developer.apple.com/iphone/library/samplecode/SpeakHere/Introduction/Intro.html
In this project there is a code-snippet where it pauses recording if the headphones is unpluged. Maybe you can use it to achieve your result.
Good luck!
(edit)
You will have to study the SpeakHereController.mm file.
I found this code in the awakeFromNib method
// we do not want to allow recording if input is not available
error = AudioSessionGetProperty(kAudioSessionProperty_AudioInputAvailable, &size, &inputAvailable);
if (error) printf("ERROR GETTING INPUT AVAILABILITY! %d\n", error);
btn_record.enabled = (inputAvailable) ? YES : NO;
// we also need to listen to see if input availability changes
error = AudioSessionAddPropertyListener(kAudioSessionProperty_AudioInputAvailable, propListener, self);
if (error) printf("ERROR ADDING AUDIO SESSION PROP LISTENER! %d\n", error);
Here is the solution, you may like it or it is helpful to you.
Before using below method please write this two line also
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_None; AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
(void)isHeadsetPluggedIn {
UInt32 routeSize = sizeof (CFStringRef); CFStringRef route;
AudioSessionGetProperty (kAudioSessionProperty_AudioRoute, &routeSize, &route);
//NSLog(#"Error >>>>>>>>>> :%#", error);
/* Known values of route:
"Headset"
"Headphone"
"Speaker"
"SpeakerAndMicrophone"
"HeadphonesAndMicrophone"
"HeadsetInOut"
"ReceiverAndMicrophone"
"Lineout" */
NSString* routeStr = (NSString*)route;
NSRange headsetRange = [routeStr rangeOfString : #"Headset"]; NSRange receiverRange = [routeStr rangeOfString : #"Receiver"];
if(headsetRange.location != NSNotFound) {
// Don't change the route if the headset is plugged in.
NSLog(#"headphone is plugged in ");
} else
if (receiverRange.location != NSNotFound) {
// Change to play on the speaker
NSLog(#"play on the speaker");
} else {
NSLog(#"Unknown audio route.");
}
}
To perform a one-off check to determine if headphones are plugged in (rather than setting a callback when they're unplugged) I found the following works in iOS5 and above:
- (BOOL) isAudioJackPlugged
{
// initialise the audio session - this should only be done once - so move this line to your AppDelegate
AudioSessionInitialize(NULL, NULL, NULL, NULL);
UInt32 routeSize;
// oddly, without calling this method caused an error.
AudioSessionGetPropertySize(kAudioSessionProperty_AudioRouteDescription, &routeSize);
CFDictionaryRef desc; // this is the dictionary to contain descriptions
// make the call to get the audio description and populate the desc dictionary
AudioSessionGetProperty (kAudioSessionProperty_AudioRouteDescription, &routeSize, &desc);
// the dictionary contains 2 keys, for input and output. Get output array
CFArrayRef outputs = CFDictionaryGetValue(desc, kAudioSession_AudioRouteKey_Outputs);
// the output array contains 1 element - a dictionary
CFDictionaryRef dict = CFArrayGetValueAtIndex(outputs, 0);
// get the output description from the dictionary
CFStringRef output = CFDictionaryGetValue(dict, kAudioSession_AudioRouteKey_Type);
/**
These are the possible output types:
kAudioSessionOutputRoute_LineOut
kAudioSessionOutputRoute_Headphones
kAudioSessionOutputRoute_BluetoothHFP
kAudioSessionOutputRoute_BluetoothA2DP
kAudioSessionOutputRoute_BuiltInReceiver
kAudioSessionOutputRoute_BuiltInSpeaker
kAudioSessionOutputRoute_USBAudio
kAudioSessionOutputRoute_HDMI
kAudioSessionOutputRoute_AirPlay
*/
return CFStringCompare(output, kAudioSessionOutputRoute_Headphones, 0) == kCFCompareEqualTo;
}
For those keeping score at home, that's a string in a dictionary in an array in a dictionary.