I have an iOS app that has some audio feedback in certain places, but I want any other music the user has playing in the background to be allowed to play over this. In addition, I want the audio in my app to respect the mute switch. According to the developer documentation, this functionality should all be enabled by the AVAudioSession ambient category. This is the code I'm using:
if (!hasInitialisedAudioSession) {
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryAmbient error:NULL];
[session setActive:YES error:NULL];
hasInitialisedAudioSession = YES;
}
The code is executing just fine, and it does indeed let the app sounds play over iPod music. What it doesn't do, however, is respect the mute switch. I've tried swapping this code out for similar C audio calls (stuff like AudioSessionSetProperty) instead of the Objective-C calls, but I get the same result - the ambient session category simply doesn't want to respect the mute switch, despite what the documentation says it should be doing.
Any ideas? Thanks for the help :)
I think I managed to work it out - turns out that it has nothing to do with my app at all, but rather the iPod app. My app obeys the mute switch as it should when the iPod isn't playing, and then allows the iPod to play over it - all behaviour I wanted. However, when the iPod is playing, the app stops responding to the mute switch, so I think it's just something the iPod does to the device audio settings. I could probably work a way around it if I really wanted to spend the time on it, but as long as it obeys the mute switch when the iPod isn't playing that's good enough for me.
EDIT: to work around this, just use this function to determine whether or not the mute switch is on manually, and don't play your sounds if the result is YES. Could be a bit of a pain if you don't have a central audio manager class, though. It would be nice if Apple could publish this behaviour in their documentation.
- (BOOL)deviceIsSilenced
{
#if TARGET_IPHONE_SIMULATOR
// return NO in simulator. Code causes crashes for some reason.
return NO;
#endif
CFStringRef state;
UInt32 propertySize = sizeof(CFStringRef);
AudioSessionInitialize(NULL, NULL, NULL, NULL);
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &state);
return (CFStringGetLength(state) <= 0);
}
Related
I have used to following code the stream the i/o of audio from microphone. What I want to do is want to select the rear microphone for recording. I have read that setting kAudioSessionProperty_Mode to kAudioSessionMode_VideoRecording can do the work but I am not sure how to use this with my code. Can any one help me in successfully setting this parameter.
I have these lines for setting the property
status = AudioUnitSetProperty(audioUnit,
kAudioSessionProperty_Mode,
kAudioSessionMode_VideoRecording,
kOutputBus,
&audioFormat,
sizeof(audioFormat));
checkStatus(status);
but its not working.
in apple developer library click here
you can see a specific method
struct AudioChannelLayout {
AudioChannelLayoutTag mChannelLayoutTag;
UInt32 mChannelBitmap;
UInt32 mNumberChannelDescriptions;
AudioChannelDescription mChannelDescriptions[1];
};
typedef struct AudioChannelLayout AudioChannelLayout;
you can change AudioChannelDescription to 2 for using secondary microphone
I did some searching and reading. Finally ended up in the AVCaptureDevice Class Reference. The key command here for you is NSLog(#"%#", [AVCaptureDevice devices]);. I ran this with my iPhone attached and got this:
"<AVCaptureFigVideoDevice: 0x1fd43a50 [Back Camera][com.apple.avfoundation.avcapturedevice.built-in_video:0]>",
"<AVCaptureFigVideoDevice: 0x1fd47230 [Front Camera][com.apple.avfoundation.avcapturedevice.built-in_video:1]>",
"<AVCaptureFigAudioDevice: 0x1fd46730 [Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>"
Only one microphone ever shows up in the list. So to answer your question, it cannot be done (yet).
Your code:
status = AudioUnitSetProperty(audioUnit,
kAudioSessionProperty_Mode,
kAudioSessionMode_VideoRecording,
kOutputBus,
&audioFormat,
sizeof(audioFormat));
checkStatus(status);
Is not working as the code is not correct. Audio SESSIONS are not properties of Audio UNITS. The Audio Session describes the general behaviour of your app with hardware resources, and how it cooperates with other demands on those same resources by other apps and other parts of the system. It is your best chance of taking control of input and output hardware, but does not give you total control as the iOS frameworks have the overall user experience as the uppermost priority.
Your app has a single audio session, which you can initialise, activate and deactivate, and get and set properties of. Since ios6 most of these properties can be addressed using the AVFoundation singleton AVAudioSession object, but to get full access you will still want to use Core Audio function syntax.
To set the audio session mode to "VideoRecording" using AVFoundation you would do something like this:
- (void) configureAVAudioSession
{
//get your app's audioSession singleton object
AVAudioSession* session = [AVAudioSession sharedInstance];
//error handling
BOOL success;
NSError* error;
//set the audioSession category.
//Needs to be Record or PlayAndRecord to use VideoRecording mode:
success = [session setCategory:AVAudioSessionCategoryPlayAndRecord
error:&error]
if (!success) NSLog(#"AVAudioSession error setting category:%#",error);
//set the audioSession mode
succcess = [session setMode:AVAudioSessionModeVideoRecording error:&error];
if (!success) NSLog(#"AVAudioSession error setting mode:%#",error);
//activate the audio session
success = [session setActive:YES error:&error];
if (!success) NSLog(#"AVAudioSession error activating: %#",error);
else NSLog(#"audioSession active");
}
The same functionality using Core Audio functions (ios5 and below). checkStatus is the error handling function from your code sample.
- (void) configureAudioSession
{
OSStatus status;
//initialise the audio session
status = AudioSessionInitialize ( NULL
//runloop
, kCFRunLoopDefaultMode
//runloopmode
, NULL
//MyInterruptionListener
, (__bridge void *)(self)
//user info
);
//set the audio session category
UInt32 category = kAudioSessionCategory_PlayAndRecord;
status = AudioSessionSetProperty ( kAudioSessionProperty_AudioCategory
, sizeof(category)
, &category);
checkStatus(status);
//set the audio session mode
UInt32 mode = kAudioSessionMode_VideoRecording;
status = AudioSessionSetProperty(kAudioSessionMode_VideoRecording
, sizeof(mode)
, &mode);
checkStatus(status);
//activate the audio session
status = AudioSessionSetActive(true);
checkStatus(status);
}
The reason you have been told to use VideoRecording mode is because it is the only mode that will give you any hope of directly selecting the rear mic. What it does is select the mic nearest to the video camera.
"On devices with more than one built-in microphone, the microphone closest to the video camera is used." (From Apple's AVSession Class Reference)
This suggests that the video camera will need to be active when using the mic, and the choice of camera from front to back is the parameter that the system uses to select the appropriate microphone. It may be that video-free apps using the rear mic (such as your example) are in fact getting a video input stream from the rear camera and not doing anything with it. I am unable to test this as I do not have access to an iPhone 5. I do see that the "Babyscope" app you mentioned has an entirely different app for running on ios5 vs. ios4.
The answer from Kuriakose is misleading: AudioChannelLayout is a description of an audo track, it has no effect on the audio hardware used in capture. The answer from Sangony just shows us that Apple do not really want us to have full control over the hardware. Much of it's audio management on iOS is an attempt to keep us away from direct control in order to accommodate both user expectations (of audio i/o behaviour between apps) and hardware limitations when dealing with live signals.
I'm trying to record and play audio in an universal app. The AVAudioSession configuration is:
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error: nil];
UInt32 category = kAudioSessionCategory_PlayAndRecord;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(category), &category);
This is working fine for devices with built-in microphone. But in a third generation iPod I am experiencing random undesired behaviours. Sometimes all works fine, and sometimes it happens only when the earphones with the integrated mic are plugged-in. In this case, is not possible to hear any sound in the app without the earphones.
The 3rd generation of iPod Touch doesn't have a built-in microphone. The only way to have one is to plug your headphones with a mic.
The only generation of iPod Touch with a built-in microphone is the 4th.
You can check if inputRecording is available :
UInt32 propertySize, micConnected;
AudioSessionGetProperty(kAudioSessionProperty_AudioInputAvailable, &propertySize, &micConnected);
I know that Apple does not provide a way to detect the state of the iPhone mute/silence switch in iOS 5. However, I have I tried a technique mentioned elsewhere to detect this by playing an audio file and measuring its runtime. However, even though my iPhone was muted, the audio file still played the entire duration. I'm using [AVAudioPlayer play] and computing the time before audioPlayerDidFinishPlaying is called. Do you know of a trick to play an audio file, which will complete early/immediately if the phone is muted?
UPDATE: Sorry the below code posted works fine for me but I am using iOS4. For iOS5 this answer is what will solve your problem - Detecting the iPhone's Ring / Silent / Mute switch using AVAudioPlayer not working?
This piece of code is what you need. Define a global gAudioSessionInited var. This flag tells you whether your mute switch is on or off.
// "Ambient" makes it respect the mute switch. Must call this once to init session
if (!gAudioSessionInited)
{
AudioSessionInterruptionListener inInterruptionListener = NULL;
OSStatus error;
if ((error = AudioSessionInitialize (NULL, NULL, inInterruptionListener, NULL)))
NSLog(#"*** Error *** error in AudioSessionInitialize: %d.", error);
else
gAudioSessionInited = YES;
}
SInt32 ambient = kAudioSessionCategory_AmbientSound;
if (AudioSessionSetProperty (kAudioSessionProperty_AudioCategory, sizeof (ambient), &ambient))
{
NSLog(#"*** Error *** could not set Session property to ambient.");
}
I've been going through the audio session categories and overrides, and it appears that you can either play audio regardless of whether the silent switch is set or the screen is locked (AVAudioSessionCategoryPlayback), OR you can respect the silent switch and the screen lock settings (AVAudioSessionCategorySoloAmbient).
Any idea how I can play audio that gets muted by the silent switch, but keeps playing when the screen is locked? All I want is to have my cake and eat it too.
To answer the inevitable "why would you ever want to circumvent the system" questions, this is for an app that shows words and plays music which is likely to be used in both a church and home setting. At church it would be bad if the app started playing music, hence I want to respect the mute switch. At home you might set the phone down to listen to music while doing something else, and you wouldn't want it to stop playing when the phone auto-locks.
You can use AVAudioSessionCategoryPlayback and use this code to check if silent switch is on or off:
- (BOOL)deviceIsSilenced
{
#if TARGET_IPHONE_SIMULATOR
// return NO in simulator. Code causes crashes for some reason.
return NO;
#endif
CFStringRef state;
UInt32 propertySize = sizeof(CFStringRef);
AudioSessionInitialize(NULL, NULL, NULL, NULL);
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &state);
return (CFStringGetLength(state) <= 0);
}
I have a question regarding sounds in my app. I need the user to be able to mute all sounds coming from my game, for example if they just want to listen to the ipod while playing. There is a similar question here Disable all program sounds but there doesnt seem to be an answer. At the moment I have an AVAudioSession set to AVAudioSessionCategoryAmbient which allows the ipod to play but also will allow my app to play game sounds. Is the best way to achieve my aim to just set a boolean bhen a mute button is clicked and check this each time a sound should be played? This seems kinda awkward although it would work... any ideas please?
Many thanks
Jules
If the iPod is playing when the game begins, you can mute the sounds automatically.
e.g.,
UInt32 userPlayback;
UInt32 propertySize = sizeof(userPlayback);
AudioSessionGetProperty(
kAudioSessionProperty_OtherAudioIsPlaying,
&propertySize, &userPlayback);
BOOL userMusicPlaying = (userPlayback != 0);
Then if userMusicPlaying is YES, don't let the sounds play. (Consider also adding a switch that lets the sounds play even with the iPod playing.) Otherwise, I don't think there's a way to disable the sounds without you or the user disabling them.
Put isMuted in the user defaults
if ([[NSUserDefaults standardUserDefaults] boolForKey:#"isMuted"]) {
playSound()....
}
to mute:
[[NSUserDefaults standardUserDefaults] setBool:YES forKey:#"isMuted"];
If you call the following it will write the settings to disk and remember what the user did last time:
[[NSUserDefaults standardUserDefaults] synchronize];