use rear microphone of iphone 5 - iphone

I have used to following code the stream the i/o of audio from microphone. What I want to do is want to select the rear microphone for recording. I have read that setting kAudioSessionProperty_Mode to kAudioSessionMode_VideoRecording can do the work but I am not sure how to use this with my code. Can any one help me in successfully setting this parameter.
I have these lines for setting the property
status = AudioUnitSetProperty(audioUnit,
kAudioSessionProperty_Mode,
kAudioSessionMode_VideoRecording,
kOutputBus,
&audioFormat,
sizeof(audioFormat));
checkStatus(status);
but its not working.

in apple developer library click here
you can see a specific method
struct AudioChannelLayout {
AudioChannelLayoutTag mChannelLayoutTag;
UInt32 mChannelBitmap;
UInt32 mNumberChannelDescriptions;
AudioChannelDescription mChannelDescriptions[1];
};
typedef struct AudioChannelLayout AudioChannelLayout;
you can change AudioChannelDescription to 2 for using secondary microphone

I did some searching and reading. Finally ended up in the AVCaptureDevice Class Reference. The key command here for you is NSLog(#"%#", [AVCaptureDevice devices]);. I ran this with my iPhone attached and got this:
"<AVCaptureFigVideoDevice: 0x1fd43a50 [Back Camera][com.apple.avfoundation.avcapturedevice.built-in_video:0]>",
"<AVCaptureFigVideoDevice: 0x1fd47230 [Front Camera][com.apple.avfoundation.avcapturedevice.built-in_video:1]>",
"<AVCaptureFigAudioDevice: 0x1fd46730 [Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>"
Only one microphone ever shows up in the list. So to answer your question, it cannot be done (yet).

Your code:
status = AudioUnitSetProperty(audioUnit,
kAudioSessionProperty_Mode,
kAudioSessionMode_VideoRecording,
kOutputBus,
&audioFormat,
sizeof(audioFormat));
checkStatus(status);
Is not working as the code is not correct. Audio SESSIONS are not properties of Audio UNITS. The Audio Session describes the general behaviour of your app with hardware resources, and how it cooperates with other demands on those same resources by other apps and other parts of the system. It is your best chance of taking control of input and output hardware, but does not give you total control as the iOS frameworks have the overall user experience as the uppermost priority.
Your app has a single audio session, which you can initialise, activate and deactivate, and get and set properties of. Since ios6 most of these properties can be addressed using the AVFoundation singleton AVAudioSession object, but to get full access you will still want to use Core Audio function syntax.
To set the audio session mode to "VideoRecording" using AVFoundation you would do something like this:
- (void) configureAVAudioSession
{
//get your app's audioSession singleton object
AVAudioSession* session = [AVAudioSession sharedInstance];
//error handling
BOOL success;
NSError* error;
//set the audioSession category.
//Needs to be Record or PlayAndRecord to use VideoRecording mode:
success = [session setCategory:AVAudioSessionCategoryPlayAndRecord
error:&error]
if (!success) NSLog(#"AVAudioSession error setting category:%#",error);
//set the audioSession mode
succcess = [session setMode:AVAudioSessionModeVideoRecording error:&error];
if (!success) NSLog(#"AVAudioSession error setting mode:%#",error);
//activate the audio session
success = [session setActive:YES error:&error];
if (!success) NSLog(#"AVAudioSession error activating: %#",error);
else NSLog(#"audioSession active");
}
The same functionality using Core Audio functions (ios5 and below). checkStatus is the error handling function from your code sample.
- (void) configureAudioSession
{
OSStatus status;
//initialise the audio session
status = AudioSessionInitialize ( NULL
//runloop
, kCFRunLoopDefaultMode
//runloopmode
, NULL
//MyInterruptionListener
, (__bridge void *)(self)
//user info
);
//set the audio session category
UInt32 category = kAudioSessionCategory_PlayAndRecord;
status = AudioSessionSetProperty ( kAudioSessionProperty_AudioCategory
, sizeof(category)
, &category);
checkStatus(status);
//set the audio session mode
UInt32 mode = kAudioSessionMode_VideoRecording;
status = AudioSessionSetProperty(kAudioSessionMode_VideoRecording
, sizeof(mode)
, &mode);
checkStatus(status);
//activate the audio session
status = AudioSessionSetActive(true);
checkStatus(status);
}
The reason you have been told to use VideoRecording mode is because it is the only mode that will give you any hope of directly selecting the rear mic. What it does is select the mic nearest to the video camera.
"On devices with more than one built-in microphone, the microphone closest to the video camera is used." (From Apple's AVSession Class Reference)
This suggests that the video camera will need to be active when using the mic, and the choice of camera from front to back is the parameter that the system uses to select the appropriate microphone. It may be that video-free apps using the rear mic (such as your example) are in fact getting a video input stream from the rear camera and not doing anything with it. I am unable to test this as I do not have access to an iPhone 5. I do see that the "Babyscope" app you mentioned has an entirely different app for running on ios5 vs. ios4.
The answer from Kuriakose is misleading: AudioChannelLayout is a description of an audo track, it has no effect on the audio hardware used in capture. The answer from Sangony just shows us that Apple do not really want us to have full control over the hardware. Much of it's audio management on iOS is an attempt to keep us away from direct control in order to accommodate both user expectations (of audio i/o behaviour between apps) and hardware limitations when dealing with live signals.

Related

Using different resolution presets with AVFoundation

I'm trying to use AVFoundation to have three recording modes: Audio, Video and Photo. Audio and Video work just fine, but the problem is, if I set the session preset to AVCaptureSessionPreset352x288, the still pictures are also saved at that resolution. If I change my session preset to AVCaptureSessionPresetPhoto, then the photos look great but the video stops working because that isn't a supported preset for video. I've tried creating multiple sessions, reassigning the session preset, etc. but nothing seems to work. Anyone have a way to make this work with the video at a low resolution and still images at full resolution?
before taking the picture set the property for a new session preset
// captureSession is your capture session object
[captureSession beginConfiguration];
captureSession.sessionPreset = AVCaptureSessionPresetHigh;
[captureSession commitConfiguration];
then call your capture image handler
captureStillImageAsynchronouslyFromConnection: completionHandler:
then change back to low res (= prevPreset)
[captureSession beginConfiguration];
captureSession.sessionPreset = prevPreset;
[captureSession commitConfiguration];

Detecting iPhone mute switch in iOS 5

I know that Apple does not provide a way to detect the state of the iPhone mute/silence switch in iOS 5. However, I have I tried a technique mentioned elsewhere to detect this by playing an audio file and measuring its runtime. However, even though my iPhone was muted, the audio file still played the entire duration. I'm using [AVAudioPlayer play] and computing the time before audioPlayerDidFinishPlaying is called. Do you know of a trick to play an audio file, which will complete early/immediately if the phone is muted?
UPDATE: Sorry the below code posted works fine for me but I am using iOS4. For iOS5 this answer is what will solve your problem - Detecting the iPhone's Ring / Silent / Mute switch using AVAudioPlayer not working?
This piece of code is what you need. Define a global gAudioSessionInited var. This flag tells you whether your mute switch is on or off.
// "Ambient" makes it respect the mute switch. Must call this once to init session
if (!gAudioSessionInited)
{
AudioSessionInterruptionListener inInterruptionListener = NULL;
OSStatus error;
if ((error = AudioSessionInitialize (NULL, NULL, inInterruptionListener, NULL)))
NSLog(#"*** Error *** error in AudioSessionInitialize: %d.", error);
else
gAudioSessionInited = YES;
}
SInt32 ambient = kAudioSessionCategory_AmbientSound;
if (AudioSessionSetProperty (kAudioSessionProperty_AudioCategory, sizeof (ambient), &ambient))
{
NSLog(#"*** Error *** could not set Session property to ambient.");
}

Has anyone been able to play a video file and show live camera feed at the same time in separate views on iOS?

I have been trying to do this for a few days now using AVFoundation as well as trying to use MPMoviePlayerViewController. The closest I can get is allowing one to play at a time. I would like to think that this is possible because of Facetime. However, I know this is a little different because there is no separate video file.
Any ideas would help, and thanks.
I'm not sure where this is documented, but to get AVCaptureVideoPreviewLayer and MPMoviePlayerViewController to play together at the same time you need to set a mixable audio session category first.
Here's one way to do that:
AVAudioSession* session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayback error:nil];
UInt32 mixable = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(mixable), &mixable);
[session setActive:YES error:nil];
See the Audio Session Programming Guide and Audio Session Cookbook for more info.
Have you tried to play video on one thread and recording video on another? That would allow both of them to run while maintaining their separation.

AVAudioSession category not working as documentation dictates

I have an iOS app that has some audio feedback in certain places, but I want any other music the user has playing in the background to be allowed to play over this. In addition, I want the audio in my app to respect the mute switch. According to the developer documentation, this functionality should all be enabled by the AVAudioSession ambient category. This is the code I'm using:
if (!hasInitialisedAudioSession) {
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryAmbient error:NULL];
[session setActive:YES error:NULL];
hasInitialisedAudioSession = YES;
}
The code is executing just fine, and it does indeed let the app sounds play over iPod music. What it doesn't do, however, is respect the mute switch. I've tried swapping this code out for similar C audio calls (stuff like AudioSessionSetProperty) instead of the Objective-C calls, but I get the same result - the ambient session category simply doesn't want to respect the mute switch, despite what the documentation says it should be doing.
Any ideas? Thanks for the help :)
I think I managed to work it out - turns out that it has nothing to do with my app at all, but rather the iPod app. My app obeys the mute switch as it should when the iPod isn't playing, and then allows the iPod to play over it - all behaviour I wanted. However, when the iPod is playing, the app stops responding to the mute switch, so I think it's just something the iPod does to the device audio settings. I could probably work a way around it if I really wanted to spend the time on it, but as long as it obeys the mute switch when the iPod isn't playing that's good enough for me.
EDIT: to work around this, just use this function to determine whether or not the mute switch is on manually, and don't play your sounds if the result is YES. Could be a bit of a pain if you don't have a central audio manager class, though. It would be nice if Apple could publish this behaviour in their documentation.
- (BOOL)deviceIsSilenced
{
#if TARGET_IPHONE_SIMULATOR
// return NO in simulator. Code causes crashes for some reason.
return NO;
#endif
CFStringRef state;
UInt32 propertySize = sizeof(CFStringRef);
AudioSessionInitialize(NULL, NULL, NULL, NULL);
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &state);
return (CFStringGetLength(state) <= 0);
}

iPhone SDK: Set audio route and also mute audio when mute switch is turned on

I have an application where user can record sound and also play the sound either in the speaker or the ear piece. To route the audio I set kAudioSessionProperty_OverrideAudioRoute in the following way:
if(loudSpeakerOn) {
audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
}
else {
audioRouteOverride = kAudioSessionOverrideAudioRoute_None;
}
AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);
I should also make sure that when user switches on/off the mute/vibrate switch the sound should mute/unmute. For this I set the following property
SInt32 ambient = kAudioSessionCategory_AmbientSound;
if (AudioSessionSetProperty (kAudioSessionProperty_AudioCategory, sizeof(ambient), &ambient)) {
NSLog(#"*** Error *** could not set Session property to ambient.");
}
Now, the problem is when I set the Ambient property, Audio Route and Audio recording stops working.
How can I get all of these working together?
Thanks!
Did you get this working? I believe setting the audio sessions category ambient will disallow recording. If you need to simultaneously record and play audio, use kAudioSessionCategory_PlayAndRecord instead, otherwise switch back and forth between these when you need them. Also setting the category might reinitialize the routing, so you'll probably want to set that again after setting the category.