AVAudioRecorder meter delay on iPad - iphone

I am having trouble getting an accurate meter reading from the AVAudioRecorder (testing on iPad).
It seems to work fine while the volume is rising, however a delay happens when the volume drops. For example: I speak into the mic and slowly raise my voice. The readings increment from -35.567 to -34.678 up to -10.579 as I would hope, but when I stop talking there is a delay of 1 - 2 seconds before it drops back down to -35.567 (or whatever it happens to be).
The NSLog continues to update from the loop but the meter number stays the same during the delay even though the sound has long ended.
I have added the gist of the code below and would be happy to supply full code if need be.
I initialize the recorder like so:
AVAudioSession * audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error: &error];
[audioSession setActive:YES error: &error];
NSMutableDictionary* recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
recordedTmpFile = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent: [NSString stringWithString: #"Recording.caf"]]];
recorder = [[ AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSetting error:&error];
[recorder setDelegate:self];
[recorder prepareToRecord];
[recorder setMeteringEnabled:YES];
[recorder record];
and update the meter in a loop:
-(void) loop:(ccTime)dt
{
if(isRecording == YES)
{
//get volume levels
[recorder updateMeters];
float level = [recorder peakPowerForChannel:0];
NSLog(#"Vol: %f", level);
}
}
edited: I should also mention that I am using the Cocos2d schedule for the loop:
[self schedule:#selector(loop:)];
Any ideas why there would be such a long delay?
edited: I have tried using the average peak power and this has no delay. So I could possibly use that as a work around. However I would rather not use and averaged peak power and it would be nice to understand what is going on.

I'm sure that most have figured this out but if you want less lag on your metering you need to use AudioQueue or RemoteIO. See the better explanation here:
Confusion with meters in AVAudioRecorder

You can fix this by resetting the meteringEnabled property to YES.
yourRecorderName.meteringEnabled = YES
Call this every time you want the levels to reset to ambient levels. This takes about 0.02 seconds, and in that time the levels will briefly drop down to 0, or -120 dB before resetting to ambient.
Alternatively, you can use:
[yourRecorderName stop]
[yourRecorderName record]
This takes about 0.05 seconds, but the levels won't drop down to 0 in the wait time. In fact, nothing will happen because in this case, it actually takes the recorder object 0.05 seconds to stop and start recording again.

How about using a timer ? it would be mutch quicker
after
NSError* error
if (recorder) {
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval:0.2 target: self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
} else
NSLog(#" error %#",[error description]);
}
Where levelTimer is the NStimer that calls the function that does what you want(levelTimerCallback:), updates the meters, etc.
-(IBAction)levelTimerCallback:(NSTimer*)timer
{
[recorder updateMeters];
float level = [recorder peakPowerForChannel:0];
NSLog(#"Vol: %f", level);
}

Related

Edit averagePowerForChannel of avaudioplayer in iphone

I am playing the .mp3 file by using AVaudioplayer:-
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]pathForResource:#"Speech"ofType:#"mp3"]];
NSError *error;
audioPlayer = [[AVAudioPlayer alloc]initWithContentsOfURL:url error:&error];
if (error)
{
NSLog(#"Error in audioPlayer: %#",
[error localizedDescription]);
} else {
audioPlayer.delegate = self;
audioPlayer.numberOfLoops=-1;
[audioPlayer prepareToPlay];
[audioPlayer play];
}
audioPlayer.meteringEnabled = YES;
audioPlayer.volume=0;
NSTimer *playerTimer = nil;
if (!playerTimer)
{
playerTimer = [NSTimer scheduledTimerWithTimeInterval:0.001
target:self selector:#selector(monitorAudioPlayer)
userInfo:nil
repeats:YES];
}
[audioPlayer updateMeters];
-(void)monitorAudioPlayer{
[audioPlayer updateMeters];
for (int i=0; i<audioPlayer.numberOfChannels; i++)
{
float power = [ audioPlayer averagePowerForChannel: i ];
float peak = [ audioPlayer peakPowerForChannel: i ];
}
}
And in written i get the float number of power for that audioplayer.
Is there any way to change the power of peak value of this audio player so that the sound get change to some funnysound or something other.
It is possible by using audiosession or some other thing.
I had done this by using DIRAC.
But in that i am not able to get the powervalue.AS it always return the static powervalue.
Thanks in advance.
The average and peak power levels are not writable; they are calculated characteristics of the sample data.
player.volume = x where x is greater than 1.0 will amplify the sound level before it comes out of the speakers. If you set it to a stupidly high number like 20 or 100, it will end up sounding pretty funny!
If you want to do any more funny stuff to the samples, I think you'll have to use a low level API like RemoteIO or AudioQueue.

iPhone: keep audio recording app running in the background

I have searched for suitable answers to my question but I did not find any helpful so far.
I want to record the decibel in the environment. If a specific threshold is exceeded the app shall play a sound or song file. Everything works fine so far but I have troubles to keep the app running in the background.
I have already added the attribute "Application does not run in the background" and set its value to "NO". I've read that one should add the "external-accessory" element to the "Required background modes". I added that too but still it does not work.
I am using the AVAudioRecorder to record the sound and the AVPlayer to play the sound/music file. First I used the MPMediaController iPodMusicPlayer but it throws an exception along with the attribute "Required background modes".
EDIT:
I am using xCode 4.5 with iOS 6
EDIT 2:
When I add the string viop to the "Required background modes" it seems to continue recording while in background. But it still does not play the music file when being in background. I also tried to add the "audio" value too but it did not help.
EDIT 3:
I've consulted the apples developer reference. It seems like you have to configure your AVAudioSession. With that it seems to work (link to reference). But now I have troubles in playing more than one file because as soon as the first track has finished playing, the app will go into suspended mode again. As far as I know there is no possibility to initialize the AVPlayer or AVAudioPlayer with more than one file. I used the delegate methode audioPlayerDidFinishPlaying:successfully: to set the next track but it did not work.
EDIT 4: Ok, one possibility is to avoid stopping the recorder, that is removing the [record stop] so that it even records the sound when music is played. It is a work around that works but still I appreciate any other (better) solution to this. A solution that doesn't need to keep the recorder running all the time.
the relevant code:
I initialize everything in the viewDidLoad method:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
//[MPMusicPlayerController applicationMusicPlayer];
NSURL *url = [NSURL fileURLWithPath:#"/dev/null"];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey,
nil];
NSError *error;
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
lowPassResults = -120.0;
thresholdExceeded = NO;
if (recorder) {
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.03 target: self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
} else {
NSString* errorDescription = [error description];
NSLog(errorDescription);
}
}
The levelTimer Callback that is called every 0.03 seconds:
- (void)levelTimerCallback:(NSTimer *)timer {
//refreshes the average and peak power meters (the meter uses a logarithmic scale, with -160 being complete quiet and zero being maximum input
[recorder updateMeters];
const double ALPHA = 0.05;
float averagePowerForChannel = [recorder averagePowerForChannel:0];
//adjust the referential
averagePowerForChannel = averagePowerForChannel / 0.6;
//converts the values
lowPassResults = ALPHA * averagePowerForChannel + (1.0 - ALPHA) * lowPassResults;
float db = lowPassResults + 120;
db = db < 0? 0: db;
if(db >= THRESHOLD)
{
[self playFile];
}
}
Finally the playFile method which plays the music file:
- (void) playFile {
NSString* title = #"(You came down) For a day";
NSString* artist = #"Forge";
NSMutableArray *songItemsArray = [[NSMutableArray alloc] init];
MPMediaQuery *loadSongsQuery = [[MPMediaQuery alloc] init];
MPMediaPropertyPredicate *artistPredicate = [MPMediaPropertyPredicate predicateWithValue:artist forProperty:MPMediaItemPropertyArtist];
MPMediaPropertyPredicate *titlePredicate = [MPMediaPropertyPredicate predicateWithValue:title forProperty:MPMediaItemPropertyTitle];
[loadSongsQuery addFilterPredicate:artistPredicate];
[loadSongsQuery addFilterPredicate:titlePredicate];
NSArray *itemsFromGenericQuery = [loadSongsQuery items];
if([itemsFromGenericQuery count])
[songItemsArray addObject: [itemsFromGenericQuery objectAtIndex:0]];
if([songItemsArray count])
{
MPMediaItemCollection *collection = [[MPMediaItemCollection alloc] initWithItems:songItemsArray];
if ([collection count]) {
MPMediaItem* mpItem = [[collection items]objectAtIndex:0];
NSURL* mediaUrl = [mpItem valueForProperty:MPMediaItemPropertyAssetURL];
AVPlayerItem* item = [AVPlayerItem playerItemWithURL:mediaUrl];
musicPlayer = [[AVPlayer alloc] initWithPlayerItem:item];
[musicPlayer play];
}
}
}
Can anybody help me with my problem? Did I miss anything else?
Try this,
AppDelegate.m
- (void)applicationDidEnterBackground:(UIApplication *)application
{
__block UIBackgroundTaskIdentifier task = 0;
task=[application beginBackgroundTaskWithExpirationHandler:^{
NSLog(#"Expiration handler called %f",[application backgroundTimeRemaining]);
[application endBackgroundTask:task];
task=UIBackgroundTaskInvalid;
}];
}

How to Recognise voice using AVAudioRecorder in iphone

I am using AVAudioRecorder .if I tap on record button,The recording should start/save only after recognising the voice.
- (void)viewDidLoad
{
recording = NO;
NSString * filePath = [NSHomeDirectory()
stringByAppendingPathComponent:#"Documents/recording.caf"];
NSDictionary *recordSettings =
[[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0],AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless],
AVFormatIDKey,
[NSNumber numberWithInt: 1],
AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax],
AVEncoderAudioQualityKey,nil];
AVAudioRecorder *newRecorder = [[AVAudioRecorder alloc]
initWithURL: [NSURL fileURLWithPath:filePath]
settings: recordSettings
error: nil];
[recordSettings release];
self.soundRecorder = newRecorder;
[newRecorder release];
self.soundRecorder.delegate = self;
NSLog(#"path is %#",filePath);
[super viewDidLoad];
}
- (IBAction) record:(id) sender {
if (recording) {
[self.soundRecorder stop];
[recordBtn setTitle:#"Record" forState:UIControlStateNormal];
recording = NO;
} else {
[self.soundRecorder record];
[recordBtn setTitle:#"Stop" forState:UIControlStateNormal];
recording = YES;
}
}
- (IBAction) play {
NSString * filePath = [NSHomeDirectory()
stringByAppendingPathComponent:#"Documents/recording.caf"];
AVAudioPlayer *newPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL: [NSURL fileURLWithPath:filePath] error: nil];
newPlayer.delegate = self;
NSLog(#"playing file at url %# %d",[[newPlayer url] description],[newPlayer play]);
}
Please Help me out.
That's a challenging goal you have. iOS doesn't include the smarts to recognize voice specifically, you will have to provide a filter or your own to do that. If you just want VOX type support (i.e. start recording when a given level of audio is detected) it is easily done by monitoring audio levels using the Audio Toolbox Framework.
If you need to recognize voice specifically you will need a specialized recognition filter to run your audio data through.
If you had such a filter you could take one of two approaches: a) Just record everything then post-process the resulting audio data to locate the time index at which voice is recognized and just ignore the data up to that point (copy the remaining data to another buffer perhaps) or b) use the Audio Toolbox Framework to monitor the recorded data in real time. Pass the data through your voice finding filter and only start buffering the data when your filter triggers.
Actual implementation is quite involved and too long to address here, but I have seen sample code in books and online that you could start from. I'm sorry I don't have any links to share at this time but will post any I come across in the near future.
I think this might help you with this. I think detecting a volume spike is all you need for your purposes, right?
http://mobileorchard.com/tutorial-detecting-when-a-user-blows-into-the-mic/
Pier.

Implement a noise filter algorithm for recorded audio in iPhone

I am developing an application like TomCat. I have recorded audio with a funny voice and playing it with Audio Queue Services. I have changed the settings of AVAudioRecorder, But while i am playing there is some noise or distortions.
NSMutableDictionary *settings = [[NSMutableDictionary alloc] init];
[settings setValue:[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];
[settings setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[settings setValue:[NSNumber numberWithInt:1] forKey:AVNumberOfChannelsKey];
[settings setValue:[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
[settings setValue:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
[settings setValue:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];
self.recorder = [[AVAudioRecorder alloc] initWithURL:[NSURL fileURLWithPath:self.audioPath] settings:settings error:nil];
self.recorder = [self.recorder retain];
[self.recorder prepareToRecord];
[self.recorder record];
I know how to covert the decibels to amplitude, or use LowPassFilter Blocks if the frequencies are too high. The HighPassFilter blocks if the frequencies are too low. How do I implement this in Objective-C?
//convert decibels to amp
const double ALPHA = 0.05;
double peakPowerForChannel = pow(10, (0.05 * [audioMonitor peakPowerForChannel:0]));
double audioMonitorResults;
audioMonitorResults= ALPHA * peakPowerForChannel + (1.0 - ALPHA) *audioMonitorResults;
Your question is not clear. In case you are looking to ensure you adhere to the Nyquist sampling criteria, your filter needs to be implemented BEFORE the ADC i.e., in hardware if your sampling frequency is close to the spectrum you intend to record.
If you do have an appropriate LPF and still hear funny noises, I suggest to you that your input audio levels are too high. This is again a hardware issue; bring down your input volume.
Another source of noise might be that your record does not really consists of one continuous sample block without interruptions. Alternatively, this could be a playback issue also.
Lots of things can go wrong that lead to "noise or distortions"...

Detect a blow into the microphone

For a project of mine I need to detect when the user blows into the mic. I've following this tutorial: http://www.mobileorchard.com/tutorial-detecting-when-a-user-blows-into-the-mic/ and this question: Detect blow in Mic and do something
But still I do not get the results I want. The blow is detected way too late, or sometimes not at all. When I tweak some results the blow is detected correctly, but then the blow is triggered too fast, ie. when you talk or make a clicking sound it is detected as a blow too.
Has anyone found a good way of detecting a blow? Thanks.
The AVAudioRecorder sound level API is not designed to give you reliable results in separating blowing sounds from other types of sounds received by the mic.
I suggest using the Audio Queue or the Audio Unit RemoteIO API, measuring RMS signal energy, envelope duration, and then using the Accelerate FFT library to check the spectrum for broadband noise vs. peaks that would suggest voiced talking instead of blowing.
e.g. a more reliable result will require a lot more work than 1 OS call.
Use return as you get first lowpass results >0.55
I have solve the issue have a look.
-(void)readyToBlow1 { NSURL *url = [NSURL fileURLWithPath:#"/dev/null"];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey,
nil];
NSError *error;
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
if (recorder) {
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.01 target: self selector: #selector(levelTimerCallback1:) userInfo: nil repeats: YES];
} else
NSLog(#"%#",[error description]);
}
(void)levelTimerCallback1:(NSTimer *)timer { [recorder updateMeters];
const double ALPHA = 0.05; double peakPowerForChannel = pow(10, (0.05 * [recorder peakPowerForChannel:0])); lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults; //NSLog(#"lowPassResults= %f",lowPassResults);
if (lowPassResults > 0.55) { lowPassResults = 0.0;
[self invalidateTimers];
NextPhase *objNextView =[[NextPhase alloc]init];
[UIView transitionFromView:self.view
toView:objNextView.view
duration:2.0
options:UIViewAnimationOptionTransitionCurlUp
completion:^(BOOL finished) {
}
];
[self.navigationController pushViewController:objNextView animated:NO];
**return;**
}
}
I've had good success using AudioQueueGetProperty() for the kAudioQueueProperty_CurrentLevelMeter.