I am using AVAudioRecorder .if I tap on record button,The recording should start/save only after recognising the voice.
- (void)viewDidLoad
{
recording = NO;
NSString * filePath = [NSHomeDirectory()
stringByAppendingPathComponent:#"Documents/recording.caf"];
NSDictionary *recordSettings =
[[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0],AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless],
AVFormatIDKey,
[NSNumber numberWithInt: 1],
AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax],
AVEncoderAudioQualityKey,nil];
AVAudioRecorder *newRecorder = [[AVAudioRecorder alloc]
initWithURL: [NSURL fileURLWithPath:filePath]
settings: recordSettings
error: nil];
[recordSettings release];
self.soundRecorder = newRecorder;
[newRecorder release];
self.soundRecorder.delegate = self;
NSLog(#"path is %#",filePath);
[super viewDidLoad];
}
- (IBAction) record:(id) sender {
if (recording) {
[self.soundRecorder stop];
[recordBtn setTitle:#"Record" forState:UIControlStateNormal];
recording = NO;
} else {
[self.soundRecorder record];
[recordBtn setTitle:#"Stop" forState:UIControlStateNormal];
recording = YES;
}
}
- (IBAction) play {
NSString * filePath = [NSHomeDirectory()
stringByAppendingPathComponent:#"Documents/recording.caf"];
AVAudioPlayer *newPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL: [NSURL fileURLWithPath:filePath] error: nil];
newPlayer.delegate = self;
NSLog(#"playing file at url %# %d",[[newPlayer url] description],[newPlayer play]);
}
Please Help me out.
That's a challenging goal you have. iOS doesn't include the smarts to recognize voice specifically, you will have to provide a filter or your own to do that. If you just want VOX type support (i.e. start recording when a given level of audio is detected) it is easily done by monitoring audio levels using the Audio Toolbox Framework.
If you need to recognize voice specifically you will need a specialized recognition filter to run your audio data through.
If you had such a filter you could take one of two approaches: a) Just record everything then post-process the resulting audio data to locate the time index at which voice is recognized and just ignore the data up to that point (copy the remaining data to another buffer perhaps) or b) use the Audio Toolbox Framework to monitor the recorded data in real time. Pass the data through your voice finding filter and only start buffering the data when your filter triggers.
Actual implementation is quite involved and too long to address here, but I have seen sample code in books and online that you could start from. I'm sorry I don't have any links to share at this time but will post any I come across in the near future.
I think this might help you with this. I think detecting a volume spike is all you need for your purposes, right?
http://mobileorchard.com/tutorial-detecting-when-a-user-blows-into-the-mic/
Pier.
Related
I have searched for suitable answers to my question but I did not find any helpful so far.
I want to record the decibel in the environment. If a specific threshold is exceeded the app shall play a sound or song file. Everything works fine so far but I have troubles to keep the app running in the background.
I have already added the attribute "Application does not run in the background" and set its value to "NO". I've read that one should add the "external-accessory" element to the "Required background modes". I added that too but still it does not work.
I am using the AVAudioRecorder to record the sound and the AVPlayer to play the sound/music file. First I used the MPMediaController iPodMusicPlayer but it throws an exception along with the attribute "Required background modes".
EDIT:
I am using xCode 4.5 with iOS 6
EDIT 2:
When I add the string viop to the "Required background modes" it seems to continue recording while in background. But it still does not play the music file when being in background. I also tried to add the "audio" value too but it did not help.
EDIT 3:
I've consulted the apples developer reference. It seems like you have to configure your AVAudioSession. With that it seems to work (link to reference). But now I have troubles in playing more than one file because as soon as the first track has finished playing, the app will go into suspended mode again. As far as I know there is no possibility to initialize the AVPlayer or AVAudioPlayer with more than one file. I used the delegate methode audioPlayerDidFinishPlaying:successfully: to set the next track but it did not work.
EDIT 4: Ok, one possibility is to avoid stopping the recorder, that is removing the [record stop] so that it even records the sound when music is played. It is a work around that works but still I appreciate any other (better) solution to this. A solution that doesn't need to keep the recorder running all the time.
the relevant code:
I initialize everything in the viewDidLoad method:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
//[MPMusicPlayerController applicationMusicPlayer];
NSURL *url = [NSURL fileURLWithPath:#"/dev/null"];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey,
nil];
NSError *error;
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
lowPassResults = -120.0;
thresholdExceeded = NO;
if (recorder) {
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.03 target: self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
} else {
NSString* errorDescription = [error description];
NSLog(errorDescription);
}
}
The levelTimer Callback that is called every 0.03 seconds:
- (void)levelTimerCallback:(NSTimer *)timer {
//refreshes the average and peak power meters (the meter uses a logarithmic scale, with -160 being complete quiet and zero being maximum input
[recorder updateMeters];
const double ALPHA = 0.05;
float averagePowerForChannel = [recorder averagePowerForChannel:0];
//adjust the referential
averagePowerForChannel = averagePowerForChannel / 0.6;
//converts the values
lowPassResults = ALPHA * averagePowerForChannel + (1.0 - ALPHA) * lowPassResults;
float db = lowPassResults + 120;
db = db < 0? 0: db;
if(db >= THRESHOLD)
{
[self playFile];
}
}
Finally the playFile method which plays the music file:
- (void) playFile {
NSString* title = #"(You came down) For a day";
NSString* artist = #"Forge";
NSMutableArray *songItemsArray = [[NSMutableArray alloc] init];
MPMediaQuery *loadSongsQuery = [[MPMediaQuery alloc] init];
MPMediaPropertyPredicate *artistPredicate = [MPMediaPropertyPredicate predicateWithValue:artist forProperty:MPMediaItemPropertyArtist];
MPMediaPropertyPredicate *titlePredicate = [MPMediaPropertyPredicate predicateWithValue:title forProperty:MPMediaItemPropertyTitle];
[loadSongsQuery addFilterPredicate:artistPredicate];
[loadSongsQuery addFilterPredicate:titlePredicate];
NSArray *itemsFromGenericQuery = [loadSongsQuery items];
if([itemsFromGenericQuery count])
[songItemsArray addObject: [itemsFromGenericQuery objectAtIndex:0]];
if([songItemsArray count])
{
MPMediaItemCollection *collection = [[MPMediaItemCollection alloc] initWithItems:songItemsArray];
if ([collection count]) {
MPMediaItem* mpItem = [[collection items]objectAtIndex:0];
NSURL* mediaUrl = [mpItem valueForProperty:MPMediaItemPropertyAssetURL];
AVPlayerItem* item = [AVPlayerItem playerItemWithURL:mediaUrl];
musicPlayer = [[AVPlayer alloc] initWithPlayerItem:item];
[musicPlayer play];
}
}
}
Can anybody help me with my problem? Did I miss anything else?
Try this,
AppDelegate.m
- (void)applicationDidEnterBackground:(UIApplication *)application
{
__block UIBackgroundTaskIdentifier task = 0;
task=[application beginBackgroundTaskWithExpirationHandler:^{
NSLog(#"Expiration handler called %f",[application backgroundTimeRemaining]);
[application endBackgroundTask:task];
task=UIBackgroundTaskInvalid;
}];
}
I'm working to update the MPNowPlayingInfoCenter and having a bit of trouble. I've tried quite a bit to the point where I'm at a loss. The following is my code:
self.audioPlayer.allowsAirPlay = NO;
Class playingInfoCenter = NSClassFromString(#"MPNowPlayingInfoCenter");
if (playingInfoCenter) {
NSMutableDictionary *songInfo = [[NSMutableDictionary alloc] init];
MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:#"series_placeholder"]];
[songInfo setObject:thePodcast.title forKey:MPMediaItemPropertyTitle];
[songInfo setObject:thePodcast.author forKey:MPMediaItemPropertyArtist];
[songInfo setObject:#"NCC" forKey:MPMediaItemPropertyAlbumTitle];
[songInfo setObject:albumArt forKey:MPMediaItemPropertyArtwork];
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:songInfo];
}
This isn't working, I've also tried:
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:nil];
In an attempt to get it to remove the existing information from the iPod app (or whatever may have info there). In addition, just to see if I could find out the problem, I've tried retrieving the current information on app launch:
NSDictionary *info = [[MPNowPlayingInfoCenter defaultCenter] nowPlayingInfo];
NSString *title = [info valueForKey:MPMediaItemPropertyTitle];
NSString *author = [info valueForKey:MPMediaItemPropertyArtist];
NSLog(#"Currently playing: %# // %#", title, author);
and I get Currently playing: (null) // (null)
I've researched this quite a bit and the following articles explain it pretty thoroughly, however, I am still unable to get this working properly. Am I missing something? Would there be anything interfering with this? Is this a service something my app needs to register to access (didn't see this in any docs)?
Apple's Docs
Change lock screen background audio controls
Now playing info ignored
I finally figured out the problem, I was not prompting my app to receive remote control events, simply adding this line fixed the problem:
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
I use the code below and it always works. I'm also using MPMoviePlayer like you. Have you checked whether NSClassFromString(#"MPNowPlayingInfoCenter") ever actually returns YES? Have you set you app play audio in background key in your plist?
- (void) loadMPInformation
{
NSDictionary *mpInfo;
if([savedTrack.belongingAlbum.hasAlbumArt boolValue] == NO){
mpInfo = [NSDictionary dictionaryWithObjectsAndKeys:savedTrack.belongingAlbum.album, MPMediaItemPropertyAlbumTitle,
savedTrack.belongingArtist.artist, MPMediaItemPropertyArtist, savedTrack.name, MPMediaItemPropertyTitle, nil];
} else {
UIImage *artImage = [UIImage imageWithData:savedTrack.belongingAlbum.art];
MPMediaItemArtwork *artwork = [[MPMediaItemArtwork alloc] initWithImage:artImage];
mpInfo = [NSDictionary dictionaryWithObjectsAndKeys:savedTrack.belongingAlbum.album, MPMediaItemPropertyAlbumTitle,
savedTrack.belongingArtist.artist, MPMediaItemPropertyArtist, savedTrack.name, MPMediaItemPropertyTitle, artwork, MPMediaItemPropertyArtwork, nil];
}
[MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo = mpInfo;
}
So I setup an Audio session
AudioSessionInitialize(NULL, NULL, NULL, NULL);
AudioSessionSetActive(true);
UInt32 audioCategory = kAudioSessionCategory_MediaPlayback; //for output audio
OSStatus tErr = AudioSessionSetProperty(kAudioSessionProperty_AudioCategory,sizeof(audioCategory),&audioCategory);
Then setup either an AudioQueue or RemoteIO setup to play back some audio straight from a file.
AudioQueueStart(mQueue, NULL);
Once my audio is playing I can see the 'Play Icon' in the status bar of my app. I next setup an AVAssetReader.
AVAssetTrack* songTrack = [songURL.tracks objectAtIndex:0];
NSDictionary* outputSettingsDict = [[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM],AVFormatIDKey,
// [NSNumber numberWithInt:AUDIO_SAMPLE_RATE],AVSampleRateKey, /*Not Supported*/
// [NSNumber numberWithInt: 2],AVNumberOfChannelsKey, /*Not Supported*/
[NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsNonInterleaved,
nil];
NSError* error = nil;
AVAssetReader* reader = [[AVAssetReader alloc] initWithAsset:songURL error:&error];
// {
// AVAssetReaderTrackOutput* output = [[AVAssetReaderTrackOutput alloc] initWithTrack:songTrack outputSettings:outputSettingsDict];
// [reader addOutput:output];
// [output release];
// }
{
AVAssetReaderAudioMixOutput * readaudiofile = [AVAssetReaderAudioMixOutput assetReaderAudioMixOutputWithAudioTracks:(songURL.tracks) audioSettings:outputSettingsDict];
[reader addOutput:readaudiofile];
[readaudiofile release];
}
return reader;
and when I called [reader startReading] the Audio stops playing. In both the RemoteIO and AudioQueue case the callback stops getting called.
If I add the mixing option:
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof (UInt32), &(UInt32) {0});
Then the 'Play Icon' no longer appears when the audio after AudioQueueStart is called. I am also locked out of other features since the phone doesn't view me as the primary audio source.
Does anyone know a way I can use the AVAssetReader and still remain the Primary audio source?
As of iOS 5 this has been fixed. It still does not work in iOS 4 or below. MixWithOthers is not needed (can be set to false) and the AudioQueue/RemoteIO will continue to receive callbacks even if an AVAssetReader is reading.
I am having trouble getting an accurate meter reading from the AVAudioRecorder (testing on iPad).
It seems to work fine while the volume is rising, however a delay happens when the volume drops. For example: I speak into the mic and slowly raise my voice. The readings increment from -35.567 to -34.678 up to -10.579 as I would hope, but when I stop talking there is a delay of 1 - 2 seconds before it drops back down to -35.567 (or whatever it happens to be).
The NSLog continues to update from the loop but the meter number stays the same during the delay even though the sound has long ended.
I have added the gist of the code below and would be happy to supply full code if need be.
I initialize the recorder like so:
AVAudioSession * audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error: &error];
[audioSession setActive:YES error: &error];
NSMutableDictionary* recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
recordedTmpFile = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent: [NSString stringWithString: #"Recording.caf"]]];
recorder = [[ AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSetting error:&error];
[recorder setDelegate:self];
[recorder prepareToRecord];
[recorder setMeteringEnabled:YES];
[recorder record];
and update the meter in a loop:
-(void) loop:(ccTime)dt
{
if(isRecording == YES)
{
//get volume levels
[recorder updateMeters];
float level = [recorder peakPowerForChannel:0];
NSLog(#"Vol: %f", level);
}
}
edited: I should also mention that I am using the Cocos2d schedule for the loop:
[self schedule:#selector(loop:)];
Any ideas why there would be such a long delay?
edited: I have tried using the average peak power and this has no delay. So I could possibly use that as a work around. However I would rather not use and averaged peak power and it would be nice to understand what is going on.
I'm sure that most have figured this out but if you want less lag on your metering you need to use AudioQueue or RemoteIO. See the better explanation here:
Confusion with meters in AVAudioRecorder
You can fix this by resetting the meteringEnabled property to YES.
yourRecorderName.meteringEnabled = YES
Call this every time you want the levels to reset to ambient levels. This takes about 0.02 seconds, and in that time the levels will briefly drop down to 0, or -120 dB before resetting to ambient.
Alternatively, you can use:
[yourRecorderName stop]
[yourRecorderName record]
This takes about 0.05 seconds, but the levels won't drop down to 0 in the wait time. In fact, nothing will happen because in this case, it actually takes the recorder object 0.05 seconds to stop and start recording again.
How about using a timer ? it would be mutch quicker
after
NSError* error
if (recorder) {
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval:0.2 target: self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
} else
NSLog(#" error %#",[error description]);
}
Where levelTimer is the NStimer that calls the function that does what you want(levelTimerCallback:), updates the meters, etc.
-(IBAction)levelTimerCallback:(NSTimer*)timer
{
[recorder updateMeters];
float level = [recorder peakPowerForChannel:0];
NSLog(#"Vol: %f", level);
}
For a project of mine I need to detect when the user blows into the mic. I've following this tutorial: http://www.mobileorchard.com/tutorial-detecting-when-a-user-blows-into-the-mic/ and this question: Detect blow in Mic and do something
But still I do not get the results I want. The blow is detected way too late, or sometimes not at all. When I tweak some results the blow is detected correctly, but then the blow is triggered too fast, ie. when you talk or make a clicking sound it is detected as a blow too.
Has anyone found a good way of detecting a blow? Thanks.
The AVAudioRecorder sound level API is not designed to give you reliable results in separating blowing sounds from other types of sounds received by the mic.
I suggest using the Audio Queue or the Audio Unit RemoteIO API, measuring RMS signal energy, envelope duration, and then using the Accelerate FFT library to check the spectrum for broadband noise vs. peaks that would suggest voiced talking instead of blowing.
e.g. a more reliable result will require a lot more work than 1 OS call.
Use return as you get first lowpass results >0.55
I have solve the issue have a look.
-(void)readyToBlow1 { NSURL *url = [NSURL fileURLWithPath:#"/dev/null"];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey,
nil];
NSError *error;
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
if (recorder) {
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.01 target: self selector: #selector(levelTimerCallback1:) userInfo: nil repeats: YES];
} else
NSLog(#"%#",[error description]);
}
(void)levelTimerCallback1:(NSTimer *)timer { [recorder updateMeters];
const double ALPHA = 0.05; double peakPowerForChannel = pow(10, (0.05 * [recorder peakPowerForChannel:0])); lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults; //NSLog(#"lowPassResults= %f",lowPassResults);
if (lowPassResults > 0.55) { lowPassResults = 0.0;
[self invalidateTimers];
NextPhase *objNextView =[[NextPhase alloc]init];
[UIView transitionFromView:self.view
toView:objNextView.view
duration:2.0
options:UIViewAnimationOptionTransitionCurlUp
completion:^(BOOL finished) {
}
];
[self.navigationController pushViewController:objNextView animated:NO];
**return;**
}
}
I've had good success using AudioQueueGetProperty() for the kAudioQueueProperty_CurrentLevelMeter.