How can I play a background audio while my application is running?
Thanks.
Okay. This is a solution for background sound on iOS4 & iOS5 (definitely works up to iOS 5.0.1), and I have tested it only with AVPlayer. It should probably work for MPMusicPlayerController too.
Required frameworks:
AVFoundation.framework
AudioToolbox.framework
In your Info.plist, for the key UIBackgroundModes, add audio.
In MyAppDelegate.h:
reference <AVFoundation/AVFoundation.h> & <AudioToolbox/AudioToolbox.h>
implement the protocol AVAudioSessionDelegate:
#interface MyAppDelegate : NSObject <UIApplicationDelegate, AVAudioSessionDelegate>
define a method ensureAudio:
// Ensures the audio routes are setup correctly
- (BOOL) ensureAudio;
In MyAppDelegate.m:
implement the ensureAudio method:
- (BOOL) ensureAudio
{
// Registers this class as the delegate of the audio session (to get background sound)
[[AVAudioSession sharedInstance] setDelegate: self];
// Set category
NSError *categoryError = nil;
if (![[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&categoryError]) {
NSLog(#"Audio session category could not be set");
return NO;
}
// Activate session
NSError *activationError = nil;
if (![[AVAudioSession sharedInstance] setActive: YES error: &activationError]) {
NSLog(#"Audio session could not be activated");
return NO;
}
// Allow the audio to mix with other apps (necessary for background sound)
UInt32 doChangeDefaultRoute = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(doChangeDefaultRoute), &doChangeDefaultRoute);
return YES;
}
in the application:didFinishLaunchingWithOptions: method, before you assign the root view controller, run [self ensureAudio]:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
// Configure audio session
[self ensureAudio];
// Add the navigation controller's view to the window and display.
self.window.rootViewController = self.navigationController;
[self.window makeKeyAndVisible];
return YES;
}
implement the AVAudioSessionDelegate methods like this:
#pragma mark - AVAudioSessionDelegate
- (void) beginInterruption
{
}
- (void) endInterruption
{
// Sometimes the audio session will be reset/stopped by an interruption
[self ensureAudio];
}
- (void) inputIsAvailableChanged:(BOOL)isInputAvailable
{
}
ensure that your app continues to run in the background. You can use the ol' [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler] if you want, but I think there are better ways.
play the actual audio (note I'm using ARC, that's why there are no release calls):
NSURL * file = [[NSBundle mainBundle] URLForResource:#"beep" withExtension:#"aif"];
AVURLAsset * asset = [[AVURLAsset alloc] initWithURL:file options:nil];
AVPlayerItem * item = [[AVPlayerItem alloc] initWithAsset:asset];
__block AVPlayer * player = [[AVPlayer alloc]initWithPlayerItem:item];
__block id finishObserver = [[NSNotificationCenter defaultCenter] addObserverForName:AVPlayerItemDidPlayToEndTimeNotification
object:player.currentItem
queue:[NSOperationQueue mainQueue]
usingBlock:^(NSNotification *note) {
[[NSNotificationCenter defaultCenter] removeObserver:finishObserver];
// Reference the 'player' variable so ARC doesn't release it until it's
// finished playing.
player = nil;
}];
// Trigger asynchronous load
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler:^{
// Start playing the beep (watch out - we're not on the main thread here)!
[player play];
}];
And it shooooooooooooould work!
If you are using your app also for recording - then don't forget to change setCategory to AVAudioSessionCategoryPlayAndRecord. In other case you won't be able to record
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord error:&setCategoryErr];
Related
In iOS 7, after my audio interruption listener gets called, any attempt to restore the audio session seems to fail silently.
My interruption listener calls
NSError *activationError = nil;
[[AVAudioSession sharedInstance] setActive:YES error:&activationError];
But the app's audio session is dead as soon as the alarm clock rings. The listener gets called with appropriate begin and end states.
It worked just fine on iOS 6.
I have heard that this is a bug in iOS 7 and that there is a workaround, but can't find it.
Does anyone know a link to a workaround or Technical Note from Apple?
EDIT: I found that I HAVE to use AVAudioSessionCategoryPlayback instead of kAudioSessionCategory_AmbientSound. Now it works. But it is not the category I wanted.
Based on Apple's Audio session programming guide, you should listen and react to changes in your interruption handler. This means that your code can/should handle the end of the interruptions too, based on the received parameter interruptionState.
Check out "Audio Interruption Handling Techniques" on this link, I think it will help you a lot: https://developer.apple.com/library/ios/documentation/Audio/Conceptual/AudioSessionProgrammingGuide/HandlingAudioInterruptions/HandlingAudioInterruptions.html
Good luck,
Z.
You can see my code below. But First,
You should select this bit target | Background Modes | Audio, Airplay and Picture in Picture from Capabilities.
// ViewController.m
// AVAudioPlayer
#import "ViewController.h"
#import AVFoundation;
#import MediaPlayer;
#interface ViewController ()
#property (strong, nonatomic) AVAudioPlayer *audioPlayer;
#property (weak, nonatomic) IBOutlet UISlider *volumeSlider;
#property (weak, nonatomic) IBOutlet UISlider *rateSlider;
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
[self setupAudio];
}
- (void) setupAudio {
NSError *error;
[[AVAudioSession sharedInstance] setActive:YES error:&error];
if (error != nil) {
NSAssert(error == nil, #"");
}
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&error];
if (error != nil) {
NSAssert(error == nil, #"");
}
NSURL *soundURL = [[NSBundle mainBundle] URLForResource:#"SoManyTimes" withExtension:#"mp3"];
self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:soundURL error:&error];
if (error != nil) {
NSAssert(error == nil, #"");
}
//[self.audioPlayer setVolume:0.8];
self.audioPlayer.enableRate = YES;
[self.audioPlayer prepareToPlay];
[self.audioPlayer setVolume:self.volumeSlider.value];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(audioInterrupt:) name:AVAudioSessionInterruptionNotification object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(audioRouteChanged:) name:AVAudioSessionRouteChangeNotification object:nil];
MPRemoteCommandCenter *commandCenter = [MPRemoteCommandCenter sharedCommandCenter];
[commandCenter.playCommand addTarget:self action:#selector(playButtonPressed:)];
[commandCenter.stopCommand addTarget:self action:#selector(stopButtonPressed:)];
[commandCenter.pauseCommand addTarget:self action:#selector(stopButtonPressed:)];
}
- (void) audioRouteChanged:(NSNotification*)notification {
NSNumber *reason = (NSNumber*)[notification.userInfo valueForKey:AVAudioSessionRouteChangeReasonKey];
switch ([reason integerValue]) {
case AVAudioSessionRouteChangeReasonOldDeviceUnavailable:
case AVAudioSessionRouteChangeReasonNoSuitableRouteForCategory:
[self stopButtonPressed:nil];
break;
default:
break;
}
}
- (void) audioInterrupt:(NSNotification*)notification {
NSNumber *interruptionType = (NSNumber*)[notification.userInfo valueForKey:AVAudioSessionInterruptionTypeKey];
switch ([interruptionType integerValue]) {
case AVAudioSessionInterruptionTypeBegan:
[self stopButtonPressed:nil];
break;
case AVAudioSessionInterruptionTypeEnded:
{
if ([(NSNumber*)[notification.userInfo valueForKey:AVAudioSessionInterruptionOptionKey] intValue] == AVAudioSessionInterruptionOptionShouldResume) {
[self playButtonPressed:nil];
}
break;
}
default:
break;
}
}
- (IBAction)playButtonPressed:(id)sender {
BOOL played = [self.audioPlayer play];
if (!played) {
NSLog(#"Error");
}
MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:#"CoverArt"]];
NSDictionary *songInfo = #{
MPMediaItemPropertyTitle:#"So Many Times",
MPMediaItemPropertyArtist:#"The Jellybricks",
MPMediaItemPropertyAlbumTitle:#"Soap Opera",
MPMediaItemPropertyArtwork:albumArt,
MPMediaItemPropertyPlaybackDuration:#(self.audioPlayer.duration),
MPNowPlayingInfoPropertyElapsedPlaybackTime:#(self.audioPlayer.currentTime),
};
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:songInfo];
}
- (IBAction)stopButtonPressed:(id)sender {
[self.audioPlayer stop];
}
- (IBAction)volumeSliderChanged:(UISlider*)sender {
[self.audioPlayer setVolume:sender.value];
}
- (IBAction)rateSliderChanged:(UISlider*)sender {
[self.audioPlayer setRate:sender.value];
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (void) dealloc {
[[NSNotificationCenter defaultCenter] removeObserver:self];
}
#end
I'm also using the AVAudioSessionCategoryPlayback category but interruptions do not automatically resume playback. Check out Detecting active AVAudioSessions on iOS device. The chosen answer has detailed instructions on how to handle session interrupts.
I am Working with streaming URL and I need to create a UISlider(control volume) and an indicator(buffering/loading) like one in the above image.
Code I am using is
_theAudio=[[AVPlayer alloc] initWithURL:streamURL];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
[[AVAudioSession sharedInstance] setActive: YES error: nil];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[_theAudio play];
refer this Link
or
- (void)updateSlider {
// Updates the slider about the music time
slider.value = player.currentTime;
}
- (IBAction)sliderChanged : (UISlider *)sender {
// Fast skips the music when user scrolls the UISlider
[player stop];
[player setCurrentTime:slider.value];
[player prepareToPlay];
[player play];
}
// Stops the timer when the music is finished
- (void)audioPlayerDidFinishPlaying : (AVAudioPlayer *)player successfully : (BOOL)flag {
// Music completed
if (flag) {
[sliderTimer invalidate];
}
}
for volume control u can add below line in
sliderValue_changed delegate
[audioPlayer setVolume:slider.value];
but first set min slider value to 0 and max to 1.
I have an app for iPhone and iPad that plays an audio stream using AVPlayer, I am using the same player of the Apple Sample StitchedStreamPlayer, but I made some changes to play music instead of video.
When I run the app, I can listen for some few seconds and then, the device restarts and following error is displayed:
Terminating in response to SpringBoard's termination.
(when I am running using xcode on the device it plays some minutes, but when I unplug the device and run the app again the app crashes)
I am using the iPhone 4 and an iPad mini for testing, none of them are Jailbroken and booth are iOS 6.
The code is quite big, but here is some parts:
header:
#interface NewPlayer : NSObject <AVAudioSessionDelegate>
#property (strong) AVPlayer *player;
#property (strong) AVPlayerItem *playerItem;
some important methods of Implementation
-(void)play:(NSString *)audio
{
/* Has the user entered a audio URL? */
NSURL *audioUrl = [NSURL URLWithString:audio];
if ([audioUrl scheme]) /* Sanity check on the URL. */
{
/*
Create an asset for inspection of a resource referenced by a given URL.
Load the values for the asset keys "tracks", "playable".
*/
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:audioUrl options:nil];
NSArray *requestedKeys = [NSArray arrayWithObjects:kTracksKey, kPlayableKey, nil];
/* Tells the asset to load the values of any of the specified keys that are not already loaded. */
[asset loadValuesAsynchronouslyForKeys:requestedKeys completionHandler:
^{
dispatch_async( dispatch_get_main_queue(),
^{
/* IMPORTANT: Must dispatch to main queue in order to operate on the AVPlayer and AVPlayerItem. */
[self prepareToPlayAsset:asset withKeys:requestedKeys];
});
}];
}
}
- (void)prepareToPlayAsset:(AVURLAsset *)asset withKeys:(NSArray *)requestedKeys
{
/* Make sure that the value of each key has loaded successfully. */
for (NSString *thisKey in requestedKeys)
{
NSError *error = nil;
AVKeyValueStatus keyStatus = [asset statusOfValueForKey:thisKey error:&error];
if (keyStatus == AVKeyValueStatusFailed)
{
[self assetFailedToPrepareForPlayback:error];
return;
}
/* If you are also implementing the use of -[AVAsset cancelLoading], add your code here to bail
out properly in the case of cancellation. */
}
/* Use the AVAsset playable property to detect whether the asset can be played. */
if (!asset.playable)
{
/* Generate an error describing the failure. */
NSString *localizedDescription = NSLocalizedString(#"Item cannot be played", #"Item cannot be played description");
NSString *localizedFailureReason = NSLocalizedString(#"The assets tracks were loaded, but could not be made playable.", #"Item cannot be played failure reason");
NSDictionary *errorDict = [NSDictionary dictionaryWithObjectsAndKeys:
localizedDescription, NSLocalizedDescriptionKey,
localizedFailureReason, NSLocalizedFailureReasonErrorKey,
nil];
NSError *assetCannotBePlayedError = [NSError errorWithDomain:#"StitchedStreamPlayer" code:0 userInfo:errorDict];
/* Display the error to the user. */
[self assetFailedToPrepareForPlayback:assetCannotBePlayedError];
return;
}
/* At this point we're ready to set up for playback of the asset. */
/* Stop observing our prior AVPlayerItem, if we have one. */
if (self.playerItem)
{
/* Remove existing player item key value observers and notifications. */
[self.playerItem removeObserver:self forKeyPath:kStatusKey];
[[NSNotificationCenter defaultCenter] removeObserver:self
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.playerItem];
}
/* Create a new instance of AVPlayerItem from the now successfully loaded AVAsset. */
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
/* Observe the player item "status" key to determine when it is ready to play. */
[self.playerItem addObserver:self
forKeyPath:kStatusKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:MyStreamingAudioViewControllerPlayerItemStatusObserverContext];
/* When the player item has played to its end time we'll toggle
the movie controller Pause button to be the Play button */
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.playerItem];
/* Create new player, if we don't already have one. */
if (![self player])
{
/* Get a new AVPlayer initialized to play the specified player item. */
[self setPlayer:[AVPlayer playerWithPlayerItem:self.playerItem]];
/* Observe the AVPlayer "currentItem" property to find out when any
AVPlayer replaceCurrentItemWithPlayerItem: replacement will/did
occur.*/
[self.player addObserver:self
forKeyPath:kCurrentItemKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:MyStreamingAudioViewControllerCurrentItemObservationContext];
}
/* Make our new AVPlayerItem the AVPlayer's current item. */
if (self.player.currentItem != self.playerItem)
{
/* Replace the player item with a new player item. The item replacement occurs
asynchronously; observe the currentItem property to find out when the
replacement will/did occur*/
[[self player] replaceCurrentItemWithPlayerItem:self.playerItem];
[self syncPlayPauseButtons];
}
}
- (void)observeValueForKeyPath:(NSString*) path
ofObject:(id)object
change:(NSDictionary*)change
context:(void*)context
{
/* AVPlayerItem "status" property value observer. */
if (context == MyStreamingAudioViewControllerPlayerItemStatusObserverContext)
{
[self syncPlayPauseButtons];
AVPlayerStatus status = [[change objectForKey:NSKeyValueChangeNewKey] integerValue];
switch (status)
{
/* Indicates that the status of the player is not yet known because
it has not tried to load new media resources for playback */
case AVPlayerStatusUnknown:
{
NSLog(#"desconhecido");
}
break;
case AVPlayerStatusReadyToPlay:
{
/* Once the AVPlayerItem becomes ready to play, i.e.
[playerItem status] == AVPlayerItemStatusReadyToPlay,
its duration can be fetched from the item. */
NSLog(#"ready to play");
[player play];
[self.delegate tocandoMusica];
}
break;
case AVPlayerStatusFailed:
{
AVPlayerItem *thePlayerItem = (AVPlayerItem *)object;
[self assetFailedToPrepareForPlayback:thePlayerItem.error];
NSLog(#"falhou");
[self.delegate acabouMusica];
}
break;
}
}
/* AVPlayer "rate" property value observer. */
else if (context == MyStreamingAudioViewControllerRateObservationContext)
{
//[self syncPlayPauseButtons];
}
/* AVPlayer "currentItem" property observer.
Called when the AVPlayer replaceCurrentItemWithPlayerItem:
replacement will/did occur. */
else if (context == MyStreamingAudioViewControllerCurrentItemObservationContext)
{
AVPlayerItem *newPlayerItem = [change objectForKey:NSKeyValueChangeNewKey];
/* New player item null? */
if (newPlayerItem == (id)[NSNull null])
{
//[self disablePlayerButtons];
//[self disableScrubber];
}
else /* Replacement of player currentItem has occurred */
{
/* Specifies that the player should preserve the video’s aspect ratio and
fit the video within the layer’s bounds. */
[self syncPlayPauseButtons];
}
}
/* Observe the AVPlayer "currentItem.timedMetadata" property to parse the media stream
timed metadata. */
else if (context == MyStreamingAudioViewControllerTimedMetadataObserverContext)
{
//NSArray* array = [[player currentItem] timedMetadata];
//for (AVMetadataItem *metadataItem in array)
//{
//}
}
else
{
[super observeValueForKeyPath:path ofObject:object change:change context:context];
}
return;
}
If you want to take a deep look, just take a look on StitchedStreamPlayer Sample, I have no idea. I have looked at:
Failed to play audio file using AVPlayer in iPhone
memory leak in AudioToolbox library AVAudioPlayer
AudioToolBox leak in iOS6?
and many others..
I have tried to forget all this implementation and use just
player = [AVPlayer playerWithURL:[NSURL URLWithString:url]];
[player play];
but it crashes!
Some idea?
EDITED
I Have tried the MPMoviePlayerController but the same happened, the music started and then the device restarted.
This is the code I have used:
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:NULL];
player = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:[[arrRadios objectAtIndex:indexPath.row] objectForKey:#"url"]]];
[player play];
I am playing music using AVAudioPlayer in background. The problem is: if there is a incoming calling interrupts the player, it will never resume unless switch to foreground and do it manually.
The code is simple, to play it in background:
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord error: nil];
[[AVAudioSession sharedInstance] setActive: YES error: nil];
url = [[NSURL alloc] initFileURLWithPath:...];
audio_player = [[AVAudioPlayer alloc] initWithContentsOfURL: url error:NULL];
audio_player.delegate = self;
bool ret = [audio_player play];
delegate to handle interruptions:
-(void)audioPlayerBeginInterruption:(AVAudioPlayer *)player
{
//tried this, not working [[AVAudioSession sharedInstance] setActive: NO error: nil];
NSLog(#"-- interrupted --");
}
//----------- THIS PART NOT WORKING WHEN RUNNING IN BACKGROUND ----------
- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player
{
NSLog(#"resume!");
//--- tried, not working: [[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord error: nil];
//--- tried, not working: [[AVAudioSession sharedInstance] setActive: YES error: nil];
//--- tried, not working: [audio_player prepareToPlay];
[audio_player play];
}
Can any one help me?
Found the solution!
I had the same problem, my app was resuming the audio nicely after an interruption, only if my app was open. When it was on the background it failed ro resume playing the audio after an interruption.
I fixed this by adding the following lines of code:
Add this line whenever your app start playing audio.[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
And on the endInterruption method, wait 1 or 2 seconds before resume playing the audio. This to allow the OS to stop using the audio channel.
- (void)endInterruptionWithFlags:(NSUInteger)flags {
// Validate if there are flags available.
if (flags) {
// Validate if the audio session is active and immediately ready to be used.
if (AVAudioSessionInterruptionFlags_ShouldResume) {
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, 1), dispatch_get_main_queue(), ^{
// Resume playing the audio.
});
}
}
}
You can also add this line when your app stops (not pause) playing audio. But is not required.
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
Try this
-(void)audioPlayerEndInterruption:(AVAudioPlayer *)audioPlayer withFlags:(NSUInteger)flags{
if (flags == AVAudioSessionFlags_ResumePlay) {
[audioPlayer play];
}
Hope it helps.
Ok right RecordViewController, allows you to use the voice function and record. It functions properly, but When i press the 'back' button and go to a different view the recording stops. How do I make it so it keeps recording, so the user can record their voice even whilst on different views?
#implementation RecordViewController
#synthesize actSpinner, btnStart, btnPlay;
-(void)countUp {
mainInt += 1;
seconds.text = [NSString stringWithFormat:#"%02d", mainInt];
}
-(IBAction)goBack:(id)sender; {
[self dismissModalViewControllerAnimated:YES];
}
/*
// The designated initializer. Override to perform setup that is required before the view is loaded.
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil {
if (self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil]) {
// Custom initialization
}
return self;
}
*/
/*
// Implement loadView to create a view hierarchy programmatically, without using a nib.
- (void)loadView {
}
*/
// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
- (void)viewDidLoad {
[super viewDidLoad];
//Start the toggle in true mode.
toggle = YES;
btnPlay.hidden = YES;
//Instanciate an instance of the AVAudioSession object.
AVAudioSession * audioSession = [AVAudioSession sharedInstance];
//Setup the audioSession for playback and record.
//We could just use record and then switch it to playback leter, but
//since we are going to do both lets set it up once.
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error: &error];
//Activate the session
[audioSession setActive:YES error: &error];
}
/*
// Override to allow orientations other than the default portrait orientation.
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
// Return YES for supported orientations
return (interfaceOrientation == UIInterfaceOrientationPortrait);
}
*/
- (IBAction) start_button_pressed{
if(toggle)
{
toggle = NO;
[actSpinner startAnimating];
[btnStart setImage:[UIImage imageNamed:#"stoprecordingbutton.png"] forState:UIControlStateNormal];
mainInt = 0;
theTimer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:#selector(countUp) userInfo:nil repeats:YES];
btnPlay.enabled = toggle;
btnPlay.hidden = !toggle;
//Begin the recording session.
//Error handling removed. Please add to your own code.
//Setup the dictionary object with all the recording settings that this
//Recording sessoin will use
//Its not clear to me which of these are required and which are the bare minimum.
//This is a good resource: http://www.totodotnet.net/tag/avaudiorecorder/
NSMutableDictionary* recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
//Now that we have our settings we are going to instanciate an instance of our recorder instance.
//Generate a temp file for use by the recording.
//This sample was one I found online and seems to be a good choice for making a tmp file that
//will not overwrite an existing one.
//I know this is a mess of collapsed things into 1 call. I can break it out if need be.
recordedTmpFile = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent: [NSString stringWithFormat: #"%.0f.%#", [NSDate timeIntervalSinceReferenceDate] * 1000.0, #"caf"]]];
NSLog(#"Using File called: %#",recordedTmpFile);
//Setup the recorder to use this file and record to it.
recorder = [[ AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSetting error:&error];
//Use the recorder to start the recording.
//Im not sure why we set the delegate to self yet.
//Found this in antother example, but Im fuzzy on this still.
[recorder setDelegate:self];
//We call this to start the recording process and initialize
//the subsstems so that when we actually say "record" it starts right away.
[recorder prepareToRecord];
//Start the actual Recording
[recorder record];
//There is an optional method for doing the recording for a limited time see
//[recorder recordForDuration:(NSTimeInterval) 10]
}
else
{
toggle = YES;
[actSpinner stopAnimating];
[btnStart setImage:[UIImage imageNamed:#"recordbutton.png"] forState:UIControlStateNormal];
btnPlay.enabled = toggle;
btnPlay.hidden = !toggle;
[theTimer invalidate];
NSLog(#"Using File called: %#",recordedTmpFile);
//Stop the recorder.
[recorder stop];
}
}
- (void)didReceiveMemoryWarning {
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
-(IBAction) play_button_pressed{
//The play button was pressed...
//Setup the AVAudioPlayer to play the file that we just recorded.
AVAudioPlayer * avPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:recordedTmpFile error:&error];
[avPlayer prepareToPlay];
[avPlayer play];
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
// Return YES for supported orientations
return (interfaceOrientation == UIInterfaceOrientationLandscapeLeft || interfaceOrientation == UIInterfaceOrientationLandscapeRight);
}
- (void)viewDidUnload {
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
//Clean up the temp file.
NSFileManager * fm = [NSFileManager defaultManager];
[fm removeItemAtPath:[recordedTmpFile path] error:&error];
//Call the dealloc on the remaining objects.
[recorder dealloc];
recorder = nil;
recordedTmpFile = nil;
}
- (void)dealloc {
[super dealloc];
}
#end
Thanks
Keep a reference to it somewhere in the app (like in the delegate) so it won't get de-allocated.
I would set up the view controller using a singleton approach. I like it better then using the appDelegate and apple recomends it over editting it.
Have a look here for some examples
I use this approach when I want ininterrupt music playing on my apps and works like a charm... be aware of memory usage though as those instances are never released as your app is running...