Program received signal: “EXC_BAD_ACCESS” - iphone

Right I have 5 views. One of the views is called RecordViewController. (RecordViewController is causing the error)
I can switch through views fine. But once I get onto the recordviewcontroller, I can switch to another view. But if I want to switch back to recordviewcontroller. It throws me out of my app and gives me this error:
Program received signal: “EXC_BAD_ACCESS”.
Data Formatters temporarily unavailable, will re-try after a 'continue'. (The program being debugged was signaled while in a function called from GDB.
GDB has restored the context to what it was before the call.
To change this behavior use "set unwindonsignal off"
Evaluation of the expression containing the function (gdb_objc_startDebuggerMode) will be abandoned.)
Here is the code for recordviewcontroller - Ive removed the switching view code, as its not needed.
#implementation RecordViewController
#synthesize actSpinner, btnStart, btnPlay;
-(void)countUp {
mainInt += 1;
seconds.text = [NSString stringWithFormat:#"%02d", mainInt];
}
static RecordViewController *sharedInstance = nil;
+ (RecordViewController*)sharedInstance {
if (sharedInstance == nil) {
sharedInstance = [[super allocWithZone:NULL] init];
}
return sharedInstance;
}
+ (id)allocWithZone:(NSZone*)zone {
return [[self sharedInstance] retain];
}
- (id)copyWithZone:(NSZone *)zone {
return self;
}
- (id)retain {
return self;
}
- (NSUInteger)retainCount {
return NSUIntegerMax;
}
- (void)release {
}
- (id)autorelease {
return self;
}
- (void)viewDidLoad {
[super viewDidLoad];
toggle = YES;
btnPlay.hidden = YES;
AVAudioSession * audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error: &error];
[audioSession setActive:YES error: &error];
}
- (IBAction) start_button_pressed{
if(toggle)
{
toggle = NO;
[actSpinner startAnimating];
[btnStart setImage:[UIImage imageNamed:#"recordstop.png"] forState:UIControlStateNormal];
mainInt = 0;
theTimer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:#selector(countUp) userInfo:nil repeats:YES];
btnPlay.enabled = toggle;
btnPlay.hidden = !toggle;
NSMutableDictionary* recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
recordedTmpFile = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent: [NSString stringWithFormat: #"%.0f.%#", [NSDate timeIntervalSinceReferenceDate] * 1000.0, #"caf"]]];
NSLog(#"Using File called: %#",recordedTmpFile);
recorder = [[ AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSetting error:&error];
[recorder setDelegate:self];
[recorder prepareToRecord];
[recorder record];
}
else
{
toggle = YES;
[actSpinner stopAnimating];
[btnStart setImage:[UIImage imageNamed:#"recordrecord.png"] forState:UIControlStateNormal];
btnPlay.enabled = toggle;
btnPlay.hidden = !toggle;
[theTimer invalidate];
NSLog(#"Using File called: %#",recordedTmpFile);
[recorder stop];
}
}
- (void)didReceiveMemoryWarning {
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
-(IBAction) play_button_pressed{
AVAudioPlayer * avPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:recordedTmpFile error:&error];
[avPlayer prepareToPlay];
[avPlayer play];
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
// Return YES for supported orientations
return (interfaceOrientation == UIInterfaceOrientationLandscapeLeft || interfaceOrientation == UIInterfaceOrientationLandscapeRight);
}
- (void)viewDidUnload {
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
//Clean up the temp file.
NSFileManager * fm = [NSFileManager defaultManager];
[fm removeItemAtPath:[recordedTmpFile path] error:&error];
//Call the dealloc on the remaining objects.
[recorder dealloc];
recorder = nil;
recordedTmpFile = nil;
}
- (void)dealloc {
[super dealloc];
}
#end

Sounds like you are not retaining your recordViewController, so when you push another view, it's released, and when you try to go back, it's not there anymore.
If you are not using ARC, retain it when you create it.

Remove the implementations of the following methods altogether:
+ (id)allocWithZone:(NSZone*)zone
- (id)copyWithZone:(NSZone *)zone
- (id)retain
- (NSUInteger)retainCount
- (void)release
And change the implementation of sharedInstance to the following and also add the implementation of alloc shown below.
+(RecordViewController *) sharedInstance {
#synchronized([RecordViewController class])
{
if (!sharedInstance)
{
[[self alloc]init];
}
return sharedInstance;
}
return nil;
}
+(id) alloc
{
#synchronized([RecordViewController class])
{
NSAssert(sharedInstance == nil, #"Attempted to allocated second singleton RecordViewController");
sharedInstance = [super alloc];
return sharedInstance;
}
return nil;
}

I had this issue several times, and it was always some sort of infinitive recursion — most likely be calling the setter from with-in the setter.
edit
googleling for it I found this: ERROR: Memory Leak, data formatters temporarily unavailable
As your code also is leaking, it could be the same problem.

Related

playing a movie stops avcapturesession recording

I have an iOS app that records the video from front facing camera in the background and was working fine. But now I am trying to play a short mp4 from at the same time and the playback using MPMoviePlayerController stops the capture session.
I tried AVPlayer instead with the same result.
I also set the [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
Still no luck. Did anyone faced and solved the same problem.
Thanks for any suggestion.
using ios5 SDK.
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
UIButton *recButton=[[UIButton alloc] initWithFrame:CGRectMake(10,200, 200,40)] ;
recButton.backgroundColor = [UIColor blackColor];
[recButton addTarget:self action:#selector(startRecording) forControlEvents:UIControlEventTouchUpInside];
[self.view addSubview:recButton];
isRecording=NO;
}
- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
}
-(void) viewWillAppear:(BOOL)animated
{
self.navigationController.navigationBarHidden = YES;
}
-(void) viewWillDisappear:(BOOL)animated
{
self.navigationController.navigationBarHidden = NO;
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone) {
return (interfaceOrientation == UIInterfaceOrientationPortrait);
} else {
return YES;
}
}
#pragma mark video playing
-(void) startRecording
{
if (isRecording) {
[self stopVideoRecording];
isRecording=NO;
}
else
{
[self initCaptureSession];
[self startVideoRecording];
isRecording=YES;
}
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"new"
ofType:#"mov"]];
[self playMovieAtURL:url];
}
-(void) playMovieAtURL: (NSURL*) theURL {
player =
[[MPMoviePlayerController alloc] initWithContentURL: theURL ];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
player.scalingMode = MPMovieScalingModeAspectFill;
player.controlStyle = MPMovieControlStyleNone;
[player prepareToPlay];
// Register for the playback finished notification
[[NSNotificationCenter defaultCenter]
addObserver: self
selector: #selector(myMovieFinishedCallback:)
name: MPMoviePlayerPlaybackDidFinishNotification
object: player];
[player.view setFrame: self.view.bounds];
[self.view addSubview:player.view];
// Movie playback is asynchronous, so this method returns immediately.
[player play];
}
// When the movie is done, release the controller.
-(void) myMovieFinishedCallback: (NSNotification*) aNotification
{
MPMoviePlayerController* theMovie = [aNotification object];
[[NSNotificationCenter defaultCenter]
removeObserver: self
name: MPMoviePlayerPlaybackDidFinishNotification
object: theMovie];
[player.view removeFromSuperview];
[self stopVideoRecording];
}
#pragma mark -
#pragma mark recording
-(void) initCaptureSession
{
NSLog(#"Setting up capture session");
captureSession = [[AVCaptureSession alloc] init];
//----- ADD INPUTS -----
NSLog(#"Adding video input");
//ADD VIDEO INPUT
AVCaptureDevice *VideoDevice = [self frontFacingCameraIfAvailable ];
//[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (VideoDevice)
{
NSError *error;
videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:VideoDevice error:&error];
if (!error)
{
if ([captureSession canAddInput:videoInputDevice])
[captureSession addInput:videoInputDevice];
else
NSLog(#"Couldn't add video input");
}
else
{
NSLog(#"Couldn't create video input");
}
}
else
{
NSLog(#"Couldn't create video capture device");
}
//ADD AUDIO INPUT
NSLog(#"Adding audio input");
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *error = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
if (audioInput)
{
[captureSession addInput:audioInput];
}
//----- ADD OUTPUTS ---
//ADD MOVIE FILE OUTPUT
NSLog(#"Adding movie file output");
movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
// Float64 TotalSeconds = 60; //Total seconds
// int32_t preferredTimeScale = 30; //Frames per second
// CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale); //<<SET MAX DURATION
// movieFileOutput.maxRecordedDuration = maxDuration;
movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024; //<<SET MIN FREE SPACE IN BYTES FOR RECORDING TO CONTINUE ON A VOLUME
if ([captureSession canAddOutput:movieFileOutput])
[captureSession addOutput:movieFileOutput];
//SET THE CONNECTION PROPERTIES (output properties)
[self CameraSetOutputProperties]; //(We call a method as it also has to be done after changing camera)
//----- SET THE IMAGE QUALITY / RESOLUTION -----
//Options:
// AVCaptureSessionPresetHigh - Highest recording quality (varies per device)
// AVCaptureSessionPresetMedium - Suitable for WiFi sharing (actual values may change)
// AVCaptureSessionPresetLow - Suitable for 3G sharing (actual values may change)
// AVCaptureSessionPreset640x480 - 640x480 VGA (check its supported before setting it)
// AVCaptureSessionPreset1280x720 - 1280x720 720p HD (check its supported before setting it)
// AVCaptureSessionPresetPhoto - Full photo resolution (not supported for video output)
NSLog(#"Setting image quality");
[captureSession setSessionPreset:AVCaptureSessionPresetMedium];
if ([captureSession canSetSessionPreset:AVCaptureSessionPreset640x480]) //Check size based configs are supported before setting them
[captureSession setSessionPreset:AVCaptureSessionPreset640x480];
//----- START THE CAPTURE SESSION RUNNING -----
[captureSession startRunning];
}
//********** CAMERA SET OUTPUT PROPERTIES **********
- (void) CameraSetOutputProperties
{
AVCaptureConnection *CaptureConnection=nil;
//SET THE CONNECTION PROPERTIES (output properties)
NSComparisonResult order = [[UIDevice currentDevice].systemVersion compare: #"5.0.0" options: NSNumericSearch];
if (order == NSOrderedSame || order == NSOrderedDescending) {
// OS version >= 5.0.0
CaptureConnection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
// if (CaptureConnection.supportsVideoMinFrameDuration)
// CaptureConnection.videoMinFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
// if (CaptureConnection.supportsVideoMaxFrameDuration)
// CaptureConnection.videoMaxFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
// if (CaptureConnection.supportsVideoMinFrameDuration)
// {
// // CMTimeShow(CaptureConnection.videoMinFrameDuration);
// // CMTimeShow(CaptureConnection.videoMaxFrameDuration);
// }
} else {
// OS version < 5.0.0
CaptureConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:[movieFileOutput connections]];
}
//Set landscape (if required)
if ([CaptureConnection isVideoOrientationSupported])
{
AVCaptureVideoOrientation orientation = AVCaptureVideoOrientationPortrait;// AVCaptureVideoOrientationLandscapeRight; //<<<<<SET VIDEO ORIENTATION IF LANDSCAPE
[CaptureConnection setVideoOrientation:orientation];
}
//Set frame rate (if requried)
//CMTimeShow(CaptureConnection.videoMinFrameDuration);
//CMTimeShow(CaptureConnection.videoMaxFrameDuration);
}
- (void) startVideoRecording
{
//Create temporary URL to record to
NSString *outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"output.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath])
{
NSError *error;
if ([fileManager removeItemAtPath:outputPath error:&error] == NO)
{
//Error - handle if requried
NSLog(#"file remove error");
}
}
//Start recording
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
}
-(void) stopVideoRecording
{
[movieFileOutput stopRecording];
}
//********** DID FINISH RECORDING TO OUTPUT FILE AT URL **********/
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error
{
NSLog(#"didFinishRecordingToOutputFileAtURL - enter");
BOOL RecordedSuccessfully = YES;
if ([error code] != noErr)
{
// A problem occurred: Find out if the recording was successful.
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if (value)
{
RecordedSuccessfully = [value boolValue];
}
}
if (RecordedSuccessfully)
{
//----- RECORDED SUCESSFULLY -----
NSLog(#"didFinishRecordingToOutputFileAtURL - success");
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error)
{
if (error)
{
NSLog(#"File save error");
}
else
{
recordedVideoURL=assetURL;
}
}];
}
else {
NSString *assetURL=[self copyFileToDocuments:outputFileURL];
if(assetURL!=nil)
{
recordedVideoURL=[NSURL URLWithString:assetURL];
}
}
}
}
- (NSString*) copyFileToDocuments:(NSURL *)fileURL
{
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss"];
NSString *destinationPath = [documentsDirectory stringByAppendingFormat:#"/output_%#.mov", [dateFormatter stringFromDate:[NSDate date]]];
NSError *error;
if (![[NSFileManager defaultManager] copyItemAtURL:fileURL toURL:[NSURL fileURLWithPath:destinationPath] error:&error]) {
NSLog(#"File save error %#", [error localizedDescription]);
return nil;
}
return destinationPath;
}
- (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections
{
for ( AVCaptureConnection *connection in connections ) {
for ( AVCaptureInputPort *port in [connection inputPorts] ) {
if ( [[port mediaType] isEqual:mediaType] ) {
return connection;
}
}
}
return nil;
}
- (AVCaptureDevice *)frontFacingCameraIfAvailable
{
// look at all the video devices and get the first one that's on the front
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices)
{
if (device.position == AVCaptureDevicePositionFront)
{
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if ( ! captureDevice)
{
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
#pragma mark -
#end
I have this problem too,I don't know how to fix it,but I know the problem is here:
[captureSession addInput:audioInput];
If you delete this code of line,it will be work fine,I think it's audio mix or some audio problem.
I still finding the answer.
I have found the answer here: answer ,it work!
but u remember add AudioToolbox.framework,maybe helpful for you.

AVAudioRecorder records only the audio after interruption

In my application for recording and playing audio using AVAudioRecorder and AVAudioPlayer I came across a scenario in the case of incoming phone call.While the recording is in progress and if the phone call comes,the audio recorded after the phone call is only recorded.I want the recording recorded after the phone call to be the continuation of the audio recorded before the phone call.
I track the interruption occuring in audio recorder using the AVAudioRecorderDelegate methods
(void)audioRecorderBeginInterruption:(AVAudioRecorder *)avRecorder
and
(void)audioRecorderEndInterruption:(AVAudioRecorder *)avRecorder.
In my EndInterruption method I activates the audioSession.
Here is the recording code that I use
- (void)startRecordingProcess
{
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err = nil;
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&err];
if(err)
{
DEBUG_LOG(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
[audioSession setActive:YES error:&err];
err = nil;
if(err)
{
DEBUG_LOG(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
// Record settings for recording the audio
recordSetting = [[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatMPEG4AAC],AVFormatIDKey,
[NSNumber numberWithInt:44100],AVSampleRateKey,
[NSNumber numberWithInt: 2],AVNumberOfChannelsKey,
[NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
nil];
BOOL fileExists = [[NSFileManager defaultManager] fileExistsAtPath:recorderFilePath];
if (fileExists)
{
BOOL appendingFileExists =
[[NSFileManager defaultManager] fileExistsAtPath:appendingFilePath];
if (appendingFileExists)
{
[[NSFileManager defaultManager]removeItemAtPath:appendingFilePath error:nil];
}
if (appendingFilePath)
{
[appendingFilePath release];
appendingFilePath = nil;
}
appendingFilePath = [[NSString alloc]initWithFormat:#"%#/AppendedAudio.m4a", DOCUMENTS_FOLDER];
fileUrl = [NSURL fileURLWithPath:appendingFilePath];
}
else
{
isFirstTime = YES;
if (recorderFilePath)
{
DEBUG_LOG(#"Testing 2");
[recorderFilePath release];
recorderFilePath = nil;
}
DEBUG_LOG(#"Testing 3");
recorderFilePath = [[NSString alloc]initWithFormat:#"%#/RecordedAudio.m4a", DOCUMENTS_FOLDER];
fileUrl = [NSURL fileURLWithPath:recorderFilePath];
}
err = nil;
recorder = [[recorder initWithURL:fileUrl settings:recordSetting error:&err]retain];
if(!recorder)
{
DEBUG_LOG(#"recorder: %# %d %#", [err domain], [err code], [[err userInfo] description]);
[[AlertFunctions sharedInstance] showMessageWithTitle:kAppName
message:[err localizedDescription]
delegate:nil
cancelButtonTitle:#"Ok"];
return;
}
//prepare to record
[recorder setDelegate:self];
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
}
While searching for a solution to this issue I came across another link
how to resume recording after interruption occured in iphone? and http://www.iphonedevsdk.com/forum/iphone-sdk-development/31268-avaudiorecorderdelegate-interruption.html which speaks of the same issue.
I tried the suggestions that were given in those links but were not successful.
I hope to make it work with AVAudioRecorder itself.
Is there any way I could find a solution to this issue?
All valuable suggestions are appreciated.
After several research I was notified by Apple that it's an issue with the current API. So I managed to find a workaround for the issue by saving the previous audio file just after interruption and joining it with the resumed audio file. Hope it helps someone out there who may face the same issue.
I was also facing a similar issue where AVAudioRecorder was recording only after interruption.
So i fixed this issue by maintaining an array of recordings and keeping them in the NSTemporaryDirectory and finally merging them at the end.
Below are the key steps:
Make your class listen to the AVAudioSessionInterruptionNotification.
On interruption begin (AVAudioSessionInterruptionTypeBegan), save your recording
On interruption end(AVAudioSessionInterruptionTypeEnded), start a new recording for interruption option AVAudioSessionInterruptionOptionShouldResume
Append all recordings on hitting the Save button.
The code snippets for the above mentioned steps are:
// 1. Make this class listen to the AVAudioSessionInterruptionNotification in viewDidLoad
- (void)viewDidLoad
{
[super viewDidLoad];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(handleAudioSessionInterruption:)
name:AVAudioSessionInterruptionNotification
object:[AVAudioSession sharedInstance]];
// other coding stuff
}
// observe the interruption begin / end
- (void)handleAudioSessionInterruption:(NSNotification*)notification
{
AVAudioSessionInterruptionType interruptionType = [notification.userInfo[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
AVAudioSessionInterruptionOptions interruptionOption = [notification.userInfo[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue];
switch (interruptionType) {
// 2. save recording on interruption begin
case AVAudioSessionInterruptionTypeBegan:{
// stop recording
// Update the UI accordingly
break;
}
case AVAudioSessionInterruptionTypeEnded:{
if (interruptionOption == AVAudioSessionInterruptionOptionShouldResume) {
// create a new recording
// Update the UI accordingly
}
break;
}
default:
break;
}
}
// 4. append all recordings
- (void) audioRecorderDidFinishRecording:(AVAudioRecorder *)avrecorder successfully:(BOOL)flag
{
// append all recordings one after other
}
Here is a working example:
//
// XDRecordViewController.m
//
// Created by S1LENT WARRIOR
//
#import "XDRecordViewController.h"
#interface XDRecordViewController ()
{
AVAudioRecorder *recorder;
__weak IBOutlet UIButton* btnRecord;
__weak IBOutlet UIButton* btnSave;
__weak IBOutlet UIButton* btnDiscard;
__weak IBOutlet UILabel* lblTimer; // a UILabel to display the recording time
// some variables to display the timer on a lblTimer
NSTimer* timer;
NSTimeInterval intervalTimeElapsed;
NSDate* pauseStart;
NSDate* previousFireDate;
NSDate* recordingStartDate;
// interruption handling variables
BOOL isInterrupted;
NSInteger preInterruptionDuration;
NSMutableArray* recordings; // an array of recordings to be merged in the end
}
#end
#implementation XDRecordViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Make this class listen to the AVAudioSessionInterruptionNotification
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(handleAudioSessionInterruption:)
name:AVAudioSessionInterruptionNotification
object:[AVAudioSession sharedInstance]];
[self clearContentsOfDirectory:NSTemporaryDirectory()]; // clear contents of NSTemporaryDirectory()
recordings = [NSMutableArray new]; // initialize recordings
[self setupAudioSession]; // setup the audio session. you may customize it according to your requirements
}
- (void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
[self initRecording]; // start recording as soon as the view appears
}
- (void)dealloc
{
[self clearContentsOfDirectory:NSTemporaryDirectory()]; // remove all files files from NSTemporaryDirectory
[[NSNotificationCenter defaultCenter] removeObserver:self]; // remove this class from NSNotificationCenter
}
#pragma mark - Event Listeners
// called when recording button is tapped
- (IBAction) btnRecordingTapped:(UIButton*)sender
{
sender.selected = !sender.selected; // toggle the button
if (sender.selected) { // resume recording
[recorder record];
[self resumeTimer];
} else { // pause recording
[recorder pause];
[self pauseTimer];
}
}
// called when save button is tapped
- (IBAction) btnSaveTapped:(UIButton*)sender
{
[self pauseTimer]; // pause the timer
// disable the UI while the recording is saving so that user may not press the save, record or discard button again
btnSave.enabled = NO;
btnRecord.enabled = NO;
btnDiscard.enabled = NO;
[recorder stop]; // stop the AVAudioRecorder so that the audioRecorderDidFinishRecording delegate function may get called
// Deactivate the AVAudioSession
NSError* error;
[[AVAudioSession sharedInstance] setActive:NO error:&error];
if (error) {
NSLog(#"%#", error);
}
}
// called when discard button is tapped
- (IBAction) btnDiscardTapped:(id)sender
{
[self stopTimer]; // stop the timer
recorder.delegate = Nil; // set delegate to Nil so that audioRecorderDidFinishRecording delegate function may not get called
[recorder stop]; // stop the recorder
// Deactivate the AVAudioSession
NSError* error;
[[AVAudioSession sharedInstance] setActive:NO error:&error];
if (error) {
NSLog(#"%#", error);
}
[self.navigationController popViewControllerAnimated:YES];
}
#pragma mark - Notification Listeners
// called when an AVAudioSessionInterruption occurs
- (void)handleAudioSessionInterruption:(NSNotification*)notification
{
AVAudioSessionInterruptionType interruptionType = [notification.userInfo[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
AVAudioSessionInterruptionOptions interruptionOption = [notification.userInfo[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue];
switch (interruptionType) {
case AVAudioSessionInterruptionTypeBegan:{
// • Recording has stopped, already inactive
// • Change state of UI, etc., to reflect non-recording state
preInterruptionDuration += recorder.currentTime; // time elapsed
if(btnRecord.selected) { // timer is already running
[self btnRecordingTapped:btnRecord]; // pause the recording and pause the timer
}
recorder.delegate = Nil; // Set delegate to nil so that audioRecorderDidFinishRecording may not get called
[recorder stop]; // stop recording
isInterrupted = YES;
break;
}
case AVAudioSessionInterruptionTypeEnded:{
// • Make session active
// • Update user interface
// • AVAudioSessionInterruptionOptionShouldResume option
if (interruptionOption == AVAudioSessionInterruptionOptionShouldResume) {
// Here you should create a new recording
[self initRecording]; // create a new recording
[self btnRecordingTapped:btnRecord];
}
break;
}
default:
break;
}
}
#pragma mark - AVAudioRecorderDelegate
- (void) audioRecorderDidFinishRecording:(AVAudioRecorder *)avrecorder successfully:(BOOL)flag
{
[self appendAudiosAtURLs:recordings completion:^(BOOL success, NSURL *outputUrl) {
// do whatever you want with the new audio file :)
}];
}
#pragma mark - Timer
- (void)timerFired:(NSTimer*)timer
{
intervalTimeElapsed++;
[self updateDisplay];
}
// function to time string
- (NSString*) timerStringSinceTimeInterval:(NSTimeInterval)timeInterval
{
NSDate *timerDate = [NSDate dateWithTimeIntervalSince1970:timeInterval];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"mm:ss"];
[dateFormatter setTimeZone:[NSTimeZone timeZoneForSecondsFromGMT:0.0]];
return [dateFormatter stringFromDate:timerDate];
}
// called when recording pauses
- (void) pauseTimer
{
pauseStart = [NSDate dateWithTimeIntervalSinceNow:0];
previousFireDate = [timer fireDate];
[timer setFireDate:[NSDate distantFuture]];
}
- (void) resumeTimer
{
if (!timer) {
timer = [NSTimer scheduledTimerWithTimeInterval:1.0
target:self
selector:#selector(timerFired:)
userInfo:Nil
repeats:YES];
return;
}
float pauseTime = - 1 * [pauseStart timeIntervalSinceNow];
[timer setFireDate:[previousFireDate initWithTimeInterval:pauseTime sinceDate:previousFireDate]];
}
- (void)stopTimer
{
[self updateDisplay];
[timer invalidate];
timer = nil;
}
- (void)updateDisplay
{
lblTimer.text = [self timerStringSinceTimeInterval:intervalTimeElapsed];
}
#pragma mark - Helper Functions
- (void) initRecording
{
// Set the audio file
NSString* name = [NSString stringWithFormat:#"recording_%#.m4a", #(recordings.count)]; // creating a unique name for each audio file
NSURL *outputFileURL = [NSURL fileURLWithPathComponents:#[NSTemporaryDirectory(), name]];
[recordings addObject:outputFileURL];
// Define the recorder settings
NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue:#(kAudioFormatMPEG4AAC) forKey:AVFormatIDKey];
[recordSetting setValue:#(44100.0) forKey:AVSampleRateKey];
[recordSetting setValue:#(1) forKey:AVNumberOfChannelsKey];
NSError* error;
// Initiate and prepare the recorder
recorder = [[AVAudioRecorder alloc] initWithURL:outputFileURL settings:recordSetting error:&error];
recorder.delegate = self;
recorder.meteringEnabled = YES;
[recorder prepareToRecord];
if (![AVAudioSession sharedInstance].inputAvailable) { // can not record audio if mic is unavailable
NSLog(#"Error: Audio input device not available!");
return;
}
intervalTimeElapsed = 0;
recordingStartDate = [NSDate date];
if (isInterrupted) {
intervalTimeElapsed = preInterruptionDuration;
isInterrupted = NO;
}
// Activate the AVAudioSession
[[AVAudioSession sharedInstance] setActive:YES error:&error];
if (error) {
NSLog(#"%#", error);
}
recordingStartDate = [NSDate date]; // Set the recording start date
[self btnRecordingTapped:btnRecord];
}
- (void)setupAudioSession
{
static BOOL audioSessionSetup = NO;
if (audioSessionSetup) {
return;
}
AVAudioSession* session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayAndRecord
withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker
error:Nil];
[session setMode:AVAudioSessionModeSpokenAudio error:nil];
audioSessionSetup = YES;
}
// gets an array of audios and append them to one another
// the basic logic was derived from here: http://stackoverflow.com/a/16040992/634958
// i modified this logic to append multiple files
- (void) appendAudiosAtURLs:(NSMutableArray*)urls completion:(void(^)(BOOL success, NSURL* outputUrl))handler
{
// Create a new audio track we can append to
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack* appendedAudioTrack =
[composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
// Grab the first audio track that need to be appended
AVURLAsset* originalAsset = [[AVURLAsset alloc]
initWithURL:urls.firstObject options:nil];
[urls removeObjectAtIndex:0];
NSError* error = nil;
// Grab the first audio track and insert it into our appendedAudioTrack
AVAssetTrack *originalTrack = [[originalAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, originalAsset.duration);
[appendedAudioTrack insertTimeRange:timeRange
ofTrack:originalTrack
atTime:kCMTimeZero
error:&error];
CMTime duration = originalAsset.duration;
if (error) {
if (handler) {
dispatch_async(dispatch_get_main_queue(), ^{
handler(NO, Nil);
});
}
}
for (NSURL* audioUrl in urls) {
AVURLAsset* newAsset = [[AVURLAsset alloc]
initWithURL:audioUrl options:nil];
// Grab the rest of the audio tracks and insert them at the end of each other
AVAssetTrack *newTrack = [[newAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
timeRange = CMTimeRangeMake(kCMTimeZero, newAsset.duration);
[appendedAudioTrack insertTimeRange:timeRange
ofTrack:newTrack
atTime:duration
error:&error];
duration = appendedAudioTrack.timeRange.duration;
if (error) {
if (handler) {
dispatch_async(dispatch_get_main_queue(), ^{
handler(NO, Nil);
});
}
}
}
// Create a new audio file using the appendedAudioTrack
AVAssetExportSession* exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (!exportSession) {
if (handler) {
dispatch_async(dispatch_get_main_queue(), ^{
handler(NO, Nil);
});
}
}
NSArray* appendedAudioPath = #[NSTemporaryDirectory(), #"temp.m4a"]; // name of the final audio file
exportSession.outputURL = [NSURL fileURLWithPathComponents:appendedAudioPath];
exportSession.outputFileType = AVFileTypeAppleM4A;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
BOOL success = NO;
// exported successfully?
switch (exportSession.status) {
case AVAssetExportSessionStatusFailed:
break;
case AVAssetExportSessionStatusCompleted: {
success = YES;
break;
}
case AVAssetExportSessionStatusWaiting:
break;
default:
break;
}
if (handler) {
dispatch_async(dispatch_get_main_queue(), ^{
handler(success, exportSession.outputURL);
});
}
}];
}
- (void) clearContentsOfDirectory:(NSString*)directory
{
NSFileManager *fm = [NSFileManager defaultManager];
NSError *error = nil;
for (NSString *file in [fm contentsOfDirectoryAtPath:directory error:&error]) {
[fm removeItemAtURL:[NSURL fileURLWithPathComponents:#[directory, file]] error:&error];
}
}
#end
I know its too late to answer to question, but hope this helps someone else!

How to create a audioplayer application like pandora app?

I am created an application which play multiple audio files from locally. Audio files are long. Audio player has following options for the user
Forward,
Rewind,
Next Track,
Previous Track,
I am planning to use AvAudioPlayer so that i can play an audio with long time. When i am changing the audio file ie pressing next track audio. The audioplayer instance is not getting released. This problem is appearing some times only. Please help me..!! I am help less..
Next Track Button IBAction Method
- (IBAction) nextTrackPressed
{
[audioPlay stopAudio];
if (audioPlay) {
audioPlay = nil;
[audioPlay release];
}
appDelegate.trackSelected += 1;
[self intiNewAudioFile];
[self play];
}
Initializing audio file through below method
-(void) intiNewAudioFile
{
NSAutoreleasePool *subPool = [[NSAutoreleasePool alloc] init];
NSString *filePath = [[NSString alloc] init];
trackObject = [appDelegate.trackDetailArray objectAtIndex:appDelegate.trackSelected];
NSLog(#"%#",trackObject.trackName);
// Get the file path to the song to play.
filePath = [[NSBundle mainBundle] pathForResource:trackObject.trackName ofType:#"mp3"];
// Convert the file path to a URL.
NSURL *fileURL = [[NSURL alloc] initFileURLWithPath:filePath];
if (audioPlay) {
audioPlay = nil;
[audioPlay release];
}
audioPlay = [[AudioPlayerClass alloc] init];
[audioPlay initAudioWithUrl:fileURL];
[filePath release];
[fileURL release];
[subPool release];
}
AudioPlayerClass Implementation
#import "AudioPlayerClass.h"
#implementation AudioPlayerClass
- (void) initAudioWithUrl: (NSURL *) url
{
curAudioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
[curAudioPlayer setDelegate:self];
[curAudioPlayer prepareToPlay];
}
- (void) playAudio
{
[curAudioPlayer play];
}
- (void) pauseAudio
{
[curAudioPlayer pause];
}
- (void) stopAudio
{
[curAudioPlayer stop];
}
- (BOOL) isAudioPlaying
{
return curAudioPlayer.playing;
}
- (void) setAudiowithCurrentTime:(NSInteger) time
{
curAudioPlayer.currentTime = time;
}
- (NSInteger) getAudioFileDuration
{
return curAudioPlayer.duration;
}
- (NSInteger) getAudioCurrentTime
{
return curAudioPlayer.currentTime;
}
- (void) releasePlayer
{
[curAudioPlayer release];
}
- (void)dealloc {
[curAudioPlayer release];
[super dealloc];
}
#end
Your problem is here:
if (audioPlay) {
audioPlay = nil;
[audioPlay release];
}
You are setting audioPlay to nil before calling release which means that the release message is getting sent to nil. You need to reverse the order of these 2 lines.
if (audioPlay) {
[audioPlay release];
audioPlay = nil;
}

Front Camera Video Recording iPhone 4?

i am trying to record a video from front camera of iPhone 4 using AVFoundation Framework with the help of WWDC samples i got from iPhone developer program. But i still cant get it to work..the video does not get recorded or mayb saved in my iPhone library...here's the code i am trying to use...it would b really helpful if someone culd help me with the problem i am having??
-(void)recordVideo
{
AVCaptureDeviceInput *videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil];
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session addInput:videoInput];
[session addOutput:movieFileOutput];
[movieFileOutput release];
if (![session isRunning])
{
[self performSelector:#selector(startRecording) withObject:nil afterDelay:1.0];
[session startRunning];
}
}
- (void) startRecording
{
NSLog(#"start recording");
AVCaptureConnection *videoConnection = [playVideo connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self movieFileOutput] connections]];
if ([videoConnection isVideoOrientationSupported]) {
[videoConnection setVideoOrientation:[self orientation]];
}
[[self movieFileOutput] startRecordingToOutputFileURL:[self tempFileURL]
recordingDelegate:self];
}
- (void) stopRecording
{
NSLog(#"stop recording");
[[self movieFileOutput] stopRecording];
}
- (NSURL *) tempFileURL
{
NSLog(#"temp file url");
NSString *outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"output.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath]) {
NSLog(#"exists");
}
[outputPath release];
return [outputURL autorelease];
}
- (void) setConnectionWithMediaType:(NSString *)mediaType enabled:(BOOL)enabled;
{
[[playVideo connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self movieFileOutput] connections]] setEnabled:enabled];
}
+ (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections;
{
NSLog(#"connection with media type");
for ( AVCaptureConnection *connection in connections ) {
for ( AVCaptureInputPort *port in [connection inputPorts] ) {
if ( [[port mediaType] isEqual:mediaType] ) {
return [[connection retain] autorelease];
}
}
}
return nil;
}
#implementation recordVideo (Internal)
- (AVCaptureDevice *) cameraWithPosition:(AVCaptureDevicePosition) position
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device position] == position) {
return device;
}
}
return nil;
}
- (AVCaptureDevice *) backFacingCamera
{
NSLog(#"back");
return [self cameraWithPosition:AVCaptureDevicePositionBack];
}
- (AVCaptureDevice *) frontFacingCamera
{
NSLog(#"front ");
return [self cameraWithPosition:AVCaptureDevicePositionFront];
}
#end
#implementation recordVideo (AVCaptureFileOutputRecordingDelegate)
- (void) captureOutput:(AVCaptureFileOutput *)captureOutput
didStartRecordingToOutputFileAtURL:(NSURL *)fileURL
fromConnections:(NSArray *)connections
{
NSLog(#"did start recording");
}
- (void) captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error
{
NSLog(#"did finish recording output file");
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error){
if (error && [delegate respondsToSelector:#selector(assetLibraryError:forURL:)]) {
[delegate assetLibraryError:error forURL:assetURL];
}
}];
}
else {
}
[library release];
}
#end
You are throwing away the errors - how can you debug if you do that?
You need to be more specific in what is / is not happening. Which methods are called? What errors are reported by the SDK (if any)? etc...
you have released reference to movieFileOutput in -(void)recordVideo;
and you are still using released movieFileOutput in - (void) startRecording as well as
other functions.You better not release object that you are using.

Adding delegate to AVAudioPlayer cause "_NSAutoreleaseNoPool(): Object 0x55e060 of class NSCFString autoreleased with no pool in place - just leaking"

I have been using a class to play sounds using AVAudioPlayer. Since I want to release these sounds right after they are played, I added a delegate. That causes a "_NSAutoreleaseNoPool(): Object 0x55e060 of class NSCFString autoreleased with no pool in place - just leaking" error right after the sound completes playing, but before my -audioPlayerDidFinishPlaying is called.
Here are some sources:
#interface MyAVAudioPlayer : NSObject <AVAudioPlayerDelegate> {
AVAudioPlayer *player;
float savedVolume;
BOOL releaseWhenDone;
}
The main class .m:
- (MyAVAudioPlayer *) initPlayerWithName: (NSString *) name;
{
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource: name ofType: #"caf"];
NSURL *fileURL = [[NSURL alloc] initFileURLWithPath: soundFilePath];
player = [[AVAudioPlayer alloc] initWithContentsOfURL: fileURL error: nil];
[fileURL release];
[player prepareToPlay];
return (self);
}
- (MyAVAudioPlayer *)getAndPlayAndRelease:(NSString *)name withVolume:(float) vol;
{
MyAVAudioPlayer *newMyAVPlayer = [self initPlayerWithName:name];
player.volume = vol;
[player play];
releaseWhenDone = YES;
[player setDelegate: self];
return newMyAVPlayer;
}
+ (void) getAndPlayAndReleaseAuto:(NSString *)name withVolume:(float) vol;
{
MyAVAudioPlayer *newMyAVPlayer = [[MyAVAudioPlayer alloc] getAndPlayAndRelease:name withVolume:vol];
// [newMyAVPlayer autorelease];
}
#pragma mark -
#pragma mark AVAudioPlayer Delegate Methods
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)playedPlayer successfully:(BOOL)flag {
if (releaseWhenDone) {
NSLog(#"releasing");
[playedPlayer release];
// [self release];
NSLog(#"released");
}
}
- (void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError *)error {
NSLog(#"Error while decoding: %#", [error localizedDescription] );
}
- (void)audioPlayerBeginInterruption:(AVAudioPlayer *)player {
NSLog(#"Interrupted!");
}
- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player {
NSLog(#"EndInterruption!");
}
- (BOOL) play;
{
player.currentTime = 0.0;
return [player play];
}
Commenting out the [player setDelegate: self]; makes the error go away, but then my audioPlayerDidFinishPlaying doesn't get called.
Any thoughts? Am I suddenly running in another thread?
I found the problem. My bug, of course.
In a lot of my class files, I was adding:
-(BOOL) respondsToSelector:(SEL) aSelector
{
NSLog(#"Class: %# subclass of %#, Selector: %#", [self class], [super class], NSStringFromSelector(aSelector));
return [super respondsToSelector:aSelector];
}
Mainly out of curiousity.
Well, when I added a delegate to my sound, then this method gets called before the delegate, and it gets called from whatever runloop the AVAudioPlayer happened to be in, and likely, a runloop with no autorelease pool.