I am created an application which play multiple audio files from locally. Audio files are long. Audio player has following options for the user
Forward,
Rewind,
Next Track,
Previous Track,
I am planning to use AvAudioPlayer so that i can play an audio with long time. When i am changing the audio file ie pressing next track audio. The audioplayer instance is not getting released. This problem is appearing some times only. Please help me..!! I am help less..
Next Track Button IBAction Method
- (IBAction) nextTrackPressed
{
[audioPlay stopAudio];
if (audioPlay) {
audioPlay = nil;
[audioPlay release];
}
appDelegate.trackSelected += 1;
[self intiNewAudioFile];
[self play];
}
Initializing audio file through below method
-(void) intiNewAudioFile
{
NSAutoreleasePool *subPool = [[NSAutoreleasePool alloc] init];
NSString *filePath = [[NSString alloc] init];
trackObject = [appDelegate.trackDetailArray objectAtIndex:appDelegate.trackSelected];
NSLog(#"%#",trackObject.trackName);
// Get the file path to the song to play.
filePath = [[NSBundle mainBundle] pathForResource:trackObject.trackName ofType:#"mp3"];
// Convert the file path to a URL.
NSURL *fileURL = [[NSURL alloc] initFileURLWithPath:filePath];
if (audioPlay) {
audioPlay = nil;
[audioPlay release];
}
audioPlay = [[AudioPlayerClass alloc] init];
[audioPlay initAudioWithUrl:fileURL];
[filePath release];
[fileURL release];
[subPool release];
}
AudioPlayerClass Implementation
#import "AudioPlayerClass.h"
#implementation AudioPlayerClass
- (void) initAudioWithUrl: (NSURL *) url
{
curAudioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
[curAudioPlayer setDelegate:self];
[curAudioPlayer prepareToPlay];
}
- (void) playAudio
{
[curAudioPlayer play];
}
- (void) pauseAudio
{
[curAudioPlayer pause];
}
- (void) stopAudio
{
[curAudioPlayer stop];
}
- (BOOL) isAudioPlaying
{
return curAudioPlayer.playing;
}
- (void) setAudiowithCurrentTime:(NSInteger) time
{
curAudioPlayer.currentTime = time;
}
- (NSInteger) getAudioFileDuration
{
return curAudioPlayer.duration;
}
- (NSInteger) getAudioCurrentTime
{
return curAudioPlayer.currentTime;
}
- (void) releasePlayer
{
[curAudioPlayer release];
}
- (void)dealloc {
[curAudioPlayer release];
[super dealloc];
}
#end
Your problem is here:
if (audioPlay) {
audioPlay = nil;
[audioPlay release];
}
You are setting audioPlay to nil before calling release which means that the release message is getting sent to nil. You need to reverse the order of these 2 lines.
if (audioPlay) {
[audioPlay release];
audioPlay = nil;
}
Related
I have got some code online which captures video from the camera of iPhone and then stores it to a video file and it is working fine. But my purpose is not to save it in the memory, but to send it to a sever. I have found out that there is a free media server named WOWZA which allows streaming and also Apple has (HSL) HTTP Live Streaming feature and that the servers expect the video to be in h.264 format for video and in mp3 for audio. By reading some of the documents about Apple HSL I also came to know that it gives a different url in the playlist file for each segment of the media file which is then played in the correct order on a device through the browser. I am not sure how to get small segments of the file that is recorded by the phone's camera and also how to convert it into the required format.
Following is the code for capturing video:
Implementation File
#import "THCaptureViewController.h"
#import <AVFoundation/AVFoundation.h>
#import "THPlayerViewController.h"
#define VIDEO_FILE #"test.mov"
#interface THCaptureViewController ()
#property (nonatomic, strong) AVCaptureSession *captureSession;
#property (nonatomic, strong) AVCaptureMovieFileOutput *captureOutput;
#property (nonatomic, weak) AVCaptureDeviceInput *activeVideoInput;
#property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;
#end
#implementation THCaptureViewController
- (void)viewDidLoad
{
[super viewDidLoad];
#if TARGET_IPHONE_SIMULATOR
self.simulatorView.hidden = NO;
[self.view bringSubviewToFront:self.simulatorView];
#else
self.simulatorView.hidden = YES;
[self.view sendSubviewToBack:self.simulatorView];
#endif
// Hide the toggle button if device has less than 2 cameras. Does 3GS support iOS 6?
self.toggleCameraButton.hidden = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] count] < 2;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0),
^{
[self setUpCaptureSession];
});
}
#pragma mark - Configure Capture Session
- (void)setUpCaptureSession
{
self.captureSession = [[AVCaptureSession alloc] init];
NSError *error;
// Set up hardware devices
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (videoDevice) {
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (input) {
[self.captureSession addInput:input];
self.activeVideoInput = input;
}
}
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
if (audioDevice) {
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (audioInput) {
[self.captureSession addInput:audioInput];
}
}
//Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[self.captureSession addOutput:output];
// Setup the still image file output
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}];
if ([self.captureSession canAddOutput:stillImageOutput]) {
[self.captureSession addOutput:stillImageOutput];
}
// Start running session so preview is available
[self.captureSession startRunning];
// Set up preview layer
dispatch_async(dispatch_get_main_queue(), ^{
self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
self.previewLayer.frame = self.previewView.bounds;
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[[self.previewLayer connection] setVideoOrientation:[self currentVideoOrientation]];
[self.previewView.layer addSublayer:self.previewLayer];
});
}
#pragma mark - Start Recording
- (IBAction)startRecording:(id)sender {
if ([sender isSelected]) {
[sender setSelected:NO];
[self.captureOutput stopRecording];
} else {
[sender setSelected:YES];
if (!self.captureOutput) {
self.captureOutput = [[AVCaptureMovieFileOutput alloc] init];
[self.captureSession addOutput:self.captureOutput];
}
// Delete the old movie file if it exists
//[[NSFileManager defaultManager] removeItemAtURL:[self outputURL] error:nil];
[self.captureSession startRunning];
AVCaptureConnection *videoConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:self.captureOutput.connections];
if ([videoConnection isVideoOrientationSupported]) {
videoConnection.videoOrientation = [self currentVideoOrientation];
}
if ([videoConnection isVideoStabilizationSupported]) {
videoConnection.enablesVideoStabilizationWhenAvailable = YES;
}
[self.captureOutput startRecordingToOutputFileURL:[self outputURL] recordingDelegate:self];
}
// Disable the toggle button if recording
self.toggleCameraButton.enabled = ![sender isSelected];
}
- (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections {
for (AVCaptureConnection *connection in connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:mediaType]) {
return connection;
}
}
}
return nil;
}
#pragma mark - AVCaptureFileOutputRecordingDelegate
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
if (!error) {
[self presentRecording];
} else {
NSLog(#"Error: %#", [error localizedDescription]);
}
}
#pragma mark - Show Last Recording
- (void)presentRecording
{
NSString *tracksKey = #"tracks";
AVAsset *asset = [AVURLAsset assetWithURL:[self outputURL]];
[asset loadValuesAsynchronouslyForKeys:#[tracksKey] completionHandler:^{
NSError *error;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
dispatch_async(dispatch_get_main_queue(), ^{
UIStoryboard *mainStoryboard = [UIStoryboard storyboardWithName:#"MainStoryboard" bundle:nil];
THPlayerViewController *controller = [mainStoryboard instantiateViewControllerWithIdentifier:#"THPlayerViewController"];
controller.title = #"Capture Recording";
controller.asset = asset;
[self presentViewController:controller animated:YES completion:nil];
});
}
}];
}
#pragma mark - Recoding Destination URL
- (NSURL *)outputURL
{
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSLog(#"documents Directory: %#", documentsDirectory);
NSString *filePath = [documentsDirectory stringByAppendingPathComponent:VIDEO_FILE];
NSLog(#"output url: %#", filePath);
return [NSURL fileURLWithPath:filePath];
}
#end
I found this link which shows how to capture the video in frames. But I am not sure that if capturing the video in frames will help me in sending the video in h.264 format to the server. Can this be done, if yes then how?
Here the person who has asked the question says (in the comments below the question) that he was able to do it successfully, but he hasn't mentioned that how he captured the video.
Please tell me which data type should be used to get small segments of the video captured and also how to convert the captured data in the required format and send it to the server.
You can use live sdk .You have to setup nginx powered streaming server.
Please follow this link .I have used it and it is very efficient solution .
https://github.com/ltebean/Live
In my application for recording and playing audio using AVAudioRecorder and AVAudioPlayer I came across a scenario in the case of incoming phone call.While the recording is in progress and if the phone call comes,the audio recorded after the phone call is only recorded.I want the recording recorded after the phone call to be the continuation of the audio recorded before the phone call.
I track the interruption occuring in audio recorder using the AVAudioRecorderDelegate methods
(void)audioRecorderBeginInterruption:(AVAudioRecorder *)avRecorder
and
(void)audioRecorderEndInterruption:(AVAudioRecorder *)avRecorder.
In my EndInterruption method I activates the audioSession.
Here is the recording code that I use
- (void)startRecordingProcess
{
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err = nil;
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&err];
if(err)
{
DEBUG_LOG(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
[audioSession setActive:YES error:&err];
err = nil;
if(err)
{
DEBUG_LOG(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
// Record settings for recording the audio
recordSetting = [[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatMPEG4AAC],AVFormatIDKey,
[NSNumber numberWithInt:44100],AVSampleRateKey,
[NSNumber numberWithInt: 2],AVNumberOfChannelsKey,
[NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
nil];
BOOL fileExists = [[NSFileManager defaultManager] fileExistsAtPath:recorderFilePath];
if (fileExists)
{
BOOL appendingFileExists =
[[NSFileManager defaultManager] fileExistsAtPath:appendingFilePath];
if (appendingFileExists)
{
[[NSFileManager defaultManager]removeItemAtPath:appendingFilePath error:nil];
}
if (appendingFilePath)
{
[appendingFilePath release];
appendingFilePath = nil;
}
appendingFilePath = [[NSString alloc]initWithFormat:#"%#/AppendedAudio.m4a", DOCUMENTS_FOLDER];
fileUrl = [NSURL fileURLWithPath:appendingFilePath];
}
else
{
isFirstTime = YES;
if (recorderFilePath)
{
DEBUG_LOG(#"Testing 2");
[recorderFilePath release];
recorderFilePath = nil;
}
DEBUG_LOG(#"Testing 3");
recorderFilePath = [[NSString alloc]initWithFormat:#"%#/RecordedAudio.m4a", DOCUMENTS_FOLDER];
fileUrl = [NSURL fileURLWithPath:recorderFilePath];
}
err = nil;
recorder = [[recorder initWithURL:fileUrl settings:recordSetting error:&err]retain];
if(!recorder)
{
DEBUG_LOG(#"recorder: %# %d %#", [err domain], [err code], [[err userInfo] description]);
[[AlertFunctions sharedInstance] showMessageWithTitle:kAppName
message:[err localizedDescription]
delegate:nil
cancelButtonTitle:#"Ok"];
return;
}
//prepare to record
[recorder setDelegate:self];
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
}
While searching for a solution to this issue I came across another link
how to resume recording after interruption occured in iphone? and http://www.iphonedevsdk.com/forum/iphone-sdk-development/31268-avaudiorecorderdelegate-interruption.html which speaks of the same issue.
I tried the suggestions that were given in those links but were not successful.
I hope to make it work with AVAudioRecorder itself.
Is there any way I could find a solution to this issue?
All valuable suggestions are appreciated.
After several research I was notified by Apple that it's an issue with the current API. So I managed to find a workaround for the issue by saving the previous audio file just after interruption and joining it with the resumed audio file. Hope it helps someone out there who may face the same issue.
I was also facing a similar issue where AVAudioRecorder was recording only after interruption.
So i fixed this issue by maintaining an array of recordings and keeping them in the NSTemporaryDirectory and finally merging them at the end.
Below are the key steps:
Make your class listen to the AVAudioSessionInterruptionNotification.
On interruption begin (AVAudioSessionInterruptionTypeBegan), save your recording
On interruption end(AVAudioSessionInterruptionTypeEnded), start a new recording for interruption option AVAudioSessionInterruptionOptionShouldResume
Append all recordings on hitting the Save button.
The code snippets for the above mentioned steps are:
// 1. Make this class listen to the AVAudioSessionInterruptionNotification in viewDidLoad
- (void)viewDidLoad
{
[super viewDidLoad];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(handleAudioSessionInterruption:)
name:AVAudioSessionInterruptionNotification
object:[AVAudioSession sharedInstance]];
// other coding stuff
}
// observe the interruption begin / end
- (void)handleAudioSessionInterruption:(NSNotification*)notification
{
AVAudioSessionInterruptionType interruptionType = [notification.userInfo[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
AVAudioSessionInterruptionOptions interruptionOption = [notification.userInfo[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue];
switch (interruptionType) {
// 2. save recording on interruption begin
case AVAudioSessionInterruptionTypeBegan:{
// stop recording
// Update the UI accordingly
break;
}
case AVAudioSessionInterruptionTypeEnded:{
if (interruptionOption == AVAudioSessionInterruptionOptionShouldResume) {
// create a new recording
// Update the UI accordingly
}
break;
}
default:
break;
}
}
// 4. append all recordings
- (void) audioRecorderDidFinishRecording:(AVAudioRecorder *)avrecorder successfully:(BOOL)flag
{
// append all recordings one after other
}
Here is a working example:
//
// XDRecordViewController.m
//
// Created by S1LENT WARRIOR
//
#import "XDRecordViewController.h"
#interface XDRecordViewController ()
{
AVAudioRecorder *recorder;
__weak IBOutlet UIButton* btnRecord;
__weak IBOutlet UIButton* btnSave;
__weak IBOutlet UIButton* btnDiscard;
__weak IBOutlet UILabel* lblTimer; // a UILabel to display the recording time
// some variables to display the timer on a lblTimer
NSTimer* timer;
NSTimeInterval intervalTimeElapsed;
NSDate* pauseStart;
NSDate* previousFireDate;
NSDate* recordingStartDate;
// interruption handling variables
BOOL isInterrupted;
NSInteger preInterruptionDuration;
NSMutableArray* recordings; // an array of recordings to be merged in the end
}
#end
#implementation XDRecordViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Make this class listen to the AVAudioSessionInterruptionNotification
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(handleAudioSessionInterruption:)
name:AVAudioSessionInterruptionNotification
object:[AVAudioSession sharedInstance]];
[self clearContentsOfDirectory:NSTemporaryDirectory()]; // clear contents of NSTemporaryDirectory()
recordings = [NSMutableArray new]; // initialize recordings
[self setupAudioSession]; // setup the audio session. you may customize it according to your requirements
}
- (void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
[self initRecording]; // start recording as soon as the view appears
}
- (void)dealloc
{
[self clearContentsOfDirectory:NSTemporaryDirectory()]; // remove all files files from NSTemporaryDirectory
[[NSNotificationCenter defaultCenter] removeObserver:self]; // remove this class from NSNotificationCenter
}
#pragma mark - Event Listeners
// called when recording button is tapped
- (IBAction) btnRecordingTapped:(UIButton*)sender
{
sender.selected = !sender.selected; // toggle the button
if (sender.selected) { // resume recording
[recorder record];
[self resumeTimer];
} else { // pause recording
[recorder pause];
[self pauseTimer];
}
}
// called when save button is tapped
- (IBAction) btnSaveTapped:(UIButton*)sender
{
[self pauseTimer]; // pause the timer
// disable the UI while the recording is saving so that user may not press the save, record or discard button again
btnSave.enabled = NO;
btnRecord.enabled = NO;
btnDiscard.enabled = NO;
[recorder stop]; // stop the AVAudioRecorder so that the audioRecorderDidFinishRecording delegate function may get called
// Deactivate the AVAudioSession
NSError* error;
[[AVAudioSession sharedInstance] setActive:NO error:&error];
if (error) {
NSLog(#"%#", error);
}
}
// called when discard button is tapped
- (IBAction) btnDiscardTapped:(id)sender
{
[self stopTimer]; // stop the timer
recorder.delegate = Nil; // set delegate to Nil so that audioRecorderDidFinishRecording delegate function may not get called
[recorder stop]; // stop the recorder
// Deactivate the AVAudioSession
NSError* error;
[[AVAudioSession sharedInstance] setActive:NO error:&error];
if (error) {
NSLog(#"%#", error);
}
[self.navigationController popViewControllerAnimated:YES];
}
#pragma mark - Notification Listeners
// called when an AVAudioSessionInterruption occurs
- (void)handleAudioSessionInterruption:(NSNotification*)notification
{
AVAudioSessionInterruptionType interruptionType = [notification.userInfo[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
AVAudioSessionInterruptionOptions interruptionOption = [notification.userInfo[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue];
switch (interruptionType) {
case AVAudioSessionInterruptionTypeBegan:{
// • Recording has stopped, already inactive
// • Change state of UI, etc., to reflect non-recording state
preInterruptionDuration += recorder.currentTime; // time elapsed
if(btnRecord.selected) { // timer is already running
[self btnRecordingTapped:btnRecord]; // pause the recording and pause the timer
}
recorder.delegate = Nil; // Set delegate to nil so that audioRecorderDidFinishRecording may not get called
[recorder stop]; // stop recording
isInterrupted = YES;
break;
}
case AVAudioSessionInterruptionTypeEnded:{
// • Make session active
// • Update user interface
// • AVAudioSessionInterruptionOptionShouldResume option
if (interruptionOption == AVAudioSessionInterruptionOptionShouldResume) {
// Here you should create a new recording
[self initRecording]; // create a new recording
[self btnRecordingTapped:btnRecord];
}
break;
}
default:
break;
}
}
#pragma mark - AVAudioRecorderDelegate
- (void) audioRecorderDidFinishRecording:(AVAudioRecorder *)avrecorder successfully:(BOOL)flag
{
[self appendAudiosAtURLs:recordings completion:^(BOOL success, NSURL *outputUrl) {
// do whatever you want with the new audio file :)
}];
}
#pragma mark - Timer
- (void)timerFired:(NSTimer*)timer
{
intervalTimeElapsed++;
[self updateDisplay];
}
// function to time string
- (NSString*) timerStringSinceTimeInterval:(NSTimeInterval)timeInterval
{
NSDate *timerDate = [NSDate dateWithTimeIntervalSince1970:timeInterval];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"mm:ss"];
[dateFormatter setTimeZone:[NSTimeZone timeZoneForSecondsFromGMT:0.0]];
return [dateFormatter stringFromDate:timerDate];
}
// called when recording pauses
- (void) pauseTimer
{
pauseStart = [NSDate dateWithTimeIntervalSinceNow:0];
previousFireDate = [timer fireDate];
[timer setFireDate:[NSDate distantFuture]];
}
- (void) resumeTimer
{
if (!timer) {
timer = [NSTimer scheduledTimerWithTimeInterval:1.0
target:self
selector:#selector(timerFired:)
userInfo:Nil
repeats:YES];
return;
}
float pauseTime = - 1 * [pauseStart timeIntervalSinceNow];
[timer setFireDate:[previousFireDate initWithTimeInterval:pauseTime sinceDate:previousFireDate]];
}
- (void)stopTimer
{
[self updateDisplay];
[timer invalidate];
timer = nil;
}
- (void)updateDisplay
{
lblTimer.text = [self timerStringSinceTimeInterval:intervalTimeElapsed];
}
#pragma mark - Helper Functions
- (void) initRecording
{
// Set the audio file
NSString* name = [NSString stringWithFormat:#"recording_%#.m4a", #(recordings.count)]; // creating a unique name for each audio file
NSURL *outputFileURL = [NSURL fileURLWithPathComponents:#[NSTemporaryDirectory(), name]];
[recordings addObject:outputFileURL];
// Define the recorder settings
NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue:#(kAudioFormatMPEG4AAC) forKey:AVFormatIDKey];
[recordSetting setValue:#(44100.0) forKey:AVSampleRateKey];
[recordSetting setValue:#(1) forKey:AVNumberOfChannelsKey];
NSError* error;
// Initiate and prepare the recorder
recorder = [[AVAudioRecorder alloc] initWithURL:outputFileURL settings:recordSetting error:&error];
recorder.delegate = self;
recorder.meteringEnabled = YES;
[recorder prepareToRecord];
if (![AVAudioSession sharedInstance].inputAvailable) { // can not record audio if mic is unavailable
NSLog(#"Error: Audio input device not available!");
return;
}
intervalTimeElapsed = 0;
recordingStartDate = [NSDate date];
if (isInterrupted) {
intervalTimeElapsed = preInterruptionDuration;
isInterrupted = NO;
}
// Activate the AVAudioSession
[[AVAudioSession sharedInstance] setActive:YES error:&error];
if (error) {
NSLog(#"%#", error);
}
recordingStartDate = [NSDate date]; // Set the recording start date
[self btnRecordingTapped:btnRecord];
}
- (void)setupAudioSession
{
static BOOL audioSessionSetup = NO;
if (audioSessionSetup) {
return;
}
AVAudioSession* session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayAndRecord
withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker
error:Nil];
[session setMode:AVAudioSessionModeSpokenAudio error:nil];
audioSessionSetup = YES;
}
// gets an array of audios and append them to one another
// the basic logic was derived from here: http://stackoverflow.com/a/16040992/634958
// i modified this logic to append multiple files
- (void) appendAudiosAtURLs:(NSMutableArray*)urls completion:(void(^)(BOOL success, NSURL* outputUrl))handler
{
// Create a new audio track we can append to
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack* appendedAudioTrack =
[composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
// Grab the first audio track that need to be appended
AVURLAsset* originalAsset = [[AVURLAsset alloc]
initWithURL:urls.firstObject options:nil];
[urls removeObjectAtIndex:0];
NSError* error = nil;
// Grab the first audio track and insert it into our appendedAudioTrack
AVAssetTrack *originalTrack = [[originalAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, originalAsset.duration);
[appendedAudioTrack insertTimeRange:timeRange
ofTrack:originalTrack
atTime:kCMTimeZero
error:&error];
CMTime duration = originalAsset.duration;
if (error) {
if (handler) {
dispatch_async(dispatch_get_main_queue(), ^{
handler(NO, Nil);
});
}
}
for (NSURL* audioUrl in urls) {
AVURLAsset* newAsset = [[AVURLAsset alloc]
initWithURL:audioUrl options:nil];
// Grab the rest of the audio tracks and insert them at the end of each other
AVAssetTrack *newTrack = [[newAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
timeRange = CMTimeRangeMake(kCMTimeZero, newAsset.duration);
[appendedAudioTrack insertTimeRange:timeRange
ofTrack:newTrack
atTime:duration
error:&error];
duration = appendedAudioTrack.timeRange.duration;
if (error) {
if (handler) {
dispatch_async(dispatch_get_main_queue(), ^{
handler(NO, Nil);
});
}
}
}
// Create a new audio file using the appendedAudioTrack
AVAssetExportSession* exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (!exportSession) {
if (handler) {
dispatch_async(dispatch_get_main_queue(), ^{
handler(NO, Nil);
});
}
}
NSArray* appendedAudioPath = #[NSTemporaryDirectory(), #"temp.m4a"]; // name of the final audio file
exportSession.outputURL = [NSURL fileURLWithPathComponents:appendedAudioPath];
exportSession.outputFileType = AVFileTypeAppleM4A;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
BOOL success = NO;
// exported successfully?
switch (exportSession.status) {
case AVAssetExportSessionStatusFailed:
break;
case AVAssetExportSessionStatusCompleted: {
success = YES;
break;
}
case AVAssetExportSessionStatusWaiting:
break;
default:
break;
}
if (handler) {
dispatch_async(dispatch_get_main_queue(), ^{
handler(success, exportSession.outputURL);
});
}
}];
}
- (void) clearContentsOfDirectory:(NSString*)directory
{
NSFileManager *fm = [NSFileManager defaultManager];
NSError *error = nil;
for (NSString *file in [fm contentsOfDirectoryAtPath:directory error:&error]) {
[fm removeItemAtURL:[NSURL fileURLWithPathComponents:#[directory, file]] error:&error];
}
}
#end
I know its too late to answer to question, but hope this helps someone else!
Right I have 5 views. One of the views is called RecordViewController. (RecordViewController is causing the error)
I can switch through views fine. But once I get onto the recordviewcontroller, I can switch to another view. But if I want to switch back to recordviewcontroller. It throws me out of my app and gives me this error:
Program received signal: “EXC_BAD_ACCESS”.
Data Formatters temporarily unavailable, will re-try after a 'continue'. (The program being debugged was signaled while in a function called from GDB.
GDB has restored the context to what it was before the call.
To change this behavior use "set unwindonsignal off"
Evaluation of the expression containing the function (gdb_objc_startDebuggerMode) will be abandoned.)
Here is the code for recordviewcontroller - Ive removed the switching view code, as its not needed.
#implementation RecordViewController
#synthesize actSpinner, btnStart, btnPlay;
-(void)countUp {
mainInt += 1;
seconds.text = [NSString stringWithFormat:#"%02d", mainInt];
}
static RecordViewController *sharedInstance = nil;
+ (RecordViewController*)sharedInstance {
if (sharedInstance == nil) {
sharedInstance = [[super allocWithZone:NULL] init];
}
return sharedInstance;
}
+ (id)allocWithZone:(NSZone*)zone {
return [[self sharedInstance] retain];
}
- (id)copyWithZone:(NSZone *)zone {
return self;
}
- (id)retain {
return self;
}
- (NSUInteger)retainCount {
return NSUIntegerMax;
}
- (void)release {
}
- (id)autorelease {
return self;
}
- (void)viewDidLoad {
[super viewDidLoad];
toggle = YES;
btnPlay.hidden = YES;
AVAudioSession * audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error: &error];
[audioSession setActive:YES error: &error];
}
- (IBAction) start_button_pressed{
if(toggle)
{
toggle = NO;
[actSpinner startAnimating];
[btnStart setImage:[UIImage imageNamed:#"recordstop.png"] forState:UIControlStateNormal];
mainInt = 0;
theTimer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:#selector(countUp) userInfo:nil repeats:YES];
btnPlay.enabled = toggle;
btnPlay.hidden = !toggle;
NSMutableDictionary* recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
recordedTmpFile = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent: [NSString stringWithFormat: #"%.0f.%#", [NSDate timeIntervalSinceReferenceDate] * 1000.0, #"caf"]]];
NSLog(#"Using File called: %#",recordedTmpFile);
recorder = [[ AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSetting error:&error];
[recorder setDelegate:self];
[recorder prepareToRecord];
[recorder record];
}
else
{
toggle = YES;
[actSpinner stopAnimating];
[btnStart setImage:[UIImage imageNamed:#"recordrecord.png"] forState:UIControlStateNormal];
btnPlay.enabled = toggle;
btnPlay.hidden = !toggle;
[theTimer invalidate];
NSLog(#"Using File called: %#",recordedTmpFile);
[recorder stop];
}
}
- (void)didReceiveMemoryWarning {
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
-(IBAction) play_button_pressed{
AVAudioPlayer * avPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:recordedTmpFile error:&error];
[avPlayer prepareToPlay];
[avPlayer play];
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
// Return YES for supported orientations
return (interfaceOrientation == UIInterfaceOrientationLandscapeLeft || interfaceOrientation == UIInterfaceOrientationLandscapeRight);
}
- (void)viewDidUnload {
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
//Clean up the temp file.
NSFileManager * fm = [NSFileManager defaultManager];
[fm removeItemAtPath:[recordedTmpFile path] error:&error];
//Call the dealloc on the remaining objects.
[recorder dealloc];
recorder = nil;
recordedTmpFile = nil;
}
- (void)dealloc {
[super dealloc];
}
#end
Sounds like you are not retaining your recordViewController, so when you push another view, it's released, and when you try to go back, it's not there anymore.
If you are not using ARC, retain it when you create it.
Remove the implementations of the following methods altogether:
+ (id)allocWithZone:(NSZone*)zone
- (id)copyWithZone:(NSZone *)zone
- (id)retain
- (NSUInteger)retainCount
- (void)release
And change the implementation of sharedInstance to the following and also add the implementation of alloc shown below.
+(RecordViewController *) sharedInstance {
#synchronized([RecordViewController class])
{
if (!sharedInstance)
{
[[self alloc]init];
}
return sharedInstance;
}
return nil;
}
+(id) alloc
{
#synchronized([RecordViewController class])
{
NSAssert(sharedInstance == nil, #"Attempted to allocated second singleton RecordViewController");
sharedInstance = [super alloc];
return sharedInstance;
}
return nil;
}
I had this issue several times, and it was always some sort of infinitive recursion — most likely be calling the setter from with-in the setter.
edit
googleling for it I found this: ERROR: Memory Leak, data formatters temporarily unavailable
As your code also is leaking, it could be the same problem.
I have a class that I call to utilize AVAudioPlayer and everything works fine and dandy when it comes to playing the audio, but when the -audioPlayerDidFinishPlaying: is called my NSLog() command says that the player is released; the problem is that the app crashes moments later. I should mention that audioPlayer is an ivar in this class. Here is the code:
-(id) initWithFileName:(NSString *)sndFileName
{
[super init];
sndFileToPlay = [[NSString alloc] initWithString:sndFileName];
return self;
}
-(void)dealloc {
[audioPlayer release];
self.audioPlayer.delegate = nil;
self.audioPlayer = nil;
[super dealloc];
}
-(void)play
{
[self playSound:sndFileToPlay];
}
-(void)playSound:(NSString *)fileName
{
NSString *fname, *ext;
NSRange range = [fileName rangeOfString:#"."];
int location = range.location;
if( location > 0 )
{
fname = [fileName substringWithRange:NSMakeRange(0, location)];
ext = [fileName substringFromIndex:location+1];
[self playSound:fname :ext];
}
}
—
-(void)playSound:(NSString *)fileName :(NSString *)fileExt
{
NSBundle *mainBundle = [NSBundle mainBundle];
NSURL *fileURL = [NSURL fileURLWithPath:
[mainBundle pathForResource:fileName ofType:fileExt] isDirectory:NO];
if (fileURL != nil)
{
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL: fileURL
error: nil];
[fileURL release];
[audioPlayer setDelegate:self];
[audioPlayer play];
}
}
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player
successfully:(BOOL)flag
{
NSLog(#"Releasing");
[audioPlayer release];
}
There are several things wrong with your code.
For one, in your dealloc:
[audioPlayer release];
self.audioPlayer.delegate = nil;
self.audioPlayer = nil;
You are releasing the audioPlayer, then, on the released (and maybe deallocated) player you set the delegate to nil and then the property, which releases it again. Remove the [audioPlayer release];.
In your audioPlayerDidFinishPlaying:successfully: you're releasing the player as well, but you haven't set the variable to nil. That might cause a crash since by the time you access this variable again a different object might be at that memory address. Use the property instead and do it like in your dealloc:
self.audioPlayer.delegate = nil;
self.audioPlayer = nil;
Then, in playSound:: (argh, non-named second argument !) you over-release fileURL. The -[NSURL fileURLWithPath:isDirectory:] returns an autoreleased object, you may not release it.
Last but maybe not least you leak sndFileToPlay, you need to release it in your dealloc method. And instead of sndFileToPlay = [[NSString alloc] initWithString:sndFileName]; simply do sndFileToPlay = [sndFileName copy];.
You might want to read up on Objective-C memory management. It's not hard once you know the three or four rules-of-thumb.
You should clean up your code. If playSound is called several times, you are leaking AVAudioPlayer.
In your dealloc, you should put [audioPlayer release] after the two lines beneath.
Turn on NSZombieEnabled to debug, and make sure that the audioPlayer is not released when didFinish is called.
I have been using a class to play sounds using AVAudioPlayer. Since I want to release these sounds right after they are played, I added a delegate. That causes a "_NSAutoreleaseNoPool(): Object 0x55e060 of class NSCFString autoreleased with no pool in place - just leaking" error right after the sound completes playing, but before my -audioPlayerDidFinishPlaying is called.
Here are some sources:
#interface MyAVAudioPlayer : NSObject <AVAudioPlayerDelegate> {
AVAudioPlayer *player;
float savedVolume;
BOOL releaseWhenDone;
}
The main class .m:
- (MyAVAudioPlayer *) initPlayerWithName: (NSString *) name;
{
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource: name ofType: #"caf"];
NSURL *fileURL = [[NSURL alloc] initFileURLWithPath: soundFilePath];
player = [[AVAudioPlayer alloc] initWithContentsOfURL: fileURL error: nil];
[fileURL release];
[player prepareToPlay];
return (self);
}
- (MyAVAudioPlayer *)getAndPlayAndRelease:(NSString *)name withVolume:(float) vol;
{
MyAVAudioPlayer *newMyAVPlayer = [self initPlayerWithName:name];
player.volume = vol;
[player play];
releaseWhenDone = YES;
[player setDelegate: self];
return newMyAVPlayer;
}
+ (void) getAndPlayAndReleaseAuto:(NSString *)name withVolume:(float) vol;
{
MyAVAudioPlayer *newMyAVPlayer = [[MyAVAudioPlayer alloc] getAndPlayAndRelease:name withVolume:vol];
// [newMyAVPlayer autorelease];
}
#pragma mark -
#pragma mark AVAudioPlayer Delegate Methods
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)playedPlayer successfully:(BOOL)flag {
if (releaseWhenDone) {
NSLog(#"releasing");
[playedPlayer release];
// [self release];
NSLog(#"released");
}
}
- (void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError *)error {
NSLog(#"Error while decoding: %#", [error localizedDescription] );
}
- (void)audioPlayerBeginInterruption:(AVAudioPlayer *)player {
NSLog(#"Interrupted!");
}
- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player {
NSLog(#"EndInterruption!");
}
- (BOOL) play;
{
player.currentTime = 0.0;
return [player play];
}
Commenting out the [player setDelegate: self]; makes the error go away, but then my audioPlayerDidFinishPlaying doesn't get called.
Any thoughts? Am I suddenly running in another thread?
I found the problem. My bug, of course.
In a lot of my class files, I was adding:
-(BOOL) respondsToSelector:(SEL) aSelector
{
NSLog(#"Class: %# subclass of %#, Selector: %#", [self class], [super class], NSStringFromSelector(aSelector));
return [super respondsToSelector:aSelector];
}
Mainly out of curiousity.
Well, when I added a delegate to my sound, then this method gets called before the delegate, and it gets called from whatever runloop the AVAudioPlayer happened to be in, and likely, a runloop with no autorelease pool.