How to do recording for main view? - iphone

There is recoding in my app.
I have created a custom ScreenCapture View to do recoding.
Now i want to record a video for main view. (i.e on self.view),but it is not working.
I have used following code to do recording of my custom view :
- (IBAction)btnRecording_Pressed:(id)sender {
if (Isrecording ==YES)
{
//
// imgDustbin.hidden=YES;
// [[NSUserDefaults standardUserDefaults ] setValue:#"NO" forKey:#"DUSTBIN"];
//---
[voiceRecorder stop];
[captureview stopRecording];
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err = nil;
[audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err];
if(err)
{
NSLog(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
[audioSession setActive:YES error:&err];
err = nil;
if(err){
NSLog(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
[recordSetting setValue :[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];
// NSString *recorderFilePath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];//26
NSString *recorderFilePath = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) objectAtIndex:0];
recorderFilePath = [recorderFilePath stringByAppendingPathComponent:#"tempRecording.caf"];
NSURL *urls = [NSURL fileURLWithPath:recorderFilePath];
err = nil;
voiceRecorder = [[ AVAudioRecorder alloc] initWithURL:urls settings:recordSetting error:&err];
//[recorder setMeteringEnabled:YES];
if(!voiceRecorder){
NSLog(#"recorder: %# %d %#", [err domain], [err code], [[err userInfo] description]);
UIAlertView *alert =
[[UIAlertView alloc] initWithTitle: #"Warning"
message: [err localizedDescription]
delegate: nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
[alert release];
return;
}
//prepare to record
[voiceRecorder setDelegate:self];
[voiceRecorder prepareToRecord];
//scrren short of screen
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(self.view.bounds.size);
[captureview.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * imageData = UIImageJPEGRepresentation(viewImage, 1.0);
// NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);//26
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
documentsDirectory = [documentsDirectory stringByAppendingPathComponent:#"VideoScreen.jpg"];
[imageData writeToFile:documentsDirectory atomically:YES];
//--------------------
[voiceRecorder record];
[captureview performSelector:#selector(startRecording) withObject:nil afterDelay:0];
Isrecording =NO;
[btnRecord setTitle:#"Stop" forState:UIControlStateNormal];
}
else if (!Isrecording)
{
//
imgDustbin.hidden =NO;
[[NSUserDefaults standardUserDefaults ] setValue:#"YES" forKey:#"DUSTBIN"];
//------
[voiceRecorder stop];
[captureview stopRecording];
[self createVideo];
Isrecording=YES;
[btnRecord setTitle:#"Record" forState:UIControlStateNormal];
}
}
how to do this ?
Thanks..

I've removed some project specific code for my setup, but if you have your input and outputs setup the previewing/preview layer code is what you are looking for. You add a sublayer that shows what video will be recorded.
- (void)setupSession{
// create a capture session set session preset
// get a camera, front facing if possible
// check to see if camera is available
// create input
// create output
// add output
[session beginConfiguration];
[session addInput:input];
[session addOutput:output];
[session commitConfiguration];
// configure orientation
connection = [output connectionWithMediaType:AVMediaTypeVideo];
//check to make sure you can record
// Important for you, the preview layer
// add preview layer
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
if ([captureVideoPreviewLayer isOrientationSupported])
{
[captureVideoPreviewLayer setOrientation:[[UIDevice currentDevice] orientation]];
} else {
NSLog(#"Cannot set preview orientation");
}
[captureVideoPreviewLayer setFrame:[_previewView bounds]];
//add sublayer to mainview where you are setting your recording from
[[self.view layer] addSublayer:captureVideoPreviewLayer];
// start session
[session startRunning];
}

Related

Received memory warning in recording the video

I made an app for ipad which contains image dragging and video recording.
It takes screenshots as recording starts and after that makes movie by appending them.
When i record video of 5 to 10 seconds , it works fine. But as i try to record video of 1 minute or more, it crashes and gives "Received memory warning" in log.
I have used the following code ;
- (IBAction)btnRecording_Pressed:(id)sender
{
if ([recordButton.titleLabel.text isEqualToString:#"Start Recording"]) {
backButton.enabled = NO;
[recordButton setTitle:#"Stop Recording" forState:UIControlStateNormal];
fileIndex = 0;
recTimer = [NSTimer scheduledTimerWithTimeInterval:1.0/5.0 target:self selector:#selector(startFrameCapture) userInfo:nil repeats:YES];
[recTimer retain];
[self startRecording];
}else{
[recordButton setTitle:#"Start Recording" forState:UIControlStateNormal];
[recTimer invalidate];
[recTimer release];
[self stopRecording];
[self getFileName];
}
}
-(void)startFrameCapture
{
fileIndex++;
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
documentsDirectory = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"Screenshot%d.jpg",fileIndex]];
[self performSelectorInBackground:#selector(newThread:) withObject:documentsDirectory];
}
-(void)newThread:(NSString *)frameName{
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
imageData = UIImageJPEGRepresentation(viewImage, 1.0);
[imageData writeToFile:frameName atomically:YES];
}
- (void) startRecording{
if([recorder isRecording]){
NSLog(#"Stopped Recording");
[self stopRecording];
}else{
NSLog(#"Started Recording");
[self prepareRecorderNow];
[recorder record];
}
}
- (void) stopRecording{
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
if([recorder isRecording]){
[recorder stop];
[recorder release];
recorder = nil;
}
[pool drain];
}
-(void)prepareRecorderNow{
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
NSError *err = nil;
[audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err];
if(err){
NSLog(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
[audioSession setActive:YES error:&err];
err = nil;
if(err){
NSLog(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
[recordSetting setValue :[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];
// Create a new dated file
[recorderFilePath release];
recorderFilePath = [[NSString stringWithFormat:#"%#/deformed.caf", DOCUMENTS_FOLDER] retain];
NSURL *url = [NSURL fileURLWithPath:recorderFilePath];
err = nil;
recorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&err];
[recordSetting release];
if(!recorder){
NSLog(#"recorder: %# %d %#", [err domain], [err code], [[err userInfo] description]);
UIAlertView *alert =
[[UIAlertView alloc] initWithTitle: #"Warning"
message: [err localizedDescription]
delegate: nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
[alert release];
return;
}
//prepare to record
[recorder setDelegate:self];
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
BOOL audioHWAvailable = audioSession.inputIsAvailable;
if (! audioHWAvailable) {
UIAlertView *cantRecordAlert =
[[UIAlertView alloc] initWithTitle: #"Warning"
message: #"Audio input hardware not available"
delegate: nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[cantRecordAlert show];
[cantRecordAlert release];
return;
}
NSLog(#"Not Over Man");
[pool drain];
}
What can be the issue ?
Thanks!!
You are creating a image context 5 times per second. That could be the problem.
Try reusing your UIGraphicsImageContext by saving it as an ivar or property.
I had a similar problem in case of Capturing Pictures, practically, I have seen the problem with the NOT RELEASED UIImage objects, which occupies most of the memory, here is a fix you can try
-(void)newThread:(NSString *)frameName
{
UIImage *viewImage=nil;
viewImage=[[UIImage alloc] init];
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
imageData = UIImageJPEGRepresentation(viewImage, 1.0);
[imageData writeToFile:frameName atomically:YES];
[viewImage release];
}

error in sqlite3_exe()

sir i have working with db connectivity i have written all code and i imported the header file also.but its showing the error in the line of sqlite3_exe symbol not found in architecture
help me to clear...!!!!
NSFileManager *filemgr=[NSFileManager defaultManager];
if ([filemgr fileExistsAtPath:databasePath]==NO)
{
const char *dbpath=[databasePath UTF8String];
if (sqlite3_open(dbpath, &contactDB)==SQLITE_OK)
{
char *errorMsg;
const char *sql_stmt="CREATE TABLE IF NOT EXISTS CONTACTS (ID INTEGER PRIMARY KEY AUTOINCREMENT, NAME TEXT, ADDRESS TEXT, PHONE NUMERIC)";
if (sqlite3_exe(contactDB, sql_stmt, NULL,NULL, &errorMsg)!=SQLITE_OK)
{
status.text=#"Failed to crate table";
}
sqlite3_close(contactDB);
}else {
status.text=#"Faild to open/create database";
}
}
the error line is
if (sqlite3_exe(contactDB, sql_stmt, NULL,NULL, &errorMsg)!=SQLITE_OK)
warning is the implicit declaration of function "sqlite3_exe" is invalid in c99.
error is Undefined symbols for architecture i386:
"_sqlite3_exe", referenced from:
-[DatabaseViewController viewDidLoad] in DatabaseViewController.o
ld: symbol(s) not found for architecture i386
clang: error: linker command failed with exit code 1 (use -v to see invocation)
It is sqlite3_exec. Not sqlite3_exe.
Please check.
"C" is missing in exe.
Here is the sample code for the same. You can record the audio here. you can
#define DOCUMENTS_FOLDER [NSHomeDirectory() stringByAppendingPathComponent:#"Documents"]
- (void) startRecording{
UIBarButtonItem *stopButton = [[UIBarButtonItem alloc] initWithTitle:#"Stop" style:UIBarButtonItemStyleBordered target:self action:#selector(stopRecording)];
self.navigationItem.rightBarButtonItem = stopButton;
[stopButton release];
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err = nil;
[audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err];
if(err){
NSLog(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
[audioSession setActive:YES error:&err];
err = nil;
if(err){
NSLog(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
[recordSetting setValue :[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];
// Create a new dated file
NSDate *now = [NSDate dateWithTimeIntervalSinceNow:0];
NSString *caldate = [now description];
recorderFilePath = [[NSString stringWithFormat:#"%#/%#.caf", DOCUMENTS_FOLDER, caldate] retain];
NSURL *url = [NSURL fileURLWithPath:recorderFilePath];
err = nil;
recorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&err];
if(!recorder){
NSLog(#"recorder: %# %d %#", [err domain], [err code], [[err userInfo] description]);
UIAlertView *alert =
[[UIAlertView alloc] initWithTitle: #"Warning"
message: [err localizedDescription]
delegate: nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
[alert release];
return;
}
//prepare to record
[recorder setDelegate:self];
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
BOOL audioHWAvailable = audioSession.inputIsAvailable;
if (! audioHWAvailable) {
UIAlertView *cantRecordAlert =
[[UIAlertView alloc] initWithTitle: #"Warning"
message: #"Audio input hardware not available"
delegate: nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[cantRecordAlert show];
[cantRecordAlert release];
return;
}
// start recording
[recorder recordForDuration:(NSTimeInterval) 10];
}
- (void) stopRecording{
[recorder stop];
NSURL *url = [NSURL fileURLWithPath: recorderFilePath];
NSError *err = nil;
NSData *audioData = [NSData dataWithContentsOfFile:[url path] options: 0 error:&err];
if(!audioData)
NSLog(#"audio data: %# %d %#", [err domain], [err code], [[err userInfo] description]);
[editedObject setValue:[NSData dataWithContentsOfURL:url] forKey:editedFieldKey];
//[recorder deleteRecording];
NSFileManager *fm = [NSFileManager defaultManager];
err = nil;
[fm removeItemAtPath:[url path] error:&err];
if(err)
NSLog(#"File Manager: %# %d %#", [err domain], [err code], [[err userInfo] description]);
UIBarButtonItem *startButton = [[UIBarButtonItem alloc] initWithTitle:#"Record" style:UIBarButtonItemStyleBordered target:self action:#selector(startRecording)];
self.navigationItem.rightBarButtonItem = startButton;
[startButton release];
}
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully:(BOOL)flag
{
NSLog (#"audioRecorderDidFinishRecording:successfully:");
// your actions here
}
For Playing You can use these lines in stop recording method:
NSURL* musicFile = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"click"
ofType:#"caf"]];
AVAudioPlayer *click = [[AVAudioPlayer alloc] initWithContentsOfURL:musicFile error:nil];
[click play];
[click release];
Make sure you are passing the correct URL of the recorded sound path into the app. And these lines will play them for you in audio player.

Objective C: auto play after recording with specific time

Good day, can you help me with my project in making a recording project, which is after you recorded it using AVAudioRecorder it will automatically play in a certain time can you give me or site me a link regarding with my question..i am badly needed your help masters, because i'm new at iOS development. thanks in advance guys. have a good day.
here's my code # startrecording:
-(void)startRecording:(UIButton *)sender
{ //for recording
recStopBtn.enabled=NO;
recStopBtn.hidden = NO;
recStopBtn.enabled =YES;
playRecBtn.enabled = NO;
loading.hidden = NO;
[loading startAnimating];
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err = nil;
[audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err];
if(err)
{
NSLog(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
[audioSession setActive:YES error:&err];
err = nil;
if(err)
{
NSLog(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
recordSetting = [[NSMutableDictionary alloc] init];
// We can use kAudioFormatAppleIMA4 (4:1 compression) or kAudioFormatLinearPCM for nocompression
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
// We can use 44100, 32000, 24000, 16000 or 12000 depending on sound quality
[recordSetting setValue:[NSNumber numberWithFloat:16000.0] forKey:AVSampleRateKey];
// We can use 2(if using additional h/w) or 1 (iPhone only has one microphone)
[recordSetting setValue:[NSNumber numberWithInt: 1] forKey:AVNumberOfChannelsKey];
// These settings are used if we are using kAudioFormatLinearPCM format
//[recordSetting setValue :[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
//[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
//[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];
recorderFilePath = [NSString stringWithFormat:#"%#/MySound.caf", DOCUMENTS_FOLDER];
NSLog(#"recorderFilePath: %#",recorderFilePath);
NSURL *url = [NSURL fileURLWithPath:recorderFilePath];
err = nil;
NSData *audioData = [NSData dataWithContentsOfFile:[url path] options: 0 error:&err];
if(audioData)
{
NSFileManager *fm = [NSFileManager defaultManager];
[fm removeItemAtPath:[url path] error:&err];
}
err = nil;
recorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&err];
if(!recorder){
NSLog(#"recorder: %# %d %#", [err domain], [err code], [[err userInfo] description]);
UIAlertView *alert =
[[UIAlertView alloc] initWithTitle: #"Warning"
message: [err localizedDescription]
delegate: nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
return;
}
//prepare to record
[recorder setDelegate:self];
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
BOOL audioHWAvailable = audioSession.inputIsAvailable;
if (! audioHWAvailable) {
UIAlertView *cantRecordAlert =
[[UIAlertView alloc] initWithTitle: #"Warning"
message: #"Audio input hardware not available"
delegate: nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[cantRecordAlert show];
return;
}
// start recording
[recorder record];
lblStatusMsg.text = #"Recording...";
NSLog(#"RECORDING");
//recIcon.image = [UIImage imageNamed:#"rec_icon.png"];
//progressView.progress = 0.0;
//timer = [NSTimer scheduledTimerWithTimeInterval:6.0 target:self selector:#selector(handleTimer) userInfo:nil repeats:YES];
}
If all you need is how to make something happen after a certain amount of time, use an NSTimer. I see you already have one commented out in your code.
To record for a certain amount of time use recordForDuration. To stop recording manually use stop. To play a recording use the url property of the AVAudioRecorder in AVAudioPlayer method initWithContentsOfURL.
So basically, uncomment your NSTimer and then do
AVAudioPlayer *player;
- (void) handleTimer
{
player = [[AVAudioPlayer alloc] initWithContentsOfURL:nameOfAudioRecorder.url];
}

iOS - Recording audio with AVAudioRecorder fail with no error

I got this problem with AVAudioRecorder not working for me, at least from what i can see (or can't hear).
I am targeting iOS 5 with ARC.
I did setup error objects but none of them get fired, so i guess i am doing something wrong here.
Here is the part were i setup the AVAudioRecorder :
NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docsDir = [dirPaths objectAtIndex:0];
NSString *soundFilePath = [docsDir stringByAppendingPathComponent:currentTrack.trackFileName];
currentTrack.trackFileURL = [NSURL fileURLWithPath:soundFilePath];
NSLog(#"Chemin : %#", currentTrack.trackFileURL.path);
NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
[recordSetting setValue :[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];
currentTrack.trackRecordSettings = recordSetting;
NSError *err = nil;
//[audioSession setDelegate:self];
if(err){
NSLog(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
[[AVAudioSession sharedInstance] setActive:YES error:&err];
err = nil;
if(err){
NSLog(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
err = nil;
recorder = [[AVAudioRecorder alloc]
initWithURL:currentTrack.trackFileURL
settings:currentTrack.trackRecordSettings
error:&error];
if (err)
{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle: #"Warning"
message: [err localizedDescription]
delegate: nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
NSLog(#"error: %#", [error localizedDescription]);
} else {
[recorder setDelegate:self];
[recorder prepareToRecord];
BOOL audioHWAvailable = [[AVAudioSession sharedInstance]inputIsAvailable ];
if (!audioHWAvailable) {
UIAlertView *cantRecordAlert =
[[UIAlertView alloc] initWithTitle: #"Warning"
message: #"Audio input hardware not available"
delegate: nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[cantRecordAlert show];
return;
}
}
No error thrown here
Then when it is time to start the recording :
-(void)startRecordingAudio{
NSLog(#"Start recording");
[[AVAudioSession sharedInstance] setCategory :AVAudioSessionCategoryRecord error:nil];
[recorder record];
}
Then when it is time to stop the recording :
-(void)stopRecordingAudio{
NSLog(#"Stop recording");
[recorder stop];
NSError *err;
NSData *audioData = [NSData dataWithContentsOfFile:currentTrack.trackFileURL.path options: 0 error:&err];
if(!audioData)
NSLog(#"audio data error: %# %d %#", [err domain], [err code], [[err userInfo] description]);
NSLog(#"%d", audioData.length);
}
Is it normal that audioData length is always 4096 ? If i understand right it is about 93 ms of sound...
Finally, when it is time to play the recorded sound:
-(void)startPlayingAudio{
[[AVAudioSession sharedInstance] setCategory :AVAudioSessionCategoryPlayback error:nil];
NSLog(#"Start playing");
NSError *error;
if(player == nil){
player = [[AVAudioPlayer alloc] initWithContentsOfURL:currentTrack.trackFileURL
error:&error];
}
player.delegate = self;
if (error)
NSLog(#"Error: %#",
[error localizedDescription]);
else
[player play];
}
I did setup delegate methods that never fire too :
-(void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag
{
NSLog (#"audioRecorderDidFinishRecording:successfully:");
}
-(void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError *)error
{
NSLog(#"Decode Error occurred");
}
-(void)audioRecorderDidFinishRecording:(AVAudioRecorder *)recorder successfully: (BOOL)flag
{
NSLog (#"audioRecorderDidFinishRecording:successfully: %d", flag);
}
-(void)audioRecorderEncodeErrorDidOccur:(AVAudioRecorder *)recorder error:(NSError *)error
{
NSLog(#"Encode Error occurred");
}
-(void)beginInterruption{
NSLog(#"INTERRUPT");
}
Also, i did see that the .caf file are created correctly in my app document folder.
Thanks for any help you can bring. As for now i cannot hear the sound recorded and i test using iPhone 4 device with no headphones.
Found the problem :
record was called before prepareRecord. Since i was using viewDidAppear to setup the recording, the call to start the recording was made before the view was visible...
I did use viewDidAppear because viewDidLoad is not called since i dont use initWithNib. I tried to override loadView, but all the examples i found are not that clear about the subject and loadView is never called. Maybe somebody can point me into the right direction? Nonetheless, i can hear my voice on the iphone!
Thanks for your time.

How can i record a song play from iPod library using MPMediaPickerController?

I am making an App in which i have to play music from iPod Music Library using MPMediaPickerController.After playing a song i want to record the song with some external Voice(For example:- User's Voice).I m trying to do the same but getting a problem i.e.when App launches firstly,i choose a song from iPod Music library after that i click on start Recording Button.when i click on Start Recording button my song which i played before stops but the recording is working properly.Recording of User's Voice is working fine but song is not getting recorded as i told that song stops when "Start Recording" button clicked.I am using AVAudioRecorder for Recording.This is my code which i m using.
-(Void)ViewDidLoad
{
self.musicPlayer = [MPMusicPlayerController iPodMusicPlayer];
}
- (void)playOrPauseMusic:(id)sender {
MPMusicPlaybackState playbackState = self.musicPlayer.playbackState;
if (playbackState == MPMusicPlaybackStateStopped || playbackState == MPMusicPlaybackStatePaused) {
[self.musicPlayer play];
} else if (playbackState == MPMusicPlaybackStatePlaying) {
[self.musicPlayer pause];
}
}
- (void)openMediaPicker:(id)sender {
MPMediaPickerController *mediaPicker = [[MPMediaPickerController alloc] initWithMediaTypes:MPMediaTypeMusic];
mediaPicker.delegate = self;
mediaPicker.allowsPickingMultipleItems = NO; // this is the default
[mediaPicker shouldAutorotateToInterfaceOrientation:UIInterfaceOrientationLandscapeRight];
[self presentModalViewController:mediaPicker animated:YES];
}
- (void) startRecording
{
[self.musicPlayer play];
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err = nil;
[audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err];
if(err){
NSLog(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
[audioSession setActive:YES error:&err];
err = nil;
if(err){
NSLog(#"audioSession: %# %d %#", [err domain], [err code], [[err userInfo] description]);
return;
}
recordSetting = [[NSMutableDictionary alloc] init];
// We can use kAudioFormatAppleIMA4 (4:1 compression) or kAudioFormatLinearPCM for nocompression
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
// We can use 44100, 32000, 24000, 16000 or 12000 depending on sound quality
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
// We can use 2(if using additional h/w) or 1 (iPhone only has one microphone)
[recordSetting setValue:[NSNumber numberWithInt: 1] forKey:AVNumberOfChannelsKey];
[recordSetting setObject:[NSNumber numberWithInt:12800] forKey:AVEncoderBitRateKey];
[recordSetting setObject:[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
[recordSetting setObject:[NSNumber numberWithInt: AVAudioQualityMax] forKey: AVEncoderAudioQualityKey];
recorderFilePath = [NSString stringWithFormat:#"%#/MySound.caf", DOCUMENTS_FOLDER] ;
NSLog(#"recorderFilePath: %#",recorderFilePath);
NSURL *url = [NSURL fileURLWithPath:recorderFilePath];
err = nil;
NSData *audioData = [NSData dataWithContentsOfFile:[url path] options: 0 error:&err];
if(audioData)
{
NSFileManager *fm = [NSFileManager defaultManager];
[fm removeItemAtPath:[url path] error:&err];
}
err = nil;
recorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&err];
if(!recorder){
NSLog(#"recorder: %# %d %#", [err domain], [err code], [[err userInfo] description]);
UIAlertView *alert =
[[UIAlertView alloc] initWithTitle: #"Warning"
message: [err localizedDescription]
delegate: nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
return;
}
//prepare to record
[recorder setDelegate:self];
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
BOOL audioHWAvailable = audioSession.inputIsAvailable;
if (! audioHWAvailable) {
UIAlertView *cantRecordAlert =
[[UIAlertView alloc] initWithTitle: #"Warning"
message: #"Audio input hardware not available"
delegate: nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[cantRecordAlert show];
return;
}
// start recording
[recorder recordForDuration:(NSTimeInterval) 2];
lblStatusMsg.text = #"Recording...";
selector:#selector(handleTimer) userInfo:nil repeats:YES];
}
This is my Code of MPMusicPlayer and AVAudioRecorder.Please help.Thanks in advance.
You can't record Songs imported from iPod library, you can only record mp3 file, for that you need to convert Mediaitem into mp3 format.
you can get referance for that from here, i hope this may help you.
Maybe look into playing two tracks simultaneously. You could record a separate voice over track and trigger this and your mp3 to start playing at the same time. Obviously this would only work inside your app.