Android can't read iPhone recorded audio files - iphone

I record some Audio file with my iOS application and I send it to an Android application through my web server.
The Android application successfully gets the file but when I use the MediaPlayer class to try and play it, it reports an error "Unable to to create media player" at the line mediaplayer.setDataSource(androidFilePath);
Just to mention, iOS devices can read the files sent with the app.
After some research I found that it may be an encoding issue. But I tried many recording settings provided here on SO but none has worked. Here is the code I use to record the audio file:
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryRecord error:nil];
NSMutableDictionary *recordSettings = [[NSMutableDictionary alloc] initWithCapacity:10];
[recordSettings NSNumber numberWithInt: kAudioFormatMPEG4AAC] forKey: AVFormatIDKey];
[recordSettings setObject:[NSNumber numberWithFloat:16000.0] forKey: AVSampleRateKey];
[recordSettings setObject:[NSNumber numberWithInt:1] forKey:AVNumberOfChannelsKey];
[recordSettings setObject:[NSNumber numberWithInt: AVAudioQualityMin] forKey: AVEncoderAudioQualityKey];
}
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *basePath = paths[0];
self.audioURL = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/recordTest.caf", basePath]];
NSError *error = nil;
self.audioRecorder = [[ AVAudioRecorder alloc] initWithURL:self.audioURL settings:recordSettings error:&error];
if ([self.audioRecorder recordForDuration:60] == YES){
[self.audioRecorder record];
}
Could you just tell me what change do I have to make so that Android devices can read those audio files?

I'd probably try .aac or .mp4 instead of .caf.

Try removing the AVEncoderAudioQualityKey from the code and check if this solves the problem. Also do check if the android device you are checking with are able to play the other aac files properly or not.

No you can't state's all the formats used Check here.
You would have to convert to AAC or WAV before you can use it

Related

Different AVAudioSession needed for Recording and Playing in IOS

I am continuing this question from my old post. I read much content for audio recording and playing using Audio queues, audio bufferes, Unfortunately due to less time I compromised and planning to use AVFondation frame work for recording and playing of recorder audio.
My idea: creating one instance of AVRecorder and record through micro phone, same audio planning to play through Iphone speaker using AVAudioPlayer (I ll use multiple instances)
Please bare my tons of lines for recording and playing
-(id)startAudioRecorder:(NSUInteger)viewTag {
NSError *setCategoryErr = nil;
NSError *activationErr = nil;
//Set the general audio session category
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord error: &setCategoryErr];
//Make the default sound route for the session be to use the speaker
UInt32 doChangeDefaultRoute = 1;
AudioSessionSetProperty (kAudioSessionProperty_OverrideCategoryDefaultToSpeaker, sizeof (doChangeDefaultRoute), &doChangeDefaultRoute);
//Activate the customized audio session
[[AVAudioSession sharedInstance] setActive: YES error: &activationErr];
self.audioRecorder=nil;
//AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err = nil;
//[audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err];
NSMutableDictionary* recordSetting = [[NSMutableDictionary alloc] init] ;
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatMPEG4AAC] forKey:AVFormatIDKey];//kAudioFormatAppleIMA4,kAudioFormatAAC
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 1] forKey:AVNumberOfChannelsKey];
NSString *absolutePath=[self getAbsoluteAudioFilePath:viewTag];
NSError *audioRecorderError=nil;
NSURL *absoluteUrl=[NSURL fileURLWithPath:absolutePath];
self.audioRecorder=[[[AVAudioRecorder alloc] initWithURL:absoluteUrl settings:recordSetting error:&audioRecorderError] autorelease];
NSLog(#"%s audioRecorder created=%#",__func__,self.audioRecorder);
BOOL isDurationAccepted=[self.audioRecorder recordForDuration:1800.0];
BOOL record=[self.audioRecorder record];
[recordSetting release];
recordSetting=nil;
//if(error !=NULL) *error=audioRecorderError;
return nil;
}
-(id)playRecordedAudio:(NSUInteger)viewTag
{
NSURL *absoluteUrl=[NSURL fileURLWithPath:[[self getAbsoluteAudioFilePath:viewTag] stringByReplacingPercentEscapesUsingEncoding:NSUTF8StringEncoding ]];
NSError *audioPlayerError=nil;
recordingView.audioPlayer=[[[AVAudioPlayer alloc] initWithContentsOfURL: absoluteUrl error:&audioPlayerError] autorelease] ;
recordingView.audioPlayer.delegate=self;
[recordingView.audioPlayer prepareToPlay];
recordingView.audioPlayer.numberOfLoops=-1;
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
[recordingView.audioPlayer play];
return nil;
}
The above code perfectly working as I expected. But when I am looking for recording->playing, again recording(before finishing currently playing audio) this time currently playing audio coming to microphone(ear phones) instead of playing through speakers.
Now the problem is, I am unable to record and play simultaneously. I am expecting record(even audio playing) always through earphones and multiple audios play through phone speaker.
Please suggest me changes in my above code to come out of this situation. All your comments helpful.
Set AVAudioSessionCategoryPlayAndRecord for playing those recording

Reading .m4a raw data from iPod library

I'm using AVAssetReaderTrackOutput to music raw data from iPod library, racing WAV or MP3 file format from iPod library work fine. However, if i read M4A file i will get data with all 0s or garbage... The M4A file i'm reading is playable and i'm sure is not encrypted... It seems to be iOS encrypt or output 0s when you are reading M4A file. Does anyone has any suggestion on how to resolve this issue?
NOTE: If i use AVAssetExportSession to export from m4a file to m4a file then i get correct raw data but this means i have to first save it to a disk (on Apps document folder for example) and then reading it. I'd like to be able to stream it so i need to find a way to read it directly.
Here is my code and i get correct mp3 or wav raw data "except m4a":
NSURL *assetURL = [MediaItem valueForProperty:MPMediaItemPropertyAssetURL];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
AVAssetTrack* songTrack = [songAsset.tracks objectAtIndex:0];
AVAssetReaderOutput *assetReaderOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[songAsset.tracks objectAtIndex:0] outputSettings:nil];
NSError *assetError = nil;
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:songAsset
error:&assetError];
[assetReader addOutput: assetReaderOutput];
//Retain
self.mStreamAssetReader = assetReader;
self.mStreamAssetReaderOutput = assetReaderOutput;
Assuming that by "raw music data" you mean raw PCM audio, you should pass in an initialised NSDictionary into your AVAssetReaderTrackOutput with the appropriate parameters such that iOS will decode the AAC bitstream inside the .m4a file for you.
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
[NSNumber numberWithFloat:48000.0], AVSampleRateKey,
[NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
[NSNumber numberWithBool:NO], AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsBigEndianKey,
nil];
AVAssetReaderOutput *assetReaderOutput =
[AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[songAsset.tracks objectAtIndex:0]
outputSettings:settings];

GData youtube video upload missing audio

I am using GData Api to upload video on youtube from ios application.
It successfully uploads the video but audio is missing from that.
I am using .mp4 format of video.
Do any one has a clue?
Thanks
-(BOOL) setupWriter{
NSError *error = nil;
// NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
// NSString *documentsDirectory = [paths objectAtIndex:0];
// NSURL * url = [NSURL URLWithString:documentsDirectory];
// url = [url URLByAppendingPathComponent:#"om.mp4"];
// NSString *path = [documentsDirectory stringByAppendingPathComponent:#"om.mp4"];
// [data writeToFile:path atomically:YES];
NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"Documents/movie.mp4"]];
_videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(_videoWriter);
// Add video input
NSDictionary *videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:128.0*1024.0], AVVideoAverageBitRateKey,
nil ];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:192], AVVideoWidthKey,
[NSNumber numberWithInt:144], AVVideoHeightKey,
videoCompressionProps, AVVideoCompressionPropertiesKey,
nil];
_videoWriterInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
float angle = M_PI/2; //rotate 180°, or 1 π radians
_videoWriterInput.transform = CGAffineTransformMakeRotation(angle);
NSParameterAssert(_videoWriterInput);
_videoWriterInput.expectsMediaDataInRealTime = YES;
// Add the audio input
AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
NSDictionary* audioOutputSettings = nil;
// Both type of audio inputs causes output video file to be corrupted.
if( NO ) {
// should work from iphone 3GS on and from ipod 3rd generation
audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatMPEG4AAC ], AVFormatIDKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
nil];
} else {
// should work on any device requires more space
audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
[ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
nil ];
}
_audioWriterInput = [[AVAssetWriterInput
assetWriterInputWithMediaType: AVMediaTypeAudio
outputSettings: audioOutputSettings ] retain];
_audioWriterInput.expectsMediaDataInRealTime = YES;
// add input
[_videoWriter addInput:_videoWriterInput];
[_videoWriter addInput:_audioWriterInput];
return YES;
}
this is setup writer I am using to capture audio is something wring in this????
From the CamStudio Support Forum:
.MOV / .MP4 / .3GPP files
MOV /MP4 /3GPP files are index based. Simply put, this means that
there is an index in the file that tells us the specific location of
where in the file the video and audio frames are present. Without the
index, it is almost impossible to know where the data for a specific
video or audio frame is. This index is contained in what is called a
'moov' atom in the file.
Now, if the index is at the beginning of the file, it will enable
processing of the video as and when successive bytes of the file are
uploaded. On the other hand, if the index is at the end, processing
the video cannot begin until the entire upload is complete - since the
index is needed to interpret the file.
Hence, for MOV / MP4 / 3gpp files, we prefer the "moov" atom in the
beginning of the file - also known as a "fast start"MP4 / MOV file.
There are tools available on the web to flatten your MOV file. Usually
the video editing/export software will have options to create your
files with the moov atom in the beginning rather than the end of your
file. If you are using Apple editing tools, then see this article on
how to produce a "fast start" MP4/MOV file./p>
Here's a list of some well-known formats that YouTube supports:
WebM files - Vp8 video codec and Vorbis Audio codecs
.MPEG4, 3GPP and MOV files - Typically supporting h264, mpeg4 video
codecs, and AAC audio codec
.AVI - Many cameras output this format - typically the video codec is
MJPEG and audio is PCM
.MPEGPS - Typically supporting MPEG2 video codec and MP2 audio
.WMV
.FLV - Adobe-FLV1 video codec, MP3 audio
I have been working with the GData API the last few days and had no similar Problems. I would suggest you checkout the filetypes supported by Youtube: http://www.google.com/support/youtube/bin/answer.py?answer=55744. Can you verify the file you are uploading actually has audio? What does Quicktime say about the Format of your Audio track? Youtube recommends using AAC (as far as I can tell).

AVAudioSession reroutes iPhone audio automatically?

I'm having problems with AVAudioSession using the AVAudioRecorder in a cocos2d game that I'm working on.
I'm trying to capture mic input using a simple AVAudioRecorder example to detect when the user makes a sound in the mic (the sound itself doesn't matter, as I'm recording into /dev/null).
Here is my setup code for the microphone:
NSURL *newURL = [[NSURL alloc] initFileURLWithPath:#"/dev/null"];
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord error: nil];
NSDictionary *recordSettings =
[[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithFloat: 22050.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityLow],
AVEncoderAudioQualityKey,
nil];
micInput = [[AVAudioRecorder alloc] initWithURL:newURL settings:recordSettings error:nil];
[newURL release];
[recordSettings release];
[micInput setMeteringEnabled:YES];
On the iPhone with the above code, the scene starts with all of the audio (sound effects, background music, etc.) playing at a really low level, because it is only playing through the phone speaker instead of the external speaker. When I test this on iPad or iPod Touch, the background audio plays through the external speaker as expected. This is a problem, since the volume of the game lowers drastically when playing on the iPhone version during this particular part of the game.
When I comment out the AVAudioSession setup line, the sounds play through the external speaker, but of course I can't get microphone input anymore. Is there any workaround or solution to this problem? I need to be able to record with AVAudioRecorder but still have audio output from the iPhone's external speaker.
Thanks!
Try something like the following after you setup your audio session:
UInt32 ASRoute = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (
kAudioSessionProperty_OverrideAudioRoute,
sizeof (ASRoute),
&ASRoute
);

recorder works on iPhone 3GS but not on iPhone 3G

I have a AVAudioRecorder, which gets initialized like:
_recorder = [AVAudioRecorder alloc];
NSMutableDictionary *recordSettings = [[[NSMutableDictionary alloc] initWithCapacity:2] autorelease];
[recordSettings setObject:[NSNumber numberWithInt:kAudioFormatMPEG4AAC] forKey: AVFormatIDKey];
[recordSettings setObject:[NSNumber numberWithInt:AVAudioQualityLow] forKey: AVEncoderAudioQualityKey];
NSURL *url = [NSURL fileURLWithPath:[self getActualSoundPath]];
[_recorder initWithURL:url settings:recordSettings error:nil];
[_recorder setDelegate:self];
This code works perfectly in my Simulator and on my iPhone 3GS but not on the older iPhone 3G...
What is the problem on that?
Thanks
Markus
The iPhone 3G doesnt have support for hardware AAC encoding like the 3GS/4/4S. You could use linear PCM, but thats not compressed.
See this thread for supported formats: How can I record AMR audio format on the iPhone?
Apple lossless (kAudioFormatAppleLossless) and iLBC (kAudioFormatiLBC) seems to work fine on the iPhone 3G.
kAudioFormatAppleIMA4 is recommended to use, but its not recognized by mobile Safari.
/Marcus