Reading .m4a raw data from iPod library - iphone

I'm using AVAssetReaderTrackOutput to music raw data from iPod library, racing WAV or MP3 file format from iPod library work fine. However, if i read M4A file i will get data with all 0s or garbage... The M4A file i'm reading is playable and i'm sure is not encrypted... It seems to be iOS encrypt or output 0s when you are reading M4A file. Does anyone has any suggestion on how to resolve this issue?
NOTE: If i use AVAssetExportSession to export from m4a file to m4a file then i get correct raw data but this means i have to first save it to a disk (on Apps document folder for example) and then reading it. I'd like to be able to stream it so i need to find a way to read it directly.
Here is my code and i get correct mp3 or wav raw data "except m4a":
NSURL *assetURL = [MediaItem valueForProperty:MPMediaItemPropertyAssetURL];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
AVAssetTrack* songTrack = [songAsset.tracks objectAtIndex:0];
AVAssetReaderOutput *assetReaderOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[songAsset.tracks objectAtIndex:0] outputSettings:nil];
NSError *assetError = nil;
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:songAsset
error:&assetError];
[assetReader addOutput: assetReaderOutput];
//Retain
self.mStreamAssetReader = assetReader;
self.mStreamAssetReaderOutput = assetReaderOutput;

Assuming that by "raw music data" you mean raw PCM audio, you should pass in an initialised NSDictionary into your AVAssetReaderTrackOutput with the appropriate parameters such that iOS will decode the AAC bitstream inside the .m4a file for you.
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
[NSNumber numberWithFloat:48000.0], AVSampleRateKey,
[NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
[NSNumber numberWithBool:NO], AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsBigEndianKey,
nil];
AVAssetReaderOutput *assetReaderOutput =
[AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[songAsset.tracks objectAtIndex:0]
outputSettings:settings];

Related

Android can't read iPhone recorded audio files

I record some Audio file with my iOS application and I send it to an Android application through my web server.
The Android application successfully gets the file but when I use the MediaPlayer class to try and play it, it reports an error "Unable to to create media player" at the line mediaplayer.setDataSource(androidFilePath);
Just to mention, iOS devices can read the files sent with the app.
After some research I found that it may be an encoding issue. But I tried many recording settings provided here on SO but none has worked. Here is the code I use to record the audio file:
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryRecord error:nil];
NSMutableDictionary *recordSettings = [[NSMutableDictionary alloc] initWithCapacity:10];
[recordSettings NSNumber numberWithInt: kAudioFormatMPEG4AAC] forKey: AVFormatIDKey];
[recordSettings setObject:[NSNumber numberWithFloat:16000.0] forKey: AVSampleRateKey];
[recordSettings setObject:[NSNumber numberWithInt:1] forKey:AVNumberOfChannelsKey];
[recordSettings setObject:[NSNumber numberWithInt: AVAudioQualityMin] forKey: AVEncoderAudioQualityKey];
}
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *basePath = paths[0];
self.audioURL = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/recordTest.caf", basePath]];
NSError *error = nil;
self.audioRecorder = [[ AVAudioRecorder alloc] initWithURL:self.audioURL settings:recordSettings error:&error];
if ([self.audioRecorder recordForDuration:60] == YES){
[self.audioRecorder record];
}
Could you just tell me what change do I have to make so that Android devices can read those audio files?
I'd probably try .aac or .mp4 instead of .caf.
Try removing the AVEncoderAudioQualityKey from the code and check if this solves the problem. Also do check if the android device you are checking with are able to play the other aac files properly or not.
No you can't state's all the formats used Check here.
You would have to convert to AAC or WAV before you can use it

Wav file does not converted to Flac Using libflac.a

I am trying to convert my recorded wave file to the flac file. I am using this tutorial to do that. Get the LibFlac Project from here, And after build that i get flacios.framwork and libFLACiOS.a.
I added both thing in my iOS project but i get error that said flacios.faramework does not found. I dont know why is that bcoz its there hall time. So i have just copied flacios.framework's Header file and add it to my project
Than i have used wav_to_flac files to convert the wave file to flac And used this code to do that:
NSArray *searchPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentPath = [NSString stringWithFormat:#"%#/%#",[searchPaths objectAtIndex:0],#"tmp"];
NSString *flacFile = documentPath;
NSString *waveFile = recorderFilePath;
NSLog(#"flacFile : %#",flacFile);
NSLog(#"recorderFilePath : %#",recorderFilePath);
const char *wave_file = [waveFile UTF8String];
const char *flac_file = [flacFile UTF8String];
int interval_seconds = 30;
char** flac_files = (char**) malloc(sizeof(char*) * 1024);
int conversionResult = convertWavToFlac(wave_file, flac_file, interval_seconds, flac_files);
NSLog(#"conversionResult : %i",conversionResult);
My wave file log said :
recorderFilePath : /var/mobile/Applications/C5A86F04-A6A2-44EA-81A9-7AD1F36AAE5D/Documents/MyAudioMemo.wav
And my flac file log :
flacFile : /var/mobile/Applications/C5A86F04-A6A2-44EA-81A9-7AD1F36AAE5D/Documents/tmp
But i got this error at the end :
writing to new flac file /var/mobile/Applications/C5A86F04-A6A2-44EA-81A9-7AD1F36AAE5D/Documents/tmp.flac
Assertion failed: (encoder->protected_->state == FLAC__STREAM_ENCODER_OK), function FLAC__stream_encoder_process_interleaved, file /Users/dilipmanek/Desktop/FLACiOS-no-ogg/libFLAC/src/libFLAC/stream_encoder.c, line 2040.
Does anybody have worked on this than plz help.
Yesterday I faced the same error, and after a few hours searching I found a solution.
When you record the wav file, using the following setting and try to convert it again.
recorderSettingsDict =[[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM],AVFormatIDKey,
[NSNumber numberWithInt:16000.0],AVSampleRateKey,
[NSNumber numberWithInt:2],AVNumberOfChannelsKey,
[NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
nil];

h 264 Hardware encoding /decoding For IOS(IPhone/Ipad)?

i have done Real time video processing in ios using AVFoundation framework ,
help of this link.i have tested it is working fine. now i want to use h264 encoding and decoding[Before draw] .i try to get h264 encoded data from AVCaptureSession ,so i have set AVVideoCodecH264 in AVCaptureSession's videosetting,before Start capturing .
NSDictionary* videoSettings = [NSDictionary
dictionaryWithObjectsAndKeys:value,key,AVVideoCodecH264,AVVideoCodecKey,
nil];
[captureOutput setVideoSettings:videoSettings];
above code doesn't produce any change in output buffer,same buffer format get as like before. how to accomplish my requirement ? is it possible ? if so Please Help me to getting started ,h264 in ios.
There's no direct access to encoded keyframes. You may found some insightful comments about that here: https://stackoverflow.com/questions/4399162/how-to-stream-live-video-from-iphone-to-a-media-server-like-wowza
A pair AVVideoCodecH264, AVVideoCodecKey may be used as parameters in outputSettings in
NSError *error = nil;
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey, nil];
AVAssetWriterInput *assetWriterInput = [[AVAssetWriterInput alloc]
initWithMediaType:AVMediaTypeVideo
outputSettings:outputSettings];
AVAssetWriter *assetWriter = [[AVAssetWriter alloc] initWithURL:fileURL
fileType:AVFileTypeMPEG4
error:&error];
if (!error && [assetWriter canAddInput:assetWriterInput])
[assetWriter addInput:assetWriterInput];

GData youtube video upload missing audio

I am using GData Api to upload video on youtube from ios application.
It successfully uploads the video but audio is missing from that.
I am using .mp4 format of video.
Do any one has a clue?
Thanks
-(BOOL) setupWriter{
NSError *error = nil;
// NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
// NSString *documentsDirectory = [paths objectAtIndex:0];
// NSURL * url = [NSURL URLWithString:documentsDirectory];
// url = [url URLByAppendingPathComponent:#"om.mp4"];
// NSString *path = [documentsDirectory stringByAppendingPathComponent:#"om.mp4"];
// [data writeToFile:path atomically:YES];
NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"Documents/movie.mp4"]];
_videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(_videoWriter);
// Add video input
NSDictionary *videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:128.0*1024.0], AVVideoAverageBitRateKey,
nil ];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:192], AVVideoWidthKey,
[NSNumber numberWithInt:144], AVVideoHeightKey,
videoCompressionProps, AVVideoCompressionPropertiesKey,
nil];
_videoWriterInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
float angle = M_PI/2; //rotate 180°, or 1 π radians
_videoWriterInput.transform = CGAffineTransformMakeRotation(angle);
NSParameterAssert(_videoWriterInput);
_videoWriterInput.expectsMediaDataInRealTime = YES;
// Add the audio input
AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
NSDictionary* audioOutputSettings = nil;
// Both type of audio inputs causes output video file to be corrupted.
if( NO ) {
// should work from iphone 3GS on and from ipod 3rd generation
audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatMPEG4AAC ], AVFormatIDKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
nil];
} else {
// should work on any device requires more space
audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
[ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
nil ];
}
_audioWriterInput = [[AVAssetWriterInput
assetWriterInputWithMediaType: AVMediaTypeAudio
outputSettings: audioOutputSettings ] retain];
_audioWriterInput.expectsMediaDataInRealTime = YES;
// add input
[_videoWriter addInput:_videoWriterInput];
[_videoWriter addInput:_audioWriterInput];
return YES;
}
this is setup writer I am using to capture audio is something wring in this????
From the CamStudio Support Forum:
.MOV / .MP4 / .3GPP files
MOV /MP4 /3GPP files are index based. Simply put, this means that
there is an index in the file that tells us the specific location of
where in the file the video and audio frames are present. Without the
index, it is almost impossible to know where the data for a specific
video or audio frame is. This index is contained in what is called a
'moov' atom in the file.
Now, if the index is at the beginning of the file, it will enable
processing of the video as and when successive bytes of the file are
uploaded. On the other hand, if the index is at the end, processing
the video cannot begin until the entire upload is complete - since the
index is needed to interpret the file.
Hence, for MOV / MP4 / 3gpp files, we prefer the "moov" atom in the
beginning of the file - also known as a "fast start"MP4 / MOV file.
There are tools available on the web to flatten your MOV file. Usually
the video editing/export software will have options to create your
files with the moov atom in the beginning rather than the end of your
file. If you are using Apple editing tools, then see this article on
how to produce a "fast start" MP4/MOV file./p>
Here's a list of some well-known formats that YouTube supports:
WebM files - Vp8 video codec and Vorbis Audio codecs
.MPEG4, 3GPP and MOV files - Typically supporting h264, mpeg4 video
codecs, and AAC audio codec
.AVI - Many cameras output this format - typically the video codec is
MJPEG and audio is PCM
.MPEGPS - Typically supporting MPEG2 video codec and MP2 audio
.WMV
.FLV - Adobe-FLV1 video codec, MP3 audio
I have been working with the GData API the last few days and had no similar Problems. I would suggest you checkout the filetypes supported by Youtube: http://www.google.com/support/youtube/bin/answer.py?answer=55744. Can you verify the file you are uploading actually has audio? What does Quicktime say about the Format of your Audio track? Youtube recommends using AAC (as far as I can tell).

AVAudioSession reroutes iPhone audio automatically?

I'm having problems with AVAudioSession using the AVAudioRecorder in a cocos2d game that I'm working on.
I'm trying to capture mic input using a simple AVAudioRecorder example to detect when the user makes a sound in the mic (the sound itself doesn't matter, as I'm recording into /dev/null).
Here is my setup code for the microphone:
NSURL *newURL = [[NSURL alloc] initFileURLWithPath:#"/dev/null"];
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord error: nil];
NSDictionary *recordSettings =
[[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithFloat: 22050.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityLow],
AVEncoderAudioQualityKey,
nil];
micInput = [[AVAudioRecorder alloc] initWithURL:newURL settings:recordSettings error:nil];
[newURL release];
[recordSettings release];
[micInput setMeteringEnabled:YES];
On the iPhone with the above code, the scene starts with all of the audio (sound effects, background music, etc.) playing at a really low level, because it is only playing through the phone speaker instead of the external speaker. When I test this on iPad or iPod Touch, the background audio plays through the external speaker as expected. This is a problem, since the volume of the game lowers drastically when playing on the iPhone version during this particular part of the game.
When I comment out the AVAudioSession setup line, the sounds play through the external speaker, but of course I can't get microphone input anymore. Is there any workaround or solution to this problem? I need to be able to record with AVAudioRecorder but still have audio output from the iPhone's external speaker.
Thanks!
Try something like the following after you setup your audio session:
UInt32 ASRoute = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (
kAudioSessionProperty_OverrideAudioRoute,
sizeof (ASRoute),
&ASRoute
);