I am creating and storing an AAC-encoded .m4a file using AVAudioRecorder. This produces a playable .m4a file just fine. I want to then use AVAssetExportSession to process the file in order to add metadata to the file. The below code is producing a .m4a file of a similar size (1 KB less than source), but when it plays back, there is just silence.
NSURL* url = [NSURL fileURLWithPath:self.m4aPath];
AVURLAsset* asset = [AVAsset assetWithURL:url];
AVMutableMetadataItem* t = [AVMutableMetadataItem metadataItem];
t.key = AVMetadataCommonKeyTitle;
t.keySpace = AVMetadataKeySpaceCommon;
t.value = #"Unit Test";
NSArray* metadata = [NSArray arrayWithObject:t];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetAppleM4A];
exportSession.outputURL = [NSURL fileURLWithPath:[[NSFileManager rawRecordingsDirectory] stringByAppendingPathComponent:#"test.m4a"]];
exportSession.outputFileType = AVFileTypeAppleM4A;
exportSession.metadata = metadata;
[exportSession exportAsynchronouslyWithCompletionHandler:^{....}];
One more piece of info: When I look at the source and exported file in the Finder, the source file has the black iTunes icon, while the exported file has the white iTunes icon. Not sure what this means in practice, but hoping it might be helpful. Moreover, double-clicking source adds it to iTunes and starts playback, while double-clicking the exported opens iTunes but does nothing.
I had a similar issue where my output m4a file had the white icon (instead of black) and wouldn't play. Though that was when I was creating the original source file from raw sample data, not when adding metadata to it.
My issue was that I wasn't closing the exported file in my code (I was just terminating the app before calling the close function). Once I called the close function, it started working. You might want to check that.
Also, I found "open with->Quicktime" useful as that gives an error when the file is corrupt, and plays it fine when it isn't. More useful than iTunes silently ignoring the error.
Related
I have 2 WAV files (mono) I would like to merge.
I want to merge them into a stereo WAV file where the first file will use the left channel while the second file will use the right channel (if possible, I would also like to control the volume and lower the second file a bit).
I've tried to use AVAssetReaderAudioMixOutput, but got the following error:
[AVAssetReaderAudioMixOutput initWithAudioTracks:audioSettings:] tracks must all be part of the same AVAsset
I'm not sure how to merge 2 different files.
AVAssetReaderOutput* reader=[AVAssetReaderAudioMixOutput assetReaderAudioMixOutputWithAudioTracks:[NSArray arrayWithObjects:
[[AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[documentDirectory stringByAppendingPathComponent:#"left.wav"]] options:nil].tracks lastObject],
[[AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[documentDirectory stringByAppendingPathComponent:#"right.wav"]] options:nil].tracks lastObject],
nil] audioSettings:nil];
I want to have two audio files and mix and play it programmatically. When I am playing the first audio file, after some time(dynamic time) I need to add the second small audio file with the first audio file when somewhere middle of the first audio file is playing, then finally I need to save as one audio file on the device. It should play the audio file with the mixer audio I included the second one.
I have gone through many forums, but couldn't get the clue exactly how to achieve this?
Could someone please clarify my below doubts?
In this case, what audio file/format I should use? Can I use .avi files?
How to add the second audio after the dynamic time set onto the first audio file programmatically? For ex: If the first audio total time is 2 mins, I might need to mix the second audio file (3 seconds audio) somewhere in 1 min or 1.5 mins or 55 seconds of the first file. Its dynamic.
How to save the final output audio file on the device? If I save the audio file programmatically somewhere, can I play back again?
I don't know how to achieve this. Please suggest your thoughts!
Open each audio file
Read the header info
Get raw uncompressed audio into memory as an array of ints for each file
Starting at the point in file 1's array where you want to mix in file2, loop through, adding file2's int value to file1's, being sure to 'clip' any values above or below the max (this is how you mix audio ... yes, it's that simple). If file2 is longer, you'll have to make the first array long enough to hold the remainder of file2 completely.
Write new header info and then the audio from the array to which you added file2.
If there is compression involved or the files won't fit in memory, you may have to implement a more complex buffering scheme.
In this case, what audio file/format I should use? Can I use .avi files?
You can choose a compressed or non-compressed format. Common non-compressed formats include Wav and AIFF. CAF can represent compressed and non compressed data. .avi is not an option (offered by the OS).
If the files are large and storage space (on disk) is a concern, you may consider AAC format saved in a CAF (or simply .m4a). For most applications, 16 bit samples will be enough, and you can also save space, memory and cpu by saving these files at an appropriate sample rate (ref: CDs are 44.1kHz).
Since ExtAudioFile interface abstract the conversion process, you should not have to change your program to compare size and speed differences of compressed and non-compressed formats for your distribution (AAC in CAF would be fine for normal applications).
Noncompressed CD quality audio will consume about 5.3 MB per minute, per channel. So if you have 2 stereo audio files, each 3 minutes long, and a 3 minute destination buffer, your memory requirement would be around 50 MB.
Since you have 'minutes' of audio, you may need to consider avoiding loading all audio data into memory at once. In order to read, manipulate, and combine audio, you will need a non-compressed representation to work with in memory, so compression formats would not help here. As well, converting a compressed representation to pcm takes a good amount of resources; reading a compressed file, although fewer bytes, can take more (or less) time.
How to add the second audio after the dynamic time set onto the first audio file programmatically? For ex: If the first audio total time is 2 mins, I might need to mix the second audio file (3 seconds audio) somewhere in 1 min or 1.5 mins or 55 seconds of the first file. Its dynamic.
To read the files and convert them to the format you want to use, use ExtAudioFile APIs - this will convert to your destination sample format for you. Common PCM sample representations in memory include SInt32, SInt16, and float, but that can vary wildly based on the application and the hardware (beyond iOS). ExtAudioFile APIs would also convert compressed formats to PCM, if needed.
Your input audio files should have the same sample rate. If not, you will have to resample the audio, a complex process which also takes a lot of resources (if done correctly/accurately). If you need to support resampling, double the time you've allocated to completing this task (not detailing the process here).
To add the sounds, you would request PCM samples from the files, process, and write to the output file (or buffer in memory).
To determine when to add the other sounds, you will need to get the sample rates for the input files (via ExtAudioFileGetProperty). If you want to write the second sound to the destination buffer at 55s, then you would start adding the sounds at sample number SampleRate * 55, where SampleRate is the sample rate of the files you are reading.
To mix audio, you will just use this form (pseudocode):
mixed[i] = fileA[i] + fileB[i];
but you have to be sure you avoid over/underflow and other arithmetic errors. Typically, you will perform this process using some integer value, because floating point calculations can take a long time (when there are so many). For some applications, you could just shift and add with no worry of overflow - this would effectively reduce each input by one half before adding them. The amplitude of the result would be one half. If you have control over the files' content (e.g. they are all bundled as resources) then you could simply ensure no peak sample in the files exceeded one half of the full scale value (about -6dBFS). Of course, saving as float would solve this issue at the expense of introducing higher CPU, memory, and file i/o demands.
At this point, you'd have 2 files open for reading, and one open for writing, then a few small temporary buffers for processing and mixing the inputs before writing to the output file. You should perform these requests in blocks for efficiency (e.g. read 1024 samples from each file, process the samples, write 1024 samples). The APIs don't guarantee much regarding caching and buffering for efficiency.
How to save the final output audio file on the device? If I save the audio file programmatically somewhere, can I play back again?
ExtAudioFile APIs would work for your read and writing needs. Yes, you can read/play it later.
Hello You can do this by using av foundation
- (BOOL) combineVoices1
{
NSError *error = nil;
BOOL ok = NO;
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
CMTime nextClipStartTime = kCMTimeZero;
//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.8];
NSString *soundOne =[[NSBundle mainBundle]pathForResource:#"test1" ofType:#"caf"];
NSURL *url = [NSURL fileURLWithPath:soundOne];
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack = [[avAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionAudioTrack1 = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.3];
NSString *soundOne1 =[[NSBundle mainBundle]pathForResource:#"test" ofType:#"caf"];
NSURL *url1 = [NSURL fileURLWithPath:soundOne1];
AVAsset *avAsset1 = [AVURLAsset URLAssetWithURL:url1 options:nil];
NSArray *tracks1 = [avAsset1 tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack1 = [[avAsset1 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack1 insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:clipAudioTrack1 atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionAudioTrack2 = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack2 setPreferredVolume:1.0];
NSString *soundOne2 =[[NSBundle mainBundle]pathForResource:#"song" ofType:#"caf"];
NSURL *url2 = [NSURL fileURLWithPath:soundOne2];
AVAsset *avAsset2 = [AVURLAsset URLAssetWithURL:url2 options:nil];
NSArray *tracks2 = [avAsset2 tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack2 = [[avAsset2 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack1 insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset2.duration) ofTrack:clipAudioTrack2 atTime:kCMTimeZero error:nil];
AVAssetExportSession *exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (nil == exportSession) return NO;
NSString *soundOneNew = [documentsDirectory stringByAppendingPathComponent:#"combined10.m4a"];
//NSLog(#"Output file path - %#",soundOneNew);
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:soundOneNew]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
// perform the export
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (AVAssetExportSessionStatusCompleted == exportSession.status) {
NSLog(#"AVAssetExportSessionStatusCompleted");
} else if (AVAssetExportSessionStatusFailed == exportSession.status) {
// a failure may happen because of an event out of your control
// for example, an interruption like a phone call comming in
// make sure and handle this case appropriately
NSLog(#"AVAssetExportSessionStatusFailed");
} else {
NSLog(#"Export Session Status: %d", exportSession.status);
}
}];
return YES;
}
If you are going to play multiple sounds at once, definitely use the *.caf format. Apple recommends it for playing multiple sounds at once. In terms of mixing them programmatically, I am assuming you just want them to play at the same time. While one sound is playing, just tell the other sound to play at whatever time you would like. To set a specific time, use NSTimer (NSTimer Class Reference) and create a method to have the sound play when the timer fires.
I am wanting to write my own logs to a text file on my iPhone. I wrote up a quick method that writes a string to a file. Right now it saves it into the Documents directory, which, if on the device is going to be a pain to get off, since I can't just browse to it. Is there a better way to quickly get this file off the device after I have written to it?
/**
* Logs a string to file
*
* #version $Revision: 0.1
*/
+ (void)logWithString:(NSString *)string {
// Create the file
NSError *error;
// Directory
NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents"];
NSString *filePath = [documentsDirectory stringByAppendingPathComponent:#"log.txt"];
// Get the file contents
NSData *localData = [NSData dataWithContentsOfFile:filePath];
if (localData) {
NSString *logString = [[NSString alloc] initWithData:localData encoding:NSUTF8StringEncoding];
string = [logString stringByAppendingFormat:#"%#\n", string];
[logString release];
}
// Write to the file
[string writeToFile:filePath atomically:YES encoding:NSUTF8StringEncoding error:&error];
}//end
Add Application supports iTunes file sharing to your application target's build info in Xcode:
Then, you can easily browse, retrieve and delete any files created by the app from iTunes, right under Devices > Your device > Apps > File Sharing:
You may have to capture what number of logs you have created so far and create a new name for each log biased on that.
So you might save your last made logs name as a string in NSUserDefaults and get the number off the end of that and add one onto that captured int ready for the next name.
So if you have #"Log4" you can get the 4 out of that and make it 5 so that the next log is named "Log5"
Just my 2 cents :P
With regard to the 'How to get the file' part of the question
iExplorer, previously iPhone Explorer allows you to view your apps, including their documents folder without jailbreaking your devices.
In my experience (albeit of an older version), getting the files from the phone can be a little temporamental (i.e. I drag a file onto my desktop and although it creates the file, it doesn't write any of the data), you can get the files from your device.
Our iPad app can show Documents and save them offline when needed.
I've got a QLPreviewController subclass named DocumentViewController (named DVC from now on) for showing them.
Workflow of the app:
- The user clicks a name of a document and the DVC is pushed on to show the document.
- The DVC downloads the file offline and shows it when done.
(So the HTTP URL is downloaded, stored offline, and an offline URL is returned)
The weird thing is, is that only PDF files are working with the offline URL, and the rest crashes.. (it works with online links though)
I did some tests and when I put file:// before the offline link the app does not crash but the DVC is ging me some information about the file (like that it is a excel 97-2004 document).
So some info is transferred, but I can't figure out what the problem is.
Here are some screenshots and after that some code.
code:
Note that Document is a model class with document properties like id, name, file type and url.
//DVC QLPreviewController dataSource method for returning url
- (id <QLPreviewItem>) previewController: (QLPreviewController *) controller previewItemAtIndex: (NSInteger)index
{
[SaveHelper saveDocumentFileAndPropertyWithDocument:document];
//[SaveHelper getDocumentFileWithDocument:document]; without file://
//if I return document.documentUrl it is working with all files except iworks files)
return [SaveHelper getDocumentFileAsPathWithDocument:document]; //with file://
}
//SaveHelper methods
+ (NSString *)documentFilePathWithDocument:(Document *)document
{
return [[self documentFilePath] stringByAppendingPathComponent:[NSString stringWithFormat:#"%#%d.%#", DOCUMENT_FILE_PREFIX, document.documentId, document.documentType]];
}
+ (NSURL *)saveDocumentFileAndPropertyWithDocument:(Document *)document
{
if([self saveDocumentPropertyWithDocument:document])
{
return [self saveDocumentFileWithDocument:document];
}
return nil;
}
+ (NSURL *)saveDocumentFileWithDocument:(Document *)document
{
NSData *data = [NSData dataWithContentsOfURL:document.documentURL];
NSString *fullPath = [self documentFilePathWithDocument:document];
if([[NSKeyedArchiver archivedDataWithRootObject:data] writeToFile:fullPath atomically:YES])
{
return [NSURL fileURLWithPath:fullPath];
}
return nil;
}
+ (NSURL *)getDocumentFileWithDocument:(Document *)document
{
return [NSURL fileURLWithPath:[self documentFilePathWithDocument:document]];
}
+ (NSURL *)getDocumentFileAsPathWithDocument:(Document *)document
{
return [NSURL fileURLWithPath:[#"file://" stringByAppendingPathComponent:[[self getDocumentFileWithDocument:document] absoluteString]]];
}
If more code needed, just say.
EDIT:
When logging the URL passed trough the 'getDocumentFileAsPathWithDocument' method:
url: file:/var/mobile/Applications/xx-xx/Documents/documentFiles/file_20.pdf
url: file:/var/mobile/Applications/xx-xx/Documents/documentFiles/file_80.docx
Where the PDF file is working and the docx not
When I try to load an image(jpg) from local storage I get a black screen with this error message:
warning: Unable to read symbols for /Developer/Platforms/iPhoneOS.platform/DeviceSupport/4.3.5 (8L1)/Symbols/System/Library/Frameworks/QuickLook.framework/DisplayBundles/Image.qldisplay/Image (file not found).
warning: No copy of Image.qldisplay/Image found locally, reading from memory on remote device. This may slow down the debug session.
EDIT:
The webview does not work either with the local urls. PDF is fine but the office files gives an message "Unable to read Document, the file format is invalid". The iWorks documents give the same error as the quicklook. I think its somewhere at the save and load of the format, I savve them as a NSDATA but after that there is no hint for the iPad to see if it is for example a word document (only the extension).
You haven't posted your download code, but I believe that the problem is there. Files from Pages (.pages extension) aren't actual files, they are bundles, i.e. directories that contain files and show as a single item in Finder (look up a .pages file in Finder, right-click it and select 'Show contents'). Downloading a .pages file is actually like downloading a directory: it depends on the web server what kind of result you get but it's most likely an error page.
You could detect that it's a .pages file and try to download all of its contents manually, but you'd have to study the structure of the files to see if that's possible because it's unlikely that you can request the contents of the directory from a web server.
The results for the .ppt and .xls files look normal to me; I think it unlikely that the iPad can preview MS Office documents at all.
Edit: apologies, I just read that iOS can preview MS Office documents. Perhaps the documents get somehow corrupted during download? Have you tried to set your download location to the app's documents folder and enable iTunes file sharing? That way you can download some documents, pull them off your device and then try to open it on your PC to see if that works.
We finally found the solution!
I was right that the problem was with saving the document.
I needed to change the save method in:
+ (NSURL *)saveDocumentFileWithDocument:(Document *)document
{
NSData *data = [NSData dataWithContentsOfURL:document.documentURL options:NSDataReadingUncached error:nil];
NSString *fullPath = [self documentFilePathWithDocument:document];
if([[NSFileManager defaultManager] createFileAtPath:fullPath contents:data attributes:nil])
{
return [NSURL fileURLWithPath:fullPath];
}
//OLD CODE
// if([[NSKeyedArchiver archivedDataWithRootObject:data] writeToFile:fullPath atomically:YES])
// {
// return [NSURL fileURLWithPath:fullPath];
// }
return nil;
}
SO saving it with the filemanager and not with a keyedarchiver.
Did you check if the size of the files is the same both online and offline? It is possible that the file download wasn't complete
Try using the URL of the MS Office documents with a normal NSURL object and opening in a UIWebView. Does it work then (so we know if its the document or your class)?
Try using NSURL's fileURLWithPath: method in the getDocumentFileAsPathWithDocument: It is possible that the URL being returned is incorrect (though doesn't look like it from the logs but doesn't hurt to try)
first of all, use this code to make sure your documents are there,because i think the error cause by your documents path.
NSFileManager *fileManager=[NSFileManager defaultManager];
if([fileManager fileExistsAtPath:fullPath]){
NSLog(#"%# exsit! ",fullPath);
}else{
NSLog(#"%# not exsit! ",fullPath);
}
If any of one have same problem even though you did everything suggestions above.
(I had same problem, when I downloaded some files from google drive.)
Try this!
Put 'x' end of your file extension to be recognized as a new version of format.
(it's working only for 'doc' and 'ppt' files, not for 'xls' files)
Yes, I know this is not a appropriate way to solve this problem, but
it's worth to try it.
Believe me I tried everything!
Hope you help.
I have alot of wav files stored in sqlite3, but when I retrieve one of them, I can't play it. The retrieve code is
NSData *soundData = (NSDATA *)sqlite3_column_blob(statement, 0);
mPlayer = [[AVAudioPlayer alloc] initWithData:soundData error:&error];
The data is stored as binary and it's there when I search for it using sqlite3.
Sorry. Never mind. I just compressed the data more and it works fine now. Seems the number of files is not as important as their size afterall.