MPMediaItems raw song data - iphone

I was wondering how to access an MPMediaItem's raw data.
Any ideas?

you can obtain the media item's data in such way:
-(void)mediaItemToData
{
// Implement in your project the media item picker
MPMediaItem *curItem = musicPlayer.nowPlayingItem;
NSURL *url = [curItem valueForProperty: MPMediaItemPropertyAssetURL];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL: url options:nil];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset: songAsset
presetName: AVAssetExportPresetPassthrough];
exporter.outputFileType = #"public.mpeg-4";
NSString *exportFile = [[self myDocumentsDirectory] stringByAppendingPathComponent:
#"exported.mp4"];
NSURL *exportURL = [[NSURL fileURLWithPath:exportFile] retain];
exporter.outputURL = exportURL;
// do the export
// (completion handler block omitted)
[exporter exportAsynchronouslyWithCompletionHandler:
^{
NSData *data = [NSData dataWithContentsOfFile: [[self myDocumentsDirectory]
stringByAppendingPathComponent: #"exported.mp4"]];
// Do with data something
}];
}
This code will work only on ios 4.0 and later
Good luck!

Of course you can access the data of a MPMediaItem. It's not crystal clear at once but it works. Here's how:
Get the media item's URL from it's MPMediaItemPropertyAssetURL property
Initialize an AVURLAsset with this URL
Initialize an AVAssetReader with this asset
Fetch the AVAssetTrack you want to read from the AVURLAsset
Create an AVAssetReaderTrackOutput with this track
Add this output to the AVAssetReader created before and -startReading
Fetch all data with AVAssetReaderTrackOutput's -copyNextSampleBuffer
PROFIT!
Here is some sample code from a project of mine (this is not a code jewel of mine, wrote it some time back in my coding dark ages):
typedef enum {
kEDSupportedMediaTypeAAC = 'aac ',
kEDSupportedMediaTypeMP3 = '.mp3'
} EDSupportedMediaType;
- (EDLibraryAssetReaderStatus)prepareAsset {
// Get the AVURLAsset
AVURLAsset *uasset = [m_asset URLAsset];
// Check for DRM protected content
if (uasset.hasProtectedContent) {
return kEDLibraryAssetReader_TrackIsDRMProtected;
}
if ([uasset tracks] == 0) {
DDLogError(#"no asset tracks found");
return AVAssetReaderStatusFailed;
}
// Initialize a reader with a track output
NSError *err = noErr;
m_reader = [[AVAssetReader alloc] initWithAsset:uasset error:&err];
if (!m_reader || err) {
DDLogError(#"could not create asset reader (%i)\n", [err code]);
return AVAssetReaderStatusFailed;
}
// Check tracks for valid format. Currently we only support all MP3 and AAC types, WAV and AIFF is too large to handle
for (AVAssetTrack *track in uasset.tracks) {
NSArray *formats = track.formatDescriptions;
for (int i=0; i<[formats count]; i++) {
CMFormatDescriptionRef format = (CMFormatDescriptionRef)[formats objectAtIndex:i];
// Check the format types
CMMediaType mediaType = CMFormatDescriptionGetMediaType(format);
FourCharCode mediaSubType = CMFormatDescriptionGetMediaSubType(format);
DDLogVerbose(#"mediaType: %s, mediaSubType: %s", COFcc(mediaType), COFcc(mediaSubType));
if (mediaType == kCMMediaType_Audio) {
if (mediaSubType == kEDSupportedMediaTypeAAC ||
mediaSubType == kEDSupportedMediaTypeMP3) {
m_track = [track retain];
m_format = CFRetain(format);
break;
}
}
}
if (m_track != nil && m_format != NULL) {
break;
}
}
if (m_track == nil || m_format == NULL) {
return kEDLibraryAssetReader_UnsupportedFormat;
}
// Create an output for the found track
m_output = [[AVAssetReaderTrackOutput alloc] initWithTrack:m_track outputSettings:nil];
[m_reader addOutput:m_output];
// Start reading
if (![m_reader startReading]) {
DDLogError(#"could not start reading asset");
return kEDLibraryAssetReader_CouldNotStartReading;
}
return 0;
}
- (OSStatus)copyNextSampleBufferRepresentation:(CMSampleBufferRepresentationRef *)repOut {
pthread_mutex_lock(&m_mtx);
OSStatus err = noErr;
AVAssetReaderStatus status = m_reader.status;
if (m_invalid) {
pthread_mutex_unlock(&m_mtx);
return kEDLibraryAssetReader_Invalidated;
}
else if (status != AVAssetReaderStatusReading) {
pthread_mutex_unlock(&m_mtx);
return kEDLibraryAssetReader_NoMoreSampleBuffers;
}
// Read the next sample buffer
CMSampleBufferRef sbuf = [m_output copyNextSampleBuffer];
if (sbuf == NULL) {
pthread_mutex_unlock(&m_mtx);
return kEDLibraryAssetReader_NoMoreSampleBuffers;
}
CMSampleBufferRepresentationRef srep = CMSampleBufferRepresentationCreateWithSampleBuffer(sbuf);
if (srep && repOut != NULL) {
*repOut = srep;
}
else {
DDLogError(#"CMSampleBufferRef corrupted");
EDCFShow(sbuf);
err = kEDLibraryAssetReader_BufferCorrupted;
}
CFRelease(sbuf);
pthread_mutex_unlock(&m_mtx);
return err;
}

You can't, and there are no workaround. An MPMediaItem is not the actual piece of media, it is just the metadata about the media item communicated to the application via RPC from another process. The data for the item itself is not accessible in your address space.
I should note that even if you have the MPMediaItem its data probably is not loaded into the devices memory. The flash on the iPhone is slow and memory is scarce. While Apple may not want you to have access to the raw data backing an MPMediaItem, it is just as likely that they didn't bother dealing with it because they didn't want to invest the time necessary to deal with the APIs. If they did provide access to such a thing it almost certainly would not be as an NSData, but more likely as an NSURL they would give your application that would allow it to open the file and stream through the data.
In any event, if you want the functionality, you should file a bug report asking for.
Also, as a side note, don't mention your age in a bug report you send to Apple. I think it is very cool you are writing apps for the phone, when I was your age I loved experimenting with computers (back then I was working on things written in Lisp). The thing is you cannot legally agree to a contract in the United States, which is why the developer agreement specifically prohibits you from joining. From the first paragraph of the agreement:
You also certify that you are of the
legal age of majority in the
jurisdiction in which you reside (at
least 18 years of age in many
countries) and you represent that you
are legally permitted to become a
Registered iPhone Developer.
If you mention to a WWDR representative that you are not of age of majority they may realize you are in violation of the agreement and be obligated to terminate your developer account. Just a friendly warning.

Related

Opening iTunes Store to specific song

Ok, I have seen similar questions here but none are actually answering the problem for me.
I have a streaming audio app and the stream source returns to me the song title and artist name. I have an iTunes button in the app, and want to open the iTunes STORE (search) to that exact song, or at least close. I have tried the following:
NSString *baseString = #"itms://phobos.apple.com/WebObjects/MZSearch.woa/wa/advancedSearchResults?songTerm=";
NSString *str1 = [self.songTitle2 stringByReplacingOccurrencesOfString:#" " withString:#"+"];
NSString *str2 = [self.artist2 stringByReplacingOccurrencesOfString:#" " withString:#"+"];
NSString *str = [NSString stringWithFormat:#"%#%#&artistTerm=%#", baseString, str1, str2];
[[UIApplication sharedApplication] openURL: [NSURL URLWithString:str]];
This call does indeed switch me to the iTunes STORE as expected, but then it pops up an error "Cannot connect to iTunes Store". I am obviously on-line as the song is actively streaming, and I am in the store. The search box in iTunes app only shows the song name and nothing else.
Here is an example of a generated string:
itms://phobos.apple.com/WebObjects/MZSearch.woa/wa/advancedSearchResults?artistTerm=Veruca+Salt&artistTerm=Volcano+Girls
I have tired taking the string it generates and pasting it into Safari, and it works OK on my Mac, opening to albums from the artist in the store. Why not on the phone?
Also, it seems to ignore both items, as it does not take me to the song by that artist. Does this require also knowing the album name (which I do not have at this time.)
Help would be appreciated. Thanks.
Yes, I am answering my own question.
After much digging and a talk with one of the best programmers I know, we have a solution, so I thought I would share it here. This solution takes the song name and artist, actually does make a call to the Link Maker API, gets back an XML document, and extracts the necessary info to create a link to the iTunes Store, opening the store to the song in an album by that artist that contains the song.
In the interface of the view controller, add:
#property (strong, readonly, nonatomic) NSOperationQueue* operationQueue;
#property (nonatomic) BOOL searching;
In the implementation:
#synthesize operationQueue = _operationQueue;
#synthesize searching = _searching;
Here are the methods and code that will do this for you:
// start an operation Queue if not started
-(NSOperationQueue*)operationQueue
{
if(_operationQueue == nil) {
_operationQueue = [NSOperationQueue new];
}
return _operationQueue;
}
// change searching state, and modify button and wait indicator (if you wish)
- (void)setSearching:(BOOL)searching
{
// this changes the view of the search button to a wait indicator while the search is perfomed
// In this case
_searching = searching;
dispatch_async(dispatch_get_main_queue(), ^{
if(searching) {
self.searchButton.enabled = NO;
[self.searchButton setTitle:#"" forState:UIControlStateNormal];
[self.activityIndicator startAnimating];
} else {
self.searchButton.enabled = YES;
[self.searchButton setTitle:#"Search" forState:UIControlStateNormal];
[self.activityIndicator stopAnimating];
}
});
}
// based on info from the iTunes affiliates docs
// http://www.apple.com/itunes/affiliates/resources/documentation/itunes-store-web-service-search-api.html
// this assume a search button to start the search.
- (IBAction)searchButtonTapped:(id)sender {
NSString* artistTerm = self.artistField.text; //the artist text.
NSString* songTerm = self.songField.text; //the song text
// they both need to be non-zero for this to work right.
if(artistTerm.length > 0 && songTerm.length > 0) {
// this creates the base of the Link Maker url call.
NSString* baseURLString = #"https://itunes.apple.com/search";
NSString* searchTerm = [NSString stringWithFormat:#"%# %#", artistTerm, songTerm];
NSString* searchUrlString = [NSString stringWithFormat:#"%#?media=music&entity=song&term=%#&artistTerm=%#&songTerm=%#", baseURLString, searchTerm, artistTerm, songTerm];
// must change spaces to +
searchUrlString = [searchUrlString stringByReplacingOccurrencesOfString:#" " withString:#"+"];
//make it a URL
searchUrlString = [searchUrlString stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
NSURL* searchUrl = [NSURL URLWithString:searchUrlString];
NSLog(#"searchUrl: %#", searchUrl);
// start the Link Maker search
NSURLRequest* request = [NSURLRequest requestWithURL:searchUrl];
self.searching = YES;
[NSURLConnection sendAsynchronousRequest:request queue:self.operationQueue completionHandler:^(NSURLResponse* response, NSData* data, NSError* error) {
// we got an answer, now find the data.
self.searching = NO;
if(error != nil) {
NSLog(#"Error: %#", error);
} else {
NSError* jsonError = nil;
NSDictionary* dict = [NSJSONSerialization JSONObjectWithData:data options:0 error:&jsonError];
if(jsonError != nil) {
// do something with the error here
NSLog(#"JSON Error: %#", jsonError);
} else {
NSArray* resultsArray = dict[#"results"];
// it is possible to get no results. Handle that here
if(resultsArray.count == 0) {
NSLog(#"No results returned.");
} else {
// extract the needed info to pass to the iTunes store search
NSDictionary* trackDict = resultsArray[0];
NSString* trackViewUrlString = trackDict[#"trackViewUrl"];
if(trackViewUrlString.length == 0) {
NSLog(#"No trackViewUrl");
} else {
NSURL* trackViewUrl = [NSURL URLWithString:trackViewUrlString];
NSLog(#"trackViewURL:%#", trackViewUrl);
// dispatch the call to switch to the iTunes store with the proper search url
dispatch_async(dispatch_get_main_queue(), ^{
[[UIApplication sharedApplication] openURL:trackViewUrl];
});
}
}
}
}
}];
}
}
The XML file that comes back has a LOT of other good info you could extract here as well, including three sizes of album art, album name, cost, etc, etc.
I hope this helps someone else out. This stumped me for quite some time, and I thank a good friend of mine for making this work.
You are in fact using a URL for the search. That's why iTunes opens on search. My iTunes in Mac OS X also opens in search.
Use the Search API for iTunes to search for the content you want and get the artist, album or song ids so you can generate a direct URL for that content.
Look in the iTunes Link Maker how to create a URL for an artist or for a specific album and compose that URL on your app.
It appears that now iOS already opens the iTunes app directly when you try to open a itunes html url.
Example, trying to do a openURL on https://itunes.apple.com/br/album/falando-de-amor/id985523754 already opens the iTunes app instead of the website.

how to create folder on Google Drive using Google Drive SDK for iPhone?

I am using Google Drive SDK for iPhone and trying to upload Audio file in "TestAudio" folder.If "TestAudio" folder is not created at google drive then first create that folder and after that my audio should store to that folder only. Every thing is working gr8 except folder creation. can any buddy please help?
I am using below code for upload audio file.
GTLUploadParameters *uploadParameters = nil;
NSString *soundFilePath = [[NSBundle mainBundle]
pathForResource:#"honey_bunny_new"
ofType:#"mp3"];
if (soundFilePath) {
NSData *fileContent = [[NSData alloc] initWithContentsOfFile:soundFilePath];
uploadParameters = [GTLUploadParameters uploadParametersWithData:fileContent MIMEType:#"audio/mpeg"];
}
self.driveFile.title = self.updatedTitle;
GTLQueryDrive *query = nil;
if (self.driveFile.identifier == nil || self.driveFile.identifier.length == 0) {
// This is a new file, instantiate an insert query.
query = [GTLQueryDrive queryForFilesInsertWithObject:self.driveFile
uploadParameters:uploadParameters];
} else {
// This file already exists, instantiate an update query.
query = [GTLQueryDrive queryForFilesUpdateWithObject:self.driveFile
fileId:self.driveFile.identifier
uploadParameters:uploadParameters];
}
UIAlertView *alert = [DrEditUtilities showLoadingMessageWithTitle:#"Saving file"
delegate:self];
[self.driveService executeQuery:query completionHandler:^(GTLServiceTicket *ticket,
GTLDriveFile *updatedFile,
NSError *error) {
[alert dismissWithClickedButtonIndex:0 animated:YES];
if (error == nil) {
self.driveFile = updatedFile;
self.originalContent = [self.textView.text copy];
self.updatedTitle = [updatedFile.title copy];
[self toggleSaveButton];
[self.delegate didUpdateFileWithIndex:self.fileIndex
driveFile:self.driveFile];
[self doneEditing:nil];
} else {
NSLog(#"An error occurred: %#", error);
[DrEditUtilities showErrorMessageWithTitle:#"Unable to save file"
message:error.description
delegate:self];
}
}];
I don't see your code to create a folder, but I was having the same problem with folder creation myself. As you know, the mimeType must be "application/vnd.google-apps.folder". I ran into assert failures if the NSData parameter to uploadParametersWithData was nil. So I tried a zero length NSData object and that failed. Using a 1 byte NSData object also failed. The trick is to call queryForFilesUpdateWithObject with uploadParameters:nil. Then the folder creation works fine. I also discovered that the Objective-C code shown at the end of:
https://developers.google.com/drive/v2/reference/files/insert
is incorrect. The file.parents should be as follows:
GTLDriveParentReference *parentRef = [GTLDriveParentReference object];
parentRef.identifier = parentID;
if (parentID.length>0) file.parents = [NSArray arrayWithObjects:parentRef,nil];

loadValuesAsynchronouslyForKeys and multiple values loading

I would like to asynchronously load the duration, time (timestamp the video was created) and locale of an Asset.
All of the sample code shown by Apple for the usage of 'loadValuesAsynchronouslyForKeys:keys' is always shows with only one key. ie:
NSURL *url = aUrl;
AVAsset asset = [[AVURLAsset alloc] initWithURL:url options:nil];
NSArray *keys = [NSArray arrayWithObject:#"duration"];
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
NSError *error = nil;
AVKeyValueStatus durationStatus = [asset statusOfValueForKey:#"duration" error:&error];
switch (durationSatus) {
case AVKeyValueStatusLoaded:
// Read duration from asset
CMTime assetDurationInCMTime = [asset duration];
break;
case AVKeyValueStatusFailed:
// Report error
break;
case AVKeyValueStatusCancelled:
// Do whatever is appropriate for cancelation
}
}];
Can I assume that if one item's status is 'AVKeyValueStatusLoaded', the other values can be read at the same time in the completion block? ie:
[asset tracks]
[asset commonMetadata];
[asset duration]
No, you can't assume that. One of my methods looks at two keys, playable and duration, and I have found that playable is often available while duration isn't yet. I therefor have moved the loadValuesAsynchronouslyForKeys: code into a separate method shouldSave:. The shouldSave: method I call from a timer in a method called saveWithDuration:. Once saveWithDuration: receives a non-zero duration, it goes ahead and saves stuff. To avoid waiting too long, I use an attempt counter for now -- in the future, I'll refine this (you'll notice that the error instance isn't really used at the moment)
- (void)shouldSave:(NSTimer*)theTimer {
NSString * const TBDuration = #"duration";
NSString * const TBPlayable = #"playable";
__block float theDuration = TBZeroDuration;
__block NSError *error = nil;
NSArray *assetKeys = [NSArray arrayWithObjects:TBDuration, TBPlayable, nil];
[_audioAsset loadValuesAsynchronouslyForKeys:assetKeys completionHandler:^() {
AVKeyValueStatus playableStatus = [_audioAsset statusOfValueForKey:TBPlayable error:&error];
switch (playableStatus) {
case AVKeyValueStatusLoaded:
//go on
break;
default:
return;
}
AVKeyValueStatus durationStatus = [_audioAsset statusOfValueForKey:TBDuration error:&error];
switch (durationStatus) {
case AVKeyValueStatusLoaded:
theDuration = CMTimeGetSeconds(_audioAsset.duration);
break;
default:
return;
}
}];
NSUInteger attempt = [[[theTimer userInfo] objectForKey:TBAttemptKey] integerValue];
attempt++;
[self saveWithDuration:theDuration attempt:attempt error:&error];
}
Technically you can't. The docs for loadValuesAsynchronouslyForKeys:completionHandler: says that
The completion states of the keys you
specify in keys are not necessarily
the same—some may be loaded, and
others may have failed. You must check
the status of each key individually.
In practice, I think this is often a safe assumption -- as you've noted, Apple's StitchedStreamPlayer sample project just looks at the status of the first key.
No you cannot assume so. I usually rely on #"duration" key to create an AVPlayerItem and start playback since loading of #"playable" generally doesn't guarantee that the asset is ready yet. Then I spawn a timer to check periodically whether others keys such as #"tracks" are loaded or not similar to what Elise van Looij has mentioned above.
Also, side note - do remember that the completionHandler in loadValuesAsynchronouslyForKeys is called on an arbitrary background thread. You will have to dispatch it to main thread if you are dealing with UI or AVPlayerLayer.

Writing metadata to ALAsset

I am developing a video app for iPhone. I am recording a video and saving it to iPhone Camera Roll using AssetsLibrary framework. The API that I have used is:
- (void)writeVideoAtPathToSavedPhotosAlbum:(NSURL *)videoPathURL
completionBlock:(ALAssetsLibraryWriteVideoCompletionBlock)completionBlock
Is there any way to save custom metadata of the video to the Camera Roll using ALAsset. If this is not possible using AssetsLibrary framework, can this be done using some other method. Basically I am interested in writing details about my app as a part of the video metadata.
Since iOS 4+ there is the AVFoundation framework, which also lets you read/write metadata from/to video files. There are only specific keys that you can use to add metadata using this option, but I don't believe it would be a problem.
Here's a small example that you can use to add a title to your videos (however, in this example all older metadata is removed):
// prepare metadata (add title "title")
NSMutableArray *metadata = [NSMutableArray array];
AVMutableMetadataItem *mi = [AVMutableMetadataItem metadataItem];
mi.key = AVMetadataCommonKeyTitle;
mi.keySpace = AVMetadataKeySpaceCommon;
mi.value = #"title";
[metadata addObject:mi];
// prepare video asset (SOME_URL can be an ALAsset url)
AVURLAsset *videoAsset = [[AVURLAsset alloc] initWithURL:SOME_URL options:nil];
// prepare to export, without transcoding if possible
AVAssetExportSession *_videoExportSession = [[AVAssetExportSession alloc] initWithAsset:videoAsset presetName:AVAssetExportPresetPassthrough];
[videoAsset release];
_videoExportSession.outputURL = [NSURL fileURLWithPath:_outputPath];
_videoExportSession.outputFileType = AVFileTypeQuickTimeMovie;
_videoExportSession.metadata = metadata;
[_videoExportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([_videoExportSession status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[_videoExportSession error] localizedDescription]);
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
default:
break;
}
[_videoExportSession release]; _videoExportSession = nil;
[self finishExport]; //in finishExport you can for example call writeVideoAtPathToSavedPhotosAlbum:completionBlock: to save the video from _videoExportSession.outputURL
}];
This also shows some examples: avmetadataeditor
There is no officially supported way to do this.
What you may do: Store the info you want to save in a separate database. The downside however is that such information is then only available in your app.
What exactly are you trying to accomplish?
You can also set the metadata in the videoWriter so something like =>
NSMutableArray *metadata = [NSMutableArray array];
AVMutableMetadataItem *mi = [AVMutableMetadataItem metadataItem];
mi.key = AVMetadataCommonKeyTitle;
mi.keySpace = AVMetadataKeySpaceCommon;
mi.value = #"title";
[metadata addObject:mi];
videoWriter.metadata = metadata;
where videoWriter is of type AVAssetWriter
and then when you stop recording you call =>
[videoWriter endSessionAtSourceTime:CMTimeMake(durationInMs, 1000)];
[videoWriter finishWritingWithCompletionHandler:^() {
ALAssetsLibrary *assetsLib = [[ALAssetsLibrary alloc] init];
[assetsLib writeVideoAtPathToSavedPhotosAlbum:videoUrl
completionBlock:^(NSURL* assetURL, NSError* error) {
if (error != nil) {
NSLog( #"Video not saved");
}
}];
}];

AVAudioRecorder / AVAudioPlayer - append recording to file

Is there any way to record onto the end of an audio file? We can't just pause the recording instead of stopping it, because the user needs to be able to come back to the app later and add more audio to their recording. Currently, the audio is stored in CoreData as NSData. NSData's AppendData does not work because the resulting audio file still reports that it is only as long as the original data.
Another possibility would be taking the original audio file, along with the new one, and concatenate them into one audio file, if there's any way to do that.
This can be done fairly easily using AVMutableComposionTrack insertTimeRange:ofTrack:atTime:error. The code is somewhat lengthy, but it's really just like 4 steps:
// Create a new audio track we can append to
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack* appendedAudioTrack =
[composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
// Grab the two audio tracks that need to be appended
AVURLAsset* originalAsset = [[AVURLAsset alloc]
initWithURL:[NSURL fileURLWithPath:originalAudioPath] options:nil];
AVURLAsset* newAsset = [[AVURLAsset alloc]
initWithURL:[NSURL fileURLWithPath:newAudioPath] options:nil];
NSError* error = nil;
// Grab the first audio track and insert it into our appendedAudioTrack
AVAssetTrack *originalTrack = [originalAsset tracksWithMediaType:AVMediaTypeAudio];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, originalAsset.duration);
[appendedAudioTrack insertTimeRange:timeRange
ofTrack:[originalTrack objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
if (error)
{
// do something
return;
}
// Grab the second audio track and insert it at the end of the first one
AVAssetTrack *newTrack = [newAsset tracksWithMediaType:AVMediaTypeAudio];
timeRange = CMTimeRangeMake(kCMTimeZero, newAsset.duration);
[appendedAudioTrack insertTimeRange:timeRange
ofTrack:[newTrack objectAtIndex:0]
atTime:originalAsset.duration
error:&error];
if (error)
{
// do something
return;
}
// Create a new audio file using the appendedAudioTrack
AVAssetExportSession* exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (!exportSession)
{
// do something
return;
}
NSString* appendedAudioPath= #""; // make sure to fill this value in
exportSession.outputURL = [NSURL fileURLWithPath:appendedAudioPath];
exportSession.outputFileType = AVFileTypeAppleM4A;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
// exported successfully?
switch (exportSession.status)
{
case AVAssetExportSessionStatusFailed:
break;
case AVAssetExportSessionStatusCompleted:
// you should now have the appended audio file
break;
case AVAssetExportSessionStatusWaiting:
break;
default:
break;
}
NSError* error = nil;
}];
You can append two audio files by creating a AVMutableCompositionTrack after adding the two files and exporting the composition using exportAsynchronouslyWithCompletionHandler method of AVAssetExportSession.
Please refer below links for more details.
AVAssetExportSession Class Reference
Creating New Assets
Hope this helps to solve your issue.
I don't have a complete code example but the Extended Audio File Services can help you concatenate two audio files. Search for Extended Audio File Services in Xcode or visit the link below.
Apple documentation
We had the same requirements for our app as the OP described, and ran into the same issues (i.e., the recording has to be stopped, instead of paused, if the user wants to listen to what she has recorded up to that point). Our app (project's Github repo) uses AVQueuePlayer for playback and a method similar to kermitology's answer to concatenate the partial recordings, with some notable differences:
implemented in Swift
concatenates multiple recordings into one
no messing with tracks
The rationale behind the last item is that simple recordings with AVAudioRecorder will have one track, and the main reason for this whole workaround is to concatenate those single tracks in the assets (see Addendum 3). So why not use AVMutableComposition's insertTimeRange method instead, that takes an AVAsset instead of an AVAssetTrack?
Relevant parts: (full code)
import UIKit
import AVFoundation
class RecordViewController: UIViewController {
/* App allows volunteers to record newspaper articles for the
blind and print-impaired, hence the name.
*/
var articleChunks = [AVURLAsset]()
func concatChunks() {
let composition = AVMutableComposition()
/* `CMTimeRange` to store total duration and know when to
insert subsequent assets.
*/
var insertAt = CMTimeRange(start: kCMTimeZero, end: kCMTimeZero)
repeat {
let asset = self.articleChunks.removeFirst()
let assetTimeRange =
CMTimeRange(start: kCMTimeZero, end: asset.duration)
do {
try composition.insertTimeRange(assetTimeRange,
of: asset,
at: insertAt.end)
} catch {
NSLog("Unable to compose asset track.")
}
let nextDuration = insertAt.duration + assetTimeRange.duration
insertAt = CMTimeRange(start: kCMTimeZero, duration: nextDuration)
} while self.articleChunks.count != 0
let exportSession =
AVAssetExportSession(
asset: composition,
presetName: AVAssetExportPresetAppleM4A)
exportSession?.outputFileType = AVFileType.m4a
exportSession?.outputURL = /* create URL for output */
// exportSession?.metadata = ...
exportSession?.exportAsynchronously {
switch exportSession?.status {
case .unknown?: break
case .waiting?: break
case .exporting?: break
case .completed?: break
case .failed?: break
case .cancelled?: break
case .none: break
}
}
/* Clean up (delete partial recordings, etc.) */
}
This diagram helped me to get around what expects what and inherited from where. (NSObject is implicitly implied as superclass where there is no inheritance arrow.)
Addendum 1: I had my reservations regarding the switch part instead of using KVO on AVAssetExportSessionStatus, but the docs are clear that exportAsynchronously's callback block "is invoked when writing is complete or in the event of writing failure".
Addendum 2: Just in case if someone has issues with AVQueuePlayer: 'An AVPlayerItem cannot be associated with more than one instance of AVPlayer'
Addendum 3: Unless you are recording in stereo, but mobile devices have one input as far as I know. Also, using fancy audio mixing would also require the use of AVCompositionTrack. A good SO thread: Proper AVAudioRecorder Settings for Recording Voice?