Is it possible to pick media items using MPMediaPickerController and then load them into an AVAudioPlayer object?
If MPMusicPlayerController doesn't meet your needs, you can copy the audio to your local bundle so you can use AVAudioPlayer.
EDIT
You basically have three options for playing audio from the user's iPod library: MPMediaPlayer, AVPlayer and AVAudioPlayer.
Here are examples for MPMediaPlayer and AVPlayer:
- (void) mediaPicker: (MPMediaPickerController *) mediaPicker
didPickMediaItems: (MPMediaItemCollection *) collection {
MPMediaItem *item = [[collection items] objectAtIndex:0];
NSURL *url = [item valueForProperty:MPMediaItemPropertyAssetURL];
[self dismissModalViewControllerAnimated: YES];
// Play the item using MPMusicPlayer
MPMusicPlayerController* appMusicPlayer = [MPMusicPlayerController applicationMusicPlayer];
[appMusicPlayer setQueueWithItemCollection:collection];
[appMusicPlayer play];
// Play the item using AVPlayer
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithURL:url];
AVPlayer *player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
[player play];
}
If you need to use AVAudioPlayer for some reason, or you need access to the audio file's actual audio data, you have to first copy the audio file to your app's directory and then work with it there. The AVAsset + AVPlayer stuff is the closest analogy to ALAsset if you're used to working with photos and videos.
Just wanted to say that it appears that in iOS 6 and 7, AVAudioPlayer can play file URLs directly from the iPod without having to copy the audio data into your app directory as Art suggested.
- (void) mediaPicker: (MPMediaPickerController *) mediaPicker didPickMediaItems: (MPMediaItemCollection *) collection {
MPMediaItem *item = [[collection items] objectAtIndex:0];
NSURL *url = [item valueForProperty:MPMediaItemPropertyAssetURL];
// Play the item using AVPlayer
self.avAudioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
[self.avAudioPlayer play];
}
I wanted to add (and I can't comment on SO yet) that it seems like in iOS 8, and perhaps before, songs that are stored on iCloud do not have a value for the property assetURL and return null for [npItem valueForProperty:MPMediaItemPropertyAssetURL].
Here is a sample output:
MPMediaItem *npItem = self.musicPlayerController.nowPlayingItem;
NSLog(#"Song Title: %#\n assetURL: %#\n Cloud Item: %d", npItem.title, [npItem valueForProperty:MPMediaItemPropertyAssetURL], npItem.cloudItem)
// Log Output
Song Title: Time We Had
assetURL: (null)
Cloud Item: 1
Song Title: The Quiet
assetURL: ipod-library://item/item.m4a?id=1529654720874100371
Cloud Item: 0
I think you'll find that the [NSBundle mainBundle] is READ-ONLY. It's not possible to write files into the APP, therefore AVAudioPlayer will not work for iPod-Library MPMediaItems!
Related
How can I play more than one .m4a file consecutively?
Presently, my function starts playing one, using AudioServicesPlaySystemSound etc., uses AudioServicesAddSystemSoundCompletion to select a callback, and then returns. When the sound is done, the callback lets my software continue.
This is a LOT of coding whenever there is back to back sound.
Can my software start a .m4a playing, and SOMEHOW wait for it to complete, waiting waiting for its callback to set a flag? Can I use pthread software to do a wait loop with a multitasking pause?
Here's my code so far...
// Restore game_state to continue game after playing sound file.
int game_state_after_sound;
void play_sound_file_done (
SystemSoundID ssID,
void calling_class
)
{
printf("\n play_sound_file_done.");
// Sound is now done, so restore game_state and continue to manage_game from where left off:
game_state = game_state_after_sound;
[ (GeController)calling_class manage_game];
}
// Plays sound_file_m4a and returns after playing has completed.
-(void) play_sound_file: (NSString *) sound_file_m4a
{
printf("\n play_sound_file '%s' ", [sound_file_m4a UTF8String] );
NSString *soundPath = [[NSBundle mainBundle] pathForResource:sound_file_m4a ofType:#"m4a"];
SystemSoundID soundID;
AudioServicesCreateSystemSoundID((CFURLRef)[NSURL fileURLWithPath: soundPath], &soundID);
AudioServicesPlaySystemSound (soundID);
AudioServicesAddSystemSoundCompletion ( soundID, NULL, NULL, play_sound_file_done, (void*)self );
game_state_after_sound = game_state;
game_state = DELAY_WHILE_PLAYING_SOUND;
}
********************************** AVPlayer work-in-progress...
But this produces no sound.
#import <AVFoundation/AVFfoundation.h>
-(void) play_queued_sounds: (NSString *) sound_file_m4a
{
printf("\n queue_sound_file: '%s' ", [sound_file_m4a UTF8String] );
AVPlayerItem *sound_file = [[AVPlayerItem alloc] initWithURL:[NSURL URLWithString:[[NSBundle mainBundle] pathForResource:sound_file_m4a ofType:#"m4a"]]];
AVQueuePlayer *player = [AVQueuePlayer queuePlayerWithItems:[NSArray arrayWithObjects:sound_file, nil]];
[player play];
return;
}
This can be achieved quite easily in AVQueuePlayer. You don't have to handle completion or anything, you just give it the AVPlayerItems that you want to play in sequence and call play. Here's an example:
#import <AVFoundation/AVFfoundation.h>
//AVPlayerItem *item1 = [[AVPlayerItem alloc] initWithURL:[NSURL URLWithString:[[NSBundle mainBundle] pathForResource:#"myFileName" ofType:#"m4a"]]];
EDIT: Use this instead
AVPlayerItem *item1 = [[AVPlayerItem alloc] initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:[[NSString alloc] initWithFormat:#"2"] ofType:#"mp3"]]];
AVQueuePlayer *player = [AVQueuePlayer queuePlayerWithItems:[NSArray arrayWithObjects:item1, nil]];
[player play];
Just as an addition, this can also be achieved with items from the users iPod library using a MPMusicPlayerController queued from the didPickMediaItems delegate method of MPMediaPickerController
- (void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
MPMusicPlayerController *player = [MPMusicPlayerController applicationMusicPlayer];
[player setQueueWithItemCollection:mediaItemCollection];
[player play];
}
I could not get the AVQueuePlayer solution to work.
GREAT SOLUTION:
For each sound to play, spin up a thread that awaits then locks a mutex, then does AudioServicesPlaySystemSound. Just spin up a thread for each successive sound to play, and use the pthread mutex to delay each successive thread until the AudioServicesAddSystemSoundCompletion function unlocks the mutex. Works like a champ.
This is 21st century embedded programming: Throw threads at the problem!
How can I play a YouTube video in an MPMoviePlayerController on the iPhone while avoiding going into fullscreen mode?
This question has been raised here: MPMoviePlayerController is playing YouTube video? and here: Play Youtube video in MPMoviePlayerController or play RTSP - 3GP link with answers claiming such functionality was impossible.
Yet this app, Deja, has exactly the functionality I would like: a seamless MPMoviePlayerController whose frame I have explicit control over. http://itunes.apple.com/app/deja/id417625158
How is this done!?
add this sample into you project
instantiate YoutubeStreamPathExtractorTest
invoke test method of YoutubeStreamPathExtractorTest instance.
Follow logs and be happy
#import "AFHTTPRequestOperationManager.h"
#import <MediaPlayer/MediaPlayer.h>
typedef void (^CallbackBlock)(NSArray* result, NSError* error);
static NSString* const kYouTubeStreamPathPattern = #"\\\"url_encoded_fmt_stream_map\\\\\":.*?url=(.*?)\\\\u0026";
#interface YoutubeStreamPathExtractorTest : NSObject
- (void)test;
- (void)youtubeURLPath:(NSString*)youtubeURLPath extractStreamURLPathsWithCallback:(CallbackBlock)callback;
#end
#implementation YoutubeStreamPathExtractorTest
- (void) test {
NSString* path = #"http://www.youtube.com/watch?v=TEV5DZpAXSw";
[self youtubeURLPath:path extractStreamURLPathsWithCallback:^(NSArray *result, NSError *error) {
if (error){
NSLog(#"extracting error:%#",[error localizedDescription]);
}
for(NSString* streamURLPath in result) {
NSLog(#"streamURLPath:%#",streamURLPath);
/*
NSURL* url = [NSURL URLWithString:streamURLPath];
MPMoviePlayerController* mpMoviePlayerController_ = [[MPMoviePlayerController alloc] initWithContentURL:url];
mpMoviePlayerController_.controlStyle = MPMovieControlStyleDefault;
[mpMoviePlayerController_ play];
*/
}
}];
}
- (void)youtubeURLPath:(NSString*)youtubeURLPath extractStreamURLPathsWithCallback:(CallbackBlock)callback {
__block NSMutableArray* resultArray = [NSMutableArray new];
AFHTTPRequestOperationManager* manager = [[AFHTTPRequestOperationManager alloc] initWithBaseURL:nil];
manager.responseSerializer = [AFHTTPResponseSerializer serializer];
manager.responseSerializer.acceptableContentTypes = [NSSet setWithObjects:#"text/html", nil];
[manager GET:youtubeURLPath
parameters:nil
success:^(AFHTTPRequestOperation* operation, id responseObject) {
NSData* data = (NSData*)responseObject;
NSString* string = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
NSError* error = nil;
NSRegularExpression* expression = [NSRegularExpression regularExpressionWithPattern:kYouTubeStreamPathPattern
options:NSRegularExpressionCaseInsensitive
error:&error];
NSRange range = NSMakeRange(0,[string length]);
NSArray* matches = [expression matchesInString:string options:0 range:range];
for(NSTextCheckingResult* checkingResult in matches) {
if ([checkingResult numberOfRanges]>1){
NSString* resultStr = [string substringWithRange:[checkingResult rangeAtIndex:1]];
//remove extra slashes
[resultArray addObject:[resultStr stringByReplacingOccurrencesOfString:#"\\" withString:#""]];
}
}
if (callback) {
callback(resultArray,error);
}
} failure:^(AFHTTPRequestOperation* operation, NSError* error) {
if (callback) {
callback(resultArray, error);
}
}];
}
#end
MPMoviePlayerController does not support the playback of YouTube SWF (Flash) video, period.
That app you are mentioning actually plays progressively downloaded files in MP4 format which YouTube also offers for some of its content. This actually is a violation of Apple's guidelines as it will (and does) exceed the maximum amount of progressive download per app per timeframe. I am surprised it got through the iTunes approval.
Warning: iOS apps submitted for distribution in the App Store must
conform to these requirements. If your app delivers video over
cellular networks, and the video exceeds either 10 minutes duration or
5 MB of data in a five minute period, you are required to use HTTP
Live Streaming. (Progressive download may be used for smaller clips.)
If your app uses HTTP Live Streaming over cellular networks, you are
required to provide at least one stream at 64 Kbps or lower bandwidth
(the low-bandwidth stream may be audio-only or audio with a still
image).
These requirements apply to iOS apps submitted for distribution in the
App Store for use on Apple products. Non-compliant apps may be
rejected or removed, at the discretion of Apple.
So your task boils down to the question on how to get the MP4 URL of a video offered through YouTube. That part is really tricky and nicely solved by Deja. Just use a packet sniffer and you will see that it actually creates a local server that feeds MPMoviePlayerController.
try this code:
NSString *urlStr=[Your url is here];
NSURL *url = [NSURL fileURLWithPath:urlStr];
MPMoviePlayerController* moviePlayer = [[MPMoviePlayerController alloc]initWithContentURL:url];
[self.view addSubview:moviePlayer.view];
moviePlayer.view.frame = CGRectMake(set frame is here);
[moviePlayer play];
[moviePlayer setFullscreen:NO animated:YES];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(moviePlayBackDidFinish)
name:MPMoviePlayerPlaybackDidFinishNotification
object:nil];
I guess it is against Youtube ToS but you can use this code here:
https://github.com/larcus94/LBYouTubeView
It is simple to use and works like a charm!
Use UIWebView. Copy HtML code video in youtube.
UIWevView* movie = [UIWebView alloc] initWithFrame:CGRectMake(0,0,320,460)];
NSString* urlString = #"past HTML code";
[self.webView loadHTMLString:urlString baseURL:nil];
[self.view addSubview:movie];
I get a song from the device iTunes library and shove it into an AVAsset:
- (void)mediaPicker: (MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
NSArray *arr = mediaItemCollection.items;
MPMediaItem *song = [arr objectAtIndex:0];
AVAsset *songAsset = [AVAsset assetWithURL:[song valueForProperty:MPMediaItemPropertyAssetURL]];
}
Then I have this Game Center method for receiving data:
- (void)match:(GKMatch *)match didReceiveData:(NSData *)data fromPlayer:(NSString *)playerID
I'm having a LOT of trouble figuring out how to send this AVAsset via GameCenter and then have it play on the receiving device.
I've read through:
http://developer.apple.com/library/ios/#documentation/MusicAudio/Reference/AudioStreamReference/Reference/reference.html#//apple_ref/doc/uid/TP40006162
http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/MultimediaPG/UsingAudio/UsingAudio.html#//apple_ref/doc/uid/TP40009767-CH2-SW5
http://developer.apple.com/library/mac/#documentation/AVFoundation/Reference/AVAudioPlayerClassReference/Reference/Reference.html
http://developer.apple.com/library/mac/#documentation/MusicAudio/Conceptual/AudioQueueProgrammingGuide/Introduction/Introduction.html
I am just lost. Information overload.
I've implemented Cocoa With Love's Audio Stream code, but I can't figure out how to take the NSData I receive through GameCenter and shove it into his code.
http://cocoawithlove.com/2008/09/streaming-and-playing-live-mp3-stream.html
Can someone please help me figure this out? Thanks!
As far as I know the AVAsset is not the actual song. So if you want to send the actual data of the picked song you need to try something like this:
- (void)mediaPicker: (MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
NSArray *arr = mediaItemCollection.items;
MPMediaItem *song = [arr objectAtIndex:0];
NSData *songData = [NSData dataWithContentsOfURL:[song valueForProperty:MPMediaItemPropertyAssetURL]];
// Send the songData variable trough GameCenter
}
On the other device now you need to write the NSData you receive to the disk somewhere and than create an AVAsset with it's newly URL. Like this:
- (void)match:(GKMatch *)match didReceiveData:(NSData *)data fromPlayer:(NSString *)playerID
{
NSString *url = NSTemporaryDirectory();
[url stringByAppendingPathComponent:<#audio_file_name#>];
// Make sure there is no other file with the same name first
if ([[NSFileManager defaultManager] fileExistsAtPath:url]) {
[[NSFileManager defaultManager] removeItemAtPath:url error:nil];
}
[data writeToFile:url atomically:NO];
AVURLAsset *urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:url] options:nil];
// Do whatever you want with your new asset
}
Let me know if that works!
I am trying to play an in app audio with ipod the code I am using to play the audio file is:
Code:
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: nil];
UInt32 doSetProperty = 1;
AudioSessionSetProperty (kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(doSetProperty), &doSetProperty);
[[AVAudioSession sharedInstance] setActive: YES error: nil];
if(error)
{
NSLog(#"Some error happened");
}
NSString *path = [[NSBundle mainBundle] pathForResource:effect ofType:type];
myPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL];
myPlayer.delegate = self;
myPlayer.numberOfLoops = -2;
myPlayer.volume = 1.0f;
[myPlayer play];
and to play the ipod music I am using
Code:
player = [MPMusicPlayerController iPodMusicPlayer];
My ipod plays fine until the audio starts playing but once the audio stops I am not able to get back to the ipod to play. I am using
Code:
NSString *value = [[NSUserDefaults standardUserDefaults] stringForKey:#"sound"];
if([value compare:#"ON"] == NSOrderedSame && myPlayer != nil)
{
[myPlayer stop];
[myPlayer release];
myPlayer = nil;
}
NSError *error = nil;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:&error];
if(error)
{
NSLog(#"Some error happened");
}
to stop the audio and then just call
Code:
[player play];
to play the ipod music but it does not play the music, it says the title to be null.
I would really appreciate if some one could help me with this.
Regards,
Ankur
I suggest moving the audio session setup (including the Ambient category) to your application delegate's didFinishLaunchingWithOptions and not using the Playback category at all. You can still play AVAudio sounds just fine with the Ambient setting.
If you want to kill iPod music whenever you play AV sounds, then you should only be using the Playback category. (I'm not exactly sure what you're trying to do.)
The error that you're getting when you try to resume the iPod music suggests that your MPMusicPlayerController object resets its song queue when it is stopped by the AVAudio sounds, and that you'll have to set up the MP player again. "Title is null" definitely sounds like a dangling pointer or the like in the MP player.
It's not possible to play iPod audio in AVAudioPlayer. You need to convert the library Asset link from ipod, using *AVURLAsset to an mp3 or wav and save it to the documents library. IOS.5 currently has a bug in the conversion (AudioExportSession) from ipod to mp3 file, so it's NOT POSSIBLE. Sorry.
You need to resume iPod player after myPlayer finish playing.
-(void) audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag {
[player play];
}
The new features list of IOS 4.0 says that AV Foundation framework has got Media asset management, Track management, Media editing, and Metadata management for media items. What do they mean by this?
Using track management and media asset management can i access media files from the photos app?
Can i make my custom compositions using AVComposition and export and send it to a server?
Can i rename, move, edit the metadata information of an asset?
I tried to get some help/documentation on this and couldn't find any thing..
Thanks,
Tony
Yes, you can do most of the stuff you mentioned.
I think it's not that simple to access your media files of your phone. But you can read data from the network and export it to your camera roll if you like.
First you have to import your videos or audio files.
What you need to get started is your own videoplayer which you create in your own view.
If you don't like to play your videos but simply compose your stuff, you can simply go without a view.
This is very easy:
1. create a mutable composition:
AVMutableComposition *composition = [AVMutableComposition composition];
This will hold your videos. Now you have an empty Composition-Asset.
Add some files from your directory or from the web:
NSURL* sourceMovieURL = [NSURL fileURLWithPath:moviePath];
AVURLAsset* sourceAsset = [AVURLAsset URLAssetWithURL:sourceMovieURL options:nil];
Then add your video to your composition
// calculate time
CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), CMTimeMake(sourceAsset.duration.value, sourceAsset.duration.timescale));
// and add into your composition
BOOL result = [composition insertTimeRange:editRange ofAsset:sourceAsset atTime:composition.duration error:&editError];
If you would like to add more videos into your composition, you can add another Assets and set it again into your composition using your time range.
Now you can EXPORT your new composition using code like this:
NSError *exportError = nil;
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
NSURL *exportURL = [NSURL fileURLWithPath:exportVideoPath];
exportSession.outputURL = exportURL;
exportSession.outputFileType = #"com.apple.quicktime-movie";
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusFailed:{
NSLog (#"FAIL");
[self performSelectorOnMainThread:#selector (doPostExportFailed:)
withObject:nil
waitUntilDone:NO];
break;
}
case AVAssetExportSessionStatusCompleted: {
NSLog (#"SUCCESS");
[self performSelectorOnMainThread:#selector (doPostExportSuccess:)
withObject:nil
waitUntilDone:NO];
break;
}
};
}];
If you want to PLAY your videos, use code like this (I assume, you have access to your view):
// create an AVPlayer with your composition
AVPlayer* mp = [AVPlayer playerWithPlayerItem:[AVPlayerItem playerItemWithAsset:composition]];
// Add the player to your UserInterface
// Create a PlayerLayer:
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:mp];
// integrate it to your view. Here you can customize your player (Fullscreen, or a small preview)
[[self view].layer insertSublayer:playerLayer atIndex:0];
playerLayer.frame = [self view].layer.bounds;
playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
And finally play your video:
[mp play];
Export to camera roll:
NSString* exportVideoPath = >>your local path to your exported file<<
UISaveVideoAtPathToSavedPhotosAlbum (exportVideoPath, self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
And get the notification if it's finished (your callback method)
- (void) video: (NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo {
// Error is nil, if succeeded
NSLog(#"Finished saving video with error: %#", error);
// do postprocessing here, i.e. notifications or UI stuff
}
Unfortunately I haven't found any "legal" solution to read from the cameraroll.
A very good source on getting started is:
http://www.subfurther.com/blog/?cat=51
download VTM_Player.zip, VTM_AVRecPlay.zip or VTM_AVEditor.zip for a very nice introduction into this