iPhone: Why does applicationMusicPlayer quit playing when app enters background? - iphone

I made a little test app to try to isolate this issue, but it exhibits the same behavior: the applicationMusicPlayer stops playing immediately when the app enters the background. I don't know what I'm doing wrong, or if it's an Apple bug. It seems so simple that if it were an Apple bug others would have encountered it and posted on it.
I've set the info.plist UIBackgroundModes key to Audio
I've verified that the app is not terminating
I've tested on 4.1 beta 3 with the same results
I've searched the web for similar complaints. People report other MPMediaPlayerController issues/bugs, but more complex e.g. involving interaction with AVAudio.
Any/all suggestions appreciated.
Here's the core of my test app:
MPTest.h
#import <UIKit/UIKit.h>
#import <Foundation/Foundation.h>
#import <MediaPlayer/MediaPlayer.h>
#interface MPTestViewController : UIViewController <MPMediaPickerControllerDelegate> {
MPMusicPlayerController *MPPlayer;
}
#property (nonatomic, retain) MPMusicPlayerController *MPPlayer;
#end
MPTest.m
#import "MPTestViewController.h"
#implementation MPTestViewController
#synthesize MPPlayer;
- (void)viewDidLoad {
[super viewDidLoad];
// get the application music player
self.MPPlayer = [MPMusicPlayerController applicationMusicPlayer];
// break to allow application didFinishLaunching to complete
[self performSelector:#selector(presentMPPicker) withObject:nil afterDelay:0.01];
}
- (void)presentMPPicker {
// present the picker in a modal view controller
MPMediaPickerController *picker = [[MPMediaPickerController alloc] initWithMediaTypes:MPMediaTypeAnyAudio];
picker.delegate = self;
[self presentModalViewController:picker animated:YES];
[picker release];
}
// delegate called after user picks a media item
- (void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection {
[self dismissModalViewControllerAnimated:YES];
// tell the player what to play
[MPPlayer setQueueWithItemCollection:mediaItemCollection];
// start playing
[MPPlayer play];
}
#end

MPMusicPlayerController does not support background audio. You will need to use something like AVAudioPlayer or AVPlayer (AVPlayer allows you to use iPod libary items via the AssetURL).
The reason is that MPMusicPlayerController uses the iPod application to play audio, thus your application is not actually playing anything.
Please see this thread on the Apple Developer Forums for more information: https://devforums.apple.com/thread/47905?start=0&tstart=120

Did you set the appropriate audio session type for being a media player? The OS uses session types to make decisions among competing uses for the audio channels.

Related

iOS enabling AirPrint for uiwebview contents

I am super new to XCode and App development. I currently am loading up a web based application in uiwebviews on the iPad. When one particular page is loaded, it displays a pdf file. I would like to be able to print this pdf file using AirPrint. I am looking for a simple solution. Currently the app I am working on has 6 files which it uses.
-ViewController.m
-ViewController.h
-MainStoryboard_iPad.storyboard
-MainStoryboard_iPhone.storyboard
-AppDelegate.h
-AppDelegate.m
In the MainStoryboard files, there are many windows (graphical) which are liked to a central navigation system. If it is possible to spend some time to really explain what I need to do, and not 'take a look at this link.' I have programming experience, but never with XCode or any product related to Apple.
I figured out how to do this. Firstly I found a piece of code here, iOS Air print for UIwebview, I had no idea how to implement this at the time, but then I did as follows.
My application was a single view XCode project
In my storyboard I inserted a button (on my navigation bar) and changed its Identifier to 'Action' then, making sure to have the 'tuxedo' editor view open, displaying my ViewController.m file, I clicked and dragged from the button to the ViewController.m file while holding down control. This inserted my IBAction method after asking for my buttons id.
myActionButton
Then I copied in the code specified in response 3 of the question. My ViewController.m looked something link this.
#import "ViewController.h"
#interface ViewController()
#end
#implementation ViewController()
//Some stuff here
#end
#synthesize webView
-(IBAction)myActionButton:(id)sender{
UIPrintInfo *pi = [UIPrintInfo printInfo];
pi.outputType = UIPrintInfoOutputGeneral;
pi.jobName = webView.request.URL.absoluteString;
pi.orientation = UIPrintInfoOrientationPortrait;
pi.duplex = UIPrintInfoDuplexLongEdge;
UIPrintInteractionController *pic = [UIPrintInteractionController sharedPrintController];
pic.printInfo = pi;
pic.showsPageRange = YES;
pic.printFormatter = webView.viewPrintFormatter;
[pic presentAnimated:YES completionHandler:^(UIPrintInteractionController *pic2, BOOL completed, NSError *error) {
// indicate done or error
}];
}
Also in my ViewController.h file
#import ...
#interface ViewController : ...
{
IBOutlet UIWebView *webView
}
#property (nonatomic,retain) IBOutlet UIWebView *webView
#end
I didn't setup the webviews so I am not 100% sure on how they are created, but there is a good series for beginners on youtube at HackLife87 which shows how to make a single view app. I think XCode Tutorial Video #7 involves setting up views.
Since I am extremely green to XCode and IPad app development, I managed to solve my problem by combining knowledge from watching the aforementioned XCode tutorial videos and then implementing the solution provided on stackoverflow by Hafthor.

Use MPMediaPicker selections as AVAudioPlayer inputs

I'm programming an application for the hearing-impaired. I'm hoping to take tracks from the iTunes library, and in my app have a slider for panning. I don't want to use OpenAL (this isn't a game - I repeat this is a media player). So since AVAudioPlayer has the easy pan method, can I take selections from the MPMediaPicker and feed them into the AVAudioPlayer so I can pan them?
I dont do a lot of iOS development, but I believe there are two ways.
Method #1
You need to add /System/Library/Frameworks/AVFoundation.framework to your target in Xcode and #import AVAudioPlayer.h as well as You need to add MediaPlayer.framework to your target in Xcode and #import .
For this operation, you need MPMediaPicker to pass the song data to AVAMedia Player. That can be accomplished like this:
#interface MusicPlayerDemoViewController : UIViewController <MPMediaPickerControllerDelegate> {
...
}
...
// This action should open the media picker
- (IBAction)openMediaPicker:(id)sender;
#end
// MusicPlayerDemoViewController.m
- (IBAction)openMediaPicker:(id)sender {
MPMediaPickerController *mediaPicker = [[MPMediaPickerController alloc] initWithMediaTypes:MPMediaTypeMusic];
mediaPicker.delegate = self;
mediaPicker.allowsPickingMultipleItems = NO; // this is the default
[self presentModalViewController:mediaPicker animated:YES];
[mediaPicker release];
}
// Media picker delegate methods
- (void)mediaPicker: (MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection {
// We need to dismiss the picker
[self dismissModalViewControllerAnimated:YES];
(Code Continues Below, the blanks are for you to fill in)
AT THIS POINT, CALL THE AVAAUDIOPLAYER CLASS AND TELL IT TO PLAY mediaItemCollection . REMEMBER TO STOP AUDIO BEFORE PLAYING, AS IT WILL PLAY MULTIPLE SONGS AT ONCE.
}
- (void)mediaPickerDidCancel:(MPMediaPickerController *)mediaPicker {
// User did not select anything
// We need to dismiss the picker
[self dismissModalViewControllerAnimated:YES];
}
NOW, ONCE THIS IS DONE THE USER NEEDS TO SELECT A NEW SONG. YOU COULD EITHER CREATE A WHILE LOOP AROUND THE WHOLE THING, WHERE THE CONDITIONAL IS CURRENT TIME >= DURATION (FROM AVAAUDIO PLAYER),
ALTERNATIVELY, YOU COULD CREATE A BUTTON TO OPEN THE PICKER
For more questions checkout:
http://oleb.net/blog/2009/07/the-music-player-framework-in-the-iphone-sdk/ (I used much of their code)
http://developer.apple.com/library/ios/#DOCUMENTATION/AVFoundation/Reference/AVAudioPlayerClassReference/Reference/Reference.html
Good Luck!
Derek
Try having AVAMediaPlayer play the variable mediaItemCollection. This was assigned by the picker to be the song location by the code above. If this does not work make sure that AVAMediaPlayer uses the same input variable type (format, like an ID or a folder location) as MpMediaPicker.
That error message sounds like a technical issue. The only thing I can think of is that AVAAudio player or MPmedia player is looking for a Volume variable (it is required?) and can't find one. I can't really answer this one as I don't do iPhone Development, try there forums or website for some help.
Sounds like you are doing a good job! If you are interested, (I don't know if you are staying at DA) Mr. Cochran (the dean of students at the Upper School) is teaching a iPhone Development Class and a AP Computer Science Class (I am in). If you want to take it further, or you want to just ask questions I know he is more than happy too!
Good Luck! Tell me when it is finished so I can test the results!

MPMoviePlayerController stopped working in 3.1.2

I am using MPMoviePlayerController to play a live streaming m3u8 video for older devices (3.1.2). This worked fine until this morning. I tried changing the scalingMode to resolve another issue, and now the player does not work at all. I went back to older backups that worked, and they don't work, either.
While debugging, control goes into [mMPPlayer play] and never returns. This also locks up my app.
Has something changed with MPMoviePlayerController, or did I break something in XCode?
My app was scheduled to start moving to production today, so I'm really in a bind, here. :(
Here's the warning that I get:
Warning: MPMoviePlayerController may not support file of type m3u8
And here's my code:
MyViewController.h:
#import <UIKit/UIKit.h>
#import <MediaPlayer/MediaPlayer.h>
#interface WatchNowViewController : UIViewController {
MPMoviePlayerController *mMPPlayer;
}
#property (nonatomic, retain) MPMoviePlayerController *mMPPlayer;
#end
MyViewController.m:
mMPPlayer = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:#"http://www.mysite.com/myVideo.m3u8"]];
mMPPlayer.scalingMode=MPMovieScalingModeFill;
mMPPlayer.backgroundColor=[UIColor blackColor];
[mMPPlayer play];
NSLog("Control never returns to here");
Happened to us as well. Not sure what went wrong. The encrypted streams just stopped playing in 3.2. Try the m3u8 url in the iPad safari and check if it plays there. If it doesn't play in iPad safari as well, try an unencrypted stream. As per my experience, an unencrypted stream played in 3.2 without issues.

Detect iPhone device capabilities for Core Location

I have a core location app that I'm writing leveraging the startMonitoringSignificantLocationChanges method to generate updates when appropriate but this does not work on older devices such as iPhone 3g.
I would like for the core location functionality to still work while the device is open, so I thought I could use a selector test to see if the device supports the method, and if it doesnt just use the standard core location updating method. Although this selector doesnt work on my iphone 3g, it still uses startMonitoringSignificantLocationChanges even though it doesnt work on the phone.
Any ideas? I would rather not use the device identifier tests because then it will have to be updated for every future release of the phone.
#interface RootViewController : UITableViewController <CLLocationManagerDelegate> {
CLLocationManager *locationManager;
}
#property (nonatomic, retain) CLLocationManager *locationManager;
#implementation RootViewController
#synthesize locationManager;
if([locationManager respondsToSelector:#selector(startMonitoringSignificantLocationChanges)]) {
[locationManager startMonitoringSignificantLocationChanges];
NSLog(#"Using bg updates");
}
else {
[locationManager startUpdatingLocation];
NSLog(#"Using reg updates");
}
if ([CLLocationManager significantLocationChangeMonitoringAvailable]) {
…
}

Sending a message to an object

I'm pretty new to Objective C but things are progressing well. However, I think I'm missing a key concept relating to how objects are created and messaged. I hope someone can help me.
I'm creating an iPhone app that simply creates a MPMusicPlayer, and starts playing a song from the que.
I create the music player (myPlayer) in the AppDelegate like so...
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
// Override point for customization after application launch.
// Add the main view controller's view to the window and display.
[window addSubview:mainViewController.view];
[window makeKeyAndVisible];
// instantiate a music player
MPMusicPlayerController *myPlayer =
[MPMusicPlayerController applicationMusicPlayer];
// assign a playback queue containing all media items on the device
[myPlayer setQueueWithQuery: [MPMediaQuery songsQuery]];
// start playing from the beginning of the queue
[myPlayer play];
return YES;
}
This works fine. Now in the MainViewController I try to send myPlayer the command to skip to the next track
[myPlayer skipToNextItem];
However when I try to compile I get the message that myPlayer in undeclared.
What am I doing wrong? I can see how I could fix this in a procedural way (by creating the player in the MainViewController), but I'd like to understand what I have to do to get it working in and OOP way.
Cheers,
Richard
Most propably, the mPlayer object is unknown to your ViewController. There are two options for you:
Make the mPlayer a property of your app delegate
Make the mPlayer a property of your view controller subclass and set it to your mPlayer upon creation
In your appdelegates declaration, do:
#property(nonatomic, retain) MPMusicPlayerController *mPlayer;
In your appdelegates implementation, do:
#synthesize mPlayer;
In your viewcontroller, do:
MPMusicPlayerController *mPlayer = [[[UIApplication sharedApplication] delegate] mPlayer];