There is a way to add an image to the lock screen for Background Audio, along with setting the Track and Artist name. It was also mentioned in a WWDC 2011 video, but nothing specific to go off of. I have looked everywhere in the docs and cannot find it. I know it is an iOS5 only thing, and Spotify's newest version has this feature. Does anyone know where they can point me in the right direction?
Thank You,
Matthew
Here's an answer I found for you:
(1) You must handle remote control events. You can't be the Now
Playing app unless you do. (See the AudioMixer (MixerHost) sample) code.)
(2) Set the Now Playing info:
MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
infoCenter.nowPlayingInfo =
[NSDictionary dictionaryWithObjectsAndKeys:#"my title", MPMediaItemPropertyTitle,
#"my artist", MPMediaItemPropertyArtist,
nil];
This is independent of whichever API you are using to play audio or
video.
as per Michaels answer above, simply append
#{MPMediaItemPropertyArtwork: [[MPMediaItemArtwork alloc] initWithImage:[UIImage ...]]}
to the nowPlayingInfo dict
the full options of available keys are ...
// MPMediaItemPropertyAlbumTitle
// MPMediaItemPropertyAlbumTrackCount
// MPMediaItemPropertyAlbumTrackNumber
// MPMediaItemPropertyArtist
// MPMediaItemPropertyArtwork
// MPMediaItemPropertyComposer
// MPMediaItemPropertyDiscCount
// MPMediaItemPropertyDiscNumber
// MPMediaItemPropertyGenre
// MPMediaItemPropertyPersistentID
// MPMediaItemPropertyPlaybackDuration
// MPMediaItemPropertyTitle
To make controls work....
- (BOOL)canBecomeFirstResponder {
return YES;
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
}
- (void)viewWillDisappear:(BOOL)animated {
[[UIApplication sharedApplication] endReceivingRemoteControlEvents];
[self resignFirstResponder];
[super viewWillDisappear:animated];
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)receivedEvent {
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlPlay:
[player play];
break;
case UIEventSubtypeRemoteControlPause:
[player pause];
break;
case UIEventSubtypeRemoteControlTogglePlayPause:
if (player.playbackState == MPMoviePlaybackStatePlaying) {
[player pause];
}
else {
[player play];
}
break;
default:
break;
}
}
}
It only works on a real iOS Device, not on the simulator
Related
I wrote a SpriteKit app last year targeting 10.10 (Yosemite). Everything ran fine, but when I upgraded to El Capitan this year it freezes in one particular spot. It's a tough problem to diagnose because there is a lot of code so I'll try to be as descriptive as possible. I've also created a YOUTUBE screen recording of the issue.
App's Purpose
The app is basically a leaderboard that I created for a tournament at the school that I teach at. When the app launches, it goes to the LeaderboardScene scene and displays the leaderboard.
The app stays in this scene for the rest of the time. The sword that says "battle" is a button. When it is pressed it creates an overlay and shows the two students that will be facing each other in video form (SKVideoNode).
The videos play continuously and the user of the app eventually clicks on whichever student wins that match and then the overlay is removed from the scene and the app shows the leaderboard once again.
Potential Reasons For High CPU
Playing video: Normally the overlay shows video, but I also created an option where still images are loaded instead of video just in case I had a problem. Whether I load images or video, the CPU usage is super high.
Here's some of the code that is most likely causing this issue:
LeaderboardScene.m
//when the sword button is pressed it switches to the LB_SHOW_VERSUS_SCREEN state
-(void) update:(NSTimeInterval)currentTime {
switch (_leaderboardState) {
...
case LB_SHOW_VERSUS_SCREEN: { //Case for "Versus Screen" overlay
[self showVersusScreen];
break;
}
case LB_CHOOSE_WINNER: {
break;
}
default:
break;
}
}
...
//sets up the video overlay
-(void) showVersusScreen {
//doesn't allow the matchup screen to pop up until the producer FLASHING actions are complete
if ([_right hasActions] == NO) {
[self addChild:_matchup]; //_matchup is an object from the Matchup.m class
NSArray *producers = #[_left, _right];
[_matchup createRound:_round WithProducers:producers VideoType:YES]; //creates the matchup with VIDEO
//[_matchup createRound:_round WithProducers:producers VideoType:NO]; //creates the matchup without VIDEO
_leaderboardState = LB_CHOOSE_WINNER;
}
}
Matchup.m
//more setting up of the overlay
-(void) createRound:(NSString*)round WithProducers:(NSArray*)producers VideoType:(bool)isVideoType {
SKAction *wait = [SKAction waitForDuration:1.25];
[self loadSoundsWithProducers:producers];
[self runAction:wait completion:^{ //resets the overlay
_isVideoType = isVideoType;
[self removeAllChildren];
[self initBackground];
[self initHighlightNode];
[self initOutline];
[self initText:round];
if (_isVideoType)
[self initVersusVideoWithProducers:producers]; //this is selected
else
[self initVersusImagesWithProducers:producers];
[self animationSequence];
_currentSoundIndex = 0;
[self playAudio];
}];
}
...
//creates a VersusSprite object which represents each of the students
-(void) initVersusVideoWithProducers:(NSArray*)producers {
Producer *left = (Producer*)[producers objectAtIndex:0];
Producer *right = (Producer*)[producers objectAtIndex:1];
_leftProducer = [[VersusSprite alloc] initWithProducerVideo:left.name LeftSide:YES];
_leftProducer.name = left.name;
_leftProducer.zPosition = 5;
_leftProducer.position = CGPointMake(-_SCREEN_WIDTH/2, _SCREEN_HEIGHT/3);
[self addChild:_leftProducer];
_rightProducer = [[VersusSprite alloc] initWithProducerVideo:right.name LeftSide:NO];
_rightProducer.name = right.name;
_rightProducer.zPosition = 5;
_rightProducer.xScale = -1;
_rightProducer.position = CGPointMake(_SCREEN_WIDTH + _SCREEN_WIDTH/2, _SCREEN_HEIGHT/3);
[self addChild:_rightProducer];
}
VersusSprite.m
-(instancetype) initWithProducerVideo:(NSString*)fileName LeftSide:(bool)isLeftSide {
if (self = [super init]) {
_isVideo = YES;
_isLeftSide = isLeftSide;
self.name = fileName;
[self initVideoWithFileName:fileName]; //creates videos
[self addProducerLabel];
}
return self;
}
...
//creates the videos for the VersusSprite
-(void) initVideoWithFileName:(NSString*)fileName {
NSArray *paths = NSSearchPathForDirectoriesInDomains (NSDesktopDirectory, NSUserDomainMask, YES);
NSString *desktopPath = [paths objectAtIndex:0];
NSString *resourcePath = [NSString stringWithFormat:#"%#/vs", desktopPath];
NSString *videoPath = [NSString stringWithFormat:#"%#/%#.mp4", resourcePath, fileName];
NSURL *fileURL = [NSURL fileURLWithPath:videoPath];
AVPlayer *avPlayer = [[AVPlayer alloc] initWithURL:fileURL];
_vid = [SKVideoNode videoNodeWithAVPlayer:avPlayer];
//[_vid setScale:1];
[self addChild:_vid];
[_vid play];
avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[avPlayer currentItem]];
}
//used to get the videos to loop
- (void)playerItemDidReachEnd:(NSNotification *)notification {
AVPlayerItem *p = [notification object];
[p seekToTime:kCMTimeZero];
}
UPDATE
The issue has been identified and is very specific to my project, so it probably won't help anyone else unfortunately. When clicking on the "sword" icon that says "Battle", the scene gets blurred and then the overlay is put on top of it. The blurring occurs on a background thread as you'll see below:
[self runAction:[SKAction waitForDuration:1.5] completion:^{
[self blurSceneProgressivelyToValue:15 WithDuration:1.25];
}];
I'll have to handle the blur in another way or just remove it altogether.
The Apple docs seem to indicate that while recording video to a file, the app can change the URL on the fly with no problem. But I'm seeing a problem. When I try this, the recording delegate gets called with an error...
The operation couldn’t be completed. (OSStatus error -12780.) Info
dictionary is: {
AVErrorRecordingSuccessfullyFinishedKey = 0; }
(funky single quote in "couldn't" comes from logging [error localizedDescription])
Here's the code, which is basically tweaks to WWDC10 AVCam sample:
1) Start recording. Start timer to change the output URL every few seconds
- (void) startRecording
{
// start the chunk timer
self.chunkTimer = [NSTimer scheduledTimerWithTimeInterval:5
target:self
selector:#selector(chunkTimerFired:)
userInfo:nil
repeats:YES];
AVCaptureConnection *videoConnection = [AVCamCaptureManager connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self movieFileOutput] connections]];
if ([videoConnection isVideoOrientationSupported]) {
[videoConnection setVideoOrientation:[self orientation]];
}
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{}]];
}
NSURL *fileUrl = [[ChunkManager sharedInstance] nextURL];
NSLog(#"now recording to %#", [fileUrl absoluteString]);
[[self movieFileOutput] startRecordingToOutputFileURL:fileUrl recordingDelegate:self];
}
2) When the timer fires, change the output file name without stopping recording
- (void)chunkTimerFired:(NSTimer *)aTimer {
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{}]];
}
NSURL *nextUrl = [self nextURL];
NSLog(#"changing capture output to %#", [[nextUrl absoluteString] lastPathComponent]);
[[self movieFileOutput] startRecordingToOutputFileURL:nextUrl recordingDelegate:self];
}
Note: [self nextURL] generates file urls like file-0.mov, file-5.mov, file-10.mov and so on.
3) This gets called each time the file changes, and every other invocation is an error...
- (void) captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error
{
id delegate = [self delegate];
if (error && [delegate respondsToSelector:#selector(someOtherError:)]) {
NSLog(#"got an error, tell delegate");
[delegate someOtherError:error];
}
if ([self backgroundRecordingID]) {
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[[UIApplication sharedApplication] endBackgroundTask:[self backgroundRecordingID]];
}
[self setBackgroundRecordingID:0];
}
if ([delegate respondsToSelector:#selector(recordingFinished)]) {
[delegate recordingFinished];
}
}
When this runs, file-0 gets written, then we see error -12780 right after changing the url to file-5, file-10 gets written, then an error, then okay, and so on.
It's appears that changing the URL on the fly doesn't work, but it stops the writing which allows the next URL change to work.
Thanks all, for the review and good thoughts on this. Here's the word from Apple DTS...
I spoke with our AV Foundation engineers, and it is definitely a bug
in that this method is not doing what the documentation says it should
("You do not need to call stopRecording before calling this method
while another recording is in progress."). Please file a bug report
using the Apple Bug Reporter (http://developer.apple.com/bugreporter/)
so the team can investigate. Make sure and include your minimal
project in the report.
I've filed this with Apple as bug 11632087
In the docs it's stated this:
If a file at the given URL already exists when capturing starts, recording to the new file will fail.
Are you sure you check that nextUrl is a non-existing file name?
According to the documentation, calling to 2 consecutive startRecordingToOutputFileURL is not supported.
you can read about it here
In iOS, this frame accurate file switching is not supported. You must call stopRecording before calling this method again to avoid any errors.
I called the code below to add Game center user banner pop up at the top of the screen, but it says Game is not recognized by Game Center:
I added this to my addDidFinishLaunching:
- (void)applicationDidFinishLaunching:(UIApplication *)application {
[[GKLocalPlayer localPlayer] authenticateWithCompletionHandler:^(NSError *error){
if (error ==nil) {
NSLog(#"Success");
} else {
NSLog(#"Fail");
}
}];
For viewing leader boards, what code do I need to add to properly call it?
-(void)viewscores:(SPEvent*)event{
CODE HERE
}
You could try it like that :
GKLeaderboardViewController *leaderboardVC = [[[GKLeaderboardViewController alloc]init]autorelease];
if (leaderboardVC !=nil) {
leaderboardVC.leaderboardDelegate = self;
[self presentViewController:leaderboardVC];
}
I hope it helps :-)
I'm trying to detect user input (a click) on the headphones connected to an iPhone. So far I've only found how to detect interruptions using AVAudioSession. Is AVAudioSession right or is there another way? how?
You want this:
beginReceivingRemoteControlEvents
You implement something this in one of your VCs classes:
// If using a nonmixable audio session category, as this app does, you must activate reception of
// remote-control events to allow reactivation of the audio session when running in the background.
// Also, to receive remote-control events, the app must be eligible to become the first responder.
- (void) viewDidAppear: (BOOL) animated {
[super viewDidAppear: animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
}
- (BOOL) canBecomeFirstResponder {
return YES;
}
// Respond to remote control events
- (void) remoteControlReceivedWithEvent: (UIEvent *) receivedEvent {
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
[self playOrStop: nil];
break;
default:
break;
}
}
}
See the sample code here.
It is even easier now, as of iOS 7. To execute a block when the headphone play/pause button is pressed:
MPRemoteCommandCenter *commandCenter = [MPRemoteCommandCenter sharedCommandCenter];
[commandCenter.togglePlayPauseCommand addTargetWithHandler:^MPRemoteCommandHandlerStatus(MPRemoteCommandEvent * _Nonnull event) {
NSLog(#"toggle button pressed");
return MPRemoteCommandHandlerStatusSuccess;
}];
or, if you prefer to use a method instead of a block:
[commandCenter.togglePlayPauseCommand addTarget:self action:#selector(toggleButtonAction)];
To stop:
[commandCenter.togglePlayPauseCommand removeTarget:self];
or:
[commandCenter.togglePlayPauseCommand removeTarget:self action:#selector(toggleButtonAction)];
You'll need to add this to the includes area of your file:
#import MediaPlayer;
I've tried the code to avoid the sleep mode interruption. It's working, but now I have a problem with interruptions when a call or sms is coming in to the iPhone. The song in my music application is stopped at that time. I want to automatically resume the song where it was stopped once the call is over.
- (void) audioPlayerBeginInterruption: (AVAudioPlayer *) player {
if (playing) {
playing = NO;
interruptedOnPlayback = YES;
[self updateViewForPlayerState];
}
}
- (void) audioPlayerEndInterruption: (AVAudioPlayer *) player {
if (interruptedOnPlayback) {
[player prepareToPlay];
[player play];
playing = YES;
interruptedOnPlayback = NO;
}
}
But it doesn't work. Please help me here, thanks in advance.
Try Apple's guide to Audio Interruptions.