I have an iOS application that can play music and video. You can cast music or video from the app using the chromecast sdk. I've been reading the documentation about playing in background. When setting this code:
let criteria = GCKDiscoveryCriteria(applicationID: chromeCastReceiverAppID)
let options = GCKCastOptions(discoveryCriteria: criteria)
options.physicalVolumeButtonsWillControlDeviceVolume = true
options.suspendSessionsWhenBackgrounded = false
GCKCastContext.setSharedInstanceWith(options)
I can keep controlling my music while casting and the app is in background this is great. When casting a video the session is lost while in background and when the app resume it doesn't resume automatically as stated in the documentation. it does work if I set the suspendSessionsWhenBackgrounded = true but then I get suspended when casting music and can't switch to the next song.
Is there a way to switch that option based on what type of content I'm casting?
I would like to display info and skip the current song being played by the iPhone. I understand MPMusicPlayerController can do this for apple music. However, I would like to display and skip the current song if it is being played by a third-party app like Spotify or Audible.
let audioPlayer = MPMusicPlayerController.systemMusicPlayer
next song
audioPlayer.skipToNextItem()
previous song
audioPlayer.skipToPreviousItem()
When using the AVCaptureSession to connect a AVCaptureDevice the device get's locked by the application and can not be released. The only way the device is truly released is after restarting the app.
We are using AVCaptureSession for previewing the camera and audio level meters. But once we start the actual capture we might need to switch to an alternative capture SDK (in this case DeckLink SDK). How ever the device remains locked by AVFoundation and we can't seem to free it in any way.... It all goes sideways as soon as wel call AVCaptureSession.addInput on the AVCaptureSession from the AVCaptureDevice. And simply iterating the inputs and using AVCaptureSession.removeInput does not seem work.
We setup the session like this:
do {
try self.selectedVideoDevice.lockForConfiguration()
try self.selectedAudioDevice?.lockForConfiguration()
self.cameraSession = AVCaptureSession()
self.cameraSession?.beginConfiguration()
self.cameraSession?.sessionPreset = AVCaptureSession.Preset.high
// Add some outputs... not relevant for the issue at hand?!
// Add audio input
if self.selectedAudioDevice != nil {
let deviceInputAudio = try AVCaptureDeviceInput(device: self.selectedAudioDevice!)
if self.cameraSession?.canAddInput(deviceInputAudio) ?? false {
self.cameraSession?.addInput(deviceInputAudio)
}
}
// Add video input
let deviceInputVideo = try AVCaptureDeviceInput(device: self.selectedVideoDevice)
if self.cameraSession?.canAddInput(deviceInputVideo) ?? false {
self.cameraSession?.addInput(deviceInputVideo)
}
self.cameraSession?.commitConfiguration()
self.cameraSession?.startRunning()
self.selectedVideoDevice.unlockForConfiguration()
self.selectedAudioDevice?.unlockForConfiguration()
} catch {
}
And try to release using something like this... one of the many tries...
self.cameraSession?.stopRunning()
for output in self.cameraSession?.outputs ?? [] {
self.cameraSession?.removeOutput(output)
}
for input in self.cameraSession?.inputs ?? [] {
self.cameraSession?.removeInput(input)
}
self.cameraSession = nil
How ever we can't get the device to be recognized in the DeckLink SDK after using in AVFoundation.
Any ideas would be great as cleaning up or setting the variables to nil doesn't seem to do anything...
We chose to implement the Desktop Video SDK from Blackmagic and do all captures from a Blackmagic device using that. This solves more issues when capturing using Blackmagic Mini Recorder (for example: audio sync) somehow AVFoundation does work with Blackmagic but not really well. And Blackmagic officially never answered the question "Do you support AVFoundation". So in order to make it work I would recommend the Desktop Video SDK that can be downloaded from their site under Support.
https://www.blackmagicdesign.com/support/
Also make sure you never load the video device into your AVFoundation workflow. It will get stuck and hold it. So first check if it's Blackmagic then continue to AVFoundation if not.
I am trying to detect which music from another app is playing in background.
I know there is a way to detect if a music is played in background but I would like to detect which song and the time if I can. How do I do that? In Swift or Objective-c
You can use the MediaPlayer framework to get the current Music playing in the Music app:
import MediaPlayer
let player = MPMusicPlayerController.systemMusicPlayer()
let currentSongTitle = player.nowPlayingItem?.title
let currentPlaybackTime = player.currentPlaybackTime
Note that you need to have the NSAppleMusicUsageDescription (aka Privacy - Media Library Usage Description) set to the String you want to present to the user when asking for permission.
Can the Apple Watch use AVFoundation? More specifically, can AVAudioPlayer and AVAudioRecorder work?
I am trying to make an app that lets you record a sound to the Apple Watch and have it play it back using the audioplayer. Thanks
UPDATE 11/28/15
This answer applies to WatchKit and WatchOS 1.x. Though I have not tested myself, WatchOS 2.x offers different support, including hardware. I can confirm that as of XCode 7.x, the compiler behaves correctly and WatchKit 1.x extensions with calls to AVFoundation won't build.
For WatchOS 1.x and XCode up to 6.x
Despite the fact that AVAudioRecorder and AVAudioPlayer both work in Simulator Apple Watch, they don't actually work on an actual device! I had opened a bug with Apple back on 5/5/15 (once I got to test an App I had written on my actual watch.) My app (a wrist audio recorded) indeed worked beautifully on Simulator. Yet on the actual watch, it would install and run, but the AVAudioRecorder "record" message would simply never toggle into "recording". Interestingly, the code does not throw any exceptions anywhere. It simply does not record!
I received a reply to my Apple Bug today 5/28/15 that "there are no plans to address this" based on "AVFoundation isn't supported on the watch." No word on whether or not Simulator will be updated so that AVFoundation also fails to work.
So, for now, Apple Watch is limited to controlling recording "on the phone" via watch extension to phone messaging which is supported in Apple Watch extensions.
Try using WKAudioFilePlayerItem ?
To use this class to play audio, you need to declare 3 variables:
var audioFile: WKAudioFileAsset!
var audioItem: WKAudioFilePlayerItem!
var audioPlayer: WKAudioFilePlayer!
They have different role inside the program. For audioFile is used to define NSURL. For example, if you have file called "bicycle.mp3", you can define the URL path like this :
let soundPath = NSBundle.mainBundle().pathForResource("bicycle", ofType: "mp3")
print("load file")
when you have a soundpath, you make it a NSURL inside audioFile:
audioFile = WKAudioFileAsset.init(URL: soundPathURL)
When you have a audioFile, you put it inside the audioPlayerItem
audioItem = WKAudioFilePlayerItem.init(asset: audioFile)
When you have audioItem, you put it inside the audioPlayer
`audioPlayer = WKAudioFilePlayer(playerItem: audioItem)`
Finally, you can put the code audioPlayer.play() where you want to play the sound
Enjoy!
Short answer? No. WatchKit doesn't currently offer access to any of the hardware in the Watch.
finally ended up with this:
#IBAction func playButtonTapped() {
let options = [WKMediaPlayerControllerOptionsAutoplayKey : "true"]
let filePath = NSBundle.mainBundle().pathForResource("1button", ofType: "wav")!
let fileUrl = NSURL.fileURLWithPath(filePath)
presentMediaPlayerControllerWithURL(fileUrl, options: options,
completion: { didPlayToEnd, endTime, error in
if let err = error {
print(err.description)
}
})
}