I am making a simple music player app that just plays audio using AVAudioEngine. When I pause the AVAudioPlayerNode, it does not update the play/pause control of the MPNowPlayingInfoCenter/MPRemoteCommandCenter. How do I update it?
I dont want to pause the entire AVAudioEngine and then resume to update the MPNowPlayingInfoCenter/MPRemoteCommandCenter play/pause control since it lags up my UI and there's a delay when playing back audio.
Does anyone have any idea or solution regarding this topic? The documentation is absolutely atrocious and no one knows a goddamn thing about AVAudioEngine. Anyone?
func pause() {
playerNode.pause()
audioEngine.pause()
displayLink?.isPaused = true
DispatchQueue.main.async {
self.musicPlayerControlsManager.isPlaying = false
}
}
func playPlayerNode() {
if !audioEngine.attachedNodes.isEmpty && audioFile != nil {
DispatchQueue.main.async {
self.musicPlayerControlsManager.isPlaying = true
self.displayLink?.isPaused = false
}
do {
try audioEngine.start()
playerNode.play()
} catch {
print(error.localizedDescription)
}
}
}
Related
I have an AVQueuePlayer that gets songs from a Firebase Storage via their URL and plays them in sequence.
static func playQueue() {
for song in songs {
guard let url = song.url else { return }
lofiSongs.append(AVPlayerItem(url: url))
}
if queuePlayer == nil {
queuePlayer = AVQueuePlayer(items: lofiSongs)
} else {
queuePlayer?.removeAllItems()
lofiSongs.forEach { queuePlayer?.insert($0, after: nil) }
}
queuePlayer?.seek(to: .zero) // In case we added items back in
queuePlayer?.play()
}
And this works great.
I can also make the lock screen controls appear and use the play pause button like this:
private static func setRemoteControlActions() {
let commandCenter = MPRemoteCommandCenter.shared()
// Add handler for Play Command
commandCenter.playCommand.addTarget { [self] event in
queuePlayer?.play()
return .success
}
// Add handler for Pause Command
commandCenter.pauseCommand.addTarget { [self] event in
if queuePlayer?.rate == 1.0 {
queuePlayer?.pause()
return .success
}
return .commandFailed
}
}
The problem comes with setting the metadata of the player (name, image, etc).
I know it can be done once by setting MPMediaItemPropertyTitle and MPMediaItemArtwork, but how would I change it when the next track loads?
I'm not sure if my approach works for AVQueueplayer, but for playing live streams with AVPlayer you can "listen" to metadata receiving.
extension ViewController: AVPlayerItemMetadataOutputPushDelegate {
func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) {
//look for metadata in groups
}
}
I added the AVPlayerItemMetadataOutputPushDelegate via an extension to my ViewController.
I also found this post.
I hope this gives you a lead to a solution. As said I'm not sure how this works with AVQueuePlayer.
I have an app that allows users to playback audio while recording a video. They can only record in landscape.
This is how I've set up the playback of audio during a video session:
guard allowBackgroundAudio == true else {
return
}
guard audioEnabled == true else {
return
}
do{
if #available(iOS 10.0, *) {
try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default, options: [.mixWithOthers, .defaultToSpeaker])
} else {
let options: [AVAudioSession.CategoryOptions] = [.mixWithOthers, .allowBluetooth]
let category = AVAudioSession.Category.playAndRecord
let selector = NSSelectorFromString("setCategory:withOptions:error:")
AVAudioSession.sharedInstance().perform(selector, with: category, with: options)
}
try AVAudioSession.sharedInstance().setActive(true)
session.automaticallyConfiguresApplicationAudioSession = false
}
catch {
print("[SwiftyCam]: Failed to set background audio preference")
}
}
The problem is audio is still playing slightly out of the receiver which means the top mic is picking up the audio playback, drowning out the user's audio.
After reading on here that the receiver speaker still playing might be a bug (or feature) from Apple I decided to use the back mic for the selfie camera, thus splitting the audio away from the mic. I can't seem to get the selfie camera to use the back mic.
public class func setMicrophone(_ uiorient: String) {
guard let inputs = AVAudioSession.sharedInstance().availableInputs else {
return
}
for input in inputs {
print(input.dataSources ?? "??")
}
// set preferred:
let preferredPort = inputs[0]
if let dataSources = preferredPort.dataSources {
for source in dataSources {
if source.dataSourceName == uiorient {
do {
try preferredPort.setPreferredDataSource(source)
}
catch _ {
print("Cannot set \(uiorient) microphone.")
}
}
}
}
}
and then have this when we are calling the selfie camera;
AudioRecorderViewController.setMicrophone("Back")
I'm scheduling a buffer periodically using scheduleBuffer function. The completion handler schedules the next sound when current one is done, like so:
func scheduleSounds() {
if !isPlaying { return }
while beatsScheduled < beatsToScheduleAhead {
// Returns AVAudioTime of the next sound, based on sample time
// first sound is always sample time 0, then increments.
let nextTime = getNextTime()
player.scheduleBuffer(
soundBuffer,
at: nextTime,
options: AVAudioPlayerNodeBufferOptions(rawValue: 0),
completionHandler: {
self.queue!.sync {
print("Finished playing a sound!")
scheduleSounds()
}
}
)
beatsScheduled += 1
if !playerStarted {
player.play(at: nil)
playerStarted = true
}
}
}
When I stop and then start the player/engine there is a delay before the sound starts playing and after logging which sounds play when I noticed that there are couple of sounds that the engine does not play. My suspicion is that the player nodes think that the sample time has already passed relative to the player node time. I think I might not be resetting or stopping something properly, here's how I start and stop the sounds:
func start() {
do {
try engine.start()
isPlaying = true
queue!.sync() {
self.scheduleSounds()
}
} catch {
print("\(error)")
}
}
func stop() {
isPlaying = false
player.stop()
engine.stop()
playerStarted = false
}
So the problem is how do I make sure that there are no sounds that AVAudioPlayerNode skips before playing the sound?
I'm building an App where I want to play music from the local library AND use the AVQueuePlayer, to play a list of tracks, where once in a while there's a break between the track. The music works totally fine, now, Since I want everything to work in the background, my only option when playing the AVQueuePlayer and want a break, is to play a silent AVPlayerItem (an empty audiofile.) I want the Music to play normally when silent AVPlayerItems are playing, I achieved that by setting AVAudioSession category to .playback and with options: .mixWithOthers, and when a regular track (not silent) is played by the AVQueuePlayer, I want the music to be dimmed a little.
I've tried changing the the audio session like this: AVAudioSession.sharedInstance().setCategory(.playback, options: [.mixWithOthers, .duckOthers])
but it doesn't change anything. When I check if it's changed like this: if AVAudioSession.sharedInstance().categoryOptions == .mixWithOthers {
print("Music mixing with silence")
}
the session category options seem to have changed, but it doesn't affect the audio in the end.
#objc func playerDidFinishPlaying() {
print("Player finished!")
guard let queuePlayer = queuePlayer else { return }
if let index = queuePlayer.items().lastIndex(where: { (playerItem) -> Bool in
return playerItem == queuePlayer.currentItem
}) {
if let nextItem = queuePlayer.items()[index + 1].asset as? AVURLAsset {
if nextItem.url.absoluteString != oneSecondSilenceUrl?.absoluteString {
try? AVAudioSession.sharedInstance().setCategory(.playback, options: [.mixWithOthers, .duckOthers])
//When narrator is speaking
} else {
try? AVAudioSession.sharedInstance().setCategory(.playback, options: .mixWithOthers)
AVAudioSession.sharedInstance()
//When there's silence and only music should be playing
}
}
}
if AVAudioSession.sharedInstance().categoryOptions == .mixWithOthers {
print("Music mixing with silence")
}
try? AVAudioSession.sharedInstance().setActive(true)
}
enter code here
This method runs whenever the the AVQueuePlayer has finished playing one of its items. I use it to check and see if the NEXT item is a silent track or not. If its a silent track, I want .mixWithOthers, If its a track that is NOT silent and the track actually plays audio, I want category options: .duckOthers :)
Any help, response or answer is very much appreciated! :)
Switching to main thread seemed to fix the problem. Weird.
DispatchQueue.main.async {
if AVAudioSession.sharedInstance().categoryOptions != .duckOthers {
try? AVAudioSession.sharedInstance().setActive(false)
try? AVAudioSession.sharedInstance().setCategory(.playback, options: [ .duckOthers])
}
}
I'm trying to use Apple's AVMIDIPlayer object for playing a MIDI file. It seems easy enough in Swift, using the following code:
let midiFile:NSURL = NSURL(fileURLWithPath:"/path/to/midifile.mid")
var midiPlayer: AVMIDIPlayer?
do {
try midiPlayer = AVMIDIPlayer(contentsOf: midiFile as URL, soundBankURL: nil)
midiPlayer?.prepareToPlay()
} catch {
print("could not create MIDI player")
}
midiPlayer?.play {
print("finished playing")
}
And it plays for about 0.05 seconds. I presume I need to frame it in some kind of loop. I've tried a simple solution:
while stillGoing {
midiPlayer?.play {
let stillGoing = false
}
}
which works, but ramps up the CPU massively. Is there a better way?
Further to the first comment, I've tried making a class, and while it doesn't flag any errors, it doesn't work either.
class midiPlayer {
var player: AVMIDIPlayer?
func play(file: String) {
let myURL = URL(string: file)
do {
try self.player = AVMIDIPlayer.init(contentsOf: myURL!, soundBankURL: nil)
self.player?.prepareToPlay()
} catch {
print("could not create MIDI player")
}
self.player?.play()
}
func stop() {
self.player?.stop()
}
}
// main
let myPlayer = midiPlayer()
let midiFile = "/path/to/midifile.mid"
myPlayer.play(file: midiFile)
You were close with your loop. You just need to give the CPU time to go off and do other things instead of constantly checking to see if midiPlayer is finished yet. Add a call to usleep() in your loop. This one checks every tenth of a second:
let midiFile:NSURL = NSURL(fileURLWithPath:"/Users/steve/Desktop/Untitled.mid")
var midiPlayer: AVMIDIPlayer?
do {
try midiPlayer = AVMIDIPlayer(contentsOfURL: midiFile, soundBankURL: nil)
midiPlayer?.prepareToPlay()
} catch {
print("could not create MIDI player")
}
var stillGoing = true
while stillGoing {
midiPlayer?.play {
print("finished playing")
stillGoing = false
}
usleep(100000)
}
You need to ensure that the midiPlayer object exists until it's done playing. If the above code is just in a single function, midiPlayer will be destroyed when the function returns because there are no remaining references to it. Typically you would declare midiPlayer as a property of an object, like a subclassed controller.
Combining Brendan and Steve's answers, the key is sleep or usleep and sticking the play method outside the loop to avoid revving the CPU.
player?.play({return})
while player!.isPlaying {
sleep(1) // or usleep(10000)
}
The original stillGoing value works, but there is also an isPlaying method.
.play needs something between its brackets to avoid hanging forever after completion.
Many thanks.