Creating Music Instance in every View Controller - swift

I,m creating a app the use a lot of sfx and background music. But i can't find the best way to inherite this type of data through View Controllers. Do i have to initialize my audios in every view controller? But what if i want to stop a music that started in a preview VC?
This is the code that i'm using:
do {
// Music BG
let resourcePath = NSBundle.mainBundle().pathForResource("MusicaBg", ofType: "wav")!
let url = NSURL(fileURLWithPath: resourcePath)
try musicPlayer = AVAudioPlayer(contentsOfURL: url)
// SFX for Button
let resourcePath2 = NSBundle.mainBundle().pathForResource("botaoApertado", ofType: "wav")!
let url2 = NSURL(fileURLWithPath: resourcePath2)
try botaoApertado = AVAudioPlayer(contentsOfURL: url2)
} catch let err as NSError {
print(err.debugDescription)
}
What's the best way to do that?

You're probably looking for the Singleton pattern since you need a single canonical instance of background music, that any ViewController can interact with.
Then any time you need to change the music you simple call the corresponding method on e.g. AudioManager.sharedInstance from anywhere, and as you keep moving through the app the music will continue.
You would probably want to start the music in your AppDelegate or FirstViewController.
For example, with the code you've given, you might want something like
class AudioManager {
static let sharedInstance = AudioManager()
var musicPlayer: AVAudioPlayer?
var botaoApertado: AVAudioPlayer?
private init() {
}
func startMusic() {
do {
// Music BG
let resourcePath = NSBundle.mainBundle().pathForResource("MusicaBg", ofType: "wav")!
let url = NSURL(fileURLWithPath: resourcePath)
try musicPlayer = AVAudioPlayer(contentsOfURL: url)
// SFX for Button
let resourcePath2 = NSBundle.mainBundle().pathForResource("botaoApertado", ofType: "wav")!
let url2 = NSURL(fileURLWithPath: resourcePath2)
try botaoApertado = AVAudioPlayer(contentsOfURL: url2)
} catch let err as NSError {
print(err.debugDescription)
}
}
}
func stopMusic() { // implementation
}
As soon as you write AudioManager.sharedInstance.startMusic() the sharedInstance static variable will be initialized (once, since it's a static property) and then startMusic() will be called on it.
If you later call AudioManager.sharedInstance.stopMusic() it will use the same sharedInstance you initialized previously, and stop the music.
Post any questions you have in the comments.

Related

How to play two audio files at once with Swift using AudioKit

I have two short wav audio files that I'm trying to play at the same time. Using AudioKit, I have an AudioEngine(), and I'm assuming I should use a MultiSegmentAudioPlayer() as the output along with the scheduleSegments().
Here is what I have:
class AudioPlayClass : ObservableObject {
var player = MultiSegmentAudioPlayer()
let engine = AudioEngine()
init(){}
func playFiles(){
self.engine.output = player
do {
try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default, options: .defaultToSpeaker)
try AVAudioSession.sharedInstance().setActive(true)
try engine.start()
guard let url = Bundle.main.url(forResource: note1, withExtension: "wav", subdirectory: instrumentDirectory) else {return}
guard let url2 = Bundle.main.url(forResource: note2, withExtension: "wav", subdirectory: instrumentDirectory) else {return}
let audioFile = try AVAudioFile(forReading: url)
let audioFile2 = try AVAudioFile(forReading: url2)
let fileSampleRate = audioFile.processingFormat.sampleRate
let file2SampleRate = audioFile2.processingFormat.sampleRate
let fileNumberOfSamples = audioFile.length
let file2NumberOfSamples = audioFile2.length
let audioFileEndTime = Double(fileNumberOfSamples)/fileSampleRate
let audioFile2EndTime = Double(file2NumberOfSamples)/file2SampleRate
let segment1 = segment(audioFile: audioFile,
playbackStartTime: 0.0, fileStartTime: 0, fileEndTime: audioFileEndTime)
let segment2 = segment(audioFile: audioFile2,
playbackStartTime: 0.0, fileStartTime: 0, fileEndTime: audioFile2EndTime)
player2.scheduleSegments(audioSegments: [segment1, segment2])
player2.play()
} catch {
print(error.localizedDescription.debugDescription)
}
}
}
public struct segment : StreamableAudioSegment {
public var audioFile: AVAudioFile
public var playbackStartTime: TimeInterval
public var fileStartTime: TimeInterval
public var fileEndTime: TimeInterval
public var completionHandler: AVAudioNodeCompletionHandler?
}
I just have a basic understanding of playing audio in Swift and using AudioKit so any feedback would be greatly appreciated. Thanks!
I think MultiSegmentAudioPlayer is mostly for playing sounds sequentially. You probably just want 2 AudioPlayer() and play them both at the same time.
Solved this by creating two separate instances of an AudioEngine(), each with their own respective AudioPlayer(), and loaded and played them immediately one after the other.

How to fix 'appendingPathComponent' is unavailable: Use appendingPathComponent on URL error

I'm working on a old Swift tutorial (Swift 2.0) that's posted on Ray Wenderlich's web site (https://www.raywenderlich.com/2185-how-to-make-a-letter-word-game-with-uikit-and-swift-part-3-3) and I'm running into an error when I tried to re-setup a function called "preloadAudioEffects" in Swift 4.2 . The error? appendingPathComponent' is unavailable: Use appendingPathComponent on URL instead.
I've tried to rename the old Swift code [Ex: NSBundle to Bundle , stringByAppendingPathComponent to appendingPathComponent()], but I'm still running into some syntax issues due to my inexperience with Swift.
This is the original code:
func preloadAudioEffects(effectFileNames:[String]) {
for effect in AudioEffectFiles {
//1 get the file path URL
let soundPath = NSBundle.mainBundle().resourcePath!.stringByAppendingPathComponent(effect)
let soundURL = NSURL.fileURLWithPath(soundPath)
//2 load the file contents
var loadError:NSError?
let player = AVAudioPlayer(contentsOfURL: soundURL, error: &loadError)
assert(loadError == nil, "Load sound failed")
//3 prepare the play
player.numberOfLoops = 0
player.prepareToPlay()
//4 add to the audio dictionary
audio[effect] = player
}
}
And this is what I've tried to do via following the suggestions in Xcode:
func preloadAudioEffects(effectFileNames:[String]) {
for effect in AudioEffectFiles {
//1 get the file path URL
let soundPath = Bundle.main.resourcePath!.appendingPathComponent(effect)
let soundURL = NSURL.fileURL(withPath: soundPath)
//2 load the file contents
var loadError:NSError?
let player = AVAudioPlayer(contentsOfURL: soundURL, error: &loadError)
assert(loadError == nil, "Load sound failed")
//3 prepare the play
player.numberOfLoops = 0
player.prepareToPlay()
//4 add to the audio dictionary
audio[effect] = player
}
}
Get the full path to the sound file and convert it to a URL by using NSURL.fileURLWithPath().
Call AVAudioPlayer(contentsOfURL:error:) to load a sound file in an audio player.
Set the numberOfLoops to zero so that the sound won’t loop at all. Call prepareToPlay() to preload the audio buffer for that sound.
Finally, save the player object in the audio dictionary, using the name of the file as the dictionary key.
Just replace resourcePath with resourceURL
let soundURL = Bundle.main.resourceURL!.appendingPathComponent(effect)
and you have to wrap the AVAudioPlayer initializer in a try block
func preloadAudioEffects(effectFileNames:[String]) {
for effect in AudioEffectFiles {
let soundURL = Bundle.main.resourceURL!.appendingPathComponent(effect)
//2 load the file contents
do {
let player = try AVAudioPlayer(contentsOf: soundURL)
//3 prepare the play
player.numberOfLoops = 0
player.prepareToPlay()
//4 add to the audio dictionary
audio[effect] = player
} catch { print(error) }
}
}

Two simultaneous AVAudioSessions

I have an app that uses WebRTC for voice chat.
After a lot of troubleshooting I came to the conclusion that whenever I receive a stream, the WebRTC will set the AVAudioSession shared instance to the following:
try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .voiceChat, options: .allowBluetooth)
I want to play an audio file (about a second long) and if I simply load it and do player.play(), it won't play. If I try to set the category to .playback or anything that would make sense for an audio file, the stream's audio will stop playing.
I also tried to create a new AVAudioSession instance but once the audio file plays, the stream's audio will stop.
Is there a way for me to get the audio file to play without affecting the stream's audio and its configuration?
Here's the full code
private func playTapSound() {
guard let url = Bundle.main.url(forResource: App.AudioFile.emojiTapSound, withExtension: "mp3") else { return }
do {
let audioSesson = AVAudioSession()
try audioSesson.setCategory(.playback, mode: .default)
try audioSesson.setActive(true)
audioPlayer = try AVAudioPlayer(contentsOf: url, fileTypeHint: AVFileType.mp3.rawValue)
guard let player = audioPlayer else { return }
player.play()
} catch let error {
LogManager.shared.print("Error - playing emoji tap audio -", error.localizedDescription, type: [.Error, .Audio])
}
// var chime1URL: NSURL?
// var chime1ID: SystemSoundID = 0
// let filePath = Bundle.main.path(forResource: "emojiTapSound", ofType: "wav")
// chime1URL = NSURL(fileURLWithPath: filePath!)
// AudioServicesCreateSystemSoundID(chime1URL!, &chime1ID)
// AudioServicesPlaySystemSound(0)
}
I also tried using the AudioToolBox but for some reason it only played on the simulator. I verified that it's not my phone's settings by trying one of Apple's preloaded sounds and it worked.

Play a sound upon collecting coin in Swift SKAction

I have a code in Swift which plays a bird game. A bird collects eggs. I want to play a coin sound upon bird touching the egg. I tried code below but it's not playing upon contact. How can I accomplish it?
do {
// Preperation
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
} catch _ {
}
do {
try AVAudioSession.sharedInstance().setActive(true)
} catch _ {
}
// Play the sound
do {
audioPlayer = try AVAudioPlayer(contentsOfURL: alertSound)
} catch _{
}
audioPlayer.prepareToPlay()
audioPlayer.play()
Create a path to the sound file
let path = NSBundle.mainBundle().pathForResource("(sound name)", ofType: (type) )!
Url for the path
var url = NSURL()
Variable to hold sound
var soundEffect: AVAudioPlayer!
Then when you want to play the sound
url = NSURL(fileURLWithPath: path)
do {
let sound = try AVAudioPlayer(contentsOfURL: url)
soundEffect = sound
sound.play()
} catch {
}

Check if AVAudioPlayer is playing

I've been trying to check if the AVAudioPlayer is currently playing, but when I try to do it in anyway it crashes.
I'm using stop button to stop the audio but when there is no audio playing it crashes.
How is it possible to check it in my stopaudio function ?
First I defined :
var audioPlayer = AVAudioPlayer()
I have stop button :
func stopaudio(sender: AnyObject){
audioPlayer.stop()
}
I tried to check like this :
if audioPlayer.playing{
audioPlayer.stop()
}
I am using the following code to play the audio :
var soundURL: NSURL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("9", ofType: "m4a")!)!
var error:NSError?
audioPlayer = AVAudioPlayer(contentsOfURL: soundURL, error: &error)
audioPlayer.prepareToPlay()
audioPlayer.play()
The problem was because the AVAudioPlayer hasn't been initialized yet, so what i did to fix it is to initialize it and then stop it.
var soundURL: NSURL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("013", ofType: "m4a")!)!
var error:NSError?
audioPlayer = AVAudioPlayer(contentsOfURL: soundURL, error: &error)
audioPlayer.stop()
Declare a function in the ViewController class:
func tryAudio() {
do {
try player = AVAudioPlayer(contentsOf: URL(fileURLWithPath: path!))
} catch {
//handle errors
}
}
call tryAudio() to initialize the player before player.stop()