Swift - AVAudioPlayer, sound doesn't play correctly - swift

Because UILocalNotification is not being displayed while the application is in active stat, I'm trying to configure an UIAlertController and playing a little sound when it appears.
I've no problem, in the AppDelegate, to handle the notification/create the alert. My problem concerns the sound. Indeed, it doesn't play correctly.
Here is what I have so far :
//...
class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
func application(application: UIApplication, didFinishLaunchingWithOptions launchOptions: [NSObject: AnyObject]?) -> Bool {
// Override point for customization after application launch.
// Notifications permissions
let types: UIUserNotificationType = UIUserNotificationType.Sound | UIUserNotificationType.Alert
let settings: UIUserNotificationSettings = UIUserNotificationSettings(forTypes: types, categories: nil)
application.registerUserNotificationSettings(settings)
return true
}
func application(application: UIApplication!, didReceiveLocalNotification notification: UILocalNotification!) {
let state : UIApplicationState = application.applicationState
var audioPlayer = AVAudioPlayer()
if (state == UIApplicationState.Active) {
// Create sound
var error:NSError?
var audioPlayer = AVAudioPlayer()
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient, error: nil)
AVAudioSession.sharedInstance().setActive(true, error: nil)
let soundURL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("sound", ofType: "wav")!)
audioPlayer = AVAudioPlayer(contentsOfURL: soundURL, error: &error)
if (error != nil) {
println("There was an error: \(error)")
} else {
audioPlayer.prepareToPlay()
audioPlayer.play()
}
// Create alert
let alertController = UIAlertController(title: "Alert title", message: "Alert message.", preferredStyle: .Alert)
let noAction = UIAlertAction(title: "No", style: .Cancel) { (action) in
// ...
}
let yesAction = UIAlertAction(title: "Yes", style: .Default) { (action) in
// ...
}
alertController.addAction(noAction)
alertController.addAction(yesAction)
self.window?.rootViewController?.presentViewController(alertController, animated: true, completion: nil)
}
}
With that, when the player go through this line : audioPlayer.play()
It only play for less then a second. Like if it was suddently deallocated maybe (?).
I tried the two things below :
Switching back the AVAudioPlayer status to inactive: AVAudioSession.sharedInstance().setActive(false, error: nil) just before the alert creation (or just after showing it). If I do that, the sound is played correctly. But, this method is a synchronous (blocking) operation, so it delays other thing (alert being shown after the sound). Apparently not a good solution.
Move the audioPlayer property (var audioPlayer = AVAudioPlayer()) to the class level, just under the window (var window: UIWindow?). If I do that, the sound is played correctly and the alert is shown correctly too.
I don't understand why it works like that. Am I missing something? Is it the proper way to solve my problem?
Thanks in advance to everyone who could help me understand/fixing this.

You already provided the correct solution to your problem and that would be #2; to have a class-level audioPlayer property. Why is this so?
Well, that's because in the code you setup the player, start the playback, show the pop-up and after everything is done, the method exits its scope. The automated memory management (ARC) works so that any local variables get released when you exit the scope of that variable. If you consider that sound playback is asynchronous (eg. it will play without blocking the main thread), you exit the scope before the audio player is finished playing the sound. ARC notices that audioPlayer is a local variable and releases it at that point, but the sound is not done playing yet.
I hope I was clear enough, if not feel free to ask more!

Related

swift AVFoundation pitched audio memory issue

Hi still new to swift and programming in general.
I have this function that plays a piece of audio at a specified pitch. It gets called by a NStimer so plays once a second. (function contained in a SoundPlayer class and then NStimer setup and used in viewController)
func playPitchedAudio(audioFile: AVAudioFile, pitch: Float){
audioEngine.stop()
audioEngine.reset()
let audioPlayerNode = AVAudioPlayerNode()
let changePitchEffect = AVAudioUnitTimePitch()
changePitchEffect.pitch = pitch
audioEngine.attachNode(audioPlayerNode)
audioEngine.attachNode(changePitchEffect)
audioEngine.connect(audioPlayerNode, to: changePitchEffect, format: nil)
audioEngine.connect(changePitchEffect, to: audioEngine.outputNode, format: nil)
audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
do {
try audioEngine.start()
} catch {
print("error")
}
audioPlayerNode.play()
}
It runs fine and works but adds a few mb to memory every time its called and never regains the space. Did some research on memory leaks but couldn't find anything that helps my my specific scenario so hoping someone can point me in the right direction.
I assumed it was something to do with creating a new node and TimePitch everytime its called so moved them into the class that contains the function but got a "libc++abi.dylib: terminating with uncaught exception of type NSException" error when the sound attempted to play for the second time.
any help much appreciated, Thanks!
extra stuff.
// defined in class to be used by function
var pitchedAudioPlayer = AVAudioPlayerNode()
var audioEngine = AVAudioEngine()
//Timer Start
self.timer.invalidate()
self.timer = NSTimer.scheduledTimerWithTimeInterval(tempo, target: self, selector: #selector(ViewController.timeTriggerPointer), userInfo: nil, repeats: true)
//Timer calls... (along with some other unrelated stuff)
func timeTriggerPointer() {
soundPlayer.playPitchedAudio(pitchFilePath, pitch: -1000.0)
}
Solution -
import AVFoundation
class SoundPlayer {
var pitchedAudioPlayer = AVAudioPlayerNode()
var audioEngine = AVAudioEngine()
let audioPlayerNode = AVAudioPlayerNode()
let changePitchEffect = AVAudioUnitTimePitch()
init() {
audioEngine.attachNode(audioPlayerNode)
audioEngine.attachNode(changePitchEffect)
audioEngine.connect(audioPlayerNode, to: changePitchEffect, format: nil)
audioEngine.connect(changePitchEffect, to: audioEngine.outputNode, format: nil)
}
func playPitchedAudio(audioFile: AVAudioFile, pitch: Float){
audioPlayerNode.stop()
changePitchEffect.pitch = pitch
audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
do {
try audioEngine.start()
} catch {
print("error")
}
audioPlayerNode.play()
}
}
A "leak" is a highly technical thing: a piece of unreferenced memory which can never be released (because it is unreferenced). I doubt that you have a leak. You just have more and more objects, that's all.
You are repeating this code over and over (once each time the timer fires):
let audioPlayerNode = AVAudioPlayerNode()
let changePitchEffect = AVAudioUnitTimePitch()
audioEngine.attachNode(audioPlayerNode)
audioEngine.attachNode(changePitchEffect)
The audio engine itself, however, remains in place (it's declared outside the function, as a property of your view controller). So you're adding two more nodes to this same audio engine every time the timer fires. Nodes take up memory, so your memory keeps rising. No surprise here. If you don't want that to happen, don't do that.
I assumed it was something to do with creating a new node and TimePitch everytime its called
There you go. So you already solved your own problem.
Think for a moment about what you're trying to do. You have an audio engine with nodes. The only thing that might to change on each run is the pitch of the AVAudioUnitTimePitch node and the file to be played. So create the whole audio engine and nodes beforehand, and leave it in place. Keep a reference to the AVAudioUnitTimePitch as a property. In the timer function, just change that pitch value!
let theTimePitchNode = AVAudioUnitTimePitch()
// and you have also added that node to the engine in your setup
func playPitchedAudio(audioFile: AVAudioFile, pitch: Float){
audioPlayerNode.stop()
theTimePitchNode.pitch = pitch
audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
do {
try audioEngine.start()
} catch {
print("error")
}
audioPlayerNode.play()
}

When using AVFoundation to play sound, there is sound at simulator, but no sound on device

As I was trying to play sound in my timer app, I use AVFoundation in the code. While I am able to hear the sound in simulator, but not in device. I wonder if I need to set something for the app to play sound in device?
Please find the below code for more information.
import AVFoundation
class timer {
var player: AVAudioPlayer = AVAudioPlayer()
// in viewdidload
let squishPath = NSBundle.mainBundle().pathForResource("ding", ofType: "wav")
let squishURL = NSURL(fileURLWithPath: squishPath!)
do {
try squishPlayer = AVAudioPlayer(contentsOfURL: squishURL)
squishPlayer.prepareToPlay()
} catch let err as NSError {
print(err.debugDescription)
}
squishPlayer.numberOfLoops = 0
}
I checked the file name is exactly and it is in the copy bounce resources.
I found the solution is to add the following to app delegate.
func application(application: UIApplication, didFinishLaunchingWithOptions launchOptions: [NSObject: AnyObject]?) -> Bool {
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(AVAudioSessionCategoryPlayback)
try session.setActive(true)
} catch let error as NSError {
print("AVAudioSession configuration error: \(error.localizedDescription)")
}
return true
}
And this solves the problem of no sound on device, but strange thing is even if I didn't add these to app delegate, there is still sound in simulator.

How to play the same sound overlapping with AVAudioPlayer?

This code plays the sound when the button is tapped but cancels the previous if it is pressed again. I do not want this to happen I want the same sound to overlap when repeatedly pressed. I believe it might be due to using the same AVAudioPlayer as I have looked on the internet but I am new to swift and want to know how to create a new AVAudioPlayer everytime the method runs so the sounds overlap.
func playSound(sound:String){
// Set the sound file name & extension
let soundPath = NSURL(fileURLWithPath:NSBundle.mainBundle().pathForResource(sound, ofType: "mp3")!)
do {
//Preperation
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
} catch _{
}
do {
try AVAudioSession.sharedInstance().setActive(true)
}
catch _ {
}
//Play the sound
var error:NSError?
do{
audioPlayer = try AVAudioPlayer(contentsOfURL: soundPath)
}catch let error1 as NSError {
error = error1
}
audioPlayer.prepareToPlay()
audioPlayer.play()
}
To play two sounds simultaneously with AVAudioPlayer you just have to use a different player for each sound.
In my example I've declared two players, playerBoom and playerCrash, in the Viewcontroller, and I'm populating them with a sound to play via a function, then trigger the play at once:
import AVFoundation
class ViewController: UIViewController {
var playerBoom:AVAudioPlayer?
var playerCrash:AVAudioPlayer?
override func viewDidLoad() {
super.viewDidLoad()
playerBoom = preparePlayerForSound(named: "sound1")
playerCrash = preparePlayerForSound(named: "sound2")
playerBoom?.prepareToPlay()
playerCrash?.prepareToPlay()
playerBoom?.play()
playerCrash?.play()
}
func preparePlayerForSound(named sound: String) -> AVAudioPlayer? {
do {
if let soundPath = NSBundle.mainBundle().pathForResource(sound, ofType: "mp3") {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
try AVAudioSession.sharedInstance().setActive(true)
return try AVAudioPlayer(contentsOfURL: NSURL(fileURLWithPath: soundPath))
} else {
print("The file '\(sound).mp3' is not available")
}
} catch let error as NSError {
print(error)
}
return nil
}
}
It works very well but IMO is not suitable if you have many sounds to play. It's a perfectly valid solution for just a few ones, though.
This example is with two different sounds but of course the idea is exactly the same for two identic sounds.
I could not find a solution using just AVAudioPlayer.
Instead, I have found a solution to this problem with a library that is built on top of AVAudioPlayer.
The library allows same sounds to be played overlapped with each other.
https://github.com/adamcichy/SwiftySound

Best / Proper way to play a short sound

I'm trying to play a short sound in my Swift app.
The sound I want to play :
is very short (less than five seconds) ;
is one of the system sound (ex: sms received) ;
will most likely be played multiple time during the life cycle of my application ;
can be fired two time in less than the sound length time (i.e. fire the first one and a second one is fired before the first finish playing) ;
In addition to this, I would like to :
hear the sound only if the application is active and the "silent mode" is off ;
make the phone vibrate (regardless if silent mode is on or off).
So far, here is what I have :
import AVFoundation
class SoundPlayer {
// Singleton
class var sharedInstance : SoundPlayer {
struct Static {
static let instance : SoundPlayer = SoundPlayer()
}
return Static.instance
}
// Properties
var error:NSError?
var audioPlayer = AVAudioPlayer()
let soundURL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("sound", ofType: "wav")!) // Working
//let soundURL = NSURL(string:"/System/Library/Audio/UISounds/sms-received1.caf") // Working
//let soundURL = NSURL(fileURLWithPath: NSBundle(identifier: "com.apple.UIKit")!.pathForResource("sms-received1", ofType: "caf")!) // Not working (crash at runtime)
init() {
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient, error: nil)
AVAudioSession.sharedInstance().setActive(true, error: nil)
audioPlayer = AVAudioPlayer(contentsOfURL: soundURL, error: &error)
if (error != nil) {
println("[SoundPlayer] There was an error: \(error)")
}
}
func playSound() {
if (audioPlayer.playing) {
audioPlayer.stop()
}
audioPlayer.play()
}
}
What I understand there are several ways to play a sound.
The two most common ways seems to be : AVAudioPlayer (like I did) and AudioToolbox.
From there, I'm asking myself three questions :
Which of those two is the most adapted (best / proper) for my situation ? (AVAudioPlayer is working correctly at the moment)
Are we allowed to play a system sound ? (Very small gain in size if in don't add my own sound to the project)
If so, is NSURL(string:"/System/Library/Audio/UISounds/sms-received1.caf") a good way to retrieve the sound ? Any better / proper solution ?
Is there a way to vibrate the phone at the same time the sound is fired ?

didEnterRegion - audio not playing - Swift

I am updating an existing app to Swift. The issue I am having is that the main purpose of the app is to play a sound file when the app gets to a specific location. Everything works in the simulator, but it does not play the sound on the device. If I add the background mode for audio and airplay, it works flawlessly. Apple has rejected this as it doesn't use the audio all the time, just when it enters a region. I did not have to have audio and airplay background mode enabled for iOS7.
I think it has something to do with the audioplayer being dealocated before it has a chance to be used. I cannot figure out how to get it set properly. There is this same question on StackOverflow for Objective C, but I cannot find anything similar for swift (and to me, this is what the answer to that question suggests and the way I had it set for iOS7).
class MasterViewController: UITableViewController, NSFetchedResultsControllerDelegate, CLLocationManagerDelegate, AVSpeechSynthesizerDelegate, AVAudioPlayerDelegate {
var myManager: CLLocationManager!
var audioPlayer = AVAudioPlayer()
var mySpeechSynthesizer = AVSpeechSynthesizer()
func playSound()
{
audioPlayer.prepareToPlay()
audioPlayer.play()
}
func audioPlayerDidFinishPlaying(player: AVAudioPlayer!, successfully flag: Bool)
{
speak()
}
func speak()
{
var myString = "This is the phrase I want to speak"
var mySpeechUtterance:AVSpeechUtterance = AVSpeechUtterance(string:myString)
mySpeechUtterance.rate = 0.12
mySpeechUtterance.voice = AVSpeechSynthesisVoice(language: "en-US")
mySpeechSynthesizer .speakUtterance(mySpeechUtterance)
}
func locationManager(manager: CLLocationManager!, didEnterRegion region: CLRegion!)
{
playSound()
scheduleAllLocations()
}
override func viewDidLoad()
{
super.viewDidLoad()
myManager = CLLocationManager()
myManager.delegate = self
myManager.desiredAccuracy = kCLLocationAccuracyBest
myManager.requestAlwaysAuthorization()
var alertSound = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("SpeakMinder", ofType: "mp3")!)
var error:NSError?
audioPlayer = AVAudioPlayer(contentsOfURL: alertSound, error: &error)
audioPlayer.delegate = self
mySpeechSynthesizer.delegate = self
}
This is the complete code. It works flawlessly if I have audio/airplay enabled in the background. Doesn't play a sound or speak if I have the audio/airplay disabled in the background. (Apple has rejected the app for having audio/airplay enabled)
From the Technical Team
If you are attempting to use AVSpeechSynthesizer from the background you should have the background audio key enabled. AVSpeech simply uses the same audio plumbing as playing any sound.
Regarding it working in iOS7 without having audio in the background enabled
If this key was not required, you were most likely getting lucky due to a bug that we have since corrected.
And an app was approved today with the audio background enabled.
If you are rejected due to having audio enabled in the background for the Speech Synthesizer.
Our own Location Based sample “Breadcrumbs" has the following enabled in the plist. You may point to this sample in your appeal.