Swift function playAtTime() Does Not Add Delay - swift

I'm trying to play two audioPlayers, one after the other has finished playing. I'm using the Swift function playAtTime() to create a delay for the second, as follows:
var audioPlayer1 = AVAudioPlayer()
var audioPlayer2 = AVAudioPlayer()
let soundPathA = NSBundle.mainBundle().pathForResource("A", ofType: "m4a")
let soundURLA = NSURL.fileURLWithPath(soundPathA!)
let soundPathB = NSBundle.mainBundle().pathForResource("B", ofType: "m4a")
let soundURLB = NSURL.fileURLWithPath(soundPathB!)
var noteA = Sound()
noteA.URL = soundURLA
var noteB = Sound()
noteB.URL = soundURLB
self.audioPlayer1 = try!AVAudioPlayer(contentsOfURL: soundURLA)
self.audioPlayer2 = try!AVAudioPlayer(contentsOfURL: soundURLB)
let duration : NSTimeInterval = audioPlayer1.duration
self.audioPlayer1.play()
self.audioPlayer2.playAtTime(duration)
However, no delay occurs. What is my issue here?

The playAtTime function does not start the play at a given time. Instead it plays immediately, from the given time in the sound to be played. So if you give it 0.6, it will start playing straight away, but the sound will start from 0.6 seconds in. See the documentation here.
If you want to wait before playing, you could use dispatch_after:
let delayTime = dispatch_time(DISPATCH_TIME_NOW, Int64(0.6 * Double(NSEC_PER_SEC)))
dispatch_after(delayTime, dispatch_get_main_queue()) {
self.audioPlayer2.play()
}

I haven't tried Michael's answer yet, but for those of you wondering how to do it with playAtTime, check out this: http://sandmemory.blogspot.com/2014/12/how-would-you-add-reverbecho-to-audio.html and look at the playEcho function.
The code, for me, looks like this:
var playTime1 : NSTimeInterval = audioPlayer2.deviceCurrentTime + (duration1*Double(value))
self.audioPlayer1.stop()
self.audioPlayer1.currentTime = 0
self.audioPlayer1.play()

Related

How to fix the 66671 AudioQueueInternalNotifyRunning error when playing a sound? [duplicate]

Why doesn't the following code play a sound? It returns "true" for play(), but I cannot hear anything.
let path = "/Users/account/Music/sound.mp3";
let fileURL = NSURL(fileURLWithPath: path)
var Player = AVAudioPlayer(contentsOfURL:fileURL, error:nil);
Player.delegate = self;
Player.prepareToPlay();
Player.volume = 1.0;
var res = Player.play();
println(res);
If I use the following code instead, I can hear the sound.
var inFileURL:CFURL = fileURL!;
var mySound = UnsafeMutablePointer<SystemSoundID>.alloc(sizeof(SystemSoundID));
AudioServicesCreateSystemSoundID(inFileURL, mySound);
AudioServicesPlaySystemSound(mySound.memory)
OS X Yosemite 10.10.3
Xcode 6.2
The problem is that Player, your AVAudioPlayer, is a local variable. So it goes out of existence immediately - before it can even start playing, let alone finish playing.
Solution: make it a property instead, so that it will persist.

AVPlayer Seek not accurate even with toleranceBefore: kCMTimeZero

We have an app which plays a long mp3 file (1 hour long). We want to be able to play from set points within the file. But, when we do it, it is inaccurate by up to 10 seconds.
Here's the code:
let trackStart = arrTracks![MediaPlayer.shared.currentSongNo].samples
let frameRate : Int32 = (MediaPlayer.shared.player?.currentItem?.asset.duration.timescale)!
MediaPlayer.shared.player?.seek(to: CMTimeMakeWithSeconds(Double(trackStart), frameRate),
toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero)
We have to use AVPlayer because we need the better quality "spectral:" AVAudioTimePitchAlgorithm.
We didn't have the problem with AVAudioPlayer, but (AFAIK) we have to use AVPlayer because we need the better quality "spectral:" AVAudioTimePitchAlgorithm.
[Edit:] - The error is consistent - it always plays from the same (wrong) place for a given requested position. This is also true after restarting.
Any help very much appreciated! Thanks
[Edit:]
We have already tried preferredTimescale: playerTimescale
Also tried kCMTimeIndefinite instead of kCMTimeZero
I have done something similar but with a slider to change seconds of playing and worked perfectly.
#objc func handleSliderChange(sender: UISlider?){
if let duration = player?.currentItem?.duration{
let totalSeconds = CMTimeGetSeconds(duration)
let value = Float64(videoSlider.value) * totalSeconds
let seekTime = CMTime(value: CMTimeValue(value), timescale: 1)
player?.seek(to: seekTime , completionHandler: { (completedSeek) in
//do smthg later
})
}
}
in you case this will be like this:
let trackStart = arrTracks![MediaPlayer.shared.currentSongNo].samples
let value = Float64(trackStart)
let seekTime = CMTime(value: CMTimeValue(value), timescale: 1)
MediaPlayer.shared.player?.seek(to: seekTime , completionHandler: { (completedSeek) in
//do smthg later
})
This is what AVURLAsset’s AVURLAssetPreferPreciseDurationAndTimingKey is for.
Apple's documentation.
Beware that this should increase the loading time.
Try this, It's working perfectly for me
#IBAction func playbackSliderValueChanged(_ playbackSlider: UISlider) {
let seconds : Int64 = Int64(playbackSlider.value)
let targetTime: CMTime = CMTimeMake(value: seconds, timescale: 1)
DispatchQueue.main.async {
self.player!.seek(to: targetTime)
if self.player!.rate == 0 { // if the player is not yet started playing
self.player?.play()
}
}
}

Detecting when person wearing an Apple Watch falls

If I drop my Apple Watch and catch it before it hits the ground, the app I'm making should detect that the watch has fallen. But that's not happening. What's wrong with the code below? Thanks!
let motion = CMMotionManager()
if motion.isDeviceMotionAvailable {
motion.deviceMotionUpdateInterval = 0.1
motion.startDeviceMotionUpdates()
if let deviceMotion = motion.deviceMotion {
let accelerationX = deviceMotion.gravity.x + deviceMotion.userAcceleration.x
let accelerationY = deviceMotion.gravity.y + deviceMotion.userAcceleration.y
let accelerationZ = deviceMotion.gravity.z + deviceMotion.userAcceleration.z
let totalAcceleration = sqrt((accelerationX * accelerationX) + (accelerationY * accelerationY) + (accelerationZ * accelerationZ))
if totalAcceleration > 9.0 {
print("Watch has fallen")
}
}
motion.stopDeviceMotionUpdates()
}
motion.deviceMotion will just get the latest sample of device-motion data.
So, this might just fetch the data once on when you run it. You will probably need something like a timer to check the acceleration.
Something like this(taken from https://developer.apple.com/documentation/coremotion/getting_raw_accelerometer_events)
let motion = CMMotionManager()
func startAccelerometers() {
// Make sure the accelerometer hardware is available.
if self.motion.isAccelerometerAvailable {
self.motion.accelerometerUpdateInterval = 1.0 / 60.0 // 60 Hz
self.motion.startAccelerometerUpdates()
// Configure a timer to fetch the data.
self.timer = Timer(fire: Date(), interval: (1.0/60.0),
repeats: true, block: { (timer) in
// Get the accelerometer data.
if let data = self.motion.accelerometerData {
let x = data.acceleration.x
let y = data.acceleration.y
let z = data.acceleration.z
// Use the accelerometer data in your app.
}
})
// Add the timer to the current run loop.
RunLoop.current.add(self.timer!, forMode: .defaultRunLoopMode)
}
}
Alternatively, you can also pass a handler to startDeviceMotionUpdates, which will be called based on deviceMotionUpdateInterval.

How to make a smooth transition between drum patterns using AVAudioPlayer?

I am writing a music app which will let the user switch between different drum patterns "smoothly" by pressing desired buttons on screen. (By "smoothly" I mean the pattern will switch in the measure immediately following the time when the user presses a button).
My problem is that the time delay I am calculating for a delayed start of the next pattern is slightly larger than desired. I can fix by reducing the time delay by 0.1 sec, but this may not work if one uses a different tempo than the one I am currently using for testing.
The code which calculates the delay is:
func startClock() {
let aSelector : Selector = "updateClock"
clock = NSTimer.scheduledTimerWithTimeInterval(0.001, target: self, selector: aSelector, userInfo: nil, repeats: true)
startTime = CFAbsoluteTimeGetCurrent()
}
func stopClock() {
clock.invalidate()
}
func updateClock() {
currentTime = CFAbsoluteTimeGetCurrent()
elapsedTime = currentTime - startTime
elapsedBeats = UInt(elapsedTime / audioMeterUpdateInterval)
elapsedMeasures = UInt( Double(elapsedBeats) / Double(beatUnit[patternSelectIdx]) )
requiredMeasures = elapsedMeasures + 1
requiredBeats = requiredMeasures * UInt(beatUnit[patternSelectIdx])
requiredTime = Double(requiredBeats) * audioMeterUpdateInterval
delayTime = requiredTime - elapsedTime - 0.1 // 0.1 sec is chosen ad hoc
}
The code for one of the buttons for ending the drum patterns is:
#IBAction func endShort(sender: UIButton) {
fileName = patternSelect + "End1"
if startPlay == true {
play(fileName, numberOfLoops: 0, delaySec: delayTime)
} else {
play(fileName, numberOfLoops: 0, delaySec: 0)
}
playPauseButton.setImage(playImage, forState: UIControlState.Normal)
startPlay = false
}
Finally, the function play called by the above code is
func play(fileName: String, numberOfLoops: Int, delaySec: Double){
if delaySec > 0 {
let delayNSec = Double(NSEC_PER_SEC)*delaySec
let dispatchTime = dispatch_time(DISPATCH_TIME_NOW, Int64(delayNSec))
self.ButtonAudioPlayer.numberOfLoops = numberOfLoops
self.ButtonAudioPlayer.volume = 1.0
self.ButtonAudioURL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(fileName, ofType: "wav")!)
dispatch_after(dispatchTime, dispatch_get_main_queue(),{
self.ButtonAudioPlayer = try! AVAudioPlayer(contentsOfURL: self.ButtonAudioURL)
self.ButtonAudioPlayer.play()
})
} else {
ButtonAudioPlayer.numberOfLoops = numberOfLoops
ButtonAudioPlayer.volume = 1.0
ButtonAudioURL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(fileName, ofType: "wav")!)
ButtonAudioPlayer = try! AVAudioPlayer(contentsOfURL: ButtonAudioURL)
ButtonAudioPlayer.play()
}
}
Is there an alternate approach to this? Also, can the 0.1 sec be a fixed delay related to the button animation time and hence always remain fixed and can be modified? Thanks!

How to display song currenttime and duration by using AVAudioPlayerNode

I'm making an audio player using AVAudioPlayerNode and playing a song with some effect (delay, pitch change, rate change etc) works, but I'm stuck on displaying song current time and duration.
AVAudioPlayerNode seems not have like AVAudioPlayer.currentTime or .duration.
I'd really appreciate if you can help me to display currenttime and duration by using AVAudioPlayerNode.
For the current time, this is what I use:
- (NSTimeInterval)currentTimeInSeconds
{
AVAudioTime *nodeTime = self.playerNode.lastRenderTime;
AVAudioTime *playerTime = [self.playerNode playerTimeForNodeTime:nodeTime];
NSTimeInterval seconds = (double)playerTime.sampleTime / playerTime.sampleRate;
return seconds;
}
In addition to the render time in an abstract time base, the player node offers conversion of this time into its own playback timeline via playerTimeForNodeTime. Also respects pausing etc.
Here it is in swift:
private func currentTime() -> NSTimeInterval {
if let nodeTime: AVAudioTime = myAvAudioPlayer.lastRenderTime, playerTime: AVAudioTime = myAvAudioPlayer.playerTimeForNodeTime(nodeTime) {
return Double(Double(playerTime.sampleTime) / playerTime.sampleRate)
}
return 0
}
Just replace myAvAudioPlayer with your AVAudioPlayerNode!
Happy coding!
For the duration of the AVAudioPlayerNode SWIFT 4:
func durationOfNodePlayer(_ fileUrl: URL) -> TimeInterval {
do {
let file = try AVAudioFile(forReading: fileUrl)
let audioNodeFileLength = AVAudioFrameCount(file.length)
return Double(Double(audioNodeFileLength) / 44100) //Divide by the AVSampleRateKey in the recorder settings
} catch {
return 0
}
}
Swift 5:
duration time, using AVAudioPlayrtNode's AVAudioFile
extension AVAudioFile{
var duration: TimeInterval{
let sampleRateSong = Double(processingFormat.sampleRate)
let lengthSongSeconds = Double(length) / sampleRateSong
return lengthSongSeconds
}
}
according to WWDC 2014: whats new in core audio , mins 54
Then currentTime:
extension AVAudioPlayerNode{
var current: TimeInterval{
if let nodeTime = lastRenderTime,let playerTime = playerTime(forNodeTime: nodeTime) {
return Double(playerTime.sampleTime) / playerTime.sampleRate
}
return 0
}
}
Thanks #Brian Becker
For the duration, .length property of AVAudioFile worked.
I divided the number of sample frames in the file by the sample rates(44100hz).
For the currenttime, tentatively I use UISlider.value and synthesize the value with the sample frames.
Something like this should give you that ability...
extension AVAudioPlayerNode {
var currentTime: TimeInterval {
get {
if let nodeTime: AVAudioTime = self.lastRenderTime, let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodeTime) {
return Double(playerTime.sampleTime) / playerTime.sampleRate
}
return 0
}
}