We have an app which plays a long mp3 file (1 hour long). We want to be able to play from set points within the file. But, when we do it, it is inaccurate by up to 10 seconds.
Here's the code:
let trackStart = arrTracks![MediaPlayer.shared.currentSongNo].samples
let frameRate : Int32 = (MediaPlayer.shared.player?.currentItem?.asset.duration.timescale)!
MediaPlayer.shared.player?.seek(to: CMTimeMakeWithSeconds(Double(trackStart), frameRate),
toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero)
We have to use AVPlayer because we need the better quality "spectral:" AVAudioTimePitchAlgorithm.
We didn't have the problem with AVAudioPlayer, but (AFAIK) we have to use AVPlayer because we need the better quality "spectral:" AVAudioTimePitchAlgorithm.
[Edit:] - The error is consistent - it always plays from the same (wrong) place for a given requested position. This is also true after restarting.
Any help very much appreciated! Thanks
[Edit:]
We have already tried preferredTimescale: playerTimescale
Also tried kCMTimeIndefinite instead of kCMTimeZero
I have done something similar but with a slider to change seconds of playing and worked perfectly.
#objc func handleSliderChange(sender: UISlider?){
if let duration = player?.currentItem?.duration{
let totalSeconds = CMTimeGetSeconds(duration)
let value = Float64(videoSlider.value) * totalSeconds
let seekTime = CMTime(value: CMTimeValue(value), timescale: 1)
player?.seek(to: seekTime , completionHandler: { (completedSeek) in
//do smthg later
})
}
}
in you case this will be like this:
let trackStart = arrTracks![MediaPlayer.shared.currentSongNo].samples
let value = Float64(trackStart)
let seekTime = CMTime(value: CMTimeValue(value), timescale: 1)
MediaPlayer.shared.player?.seek(to: seekTime , completionHandler: { (completedSeek) in
//do smthg later
})
This is what AVURLAsset’s AVURLAssetPreferPreciseDurationAndTimingKey is for.
Apple's documentation.
Beware that this should increase the loading time.
Try this, It's working perfectly for me
#IBAction func playbackSliderValueChanged(_ playbackSlider: UISlider) {
let seconds : Int64 = Int64(playbackSlider.value)
let targetTime: CMTime = CMTimeMake(value: seconds, timescale: 1)
DispatchQueue.main.async {
self.player!.seek(to: targetTime)
if self.player!.rate == 0 { // if the player is not yet started playing
self.player?.play()
}
}
}
Related
func seekFullRead(seconds: Float64, completion: (() -> ())? = nil) {
let targetTime:CMTime = CMTimeMakeWithSeconds(seconds, preferredTimescale: 60000)
fullReadPlayer?.currentItem?.seek(to: targetTime, toleranceBefore: .zero, toleranceAfter: .zero, completi[enter image description here][1]onHandler: { (finish) in
if finish {
completion?()
}
})
}
fullReadTimeObserver = fullReadPlayer?.addPeriodicTimeObserver(forInterval: CMTimeMake(value: 1, timescale: 10), queue: DispatchQueue.main, using: { [weak self](time) in
guard let self = self else { return }
if self.fullReadPlayer?.status == .readyToPlay {
self.delegate?.audioPlayer(self, didUpdateCurrentTime: time.seconds, teach: nil)
}
})
When I seek to 4.57 seconds, the correct current time will be displayed first, then the current time will be 0.2 seconds forward, but playback will start after the current time will be 0.2 seconds forward.
Logs:
current time: 1.30104062
current time: 1.401042787
seek to : 4.579999923706055
current time: 1.498295786
current time: 4.579983333333334
current time: 4.319330793
current time: 4.319642834
current time: 4.401050459
current time: 4.501045084
current time: 4.601038959
I think NSGangster had the point. Observer timer and seek are on different processes. What this means to me is I will have to handle the discrepancies by myself.
I did not find an answer on the internet and had no luck in finding out a way to make this correct, but I did manage to find a workaround: I could use a variable like 'isSeekInProgress' to mark whether to update the progress bar UI. Illustrated below:
var isSeekInProgress: Bool = false // Marker
let targetTime:CMTime = CMTimeMakeWithSeconds(seconds, preferredTimescale: 60000)
isSeekInProgress = true // Mark
fullReadPlayer?.currentItem?.seek(to: targetTime, toleranceBefore: .zero, toleranceAfter: .zero, completi[enter image description here][1]onHandler: { (finish) in
if finish {
isSeekInProgress = false // Unmark
completion?()
}
})
Some people pause the player while seeking then resume playback in the completion block, but this won't work in the case you want the playback to keep going while scrubbing. The above example is fairly painless, and all you have to do is check:
if !playbackManager.isSeekInProgress {
progressSlider.value = playbackManager.progress
}
And during the next periodic observer notification, your progress will be updated.
I'm making a video games with Swift & SpriteKit. I'm trying to make the level system of my game. Each levels has his own specifications (but it's not in the code right now).
However, I would like that when my SKaction.repeat is done, to move to an other scene (such as "Level completed" scene).
Do you know how can I do it ?
Here's my code :
func parametersLevel(){
let spawn = SKAction.run(asteroids)
let waitSpawn = SKAction.wait(forDuration: 0.8)
let sequence = SKAction.sequence([waitSpawn,spawn])
let spawnCount = SKAction.repeat(sequence, count: 750)
self.run(spawnCount)
}
Thanks for you help.
From run(_:completion:) instead of self.run(spawnCount) try with:
self.run(spawnCount, completion: {() -> Void in
println("completed")
})
If you need a key with your action, you can also do:
func parametersLevel(){
let spawn = SKAction.run(asteroids)
let waitSpawn = SKAction.wait(forDuration: 0.8)
let sequence1 = SKAction.sequence([waitSpawn,spawn])
let spawnCount = SKAction.repeat(sequence, count: 750)
let endAction = SKAction.run{} //whatever you need your ending to be
let sequence2 = SKAction.sequence([spawnCount ,endAction])
self.run(sequence2,withKey:”spawn” )
}
I'm trying to play two audioPlayers, one after the other has finished playing. I'm using the Swift function playAtTime() to create a delay for the second, as follows:
var audioPlayer1 = AVAudioPlayer()
var audioPlayer2 = AVAudioPlayer()
let soundPathA = NSBundle.mainBundle().pathForResource("A", ofType: "m4a")
let soundURLA = NSURL.fileURLWithPath(soundPathA!)
let soundPathB = NSBundle.mainBundle().pathForResource("B", ofType: "m4a")
let soundURLB = NSURL.fileURLWithPath(soundPathB!)
var noteA = Sound()
noteA.URL = soundURLA
var noteB = Sound()
noteB.URL = soundURLB
self.audioPlayer1 = try!AVAudioPlayer(contentsOfURL: soundURLA)
self.audioPlayer2 = try!AVAudioPlayer(contentsOfURL: soundURLB)
let duration : NSTimeInterval = audioPlayer1.duration
self.audioPlayer1.play()
self.audioPlayer2.playAtTime(duration)
However, no delay occurs. What is my issue here?
The playAtTime function does not start the play at a given time. Instead it plays immediately, from the given time in the sound to be played. So if you give it 0.6, it will start playing straight away, but the sound will start from 0.6 seconds in. See the documentation here.
If you want to wait before playing, you could use dispatch_after:
let delayTime = dispatch_time(DISPATCH_TIME_NOW, Int64(0.6 * Double(NSEC_PER_SEC)))
dispatch_after(delayTime, dispatch_get_main_queue()) {
self.audioPlayer2.play()
}
I haven't tried Michael's answer yet, but for those of you wondering how to do it with playAtTime, check out this: http://sandmemory.blogspot.com/2014/12/how-would-you-add-reverbecho-to-audio.html and look at the playEcho function.
The code, for me, looks like this:
var playTime1 : NSTimeInterval = audioPlayer2.deviceCurrentTime + (duration1*Double(value))
self.audioPlayer1.stop()
self.audioPlayer1.currentTime = 0
self.audioPlayer1.play()
I am writing a music app which will let the user switch between different drum patterns "smoothly" by pressing desired buttons on screen. (By "smoothly" I mean the pattern will switch in the measure immediately following the time when the user presses a button).
My problem is that the time delay I am calculating for a delayed start of the next pattern is slightly larger than desired. I can fix by reducing the time delay by 0.1 sec, but this may not work if one uses a different tempo than the one I am currently using for testing.
The code which calculates the delay is:
func startClock() {
let aSelector : Selector = "updateClock"
clock = NSTimer.scheduledTimerWithTimeInterval(0.001, target: self, selector: aSelector, userInfo: nil, repeats: true)
startTime = CFAbsoluteTimeGetCurrent()
}
func stopClock() {
clock.invalidate()
}
func updateClock() {
currentTime = CFAbsoluteTimeGetCurrent()
elapsedTime = currentTime - startTime
elapsedBeats = UInt(elapsedTime / audioMeterUpdateInterval)
elapsedMeasures = UInt( Double(elapsedBeats) / Double(beatUnit[patternSelectIdx]) )
requiredMeasures = elapsedMeasures + 1
requiredBeats = requiredMeasures * UInt(beatUnit[patternSelectIdx])
requiredTime = Double(requiredBeats) * audioMeterUpdateInterval
delayTime = requiredTime - elapsedTime - 0.1 // 0.1 sec is chosen ad hoc
}
The code for one of the buttons for ending the drum patterns is:
#IBAction func endShort(sender: UIButton) {
fileName = patternSelect + "End1"
if startPlay == true {
play(fileName, numberOfLoops: 0, delaySec: delayTime)
} else {
play(fileName, numberOfLoops: 0, delaySec: 0)
}
playPauseButton.setImage(playImage, forState: UIControlState.Normal)
startPlay = false
}
Finally, the function play called by the above code is
func play(fileName: String, numberOfLoops: Int, delaySec: Double){
if delaySec > 0 {
let delayNSec = Double(NSEC_PER_SEC)*delaySec
let dispatchTime = dispatch_time(DISPATCH_TIME_NOW, Int64(delayNSec))
self.ButtonAudioPlayer.numberOfLoops = numberOfLoops
self.ButtonAudioPlayer.volume = 1.0
self.ButtonAudioURL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(fileName, ofType: "wav")!)
dispatch_after(dispatchTime, dispatch_get_main_queue(),{
self.ButtonAudioPlayer = try! AVAudioPlayer(contentsOfURL: self.ButtonAudioURL)
self.ButtonAudioPlayer.play()
})
} else {
ButtonAudioPlayer.numberOfLoops = numberOfLoops
ButtonAudioPlayer.volume = 1.0
ButtonAudioURL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(fileName, ofType: "wav")!)
ButtonAudioPlayer = try! AVAudioPlayer(contentsOfURL: ButtonAudioURL)
ButtonAudioPlayer.play()
}
}
Is there an alternate approach to this? Also, can the 0.1 sec be a fixed delay related to the button animation time and hence always remain fixed and can be modified? Thanks!
I'm making an audio player using AVAudioPlayerNode and playing a song with some effect (delay, pitch change, rate change etc) works, but I'm stuck on displaying song current time and duration.
AVAudioPlayerNode seems not have like AVAudioPlayer.currentTime or .duration.
I'd really appreciate if you can help me to display currenttime and duration by using AVAudioPlayerNode.
For the current time, this is what I use:
- (NSTimeInterval)currentTimeInSeconds
{
AVAudioTime *nodeTime = self.playerNode.lastRenderTime;
AVAudioTime *playerTime = [self.playerNode playerTimeForNodeTime:nodeTime];
NSTimeInterval seconds = (double)playerTime.sampleTime / playerTime.sampleRate;
return seconds;
}
In addition to the render time in an abstract time base, the player node offers conversion of this time into its own playback timeline via playerTimeForNodeTime. Also respects pausing etc.
Here it is in swift:
private func currentTime() -> NSTimeInterval {
if let nodeTime: AVAudioTime = myAvAudioPlayer.lastRenderTime, playerTime: AVAudioTime = myAvAudioPlayer.playerTimeForNodeTime(nodeTime) {
return Double(Double(playerTime.sampleTime) / playerTime.sampleRate)
}
return 0
}
Just replace myAvAudioPlayer with your AVAudioPlayerNode!
Happy coding!
For the duration of the AVAudioPlayerNode SWIFT 4:
func durationOfNodePlayer(_ fileUrl: URL) -> TimeInterval {
do {
let file = try AVAudioFile(forReading: fileUrl)
let audioNodeFileLength = AVAudioFrameCount(file.length)
return Double(Double(audioNodeFileLength) / 44100) //Divide by the AVSampleRateKey in the recorder settings
} catch {
return 0
}
}
Swift 5:
duration time, using AVAudioPlayrtNode's AVAudioFile
extension AVAudioFile{
var duration: TimeInterval{
let sampleRateSong = Double(processingFormat.sampleRate)
let lengthSongSeconds = Double(length) / sampleRateSong
return lengthSongSeconds
}
}
according to WWDC 2014: whats new in core audio , mins 54
Then currentTime:
extension AVAudioPlayerNode{
var current: TimeInterval{
if let nodeTime = lastRenderTime,let playerTime = playerTime(forNodeTime: nodeTime) {
return Double(playerTime.sampleTime) / playerTime.sampleRate
}
return 0
}
}
Thanks #Brian Becker
For the duration, .length property of AVAudioFile worked.
I divided the number of sample frames in the file by the sample rates(44100hz).
For the currenttime, tentatively I use UISlider.value and synthesize the value with the sample frames.
Something like this should give you that ability...
extension AVAudioPlayerNode {
var currentTime: TimeInterval {
get {
if let nodeTime: AVAudioTime = self.lastRenderTime, let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodeTime) {
return Double(playerTime.sampleTime) / playerTime.sampleRate
}
return 0
}
}