I'm making an audio player using AVAudioPlayerNode and playing a song with some effect (delay, pitch change, rate change etc) works, but I'm stuck on displaying song current time and duration.
AVAudioPlayerNode seems not have like AVAudioPlayer.currentTime or .duration.
I'd really appreciate if you can help me to display currenttime and duration by using AVAudioPlayerNode.
For the current time, this is what I use:
- (NSTimeInterval)currentTimeInSeconds
{
AVAudioTime *nodeTime = self.playerNode.lastRenderTime;
AVAudioTime *playerTime = [self.playerNode playerTimeForNodeTime:nodeTime];
NSTimeInterval seconds = (double)playerTime.sampleTime / playerTime.sampleRate;
return seconds;
}
In addition to the render time in an abstract time base, the player node offers conversion of this time into its own playback timeline via playerTimeForNodeTime. Also respects pausing etc.
Here it is in swift:
private func currentTime() -> NSTimeInterval {
if let nodeTime: AVAudioTime = myAvAudioPlayer.lastRenderTime, playerTime: AVAudioTime = myAvAudioPlayer.playerTimeForNodeTime(nodeTime) {
return Double(Double(playerTime.sampleTime) / playerTime.sampleRate)
}
return 0
}
Just replace myAvAudioPlayer with your AVAudioPlayerNode!
Happy coding!
For the duration of the AVAudioPlayerNode SWIFT 4:
func durationOfNodePlayer(_ fileUrl: URL) -> TimeInterval {
do {
let file = try AVAudioFile(forReading: fileUrl)
let audioNodeFileLength = AVAudioFrameCount(file.length)
return Double(Double(audioNodeFileLength) / 44100) //Divide by the AVSampleRateKey in the recorder settings
} catch {
return 0
}
}
Swift 5:
duration time, using AVAudioPlayrtNode's AVAudioFile
extension AVAudioFile{
var duration: TimeInterval{
let sampleRateSong = Double(processingFormat.sampleRate)
let lengthSongSeconds = Double(length) / sampleRateSong
return lengthSongSeconds
}
}
according to WWDC 2014: whats new in core audio , mins 54
Then currentTime:
extension AVAudioPlayerNode{
var current: TimeInterval{
if let nodeTime = lastRenderTime,let playerTime = playerTime(forNodeTime: nodeTime) {
return Double(playerTime.sampleTime) / playerTime.sampleRate
}
return 0
}
}
Thanks #Brian Becker
For the duration, .length property of AVAudioFile worked.
I divided the number of sample frames in the file by the sample rates(44100hz).
For the currenttime, tentatively I use UISlider.value and synthesize the value with the sample frames.
Something like this should give you that ability...
extension AVAudioPlayerNode {
var currentTime: TimeInterval {
get {
if let nodeTime: AVAudioTime = self.lastRenderTime, let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodeTime) {
return Double(playerTime.sampleTime) / playerTime.sampleRate
}
return 0
}
}
Related
We have an app which plays a long mp3 file (1 hour long). We want to be able to play from set points within the file. But, when we do it, it is inaccurate by up to 10 seconds.
Here's the code:
let trackStart = arrTracks![MediaPlayer.shared.currentSongNo].samples
let frameRate : Int32 = (MediaPlayer.shared.player?.currentItem?.asset.duration.timescale)!
MediaPlayer.shared.player?.seek(to: CMTimeMakeWithSeconds(Double(trackStart), frameRate),
toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero)
We have to use AVPlayer because we need the better quality "spectral:" AVAudioTimePitchAlgorithm.
We didn't have the problem with AVAudioPlayer, but (AFAIK) we have to use AVPlayer because we need the better quality "spectral:" AVAudioTimePitchAlgorithm.
[Edit:] - The error is consistent - it always plays from the same (wrong) place for a given requested position. This is also true after restarting.
Any help very much appreciated! Thanks
[Edit:]
We have already tried preferredTimescale: playerTimescale
Also tried kCMTimeIndefinite instead of kCMTimeZero
I have done something similar but with a slider to change seconds of playing and worked perfectly.
#objc func handleSliderChange(sender: UISlider?){
if let duration = player?.currentItem?.duration{
let totalSeconds = CMTimeGetSeconds(duration)
let value = Float64(videoSlider.value) * totalSeconds
let seekTime = CMTime(value: CMTimeValue(value), timescale: 1)
player?.seek(to: seekTime , completionHandler: { (completedSeek) in
//do smthg later
})
}
}
in you case this will be like this:
let trackStart = arrTracks![MediaPlayer.shared.currentSongNo].samples
let value = Float64(trackStart)
let seekTime = CMTime(value: CMTimeValue(value), timescale: 1)
MediaPlayer.shared.player?.seek(to: seekTime , completionHandler: { (completedSeek) in
//do smthg later
})
This is what AVURLAsset’s AVURLAssetPreferPreciseDurationAndTimingKey is for.
Apple's documentation.
Beware that this should increase the loading time.
Try this, It's working perfectly for me
#IBAction func playbackSliderValueChanged(_ playbackSlider: UISlider) {
let seconds : Int64 = Int64(playbackSlider.value)
let targetTime: CMTime = CMTimeMake(value: seconds, timescale: 1)
DispatchQueue.main.async {
self.player!.seek(to: targetTime)
if self.player!.rate == 0 { // if the player is not yet started playing
self.player?.play()
}
}
}
If I drop my Apple Watch and catch it before it hits the ground, the app I'm making should detect that the watch has fallen. But that's not happening. What's wrong with the code below? Thanks!
let motion = CMMotionManager()
if motion.isDeviceMotionAvailable {
motion.deviceMotionUpdateInterval = 0.1
motion.startDeviceMotionUpdates()
if let deviceMotion = motion.deviceMotion {
let accelerationX = deviceMotion.gravity.x + deviceMotion.userAcceleration.x
let accelerationY = deviceMotion.gravity.y + deviceMotion.userAcceleration.y
let accelerationZ = deviceMotion.gravity.z + deviceMotion.userAcceleration.z
let totalAcceleration = sqrt((accelerationX * accelerationX) + (accelerationY * accelerationY) + (accelerationZ * accelerationZ))
if totalAcceleration > 9.0 {
print("Watch has fallen")
}
}
motion.stopDeviceMotionUpdates()
}
motion.deviceMotion will just get the latest sample of device-motion data.
So, this might just fetch the data once on when you run it. You will probably need something like a timer to check the acceleration.
Something like this(taken from https://developer.apple.com/documentation/coremotion/getting_raw_accelerometer_events)
let motion = CMMotionManager()
func startAccelerometers() {
// Make sure the accelerometer hardware is available.
if self.motion.isAccelerometerAvailable {
self.motion.accelerometerUpdateInterval = 1.0 / 60.0 // 60 Hz
self.motion.startAccelerometerUpdates()
// Configure a timer to fetch the data.
self.timer = Timer(fire: Date(), interval: (1.0/60.0),
repeats: true, block: { (timer) in
// Get the accelerometer data.
if let data = self.motion.accelerometerData {
let x = data.acceleration.x
let y = data.acceleration.y
let z = data.acceleration.z
// Use the accelerometer data in your app.
}
})
// Add the timer to the current run loop.
RunLoop.current.add(self.timer!, forMode: .defaultRunLoopMode)
}
}
Alternatively, you can also pass a handler to startDeviceMotionUpdates, which will be called based on deviceMotionUpdateInterval.
I have ViewController with stars rating that looks like this (except that there are 10 stars)
When user opens ViewController for some object that have no rating I want to point user's attention to this stars with very simple way: animate stars highlighting (you could see such behaviour on some ads in real world when each letter is highlighted one after another).
One star highlighted
Two stars highlighted
Three stars highlighted
......
Turn off all of them
So this is the way how I am doing it
func delayWithSeconds(_ seconds: Double, completion: #escaping () -> ()) {
DispatchQueue.main.asyncAfter(deadline: .now() + seconds) {
completion()
}
}
func ratingStarsAnimation() {
for i in 1...11 {
var timer : Double = 0.6 + Double(i)*0.12
delayWithSeconds(timer) {
ratingStars.rating = (i < 10) ? Double(i) : 0
}
}
}
What is going on here? I have function called delayWithSeconds that delays action and I use this function to delay each star highlighting. And 0.6 is initial delay before animation begins. After all stars are highlighted - last step is to turn off highlighting of all stars.
This code works but I can't say that it is smooth.
My questions are:
How can I change 0.6 + Double(i)*0.12 to get smooth animation feel?
I think that my solution with delays is not good - how can I solve smooth stars highlighting task better?
Have a look at the CADisplaylink class. Its a specialized timer that is linked to the refresh rate of the screen, on iOS this is 60fps.
It's the backbone of many 3rd party animation libraries.
Usage example:
var displayLink: CADisplayLink?
let start: Double = 0
let end: Double = 10
let duration: CFTimeInterval = 5 // seconds
var startTime: CFTimeInterval = 0
let ratingStars = RatingView()
func create() {
displayLink = CADisplayLink(target: self, selector: #selector(tick))
displayLink?.add(to: .main, forMode: .defaultRunLoopMode)
}
func tick() {
guard let link = displayLink else {
cleanup()
return
}
if startTime == 0 { // first tick
startTime = link.timestamp
return
}
let maxTime = startTime + duration
let currentTime = link.timestamp
guard currentTime < maxTime else {
finish()
return
}
// Add math here to ease the animation
let progress = (currentTime - startTime) / duration
let progressInterval = (end - start) * Double(progress)
// get value =~ 0...10
let normalizedProgress = start + progressInterval
ratingStars.rating = normalizedProgress
}
func finish() {
ratingStars.rating = 0
cleanup()
}
func cleanup() {
displayLink?.remove(from: .main, forMode: .defaultRunLoopMode)
displayLink = nil
startTime = 0
}
As a start this will allow your animation to be smoother. You will still need to add some trigonometry if you want to add easing but that shouldn't be too difficult.
CADisplaylink:
https://developer.apple.com/reference/quartzcore/cadisplaylink
Easing curves: http://gizma.com/easing/
I'm trying to play two audioPlayers, one after the other has finished playing. I'm using the Swift function playAtTime() to create a delay for the second, as follows:
var audioPlayer1 = AVAudioPlayer()
var audioPlayer2 = AVAudioPlayer()
let soundPathA = NSBundle.mainBundle().pathForResource("A", ofType: "m4a")
let soundURLA = NSURL.fileURLWithPath(soundPathA!)
let soundPathB = NSBundle.mainBundle().pathForResource("B", ofType: "m4a")
let soundURLB = NSURL.fileURLWithPath(soundPathB!)
var noteA = Sound()
noteA.URL = soundURLA
var noteB = Sound()
noteB.URL = soundURLB
self.audioPlayer1 = try!AVAudioPlayer(contentsOfURL: soundURLA)
self.audioPlayer2 = try!AVAudioPlayer(contentsOfURL: soundURLB)
let duration : NSTimeInterval = audioPlayer1.duration
self.audioPlayer1.play()
self.audioPlayer2.playAtTime(duration)
However, no delay occurs. What is my issue here?
The playAtTime function does not start the play at a given time. Instead it plays immediately, from the given time in the sound to be played. So if you give it 0.6, it will start playing straight away, but the sound will start from 0.6 seconds in. See the documentation here.
If you want to wait before playing, you could use dispatch_after:
let delayTime = dispatch_time(DISPATCH_TIME_NOW, Int64(0.6 * Double(NSEC_PER_SEC)))
dispatch_after(delayTime, dispatch_get_main_queue()) {
self.audioPlayer2.play()
}
I haven't tried Michael's answer yet, but for those of you wondering how to do it with playAtTime, check out this: http://sandmemory.blogspot.com/2014/12/how-would-you-add-reverbecho-to-audio.html and look at the playEcho function.
The code, for me, looks like this:
var playTime1 : NSTimeInterval = audioPlayer2.deviceCurrentTime + (duration1*Double(value))
self.audioPlayer1.stop()
self.audioPlayer1.currentTime = 0
self.audioPlayer1.play()
Is there a method to get the value of fps in swift?
No I dont mean changing the frameInterval
and no I dont mean
skView.showsFPS = true
I want to get the fps so that I can have my app wait until fps reaches 60 before it starts the game. For some reason my app occasionally gets 'stuck' at 40fps lingers for 5 seconds then slowly climbs to back to 60. This usually happens when I summon the ControlCenter.
You can calculate it yourself.
Doing something like this in your scene:
var lastUpdateTime: TimeInterval = 0
func update(currentTime: TimeInterval) {
let deltaTime = currentTime - lastUpdateTime
let currentFPS = 1 / deltaTime
print(currentFPS)
lastUpdateTime = currentTime
}
There's another way to get the FPS of the view itself independent from which scene is presented, using SKViewDelegate (taken from the Developer Documentation, just with the added FPS property):
class ViewController: NSViewController, SKViewDelegate {
let spriteView: SKView = SKView()
override func viewDidLoad() {
self.spriteView.delegate = self // Set the view's delegate to the ViewController.
}
let targetFps: TimeInterval = 60 // Set to your preferred fps
var lastRenderTime: TimeInterval = 0
var fps: TimeInterval = 0.0 // Access this property to get the view's fps
// Delegate method:
public func view(_ view: SKView, shouldRenderAtTime time: TimeInterval) -> Bool {
if time - lastRenderTime >= 1 / targetFps { // Check if the scene should render.
// Determine the fps with the time since the last render:
self.fps = 1 / (time - lastRenderTime)
// Set the last render time to the current time:
self.lastRenderTime = time
// Let scene know that it should render the next frame.
return true
}
else {
return false
}
}
}
gameView.showsStatistics = true