I have a sound in my app that starts automatically when appear the view; but, as the title says, I'd like that this sounds starts with a little delay, about an half second after the view appear. I tried to use PlayAtTime, but or it does not work or I have set somethings wrong...
This is my code:
var player = AVAudioPlayer?
override func viewDidLoad()
{
super.viewDidLoad()
playAudioWithDelay()
}
func playAudioWithDelay()
{
let file = NSBundle.mainBundle().URLForResource("PR1", withExtension: "wav")
player = AVAudioPlayer(contentsOfURL: file, error: nil)
player!.volume = 0.5
player!.numberOfLoops = -1
player!.playAtTime(//I tried with 0.5 but doesn't work)
player!.prepareToPlay()
player!.play()
}
You can try using this:
let seconds = 1.0//Time To Delay
let delay = seconds * Double(NSEC_PER_SEC) // nanoseconds per seconds
var dispatchTime = dispatch_time(DISPATCH_TIME_NOW, Int64(delay))
dispatch_after(dispatchTime, dispatch_get_main_queue(), {
//Play Sound here
})
Full code:
func playAudioWithDelay()
{
let file = NSBundle.mainBundle().URLForResource("PR1", withExtension: "wav")
player = AVAudioPlayer(contentsOfURL: file, error: nil)
player!.volume = 0.5
player!.numberOfLoops = -1
player!.playAtTime(//I tried with 0.5 but doesn't work)
player!.prepareToPlay()
let seconds = 1.0//Time To Delay
let delay = seconds * Double(NSEC_PER_SEC) // nanoseconds per seconds
var dispatchTime = dispatch_time(DISPATCH_TIME_NOW, Int64(delay))
dispatch_after(dispatchTime, dispatch_get_main_queue(), {
player!.play()
})
}
Try the following function implemented in Swift 3.0
var player: AVAudioPlayer?
func playAudioWithDelay()
{
let file = Bundle.main.url(forResource: "PR1", withExtension: "wav")
do {
player = try AVAudioPlayer(contentsOf: file!)
player?.volume = 0.5
player?.numberOfLoops = -1
player?.prepareToPlay()
} catch let error as NSError {
print("error: \(error.localizedDescription)")
}
let seconds = 1.0//Time To Delay
let when = DispatchTime.now() + seconds
DispatchQueue.main.asyncAfter(deadline: when) {
self.play()
}
}
func play() {
if player?.isPlaying == false {
player?.play()
}
}
Swift 5 Audio Delay Settings:
var player: AVAudioPlayer?
func playAudio(soundName: String, extension: String, delay: Double)
{
let file = Bundle.main.url(forResource: soundName, withExtension: extension)
do {
player = try AVAudioPlayer(contentsOf: file!)
player?.volume = 0.5
player?.numberOfLoops = -1
player?.prepareToPlay()
} catch let error as NSError {
print("error: \(error.localizedDescription)")
}
let seconds = delay//Time To Delay
let when = DispatchTime.now() + seconds
DispatchQueue.main.asyncAfter(deadline: when) {
player.play()
}
}
Related
The goal is to play several sounds one after another (getReady -> nextExercise -> burpees).
The problem is that only the first one is being played
How it should work:
I call playGetReady() from WorkoutTabataViewController
I plays the first sound
After the first sound is finished, automatically "audioPlayerDidFinishPlaying()" is being called
It triggers "playNextSound()" func, which playing next sound
But audioPlayerDidFinishPlaying() is not being called. Or am I missing something and it should work differently?
class AudioPlayerManager: AVAudioPlayerDelegate {
var description: String
static let shared = AudioPlayerManager()
var audioPlayer: AVAudioPlayer?
var workoutVC: WorkoutTabataViewController?
var mainVC: MainTabataViewController?
var currentSound = 0
let urls: [URL]
init() {
self.description = ""
//First sound
let getReady = Bundle.main.path(forResource: "Get ready", ofType: "mp3")!
let urlGetReady = URL(fileURLWithPath: getReady)
//Second sound
let nextExercise = Bundle.main.path(forResource: "Next Exercise", ofType: "mp3")!
let urlNextExercise = URL(fileURLWithPath: nextExercise)
//Third sound
let burpees = Bundle.main.path(forResource: "Burpees", ofType: "mp3")!
let urlBurpees = URL(fileURLWithPath: burpees)
urls = [urlGetReady, urlNextExercise, urlBurpees]
}
func playGetReady() {
do {
audioPlayer = try AVAudioPlayer(contentsOf: urls[currentSound])
audioPlayer?.delegate = self
audioPlayer?.play()
} catch {
print(error)
}
}
func playNextSound() {
currentSound += 1
if currentSound < urls.count {
do {
audioPlayer = try AVAudioPlayer(contentsOf: urls[currentSound])
audioPlayer?.delegate = self
audioPlayer?.play()
} catch {
print(error)
}
}
}
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
if flag {
playNextSound()
}
}
}
Your audio manager class is not introspectable. Say #objc func audioPlayerDidFinishPlaying or, better, make it an NSObject.
We're trying to implement AVPlayer seek in our SwiftUI app, it worked prior to iOS 15.4 but not after the update.
let playerCurrentTime = CMTimeGetSeconds(player.currentTime())
let newTime = playerCurrentTime + 45
let time2: CMTime = CMTimeMake(value: Int64(newTime * 1000 as Float64), timescale: 1000)
player.seek(to: time2, toleranceBefore: CMTime.zero, toleranceAfter: CMTime.zero) { success in
print(success)
}
The completionHandler is called immediately with success = false.
No other seek operations are running, and the AVPlayer status is readyToPlay.
We're streaming a MP3 file from an URL, using this initialisation code:
playerItem = AVPlayerItem(url: url)
player = AVPlayer(playerItem: playerItem)
player.addPeriodicTimeObserver(forInterval: CMTime(value: 1, timescale: 2), queue: DispatchQueue.main) { _ in
if self.player.currentItem?.status == .readyToPlay {
self.currentTimeInSeconds = CMTimeGetSeconds(self.player.currentTime())
self.progressInPct = Double(self.currentTimeInSeconds) / Double(self.totalTimeInSeconds)
self.nowPlayingInfo[MPNowPlayingInfoPropertyPlaybackRate] = 1
MPNowPlayingInfoCenter.default().nowPlayingInfo = self.nowPlayingInfo
self.setupNowPlaying()
} else {
self.nowPlayingInfo[MPNowPlayingInfoPropertyPlaybackRate] = 0
MPNowPlayingInfoCenter.default().nowPlayingInfo = self.nowPlayingInfo
}
}
We tried seeking on currentItem as well, but that didn't work either.
player.currentItem?.seek(to: time, toleranceBefore: CMTime.zero, toleranceAfter: CMTime.zero)
Anyone else experienced something like this, and have any pointers?
UPDATE:
Tried doing a complete bare bones attempt, but still the same result:
struct testView: View {
var player = AVPlayer()
var body: some View {
Button {
self.startPlayer(url: episode.streamUrl!)
}
label: {
Text("Test")
}
}
func startPlayer(url: String) {
let playerItem = AVPlayerItem(url: URL(string: url) !)
self.player.replaceCurrentItem(with: playerItem)
player.play()
DispatchQueue.main.asyncAfter(deadline: .now() + 5) {
let time2: CMTime = CMTimeMake(value: Int64(45 * 1000 as Float64), timescale: 1000)
player.seek(to: time2, toleranceBefore: CMTime.zero, toleranceAfter: CMTime.zero) {
success in
print(success)
}
}
}
}
Prints "false".
For comparision here I attach piece of code that I use for time observation and 3 types of seeking functions. It is working fine so far on iOS 15.4, (although on UIKit and with .m3u8 playlist). Hopefully it might help you in some way.
seeking functions:
func seek1() {
let videoDuration = (avPlayer.currentItem?.duration.seconds ?? 0)!
let elapsedTime: Float64 = videoDuration * Float64(archiveControlsView.seekSlider.value)
let toTime = CMTime.init(seconds: elapsedTime,
preferredTimescale: 100)
avPlayer.seek(to: toTime,completionHandler: { (completed: Bool) -> Void in
//do whatever you need
})
}
func seek2() {
let diff: TimeInterval = 60
self.avPlayer.seek(to: CMTime.init(seconds: diff, preferredTimescale: 100))
}
func seekToZero() {
self.avPlayer.seek(to: kCMTimeZero)
}
time observing function
var timeObserver: AnyObject!
func createTimer() {
let timeInterval: CMTime = CMTime.init(seconds: 1.0, preferredTimescale: 10)
timeObserver = avPlayer.addPeriodicTimeObserver(forInterval: timeInterval,
queue: DispatchQueue.main) {
(elapsedTime: CMTime) -> Void in
guard self.status != .Seeking else { return }
self.observeTime(elapsedTime)
let duration = (self.avPlayer.currentItem?.duration.seconds ?? 0)!
let elapsedTime = elapsedTime.seconds
// do whatever you need with elapsed and duration
} as AnyObject?
}
I am trying to play multiple audio files using 2 AVPlayer instances, but one of the player stops for a fraction of a second rather than playing all audio files simultaneously.
The logic of the program is as follows:
var player: AVPlayer? will stream an audio file from my database. On its own is playing perfectly.
fileprivate var countPlayer: AVPlayer? plays the count number of the current item being played by var player. The count is a sequence of 1 to 8 and for each digit I am storing/sandobxing a .wav file locally such as 1.wav, 2.wav...8.wav.
When current time of var player is at a certain time, countPlayer is triggered and it plays one of the local file 1.wav, 2.wav..etc.
The problem is that when the var countPlayer starts playing, it causes the background AVPlayer, namely var player to stop for a fraction of a second, similar to what's described in this comment:
Play multiple Audio Files with AVPlayer
var player: AVPlayer? //plays the song
fileprivate var countPlayer: AVPlayer? // plays the count number of song
private func addBoundaryTimeObserver(tableIndexPath: IndexPath) {
let mediaItem = mediaArray[tableIndexPath.row]
guard let url = URL(string: mediaItem.mediaAudioUrlStringRepresentation ?? "") else {return}
let playerItem = AVPlayerItem(url: url)
player = AVPlayer(playerItem: playerItem)
var timesToTransverse = [NSValue]()
//convert string representation of times elements to array
let timesRecorded: [String] = mediaItem.timesRecorded.components(separatedBy: ",")
// Build boundary times from arrayOfBeats keys
let timeDoubles: [Double] = timesRecorded.compactMap {timeString in
if let second = Double("\(timeString)") {
return second
}
return nil
}
guard timeDoubles.count > 0 else {return} //unexpected
timesToTransverse = timeDoubles.map { second in
let cmtime = CMTime(seconds: second, preferredTimescale: CMTimeScale(NSEC_PER_SEC))
return NSValue(time: cmtime)
}
guard timesToTransverse.count != 0 else {return}
guard let playerCell = tableView.cellForRow(at: IndexPath(row: 0, section: 0)) as? PlayerCell else {return}
startTime = Date().timeIntervalSinceReferenceDate
timeIndex = 0
player?.play()
player?.rate = Float(initialPlaybackRate)
// find the index of time
//reset timeObserverToken
// call a function with the new times sorted
// Queue on which to invoke the callback
let mainQueue = DispatchQueue.main
// Add time observer
timeObserverToken =
player?.addBoundaryTimeObserver(forTimes: timesToTransverse, queue: mainQueue) {
[weak self] in
//because there are no time signature changes, we can simply increment timeIndex with + 1 every time `addBoundaryTimeObserver` completion handler is called and subscript timesToTransverse with timeIndex in order to get the subsequent timeInSeconds
guard let strongSelf = self, strongSelf.timeIndex < timesToTransverse.count else {return}
let timeElement = timesToTransverse[strongSelf.timeIndex]
strongSelf.timeInSeconds = CMTimeGetSeconds(timeElement.timeValue)
//show progress in progressView
let duration = CMTimeGetSeconds(playerItem.duration)
let cmtimeSeconds = CMTime(seconds: strongSelf.timeInSeconds, preferredTimescale: CMTimeScale(NSEC_PER_SEC))
//Total time since timer started, in seconds
strongSelf.timeInSeconds = Date().timeIntervalSinceReferenceDate - strongSelf.startTime
let timeString = String(format: "%.2f", strongSelf.timeInSeconds)
strongSelf.timeString = timeString
//use reminder operator to determine the beat count
let beat = (strongSelf.timeIndex + 1) % 8 == 0 ? 8 : ((strongSelf.timeIndex + 1) % 8)
//play the beat count : 1, 2, ...8
self.preapareToPlayAudio(beatCount: beat)
/*
0: (0 + 1) % 8 = 1
1: (1 + 1) % 8 = 2
6: (6 + 1) % 8 = 7
7: (7 + 1) % 8 = 0
*/
strongSelf.timeIndex += 1
}
}//end addBoundaryTimeObserver
//prepare determine what wav file to play
private func preapareToPlayAudio(beatCount: Int) {
switch beatCount {
case 1:
guard let url = Bundle.main.url(forResource: "1", withExtension: "wav") else {return}
playWith(beatCountURL: url)
//7 more cases go here .....
default: print("unexpected case here")
}
}//end play(beatCount: Int)
private func playWith(beatCountURL: URL) {
let playerItem = AVPlayerItem(url: beatCountURL)
countPlayer = AVPlayer(playerItem: playerItem)
countPlayer?.play()
}
You would be better off using AVAudioPlayerNode, AVAudioMixerNode, AVAudioEngine. Using these classes you won't have problems like you have right now. It's also not that difficult to set up.
You can check out my gist, in order to play the sounds in your Playgrounds you would need to put audio files into Resources folder in Project Navigator:
https://gist.github.com/standinga/24342d23acfe70dc08cbcc994895f32b
The code works without stopping background audio when top sounds are triggered.
Here's also the same code:
import AVFoundation
import PlaygroundSupport
PlaygroundPage.current.needsIndefiniteExecution = true
class AudioPlayer {
var backgroundAudioFile:AVAudioFile
var topAudioFiles: [AVAudioFile] = []
var engine:AVAudioEngine
var backgroundAudioNode: AVAudioPlayerNode
var topAudioAudioNodes = [AVAudioPlayerNode]()
var mixer: AVAudioMixerNode
var timer: Timer!
var urls: [URL] = []
init (_ url: URL, urls: [URL] = []) {
backgroundAudioFile = try! AVAudioFile(forReading: url)
topAudioFiles = urls.map { try! AVAudioFile(forReading: $0) }
engine = AVAudioEngine()
mixer = AVAudioMixerNode()
engine.attach(mixer)
engine.connect(mixer, to: engine.outputNode, format: nil)
self.urls = urls
backgroundAudioNode = AVAudioPlayerNode()
for _ in topAudioFiles {
topAudioAudioNodes += [AVAudioPlayerNode()]
}
}
func start() {
engine.attach(backgroundAudioNode)
engine.connect(backgroundAudioNode, to: mixer, format: nil)
backgroundAudioNode.scheduleFile(backgroundAudioFile, at: nil, completionHandler: nil)
try! engine.start()
backgroundAudioNode.play()
for node in topAudioAudioNodes {
engine.attach(node)
engine.connect(node, to: mixer, format: nil)
try! engine.start()
}
// simulate rescheduling files played on top of background audio
DispatchQueue.global().async { [unowned self] in
for i in 0..<1000 {
sleep(2)
let index = i % self.topAudioAudioNodes.count
let node = self.topAudioAudioNodes[index]
node.scheduleFile(self.topAudioFiles[index], at: nil, completionHandler: nil)
node.play()
}
}
}
}
let bundle = Bundle.main
let beepLow = bundle.url(forResource: "beeplow", withExtension: "wav")!
let beepMid = bundle.url(forResource: "beepmid", withExtension: "wav")!
let backgroundAudio = bundle.url(forResource: "backgroundAudio", withExtension: "wav")!
let audioPlayer = AudioPlayer(backgroundAudio, urls: [beepLow, beepMid])
audioPlayer.start()
I am using AVPlayer to play an mp3 audio file only. I am using a url that I tested and works fine. I needed to use the AVPlayer because I needed to setup a UISlider programmatically and AVPlayer is convenient. The UISlider works and updates as the audio plays. The audio might be be playing but I cannot hear the sound. I say this because the UISlider is working.
Update: You can hear the audio when building the app on a simulator. Issue occurs when building it on device - mine is XS MAX.
Link to screen recordong ->
Visit: https://streamable.com/nkbn8
I have tried using the same URL with AVAudioPlayer and audio plays and you can hear it.
private func setupAudioContent() {
let urlString = "https://s3.amazonaws.com/kargopolov/kukushka.mp3"
if let url = NSURL(string: urlString) {
audioPlayer = AVPlayer(url: url as URL)
let playerLayer = AVPlayerLayer(player: audioPlayer)
self.layer.addSublayer(playerLayer)
playerLayer.frame = self.frame
audioPlayer?.play()
audioPlayer?.volume = 1.0
audioPlayer?.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
let interval = CMTime(value: 1, timescale: 2)
audioPlayer?.addPeriodicTimeObserver(forInterval: interval, queue: DispatchQueue.main, using: { (progressTime) in
let currentTime = CMTimeGetSeconds(progressTime)
let currentTimeSecondsString = String(format: "%02d", Int(currentTime.truncatingRemainder(dividingBy: 60)))
let currentTimeMinutesString = String(format: "%02d", Int(currentTime / 60))
self.currentTimeLabel.text = "\(currentTimeMinutesString):\(currentTimeSecondsString)"
if let duration = self.audioPlayer?.currentItem?.duration {
let durationsSeconds = CMTimeGetSeconds(duration)
self.audioSlider.value = Float(currentTime / durationsSeconds)
}
})
}
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "currentItem.loadedTimeRanges" {
isAudioPlaying = true
if let duration = audioPlayer?.currentItem?.duration {
let seconds = CMTimeGetSeconds(duration)
let secondsText = Int(seconds) % 60
let minutesText = String(format: "%02d", Int(seconds) / 60)
audioLengthLabel.text = "\(minutesText):\(secondsText)"
}
}
}
#objc func handleSliderChange() {
if let duration = audioPlayer?.currentItem?.duration {
let totalSeconds = CMTimeGetSeconds(duration)
let value = Float64(audioSlider.value) * totalSeconds
let seekTime = CMTime(value: Int64(value), timescale: 1)
audioPlayer?.seek(to: seekTime, completionHandler: { (completedSeek) in
})
}
}
Expected result: Hear Audio playing
Actual result: cannot hear audio playing. Seems like audio is playing just no sound.
When using AVPlayer you should make sure your device is not on silent mode as that will cause to not output audio even though your volume is at max.
If you would like to keep your device on silent mode and still play the audio you can use the following code before your .play():
do {
try? AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [])
}
catch {
// report for an error
print(error)
}
I'm creating a game where the user controls a character with a jetpack. When the jetpack intersects a diamond, I add the diamond to their total and then play a sound. However, the sound makes the game pause for a tenth of a second or so and disrupts the flow. This is the code I'm using:
var diamondSound = NSBundle.mainBundle().URLForResource("diamondCollect", withExtension: "wav")!
var diamondPlayer = AVAudioPlayer?()
class GameScene: SKScene{
override func didMoveToView(view: SKView) {
do {
diamondPlayer = try AVAudioPlayer(contentsOfURL: diamondSound)
guard let player = diamondPlayer else { return }
player.prepareToPlay()
} catch let error as NSError {
print(error.description)
}
}
And then later:
override func update(currentTime: CFTimeInterval) {
if character.intersectsNode(diamond){
diamondPlayer?.play()
addDiamond()
diamond.removeFromParent()
}
}
Also I am using Sprite Kit if that matters. Any help is greatly appreciated!
Usually, I prefeer to use SKAction.playSoundWithFile in my games but this one have a limitation, there is no volume setting.
So, whit this extension you can solve this lack:
public extension SKAction {
public class func playSoundFileNamed(fileName: String, atVolume: Float, waitForCompletion: Bool) -> SKAction {
let nameOnly = (fileName as NSString).stringByDeletingPathExtension
let fileExt = (fileName as NSString).pathExtension
let soundPath = NSBundle.mainBundle().URLForResource(nameOnly, withExtension: fileExt)
var player: AVAudioPlayer! = AVAudioPlayer()
do { player = try AVAudioPlayer(contentsOfURL: soundPath!, fileTypeHint: nil) }
catch let error as NSError { print(error.description) }
player.volume = atVolume
let playAction: SKAction = SKAction.runBlock { () -> Void in
player.prepareToPlay()
player.play()
}
if(waitForCompletion){
let waitAction = SKAction.waitForDuration(player.duration)
let groupAction: SKAction = SKAction.group([playAction, waitAction])
return groupAction
}
return playAction
}
}
Usage:
self.runAction(SKAction.playSoundFileNamed("diamondCollect.wav", atVolume:0.5, waitForCompletion: true))