I am working on loading a (local) movie into AVPlayer and applying processing to the audio track with an audioTapProcessor. So far I've found great GitHub examples here, here, and here. I'm using the "tap cookie" approach used in the last link and in an answer to this previous question.
Audio & video playback are working fine. However, my tapPrepare and tapProcess callbacks are not being called, but Init and Finalize are. So I'm doing something both right and wrong. relevant code attached -- Any help appreciated!
import Foundation
import AVFoundation
import AudioToolbox
import MediaToolbox
import CoreAudioTypes
class PlayerViewController: UIViewController {
class TapCookie {
weak var content: PlayerViewController?
deinit {
print("TapCookie deinit") // appears after tapFinalize
}
}
// MARK: Properties
var playerAsset: AVURLAsset?
var playerItem: AVPlayerItem! = nil
var audioProcessingFormat: AudioStreamBasicDescription?
private var tracksObserver: NSKeyValueObservation?
// MARK: Button to trigger actions
#IBAction func selectVideo(_ sender: Any) {
// starts doing stuff:
// - select a video file from device, extract movieURL string ...
playerAsset = AVURLAsset(url: movieURL)
playerItem = AVPlayerItem(url: movieURL)
//... then send asset to AVPlayer (not shown)
// set up audioProcessingTap
tracksObserver = playerItem.observe(\AVPlayerItem.tracks, options: [.initial, .new]) {
[unowned self] item, change in
installTap(playerItem: playerItem)
}
}
func installTap(playerItem: AVPlayerItem) {
let cookie = TapCookie()
cookie.content = self
var callbacks = MTAudioProcessingTapCallbacks(
version: kMTAudioProcessingTapCallbacksVersion_0,
clientInfo: UnsafeMutableRawPointer(Unmanaged.passRetained(cookie).toOpaque()),
init: tapInit,
finalize: tapFinalize,
prepare: tapPrepare,
unprepare: tapUnprepare,
process: tapProcess)
var tap: Unmanaged<MTAudioProcessingTap>?
let err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks, kMTAudioProcessingTapCreationFlag_PostEffects, &tap)
assert(noErr == err);
// tapInit successfully called after MTAudioProcessingTapCreate
let audioMix = AVMutableAudioMix()
let audioTrack = playerItem.asset.tracks(withMediaType: AVMediaType.audio).first! //use first audio track
let inputParams = AVMutableAudioMixInputParameters(track: audioTrack)
inputParams.audioTapProcessor = tap?.takeRetainedValue()
audioMix.inputParameters = [inputParams]
playerItem.audioMix = audioMix
}
// MARK: install tap callbacks
let tapInit: MTAudioProcessingTapInitCallback = {
(tap, clientInfo, tapStorageOut) in
tapStorageOut.pointee = clientInfo
print("tapInit tap: \(tap)\n clientInfo: \(String(describing: clientInfo))\n tapStorageOut: \(tapStorageOut)\n")
}
// tapPrepare not called !!
let tapPrepare: MTAudioProcessingTapPrepareCallback = {
(tap, maxFrames, processingFormat) in
print("tapPrepare tap: \(tap), maxFrames: \(maxFrames)\n processingFormat: \(processingFormat)")
let cookie = Unmanaged<TapCookie>.fromOpaque(MTAudioProcessingTapGetStorage(tap)).takeUnretainedValue()
cookie.content!.audioProcessingFormat = AudioStreamBasicDescription(mSampleRate: processingFormat.pointee.mSampleRate,
mFormatID: processingFormat.pointee.mFormatID,
mFormatFlags: processingFormat.pointee.mFormatFlags,
mBytesPerPacket: processingFormat.pointee.mBytesPerPacket,
mFramesPerPacket: processingFormat.pointee.mFramesPerPacket,
mBytesPerFrame: processingFormat.pointee.mBytesPerFrame,
mChannelsPerFrame: processingFormat.pointee.mChannelsPerFrame,
mBitsPerChannel: processingFormat.pointee.mBitsPerChannel,
mReserved: processingFormat.pointee.mReserved)
}
let tapUnprepare: MTAudioProcessingTapUnprepareCallback = {
(tap) in
print("tapUnprepare \(tap)")
}
// tapProcess not called !!
let tapProcess: MTAudioProcessingTapProcessCallback = {
(tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut) in
print("tapProcess \(tap)\n \(numberFrames)\n \(flags)\n \(bufferListInOut)\n \(numberFramesOut)\n \(flagsOut)\n")
let status = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut)
if noErr != status {
print("get audio: \(status)")
}
let cookie = Unmanaged<TapCookie>.fromOpaque(MTAudioProcessingTapGetStorage(tap)).takeUnretainedValue()
guard let cookieContent = cookie.content else {
print("Tap callback: cookie content was deallocated!")
return
}
// process audio here...
}
let tapFinalize: MTAudioProcessingTapFinalizeCallback = {
(tap) in
print("tapFinalize \(tap)")
// release cookie
Unmanaged<TapCookie>.fromOpaque(MTAudioProcessingTapGetStorage(tap)).release()
}
}
You need to create an AVPlayer
player = AVPlayer(playerItem: playerItem)
and then at some point start it playing:
player.play()
Then the prepare and process callbacks will be called.
Related
I've consulted many variations of background app refresh for Apple Watch so that I can update the complications for my app. However, the process seems very much hit or miss and most of the time it doesn't run at all after some time.
Here is the code I currently have:
BackgroundService.swift
Responsibility: Schedule background refresh and handle download processing and update complications.
import Foundation
import WatchKit
final class BackgroundService: NSObject, URLSessionDownloadDelegate {
var isStarted = false
private let requestFactory: RequestFactory
private let logManager: LogManager
private let complicationService: ComplicationService
private let notificationService: NotificationService
private var pendingBackgroundTask: WKURLSessionRefreshBackgroundTask?
private var backgroundSession: URLSession?
init(requestFactory: RequestFactory,
logManager: LogManager,
complicationService: ComplicationService,
notificationService: NotificationService
) {
self.requestFactory = requestFactory
self.logManager = logManager
self.complicationService = complicationService
self.notificationService = notificationService
super.init()
NotificationCenter.default.addObserver(self,
selector: #selector(handleInitialSchedule(_:)),
name: Notification.Name("ScheduleBackgroundTasks"),
object: nil
)
}
func updateContent() {
self.logManager.debugMessage("In BackgroundService updateContent")
let complicationsUpdateRequest = self.requestFactory.makeComplicationsUpdateRequest()
let config = URLSessionConfiguration.background(withIdentifier: "app.wakawatch.background-refresh")
config.isDiscretionary = false
config.sessionSendsLaunchEvents = true
self.backgroundSession = URLSession(configuration: config,
delegate: self,
delegateQueue: nil)
let backgroundTask = self.backgroundSession?.downloadTask(with: complicationsUpdateRequest)
backgroundTask?.resume()
self.isStarted = true
self.logManager.debugMessage("backgroundTask started")
}
func handleDownload(_ backgroundTask: WKURLSessionRefreshBackgroundTask) {
self.logManager.debugMessage("Handling finished download")
self.pendingBackgroundTask = backgroundTask
}
func urlSession(_ session: URLSession,
downloadTask: URLSessionDownloadTask,
didFinishDownloadingTo location: URL) {
processFile(file: location)
self.logManager.debugMessage("Marking pending background tasks as completed.")
if self.pendingBackgroundTask != nil {
self.pendingBackgroundTask?.setTaskCompletedWithSnapshot(false)
self.backgroundSession?.invalidateAndCancel()
self.pendingBackgroundTask = nil
self.backgroundSession = nil
self.logManager.debugMessage("Pending background task cleared")
}
self.schedule()
}
func processFile(file: URL) {
guard let data = try? Data(contentsOf: file) else {
self.logManager.errorMessage("file could not be read as data")
return
}
guard let backgroundUpdateResponse = try? JSONDecoder().decode(BackgroundUpdateResponse.self, from: data) else {
self.logManager.errorMessage("Unable to decode response to Swift object")
return
}
let defaults = UserDefaults.standard
defaults.set(backgroundUpdateResponse.totalTimeCodedInSeconds,
forKey: DefaultsKeys.complicationCurrentTimeCoded)
self.complicationService.updateTimelines()
self.notificationService.isPermissionGranted(onGrantedHandler: {
self.notificationService.notifyGoalsAchieved(newGoals: backgroundUpdateResponse.goals)
})
self.logManager.debugMessage("Complication updated")
}
func schedule() {
let time = self.isStarted ? 15 * 60 : 60
let nextInterval = TimeInterval(time)
let preferredDate = Date.now.addingTimeInterval(nextInterval)
WKExtension.shared().scheduleBackgroundRefresh(withPreferredDate: preferredDate,
userInfo: nil) { error in
if error != nil {
self.logManager.reportError(error!)
return
}
self.logManager.debugMessage("Scheduled for \(preferredDate)")
}
}
#objc func handleInitialSchedule(_ notification: NSNotification) {
if !self.isStarted {
self.schedule()
}
}
deinit {
NotificationCenter.default.removeObserver(self)
}
}
The flow for the above file's usage is that it will be used by the ExtensionDelegate to schedule background refresh. The first time, it'll schedule a refresh for 1 minute out and then every 15 minutes after that.
Here is the ExtensionDelegate:
import Foundation
import WatchKit
class ExtensionDelegate: NSObject, WKExtensionDelegate {
private var backgroundService: BackgroundService?
private var logManager: LogManager?
override init() {
super.init()
self.backgroundService = DependencyInjection.shared.container.resolve(BackgroundService.self)!
self.logManager = DependencyInjection.shared.container.resolve(LogManager.self)!
}
func isAuthorized() -> Bool {
let defaults = UserDefaults.standard
return defaults.bool(forKey: DefaultsKeys.authorized)
}
func applicationDidFinishLaunching() {
self.logManager?.debugMessage("In applicationDidFinishLaunching")
if isAuthorized() && !(self.backgroundService?.isStarted ?? false) {
self.backgroundService?.schedule()
}
}
func handle(_ backgroundTasks: Set<WKRefreshBackgroundTask>) {
self.logManager?.debugMessage("In handle backgroundTasks")
if !isAuthorized() {
return
}
for task in backgroundTasks {
self.logManager?.debugMessage("Processing task: \(task.debugDescription)")
switch task {
case let backgroundTask as WKApplicationRefreshBackgroundTask:
self.backgroundService?.updateContent()
backgroundTask.setTaskCompletedWithSnapshot(false)
case let urlSessionTask as WKURLSessionRefreshBackgroundTask:
self.backgroundService?.handleDownload(urlSessionTask)
default:
task.setTaskCompletedWithSnapshot(false)
}
}
}
}
When the app is launched from background, it'll try to schedule for the first time in applicationDidFinishLaunching.
From my understanding, WKExtension.shared().scheduleBackgroundRefresh will get called in schedule, then after the preferred time WatchOS will call handle with WKApplicationRefreshBackgroundTask task. Then I will use that to schedule a background URL session task and immediately start it as seen in the updateContent method of BackgroundService. After some time, WatchOS will then call ExtensionDelegate's handle method with WKURLSessionRefreshBackgroundTask and I'll handle that using the handleDownload task. In there, I process the response, update the complications, clear the task, and finally schedule a new background refresh.
I've found it works great if I'm actively working on the app or interacting with it in general. But let's say I go to sleep then the next day the complication will not have updated at all.
Ideally, I'd like for it to function as well as the Weather app complication WatchOS has. I don't interact with the complication, but it reliably updates.
Is the above process correct or are there any samples of correct implementations?
Some of the posts I've consulted:
https://wjwickham.com/posts/refreshing-data-in-the-background-on-watchOS/
https://developer.apple.com/documentation/watchkit/background_execution/using_background_tasks
https://spin.atomicobject.com/2021/01/26/complications-basic-functionality/
I want to build a simple metronome app using AVAudioEngine with these features:
Solid timing (I know, I know, I should be using Audio Units, but I'm still struggling with Core Audio stuff / Obj-C wrappers etc.)
Two different sounds on the "1" and on beats "2"/"3"/"4" of the bar.
Some kind of visual feedback (at least a display of the current beat) which needs to be in sync with audio.
So I have created two short click sounds (26ms / 1150 samples # 16 bit / 44,1 kHz / stereo wav files) and load them into 2 buffers. Their lengths will be set to represent one period.
My UI setup is simple: A button to toggle start / pause and a label to display the current beat (my "counter" variable).
When using scheduleBuffer's loop property the timing is okay, but as I need to have 2 different sounds and a way to sync/update my UI while looping the clicks I cannot use this. I figured out to use the completionHandler instead which the restarts my playClickLoop() function - see my code attach below.
Unfortunately while implementing this I didn't really measure the accuracy of the timing. As it now turns out when setting bpm to 120, it plays the loop at only about 117,5 bpm - quite steadily but still way too slow. When bpm is set to 180, my app plays at about 172,3 bpm.
What's going on here? Is this delay introduced by using the completionHandler? Is there any way to improve the timing? Or is my whole approach wrong?
Thanks in advance!
Alex
import UIKit
import AVFoundation
class ViewController: UIViewController {
private let engine = AVAudioEngine()
private let player = AVAudioPlayerNode()
private let fileName1 = "sound1.wav"
private let fileName2 = "sound2.wav"
private var file1: AVAudioFile! = nil
private var file2: AVAudioFile! = nil
private var buffer1: AVAudioPCMBuffer! = nil
private var buffer2: AVAudioPCMBuffer! = nil
private let sampleRate: Double = 44100
private var bpm: Double = 180.0
private var periodLengthInSamples: Double { 60.0 / bpm * sampleRate }
private var counter: Int = 0
private enum MetronomeState {case run; case stop}
private var state: MetronomeState = .stop
#IBOutlet weak var label: UILabel!
override func viewDidLoad() {
super.viewDidLoad()
//
// MARK: Loading buffer1
//
let path1 = Bundle.main.path(forResource: fileName1, ofType: nil)!
let url1 = URL(fileURLWithPath: path1)
do {file1 = try AVAudioFile(forReading: url1)
buffer1 = AVAudioPCMBuffer(
pcmFormat: file1.processingFormat,
frameCapacity: AVAudioFrameCount(periodLengthInSamples))
try file1.read(into: buffer1!)
buffer1.frameLength = AVAudioFrameCount(periodLengthInSamples)
} catch { print("Error loading buffer1 \(error)") }
//
// MARK: Loading buffer2
//
let path2 = Bundle.main.path(forResource: fileName2, ofType: nil)!
let url2 = URL(fileURLWithPath: path2)
do {file2 = try AVAudioFile(forReading: url2)
buffer2 = AVAudioPCMBuffer(
pcmFormat: file2.processingFormat,
frameCapacity: AVAudioFrameCount(periodLengthInSamples))
try file2.read(into: buffer2!)
buffer2.frameLength = AVAudioFrameCount(periodLengthInSamples)
} catch { print("Error loading buffer2 \(error)") }
//
// MARK: Configure + start engine
//
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: file1.processingFormat)
engine.prepare()
do { try engine.start() } catch { print(error) }
}
//
// MARK: Play / Pause toggle action
//
#IBAction func buttonPresed(_ sender: UIButton) {
sender.isSelected = !sender.isSelected
if player.isPlaying {
state = .stop
} else {
state = .run
try! engine.start()
player.play()
playClickLoop()
}
}
private func playClickLoop() {
//
// MARK: Completion handler
//
let scheduleBufferCompletionHandler = { [unowned self] /*(_: AVAudioPlayerNodeCompletionCallbackType)*/ in
DispatchQueue.main.async {
switch state {
case .run:
self.playClickLoop()
case .stop:
engine.stop()
player.stop()
counter = 0
}
}
}
//
// MARK: Schedule buffer + play
//
if engine.isRunning {
counter += 1; if counter > 4 {counter = 1} // Counting from 1 to 4 only
if counter == 1 {
//
// MARK: Playing sound1 on beat 1
//
player.scheduleBuffer(buffer1,
at: nil,
options: [.interruptsAtLoop],
//completionCallbackType: .dataPlayedBack,
completionHandler: scheduleBufferCompletionHandler)
} else {
//
// MARK: Playing sound2 on beats 2, 3 & 4
//
player.scheduleBuffer(buffer2,
at: nil,
options: [.interruptsAtLoop],
//completionCallbackType: .dataRendered,
completionHandler: scheduleBufferCompletionHandler)
}
//
// MARK: Display current beat on UILabel + to console
//
DispatchQueue.main.async {
self.label.text = String(self.counter)
print(self.counter)
}
}
}
}
As Phil Freihofner suggested above, here's the solution to my own problem:
The most important lesson I learned: The completionHandler callback provided by the scheduleBuffer command is not called early enough to trigger re-scheduling of another buffer while the first one is still playing. This will result in (inaudible) gaps between the sounds and mess up the timing. There must already be another buffer "in reserve", i.e. having been schdeduled before the current one has been scheduled.
Using the completionCallbackType parameter of scheduleBuffer didn't change much considering the time of the completion callback: When setting it to .dataRendered or .dataConsumed the callback was already too late to re-schedule another buffer. Using .dataPlayedback made things only worse :-)
So, to achieve seamless playback (with correct timing!) I simply activated a timer that triggers twice per period. All odd numbered timer events will re-schedule another buffer.
Sometimes the solution is so easy it's embarrassing... But sometimes you have to try almost every wrong approach first to find it ;-)
My complete working solution (including the two sound files and the UI) can be found here on GitHub:
https://github.com/Alexander-Nagel/Metronome-using-AVAudioEngine
import UIKit
import AVFoundation
private let DEBUGGING_OUTPUT = true
class ViewController: UIViewController{
private var engine = AVAudioEngine()
private var player = AVAudioPlayerNode()
private var mixer = AVAudioMixerNode()
private let fileName1 = "sound1.wav"
private let fileName2 = "sound2.wav"
private var file1: AVAudioFile! = nil
private var file2: AVAudioFile! = nil
private var buffer1: AVAudioPCMBuffer! = nil
private var buffer2: AVAudioPCMBuffer! = nil
private let sampleRate: Double = 44100
private var bpm: Double = 133.33
private var periodLengthInSamples: Double {
60.0 / bpm * sampleRate
}
private var timerEventCounter: Int = 1
private var currentBeat: Int = 1
private var timer: Timer! = nil
private enum MetronomeState {case running; case stopped}
private var state: MetronomeState = .stopped
#IBOutlet weak var beatLabel: UILabel!
#IBOutlet weak var bpmLabel: UILabel!
#IBOutlet weak var playPauseButton: UIButton!
override func viewDidLoad() {
super.viewDidLoad()
bpmLabel.text = "\(bpm) BPM"
setupAudio()
}
private func setupAudio() {
//
// MARK: Loading buffer1
//
let path1 = Bundle.main.path(forResource: fileName1, ofType: nil)!
let url1 = URL(fileURLWithPath: path1)
do {file1 = try AVAudioFile(forReading: url1)
buffer1 = AVAudioPCMBuffer(
pcmFormat: file1.processingFormat,
frameCapacity: AVAudioFrameCount(periodLengthInSamples))
try file1.read(into: buffer1!)
buffer1.frameLength = AVAudioFrameCount(periodLengthInSamples)
} catch { print("Error loading buffer1 \(error)") }
//
// MARK: Loading buffer2
//
let path2 = Bundle.main.path(forResource: fileName2, ofType: nil)!
let url2 = URL(fileURLWithPath: path2)
do {file2 = try AVAudioFile(forReading: url2)
buffer2 = AVAudioPCMBuffer(
pcmFormat: file2.processingFormat,
frameCapacity: AVAudioFrameCount(periodLengthInSamples))
try file2.read(into: buffer2!)
buffer2.frameLength = AVAudioFrameCount(periodLengthInSamples)
} catch { print("Error loading buffer2 \(error)") }
//
// MARK: Configure + start engine
//
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: file1.processingFormat)
engine.prepare()
do { try engine.start() } catch { print(error) }
}
//
// MARK: Play / Pause toggle action
//
#IBAction func buttonPresed(_ sender: UIButton) {
sender.isSelected = !sender.isSelected
if state == .running {
//
// PAUSE: Stop timer and reset counters
//
state = .stopped
timer.invalidate()
timerEventCounter = 1
currentBeat = 1
} else {
//
// START: Pre-load first sound and start timer
//
state = .running
scheduleFirstBuffer()
startTimer()
}
}
private func startTimer() {
if DEBUGGING_OUTPUT {
print("# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # ")
print()
}
//
// Compute interval for 2 events per period and set up timer
//
let timerIntervallInSamples = 0.5 * self.periodLengthInSamples / sampleRate
timer = Timer.scheduledTimer(withTimeInterval: timerIntervallInSamples, repeats: true) { timer in
//
// Only for debugging: Print counter values at start of timer event
//
// Values at begin of timer event
if DEBUGGING_OUTPUT {
print("timerEvent #\(self.timerEventCounter) at \(self.bpm) BPM")
print("Entering \ttimerEventCounter: \(self.timerEventCounter) \tcurrentBeat: \(self.currentBeat) ")
}
//
// Schedule next buffer at 1st, 3rd, 5th & 7th timerEvent
//
var bufferScheduled: String = "" // only needed for debugging / console output
switch self.timerEventCounter {
case 7:
//
// Schedule main sound
//
self.player.scheduleBuffer(self.buffer1, at:nil, options: [], completionHandler: nil)
bufferScheduled = "buffer1"
case 1, 3, 5:
//
// Schedule subdivision sound
//
self.player.scheduleBuffer(self.buffer2, at:nil, options: [], completionHandler: nil)
bufferScheduled = "buffer2"
default:
bufferScheduled = ""
}
//
// Display current beat & increase currentBeat (1...4) at 2nd, 4th, 6th & 8th timerEvent
//
if self.timerEventCounter % 2 == 0 {
DispatchQueue.main.async {
self.beatLabel.text = String(self.currentBeat)
}
self.currentBeat += 1; if self.currentBeat > 4 {self.currentBeat = 1}
}
//
// Increase timerEventCounter, two events per beat.
//
self.timerEventCounter += 1; if self.timerEventCounter > 8 {self.timerEventCounter = 1}
//
// Only for debugging: Print counter values at end of timer event
//
if DEBUGGING_OUTPUT {
print("Exiting \ttimerEventCounter: \(self.timerEventCounter) \tcurrentBeat: \(self.currentBeat) \tscheduling: \(bufferScheduled)")
print()
}
}
}
private func scheduleFirstBuffer() {
player.stop()
//
// pre-load accented main sound (for beat "1") before trigger starts
//
player.scheduleBuffer(buffer1, at: nil, options: [], completionHandler: nil)
player.play()
beatLabel.text = String(currentBeat)
}
}
Thanks so much for your help everyone! This is a wonderful community.
Alex
How accurate is the tool or process which you are using to get your measure?
I can't tell for sure that your files have the correct number of PCM frames as I am not a C programmer. It looks like data from the wav header is included when you load the files. This makes me wonder if maybe there is some latency incurred with the playbacks while the header information is processed repeatedly at the start of each play or loop.
I had good luck building a metronome in Java by using a plan of continuously outputting an endless stream derived from reading PCM frames. Timing is achieved by counting PCM frames and routing in either silence (PCM datapoint = 0) or the click's PCM data, based on the period of the chosen metronome setting and the length of the click in PCM frames.
I Created a simple audio player/recorder player using AVAudioPlayer and AVAudioRecorder. The audio player works as expected but the audio recorder doesn't seem to record or playback the recording even though it builds successfully without any errors. Im wondering if it needs permission to the mic? I'm using version 11.X The code is below any help is much appreciated.
import UIKit
import AVFoundation //must import AVFoundation
class ViewController: UIViewController {
// create property called audioPlayer equal to player
var audioPlayer : AVAudioPlayer?
var audioRecorder : AVAudioRecorder?
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
//create the audio session
let session = AVAudioSession.sharedInstance()
//set category where we want it to happen (PlayAndRecord) try? = if it doesnt work then set it nil
try? session.setCategory(AVAudioSession.Category.playAndRecord)
//play any audio through device speaker
try? session.overrideOutputAudioPort(.speaker)
// set the session to active, if it doesnt work se it to nil
try? session.setActive(true)
//setup the recording and settings to where audio will be saved (Document folder on device) and set the name and file type (iloveaudio.mp3)
if let basePath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true).first {
let paths = [basePath, "iloveaudioios.mp3"]
if let audioURL = NSURL.fileURL(withPathComponents: paths) {
//create a settings dictionary> orignally it's set to empty using ([:])
var settings : [String : Any] = [:]
//set the file type to mp3
settings[AVFormatIDKey] = Int(kAudioFileMP3Type)
//set the sample rate
settings[AVSampleRateKey] = 44100.0
//set number of channels
settings[AVNumberOfChannelsKey] = 2
// specify the path and settings using var and try to run the recorder if it faile set it to nil
audioRecorder = try? AVAudioRecorder(url: audioURL, settings: settings)
//prepare the recorder of ot fails set it to nil
audioRecorder?.prepareToRecord()
}
}
playerSetup(audioURL: nil)
}
//which file to play if something is recorded
func playerSetup(audioURL:URL?) {
if audioURL == nil {
if let audioPath = Bundle.main.path(forResource: "testaudio", ofType: "mp3") {
let tempAudioURL = URL(fileURLWithPath: audioPath)
audioPlayer = try? AVAudioPlayer(contentsOf: tempAudioURL)
}
} else {
audioPlayer = try? AVAudioPlayer(contentsOf: audioURL!)
}
//check if our audio fille exist? and set it to the proerty audioPath
audioPlayer?.prepareToPlay()
//create audio player, make sure it not nil and if its not the prepare to play, then play
}
#IBAction func PlayPressed(_ sender: Any) {
audioPlayer?.play()
}
#IBAction func Pausedpressed(_ sender: Any) {
audioPlayer?.pause()
}
#IBAction func Stop(_ sender: Any) {
audioPlayer?.stop()
audioPlayer?.currentTime = 0
}
#IBAction func Record(_ sender: Any) {
if let recorder = audioRecorder {
if !recorder.isRecording {
recorder.record()
}
}
}
#IBAction func StopRecord(_ sender: Any) {
if let recorder = audioRecorder {
if recorder.isRecording {
recorder.stop()
playerSetup(audioURL: recorder.url)
}
}
}
}
Yes , You need to add Permission in info.plist
Go to info.plist of your project
Add Privacy - Microphone Usage Description and set it's Value YES(true)
After Setting , when you'll start recording in your app for the first time it will ask for permission , Allow that and you're all set !
I am trying to make a game with background music looping in it. I made the song file in Adobe Audition (which is similar to audacity) and when I play it in a loop in Adobe Audition it loops how I want it.
When I play it in Xcode however, it has a lag in between the loops. I am using AVFoundations for the sound playing.
I have searched everywhere but I can't find the solution for the problem.
Is there any way you can loop audio files without there being any lags in between ? (I believe its called "seamless looping" )
here is the code:
class GameScene: SKScene {
...// Other Code
var ButtonAudio = URL(fileURLWithPath: Bundle.main.path(forResource: "Gamescene(new)", ofType: "mp3")!)
var ButtonAudioPlayer = AVAudioPlayer()
... //Other Code
}
And When I Call it:
override func didMove(to view: SKView) {
...//Code
ButtonAudioPlayer = try! AVAudioPlayer(contentsOf: ButtonAudio, fileTypeHint: nil)
ButtonAudioPlayer.numberOfLoops = -1
ButtonAudioPlayer.prepareToPlay()
ButtonAudioPlayer.play()
...//More Code
}
Can someone help me with this issue ?
Thank you in advance!
You can use AVPlayerLooper and AVQueuePlayer to do this.
import UIKit
import AVFoundation
class ViewController: UIViewController {
var queuePlayer = AVQueuePlayer()
var playerLooper: AVPlayerLooper?
override func viewDidLoad() {
super.viewDidLoad()
guard let url = Bundle.main.url(forResource: "Gamescene(new)", withExtension: "mp3") else { return }
let playerItem = AVPlayerItem(asset: AVAsset(url: url))
playerLooper = AVPlayerLooper(player: queuePlayer, templateItem: playerItem)
queuePlayer.play()
}
}
The solution proposed by #dave234 only works in iOS > 10. Since I needed to make seamless playback in iOS > 9, I did something different:
Instead of AVPlayer, I created AVQueuePlayer and immediately added two identical melodies to the queue.
Next, I made a listener to the penultimate melody.
When the listener was triggered, I added another similar record to the queue after the last one.
In fact, in order to avoid delay, I always play the penultimate record.
My code:
var player: AVQueuePlayer?
override func viewDidLoad() {
super.viewDidLoad()
if let path = Bundle.main.path(forResource: "music_file", ofType: "mp3") {
player = createPlayer(url: URL(fileURLWithPath: path))
}
}
func createPlayer(url: URL) -> AVQueuePlayer {
let player = AVQueuePlayer(items: [AVPlayerItem(url: url), AVPlayerItem(url: url)])
loopPlayer(playerItem: player.items()[player.items().count - 2])
return player
}
func loopPlayer(playerItem: AVPlayerItem) {
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: playerItem, queue: .main) { _ in
if let player = self.player, let url = (playerItem.asset as? AVURLAsset)?.url {
player.insert(AVPlayerItem(url: url), after: player.items()[player.items().count - 1])
self.loopPlayer(playerItem: player.items()[player.items().count - 2])
}
}
}
I have implemented player AVPlayerViewController on tvOS using AVQueuePlayer. The playback of videos works as expected but on all videos following the first the scrubber / time slider shows thumbnails for the first video.
The player is being setup using the following code:
class TFLPlayerController: AVPlayerViewController, AVPlayerViewControllerDelegate {
var queuePlayer = AVQueuePlayer()
...
func addVideoToQueue(video: Video, after: AVPlayerItem?){
let mainVideo = AVPlayerItem(URL: NSURL(string: videoURL)!)
self.queuePlayer.insertItem(mainVideo, afterItem: nil)
}
func readyToPlay() {
self.addVideoToQueue(videoRecord, after: nil)
self.player = self.queuePlayer
if let currentItem = queuePlayer.currentItem {
// Set time to get make a call to the API for the next video
let callUpNextAPITime = CMTimeMakeWithSeconds(0.5, currentItem.asset.duration.timescale)
let callUpNextAPITimeValue = NSValue(CMTime: callUpNextAPITime)
let getUpNextObserver = queuePlayer.addBoundaryTimeObserverForTimes([callUpNextAPITimeValue], queue: dispatch_get_main_queue(), usingBlock: { () -> Void in
self.getUpNext()
})
}
self.player?.play()
}
func getUpNext(){
// Get the next video to play
let playlistId = presentingPlaylist?.identifier ?? nil
if let currentItem = self.queuePlayer.currentItem {
// Get the current items id
let currentVideoId = itemVideo[currentItem]?.identifier
if let currentVideoId = currentVideoId {
// Call API to get the next video to be played from the server
dataLayer.getNextVideo(currentVideoId, playlistIdentifier: playlistId) {
video, error in
...
if let nextVideo = video {
let currentVideo: AVPlayerItem = self.queuePlayer.currentItem!
self.addVideoToQueue(nextVideo, after: currentVideo)
}
}
}
}
}
I have also instantiated the AVQueuePlayer using the code below as I thought there might be an issue with the initial queue setup but the issue remain:
AVQueuePlayer(playerItem: mainVideo)