I'm trying to change the pitch to be more and more angled as the user zooms in, but I have a suspicion it's because the .cameraChanged event is firing fast all the time, so the app crashes on the initial map load. Any suggestions on figuring out how to make this work?
override public func viewDidLoad() {
let myResourceOptions = ResourceOptions(accessToken: "pk.ey###################")
let myCameraOptions = CameraOptions(center: appState.myLocation.coordinate, zoom: 12)
let myMapInitOptions = MapInitOptions(resourceOptions: myResourceOptions, cameraOptions: myCameraOptions, styleURI: StyleURI(url: URL(string: "mapbox://styles/myaccount/myMap")!)!)
let myCameraBoundsOptions = CameraBoundsOptions(maxZoom: 18.5, minZoom: 11, maxPitch: 60, minPitch: 0)
do {
try mapView.mapboxMap.setCameraBounds(with: myCameraBoundsOptions)
mapView.mapboxMap.onNext(event: .mapLoaded) { _ in
// add sources, layers, etc
}
mapView.mapboxMap.onEvery(event: .cameraChanged, handler: { [weak self] _ in
guard let self = self else { return }
let pitch = ((self.mapView.mapboxMap.cameraState.zoom - self.mapView.mapboxMap.cameraBounds.minZoom)/(self.mapView.mapboxMap.cameraBounds.maxZoom - self.mapView.mapboxMap.cameraBounds.minZoom)) * 60
let options = CameraOptions(pitch: pitch)
self.mapView.mapboxMap.setCamera(to: options)
})
}
}
Related
So I am trying to add a multipeer element to this Sticky Note app from Apple's own Sample Code. Link to Sample Code page There are several examples of multipeer ARKit apps but the problem here is, with the app I am working from, the Sticky Note is NOT a 3D element but
For the purposes of this sample app, the sticky note entity has no geometry and thus, no appearance. Its anchor provides a 3D location only, and itʼs the sticky noteʼs screen-space annotation that has an appearance. To display it, you define the sticky noteʼs annotation. Following RealityKitʼs entity-component model, design a component that houses the annotation, which in this case is a view. See ScreenSpaceComponent.
I have been trying to use the example of multipeer apps in ARthat use the ARKit element with 3D elements stored as either assets [the "Collaborative Session" example ] or using ModelEntity geometry [the Creating a Multiuser AR Experience example ] but I haven't been successful in translating this app which uses screen space only.
I am able to get the message on the screen that it's connected to a peer, but that is as far as it goes. It will not render the notes on the second phone. I am burned out from all the attempts of making it work:(
One alternative is to forget about the notes being tethered to the screen space, and recreating this as a regular 3D space and 2D geometry thing using SpriteKit.
The system will not render the apps sticky notes on the other phone. I know there is a way around this, but I have been trying for days and haven't been able to do it.
I have been testing this using 2 phones.
I have
Added the info on the p.list
Added the Multipeer Session file
Added the code on the ViewController file related to multipeer
Added code to the arGestureSetUp() extension file which has the rendering info for the sticky notes.
What works: I can see the notes on both phones, and I get the messages saying that a peer has joined. What I can't do is view the
other user's notes like I would in a regular 3D ARkit app. It will not
render.
This is what I have added to the insertNewSticky function
func insertNewSticky(_ sender: UITapGestureRecognizer)
from one of the other examples:
let anchor = ARAnchor(name: "Anchor for object placement", transform: raycastResult.worldTransform)
arView.session.add(anchor: anchor)
Below is the full code for the Gesture Recognizer Setup
import UIKit
import ARKit
extension ViewController {
// MARK: - Gesture recognizer setup
// - Tag: AddViewTapGesture
func arViewGestureSetup() {
let tapGesture = UITapGestureRecognizer(target: self, action: #selector(tappedOnARView))
arView.addGestureRecognizer(tapGesture)
let swipeGesture = UISwipeGestureRecognizer(target: self, action: #selector(swipedDownOnARView))
swipeGesture.direction = .down
arView.addGestureRecognizer(swipeGesture)
}
func stickyNoteGestureSetup(_ note: StickyNoteEntity) {
let panGesture = UIPanGestureRecognizer(target: self, action: #selector(panOnStickyView))
note.view?.addGestureRecognizer(panGesture)
let tapOnStickyView = UITapGestureRecognizer(target: self, action: #selector(tappedOnStickyView(_:)))
note.view?.addGestureRecognizer(tapOnStickyView)
}
// MARK: - Gesture recognizer callbacks
/// Tap gesture input handler.
/// - Tag: TapHandler
#objc
func tappedOnARView(_ sender: UITapGestureRecognizer) {
// Ignore the tap if the user is editing a sticky note.
for note in stickyNotes where note.isEditing { return }
// Create a new sticky note at the tap location.
insertNewSticky(sender)
}
/**
Hit test the feature point cloud and use any hit as the position of a new StickyNote. Otherwise, display a tip.
- Tag: ScreenSpaceViewInsertionTag
*/
func insertNewSticky(_ sender: UITapGestureRecognizer) {
// Get the user's tap screen location.
let touchLocation = sender.location(in: arView)
// Cast a ray to check for its intersection with any planes.
guard let raycastResult = arView.raycast(from: touchLocation, allowing: .estimatedPlane, alignment: .any).first
else {
messageLabel.displayMessage("No surface detected, try getting closer.", duration: 2.0)
return
}
// Create a new sticky note positioned at the hit test result's world position.
let frame = CGRect(origin: touchLocation, size: CGSize(width: 200, height: 200))
let note = StickyNoteEntity(frame: frame, worldTransform: raycastResult.worldTransform)
// Center the sticky note's view on the tap's screen location.
note.setPositionCenter(touchLocation)
// Add the sticky note to the scene's entity hierarchy.
arView.scene.addAnchor(note)
// Add the sticky note's view to the view hierarchy.
guard let stickyView = note.view else { return }
arView.insertSubview(stickyView, belowSubview: trashZone)
// Enable gestures on the sticky note.
stickyNoteGestureSetup(note)
// Save a reference to the sticky note.
stickyNotes.append(note)
// Volunteer to handle text view callbacks.
stickyView.textView.delegate = self
let anchor = ARAnchor(name: "Anchor for object placement", transform: raycastResult.worldTransform)
arView.session.add(anchor: anchor)
}
/// Dismisses the keyboard.
#objc
func swipedDownOnARView(_ sender: UISwipeGestureRecognizer) {
dismissKeyboard()
}
fileprivate func dismissKeyboard() {
for note in stickyNotes {
guard let textView = note.view?.textView else { continue }
if textView.isFirstResponder {
textView.resignFirstResponder()
return
}
}
}
#objc
func tappedOnStickyView(_ sender: UITapGestureRecognizer) {
guard let stickyView = sender.view as? StickyNoteView else { return }
stickyView.textView.becomeFirstResponder()
}
//- Tag: PanOnStickyView
fileprivate func panStickyNote(_ sender: UIPanGestureRecognizer, _ stickyView: StickyNoteView, _ panLocation: CGPoint) {
messageLabel.isHidden = true
let feedbackGenerator = UIImpactFeedbackGenerator()
switch sender.state {
case .began:
// Prepare the taptic engine to reduce latency in delivering feedback.
feedbackGenerator.prepare()
// Drag if the gesture is beginning.
stickyView.stickyNote.isDragging = true
// Save offsets to implement smooth panning.
guard let frame = sender.view?.frame else { return }
stickyView.xOffset = panLocation.x - frame.origin.x
stickyView.yOffset = panLocation.y - frame.origin.y
// Fade in the widget that's used to delete sticky notes.
trashZone.fadeIn(duration: 0.4)
case .ended:
// Stop dragging if the gesture is ending.
stickyView.stickyNote.isDragging = false
// Delete the sticky note if the gesture ended on the trash widget.
if stickyView.isInTrashZone {
deleteStickyNote(stickyView.stickyNote)
// ...
} else {
attemptRepositioning(stickyView)
}
// Fades out the widget that's used to delete sticky notes when there are no sticky notes currently being dragged.
if !stickyNotes.contains(where: { $0.isDragging }) {
trashZone.fadeOut(duration: 0.2)
}
default:
// Update the sticky note's screen position based on the pan location, and initial offset.
stickyView.frame.origin.x = panLocation.x - stickyView.xOffset
stickyView.frame.origin.y = panLocation.y - stickyView.yOffset
// Give feedback whenever the pan location is near the widget used to delete sticky notes.
trashZoneThresholdFeedback(sender, feedbackGenerator)
}
}
/// Sticky note pan-gesture handler.
/// - Tag: PanHandler
#objc
func panOnStickyView(_ sender: UIPanGestureRecognizer) {
guard let stickyView = sender.view as? StickyNoteView else { return }
let panLocation = sender.location(in: arView)
// Ignore the pan if any StickyViews are being edited.
for note in stickyNotes where note.isEditing { return }
panStickyNote(sender, stickyView, panLocation)
}
func deleteStickyNote(_ note: StickyNoteEntity) {
guard let index = stickyNotes.firstIndex(of: note) else { return }
note.removeFromParent()
stickyNotes.remove(at: index)
note.view?.removeFromSuperview()
note.view?.isInTrashZone = false
}
/// - Tag: AttemptRepositioning
fileprivate func attemptRepositioning(_ stickyView: StickyNoteView) {
// Conducts a ray-cast for feature points using the panned position of the StickyNoteView
let point = CGPoint(x: stickyView.frame.midX, y: stickyView.frame.midY)
if let result = arView.raycast(from: point, allowing: .estimatedPlane, alignment: .any).first {
stickyView.stickyNote.transform.matrix = result.worldTransform
} else {
messageLabel.displayMessage("No surface detected, unable to reposition note.", duration: 2.0)
stickyView.stickyNote.shouldAnimate = true
}
}
fileprivate func trashZoneThresholdFeedback(_ sender: UIPanGestureRecognizer, _ feedbackGenerator: UIImpactFeedbackGenerator) {
guard let stickyView = sender.view as? StickyNoteView else { return }
let panLocation = sender.location(in: trashZone)
if trashZone.frame.contains(panLocation), !stickyView.isInTrashZone {
stickyView.isInTrashZone = true
feedbackGenerator.impactOccurred()
} else if !trashZone.frame.contains(panLocation), stickyView.isInTrashZone {
stickyView.isInTrashZone = false
feedbackGenerator.impactOccurred()
}
}
#objc
func tappedReset(_ sender: UIButton) {
reset()
}
}
and this is the full code for the ViewController file
/*
See LICENSE folder for this sample’s licensing information.
Abstract:
Main view controller for the AR experience.
*/
import UIKit
import RealityKit
import Combine
import ARKit
import MultipeerConnectivity
class ViewController: UIViewController, ARSessionDelegate {
// MARK: - Class variable declarations
#IBOutlet var arView: ARView!
#IBOutlet weak var messageLabel: MessageLabel!
var trashZone: GradientView!
var shadeView: UIView!
var resetButton: UIButton!
var keyboardHeight: CGFloat!
var stickyNotes = [StickyNoteEntity]()
var subscription: Cancellable!
//added Sat May 28 5:12pm
var multipeerSession: MultipeerSession?
// end of added Sat May 28 5:12pm
//added Sat May 28 5:12pm
// A dictionary to map MultiPeer IDs to ARSession ID's.
// This is useful for keeping track of which peer created which ARAnchors.
var peerSessionIDs = [MCPeerID: String]()
var sessionIDObservation: NSKeyValueObservation?
var configuration: ARWorldTrackingConfiguration?
// end of added Sat May 28 5:12pm
// MARK: - View Controller Life Cycle
override func viewDidLoad() {
super.viewDidLoad()
subscription = arView.scene.subscribe(to: SceneEvents.Update.self) { [unowned self] in
self.updateScene(on: $0)
}
arViewGestureSetup()
overlayUISetup()
arView.session.delegate = self
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
// Add observer to the keyboardWillShowNotification to get the height of the keyboard every time it is shown
let notificationName = UIResponder.keyboardWillShowNotification
let selector = #selector(keyboardIsPoppingUp(notification:))
NotificationCenter.default.addObserver(self, selector: selector, name: notificationName, object: nil)
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
arView.session.delegate = self
// Prevent the screen from being dimmed to avoid interuppting the AR experience.
UIApplication.shared.isIdleTimerDisabled = true
// Turn off ARView's automatically-configured session
// to create and set up your own configuration.
arView.automaticallyConfigureSession = false
configuration = ARWorldTrackingConfiguration()
// Enable a collaborative session.
configuration?.isCollaborationEnabled = true
// Enable realistic reflections.
configuration?.environmentTexturing = .automatic
// Begin the session.
arView.session.run(configuration!)
// Use key-value observation to monitor your ARSession's identifier.
sessionIDObservation = observe(\.arView.session.identifier, options: [.new]) { object, change in
print("SessionID changed to: \(change.newValue!)")
// Tell all other peers about your ARSession's changed ID, so
// that they can keep track of which ARAnchors are yours.
guard let multipeerSession = self.multipeerSession else { return }
self.sendARSessionIDTo(peers: multipeerSession.connectedPeers)
}
// Start looking for other players via MultiPeerConnectivity.
multipeerSession = MultipeerSession(receivedDataHandler: receivedData, peerJoinedHandler:
peerJoined, peerLeftHandler: peerLeft, peerDiscoveredHandler: peerDiscovered)
//arView.addGestureRecognizer(UITapGestureRecognizer(target: self, action: #selector(handleTap(recognizer:))))
messageLabel.displayMessage("Tap the screen to place cubes.\nInvite others to launch this app to join you.", duration: 60.0)
}
//peerDiscovered
func peerDiscovered(_ peer: MCPeerID) -> Bool {
guard let multipeerSession = multipeerSession else { return false }
if multipeerSession.connectedPeers.count > 3 {
// Do not accept more than four users in the experience.
messageLabel.displayMessage("A fifth peer wants to join the experience.\nThis app is limited to four users.", duration: 6.0)
return false
} else {
return true
}
}
// end of added Sat May 28 5:12pm
/// - Tag: PeerJoined
// added Sat May 28 5:12pm
func peerJoined(_ peer: MCPeerID) {
messageLabel.displayMessage("""
A peer has joined the experience.
Hold the phones next to each other.
""", duration: 6.0)
// Provide your session ID to the new user so they can keep track of your anchors.
sendARSessionIDTo(peers: [peer])
}
// end of added Sat May 28 5:12pm
// added Sat May 28 5:12pm
func peerLeft(_ peer: MCPeerID) {
messageLabel.displayMessage("A peer has left the shared experience.")
// Remove all ARAnchors associated with the peer that just left the experience.
if let sessionID = peerSessionIDs[peer] {
removeAllAnchorsOriginatingFromARSessionWithID(sessionID)
peerSessionIDs.removeValue(forKey: peer)
}
}
// end of added Sat May 28 5:12pm
//added Sat May 28 5:12pm
func receivedData(_ data: Data, from peer: MCPeerID) {
if let collaborationData = try? NSKeyedUnarchiver.unarchivedObject(ofClass: ARSession.CollaborationData.self, from: data) {
arView.session.update(with: collaborationData)
return
}
// ...
let sessionIDCommandString = "SessionID:"
if let commandString = String(data: data, encoding: .utf8), commandString.starts(with: sessionIDCommandString) {
let newSessionID = String(commandString[commandString.index(commandString.startIndex,
offsetBy: sessionIDCommandString.count)...])
// If this peer was using a different session ID before, remove all its associated anchors.
// This will remove the old participant anchor and its geometry from the scene.
if let oldSessionID = peerSessionIDs[peer] {
removeAllAnchorsOriginatingFromARSessionWithID(oldSessionID)
}
peerSessionIDs[peer] = newSessionID
}
}
// end of added Sat May 28 5:12pm
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
messageLabel.displayMessage("Established joint experience with a peer.")
// ...
}
func updateScene(on event: SceneEvents.Update) {
let notesToUpdate = stickyNotes.compactMap { !$0.isEditing && !$0.isDragging ? $0 : nil }
for note in notesToUpdate {
// Gets the 2D screen point of the 3D world point.
guard let projectedPoint = arView.project(note.position) else { return }
// Calculates whether the note can be currently visible by the camera.
let cameraForward = arView.cameraTransform.matrix.columns.2.xyz
let cameraToWorldPointDirection = normalize(note.transform.translation - arView.cameraTransform.translation)
let dotProduct = dot(cameraForward, cameraToWorldPointDirection)
let isVisible = dotProduct < 0
// Updates the screen position of the note based on its visibility
note.projection = Projection(projectedPoint: projectedPoint, isVisible: isVisible)
note.updateScreenPosition()
}
}
func reset() {
guard let configuration = arView.session.configuration else { return }
arView.session.run(configuration, options: .removeExistingAnchors)
for note in stickyNotes {
deleteStickyNote(note)
}
}
func session(_ session: ARSession, didFailWithError error: Error) {
guard error is ARError else { return }
let errorWithInfo = error as NSError
let messages = [
errorWithInfo.localizedDescription,
errorWithInfo.localizedFailureReason,
errorWithInfo.localizedRecoverySuggestion
]
let errorMessage = messages.compactMap({ $0 }).joined(separator: "\n")
DispatchQueue.main.async {
// Present an alert informing about the error that has occurred.
let alertController = UIAlertController(title: "The AR session failed.", message: errorMessage, preferredStyle: .alert)
let restartAction = UIAlertAction(title: "Restart Session", style: .default) { _ in
alertController.dismiss(animated: true, completion: nil)
self.reset()
}
alertController.addAction(restartAction)
self.present(alertController, animated: true, completion: nil)
}
}
override var prefersStatusBarHidden: Bool {
return true
}
override var prefersHomeIndicatorAutoHidden: Bool {
return true
}
private func sendARSessionIDTo(peers: [MCPeerID]) {
guard let multipeerSession = multipeerSession else { return }
let idString = arView.session.identifier.uuidString
let command = "SessionID:" + idString
if let commandData = command.data(using: .utf8) {
multipeerSession.sendToPeers(commandData, reliably: true, peers: peers)
}
}
private func removeAllAnchorsOriginatingFromARSessionWithID(_ identifier: String) {
guard let frame = arView.session.currentFrame else { return }
for anchor in frame.anchors {
guard let anchorSessionID = anchor.sessionIdentifier else { continue }
if anchorSessionID.uuidString == identifier {
arView.session.remove(anchor: anchor)
}
}
}
}
Update: I spoke to a Staff Engineer on Apple's RealityKit team who explained to me that what I was trying to accomplish is not feasible because the note had an embedded subclass that is not 'codable' as per Swift's Codable Protocol
I will have to rebuild the note differently than the example i had been working with to ensure it fits within the Codable protocol which will then ensure the data can travel across the network via Multipeer Connectivity Framework.
I want to build a simple metronome app using AVAudioEngine with these features:
Solid timing (I know, I know, I should be using Audio Units, but I'm still struggling with Core Audio stuff / Obj-C wrappers etc.)
Two different sounds on the "1" and on beats "2"/"3"/"4" of the bar.
Some kind of visual feedback (at least a display of the current beat) which needs to be in sync with audio.
So I have created two short click sounds (26ms / 1150 samples # 16 bit / 44,1 kHz / stereo wav files) and load them into 2 buffers. Their lengths will be set to represent one period.
My UI setup is simple: A button to toggle start / pause and a label to display the current beat (my "counter" variable).
When using scheduleBuffer's loop property the timing is okay, but as I need to have 2 different sounds and a way to sync/update my UI while looping the clicks I cannot use this. I figured out to use the completionHandler instead which the restarts my playClickLoop() function - see my code attach below.
Unfortunately while implementing this I didn't really measure the accuracy of the timing. As it now turns out when setting bpm to 120, it plays the loop at only about 117,5 bpm - quite steadily but still way too slow. When bpm is set to 180, my app plays at about 172,3 bpm.
What's going on here? Is this delay introduced by using the completionHandler? Is there any way to improve the timing? Or is my whole approach wrong?
Thanks in advance!
Alex
import UIKit
import AVFoundation
class ViewController: UIViewController {
private let engine = AVAudioEngine()
private let player = AVAudioPlayerNode()
private let fileName1 = "sound1.wav"
private let fileName2 = "sound2.wav"
private var file1: AVAudioFile! = nil
private var file2: AVAudioFile! = nil
private var buffer1: AVAudioPCMBuffer! = nil
private var buffer2: AVAudioPCMBuffer! = nil
private let sampleRate: Double = 44100
private var bpm: Double = 180.0
private var periodLengthInSamples: Double { 60.0 / bpm * sampleRate }
private var counter: Int = 0
private enum MetronomeState {case run; case stop}
private var state: MetronomeState = .stop
#IBOutlet weak var label: UILabel!
override func viewDidLoad() {
super.viewDidLoad()
//
// MARK: Loading buffer1
//
let path1 = Bundle.main.path(forResource: fileName1, ofType: nil)!
let url1 = URL(fileURLWithPath: path1)
do {file1 = try AVAudioFile(forReading: url1)
buffer1 = AVAudioPCMBuffer(
pcmFormat: file1.processingFormat,
frameCapacity: AVAudioFrameCount(periodLengthInSamples))
try file1.read(into: buffer1!)
buffer1.frameLength = AVAudioFrameCount(periodLengthInSamples)
} catch { print("Error loading buffer1 \(error)") }
//
// MARK: Loading buffer2
//
let path2 = Bundle.main.path(forResource: fileName2, ofType: nil)!
let url2 = URL(fileURLWithPath: path2)
do {file2 = try AVAudioFile(forReading: url2)
buffer2 = AVAudioPCMBuffer(
pcmFormat: file2.processingFormat,
frameCapacity: AVAudioFrameCount(periodLengthInSamples))
try file2.read(into: buffer2!)
buffer2.frameLength = AVAudioFrameCount(periodLengthInSamples)
} catch { print("Error loading buffer2 \(error)") }
//
// MARK: Configure + start engine
//
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: file1.processingFormat)
engine.prepare()
do { try engine.start() } catch { print(error) }
}
//
// MARK: Play / Pause toggle action
//
#IBAction func buttonPresed(_ sender: UIButton) {
sender.isSelected = !sender.isSelected
if player.isPlaying {
state = .stop
} else {
state = .run
try! engine.start()
player.play()
playClickLoop()
}
}
private func playClickLoop() {
//
// MARK: Completion handler
//
let scheduleBufferCompletionHandler = { [unowned self] /*(_: AVAudioPlayerNodeCompletionCallbackType)*/ in
DispatchQueue.main.async {
switch state {
case .run:
self.playClickLoop()
case .stop:
engine.stop()
player.stop()
counter = 0
}
}
}
//
// MARK: Schedule buffer + play
//
if engine.isRunning {
counter += 1; if counter > 4 {counter = 1} // Counting from 1 to 4 only
if counter == 1 {
//
// MARK: Playing sound1 on beat 1
//
player.scheduleBuffer(buffer1,
at: nil,
options: [.interruptsAtLoop],
//completionCallbackType: .dataPlayedBack,
completionHandler: scheduleBufferCompletionHandler)
} else {
//
// MARK: Playing sound2 on beats 2, 3 & 4
//
player.scheduleBuffer(buffer2,
at: nil,
options: [.interruptsAtLoop],
//completionCallbackType: .dataRendered,
completionHandler: scheduleBufferCompletionHandler)
}
//
// MARK: Display current beat on UILabel + to console
//
DispatchQueue.main.async {
self.label.text = String(self.counter)
print(self.counter)
}
}
}
}
As Phil Freihofner suggested above, here's the solution to my own problem:
The most important lesson I learned: The completionHandler callback provided by the scheduleBuffer command is not called early enough to trigger re-scheduling of another buffer while the first one is still playing. This will result in (inaudible) gaps between the sounds and mess up the timing. There must already be another buffer "in reserve", i.e. having been schdeduled before the current one has been scheduled.
Using the completionCallbackType parameter of scheduleBuffer didn't change much considering the time of the completion callback: When setting it to .dataRendered or .dataConsumed the callback was already too late to re-schedule another buffer. Using .dataPlayedback made things only worse :-)
So, to achieve seamless playback (with correct timing!) I simply activated a timer that triggers twice per period. All odd numbered timer events will re-schedule another buffer.
Sometimes the solution is so easy it's embarrassing... But sometimes you have to try almost every wrong approach first to find it ;-)
My complete working solution (including the two sound files and the UI) can be found here on GitHub:
https://github.com/Alexander-Nagel/Metronome-using-AVAudioEngine
import UIKit
import AVFoundation
private let DEBUGGING_OUTPUT = true
class ViewController: UIViewController{
private var engine = AVAudioEngine()
private var player = AVAudioPlayerNode()
private var mixer = AVAudioMixerNode()
private let fileName1 = "sound1.wav"
private let fileName2 = "sound2.wav"
private var file1: AVAudioFile! = nil
private var file2: AVAudioFile! = nil
private var buffer1: AVAudioPCMBuffer! = nil
private var buffer2: AVAudioPCMBuffer! = nil
private let sampleRate: Double = 44100
private var bpm: Double = 133.33
private var periodLengthInSamples: Double {
60.0 / bpm * sampleRate
}
private var timerEventCounter: Int = 1
private var currentBeat: Int = 1
private var timer: Timer! = nil
private enum MetronomeState {case running; case stopped}
private var state: MetronomeState = .stopped
#IBOutlet weak var beatLabel: UILabel!
#IBOutlet weak var bpmLabel: UILabel!
#IBOutlet weak var playPauseButton: UIButton!
override func viewDidLoad() {
super.viewDidLoad()
bpmLabel.text = "\(bpm) BPM"
setupAudio()
}
private func setupAudio() {
//
// MARK: Loading buffer1
//
let path1 = Bundle.main.path(forResource: fileName1, ofType: nil)!
let url1 = URL(fileURLWithPath: path1)
do {file1 = try AVAudioFile(forReading: url1)
buffer1 = AVAudioPCMBuffer(
pcmFormat: file1.processingFormat,
frameCapacity: AVAudioFrameCount(periodLengthInSamples))
try file1.read(into: buffer1!)
buffer1.frameLength = AVAudioFrameCount(periodLengthInSamples)
} catch { print("Error loading buffer1 \(error)") }
//
// MARK: Loading buffer2
//
let path2 = Bundle.main.path(forResource: fileName2, ofType: nil)!
let url2 = URL(fileURLWithPath: path2)
do {file2 = try AVAudioFile(forReading: url2)
buffer2 = AVAudioPCMBuffer(
pcmFormat: file2.processingFormat,
frameCapacity: AVAudioFrameCount(periodLengthInSamples))
try file2.read(into: buffer2!)
buffer2.frameLength = AVAudioFrameCount(periodLengthInSamples)
} catch { print("Error loading buffer2 \(error)") }
//
// MARK: Configure + start engine
//
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: file1.processingFormat)
engine.prepare()
do { try engine.start() } catch { print(error) }
}
//
// MARK: Play / Pause toggle action
//
#IBAction func buttonPresed(_ sender: UIButton) {
sender.isSelected = !sender.isSelected
if state == .running {
//
// PAUSE: Stop timer and reset counters
//
state = .stopped
timer.invalidate()
timerEventCounter = 1
currentBeat = 1
} else {
//
// START: Pre-load first sound and start timer
//
state = .running
scheduleFirstBuffer()
startTimer()
}
}
private func startTimer() {
if DEBUGGING_OUTPUT {
print("# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # ")
print()
}
//
// Compute interval for 2 events per period and set up timer
//
let timerIntervallInSamples = 0.5 * self.periodLengthInSamples / sampleRate
timer = Timer.scheduledTimer(withTimeInterval: timerIntervallInSamples, repeats: true) { timer in
//
// Only for debugging: Print counter values at start of timer event
//
// Values at begin of timer event
if DEBUGGING_OUTPUT {
print("timerEvent #\(self.timerEventCounter) at \(self.bpm) BPM")
print("Entering \ttimerEventCounter: \(self.timerEventCounter) \tcurrentBeat: \(self.currentBeat) ")
}
//
// Schedule next buffer at 1st, 3rd, 5th & 7th timerEvent
//
var bufferScheduled: String = "" // only needed for debugging / console output
switch self.timerEventCounter {
case 7:
//
// Schedule main sound
//
self.player.scheduleBuffer(self.buffer1, at:nil, options: [], completionHandler: nil)
bufferScheduled = "buffer1"
case 1, 3, 5:
//
// Schedule subdivision sound
//
self.player.scheduleBuffer(self.buffer2, at:nil, options: [], completionHandler: nil)
bufferScheduled = "buffer2"
default:
bufferScheduled = ""
}
//
// Display current beat & increase currentBeat (1...4) at 2nd, 4th, 6th & 8th timerEvent
//
if self.timerEventCounter % 2 == 0 {
DispatchQueue.main.async {
self.beatLabel.text = String(self.currentBeat)
}
self.currentBeat += 1; if self.currentBeat > 4 {self.currentBeat = 1}
}
//
// Increase timerEventCounter, two events per beat.
//
self.timerEventCounter += 1; if self.timerEventCounter > 8 {self.timerEventCounter = 1}
//
// Only for debugging: Print counter values at end of timer event
//
if DEBUGGING_OUTPUT {
print("Exiting \ttimerEventCounter: \(self.timerEventCounter) \tcurrentBeat: \(self.currentBeat) \tscheduling: \(bufferScheduled)")
print()
}
}
}
private func scheduleFirstBuffer() {
player.stop()
//
// pre-load accented main sound (for beat "1") before trigger starts
//
player.scheduleBuffer(buffer1, at: nil, options: [], completionHandler: nil)
player.play()
beatLabel.text = String(currentBeat)
}
}
Thanks so much for your help everyone! This is a wonderful community.
Alex
How accurate is the tool or process which you are using to get your measure?
I can't tell for sure that your files have the correct number of PCM frames as I am not a C programmer. It looks like data from the wav header is included when you load the files. This makes me wonder if maybe there is some latency incurred with the playbacks while the header information is processed repeatedly at the start of each play or loop.
I had good luck building a metronome in Java by using a plan of continuously outputting an endless stream derived from reading PCM frames. Timing is achieved by counting PCM frames and routing in either silence (PCM datapoint = 0) or the click's PCM data, based on the period of the chosen metronome setting and the length of the click in PCM frames.
Is it possible to continue playing sound after a route change (such as switching speakers or plugging in/unplugging headphones) without reloading the sample again in Sampler?
This is the only approach I have found that works:
#objc dynamic private func audioRouteChangeListener(notification:NSNotification) {
let audioRouteChangeReason = notification.userInfo![AVAudioSessionRouteChangeReasonKey] as! UInt
switch audioRouteChangeReason {
case AVAudioSession.RouteChangeReason.newDeviceAvailable.rawValue:
print("headphone plugged in")
DispatchQueue.global().async { [weak self] in
// Note the reload here
self?.conductor?.loadSamples(byIndex: 2)
}
case AVAudioSession.RouteChangeReason.oldDeviceUnavailable.rawValue:
print("headphone pulled out")
DispatchQueue.global().async { [weak self] in
// Note the reload here
self?.conductor?.loadSamples(byIndex: 2)
}
default:
break
}
}
And in the Conductor class, where I load .wv samples in .sfz:
func loadSamples(byIndex: Int) {
sampler1.unloadAllSamples()
if byIndex < 0 || byIndex > 3 { return }
let info = ProcessInfo.processInfo
let begin = info.systemUptime
let sfzFiles = [ "City Piano.sfz", "ShortPiano.sfz", "LongPiano.sfz" ]
sampler1.loadSfzWithEmbeddedSpacesInSampleNames(folderPath: Bundle.main.bundleURL.appendingPathComponent("Sounds/sfz").path,
sfzFileName: sfzFiles[byIndex])
let elapsedTime = info.systemUptime - begin
AKLog("Time to load samples \(elapsedTime) seconds")
}
I have working an application where we need to open certain screen based on voice command like if user says "Open Setting" then it should open the setting screen, so far that I have used the SpeechKit framework but I am not able to detect the end of speech silence. Like how Siri does it. I want to detect if the user has ended his sentence/phrase.
Please find the below code for same where I have integrate the SpeechKit framework in two ways.
A) Via closure(recognitionTask(with request: SFSpeechRecognitionRequest, resultHandler: #escaping (SFSpeechRecognitionResult?, Error?) -> Swift.Void) -> SFSpeechRecognitionTask)
let audioEngine = AVAudioEngine()
let speechRecognizer = SFSpeechRecognizer()
let request = SFSpeechAudioBufferRecognitionRequest()
var recognitionTask: SFSpeechRecognitionTask?
func startRecording() throws {
let node = audioEngine.inputNode
let recordingFormat = node.outputFormat(forBus: 0)
node.installTap(onBus: 0, bufferSize: 1024,
format: recordingFormat) { [unowned self]
(buffer, _) in
self.request.append(buffer)
}
audioEngine.prepare()
try audioEngine.start()
weak var weakSelf = self
recognitionTask = speechRecognizer?.recognitionTask(with: request) {
(result, error) in
if result != nil {
if let transcription = result?.bestTranscription {
weakSelf?.idenifyVoiceCommand(transcription)
}
}
}
}
But when I say any word/sentence like "Open Setting" then closure(recognitionTask(with:)) called multiple times and I have put the method(idenifyVoiceCommand) inside the closure which call multiple times, so how can I restrict to call only one time.
And I also review the Timer logic while googling it(SFSpeechRecognizer - detect end of utterance) but in my scenarion it does not work beacause I did not stop the audio engine as it continuously listening the user’s voice like Siri does.
B) Via delegate(SFSpeechRecognitionTaskDelegate)
speechRecognizer.recognitionTask(with: self.request, delegate: self)
func speechRecognitionTaskWasCancelled(_ task: SFSpeechRecognitionTask) {
}
func speechRecognitionTask(_ task: SFSpeechRecognitionTask, didFinishSuccessfully successfully: Bool) {
}
And I found that the delegate which handle when the end of speech occurs do not call it and accidentally call it after sometimes.
I had the same issue until now.
I checked your question and I suppose the code below helps you achieve the same thing I did:
recognitionTask = speechRecognizer?.recognitionTask(with: recognitionRequest,
resultHandler: { (result, error) in
var isFinal = false
if result != nil {
self.inputTextView.text = result?.bestTranscription.formattedString
isFinal = (result?.isFinal)!
}
if let timer = self.detectionTimer, timer.isValid {
if isFinal {
self.inputTextView.text = ""
self.textViewDidChange(self.inputTextView)
self.detectionTimer?.invalidate()
}
} else {
self.detectionTimer = Timer.scheduledTimer(withTimeInterval: 1.5, repeats: false, block: { (timer) in
self.handleSend()
isFinal = true
timer.invalidate()
})
}
})
This checks if input wasn't received for 1.5 seconds
To your speech recogniser class add:
private var timer : Timer?
And modify code here:
recognitionTask = speechRecognizer.recognitionTask(with: request) { (result, error) in
self.timer?.invalidate()
self.timer = Timer.scheduledTimer(withTimeInterval: 1.5, repeats:false) { _ in
self.timer = nil
//do here what do you want to do, when detect pause more than 1.5 sec
}
if result != nil {
Have following code which works pretty as expected besides a little freeze just before first airplane animation begins. And my code is:
enum TurnDirection: String {
case left
case right
case none
}
extension GameScene {
fileprivate func turnPlayerPlane(direction: TurnDirection) {
let forwardAction = SKAction.animate(with: textureArray.reversed(), timePerFrame: 0.05, resize: true, restore: false)
player.run(forwardAction) { [unowned self] in
self.stillTurning = false
}
}
}
class GameScene: SKScene {
let textureArray = [SKTexture(imageNamed: "01"),
SKTexture(imageNamed: "02")]
let motionManager = CMMotionManager()
var xAcceleration: CGFloat = 0
var player: SKSpriteNode!
var stillTurning = false
override func didMove(to view: SKView) {
performPlaneFly()
}
fileprivate func performPlaneFly() {
let planeWaitAction = SKAction.wait(forDuration: 1)
let planeDirectionAction = SKAction.run {
self.playerDirectionCheck()
}
let planeSequence = SKAction.sequence([planeWaitAction, planeDirectionAction])
let planeSequenceForever = SKAction.repeatForever(planeSequence)
player.run(planeSequenceForever)
}
fileprivate func playerDirectionCheck() {
turnPlayerPlane(direction: .right)
}
I commented out different parts of the code and it is seems to be turnPlayerPlane function but really do not understand what is going on there so heavy to perform before first animation begins.
Can someone explain what I did wrong there?
Thank you!
Having sit like for 10 hours with this freezes at the beginning I've found solution. And as usual solution is not as difficult as it seemed to be. So freezes came up because of performance drop because of first render of all textures. After textures are rendered as I said freezes are gone.
So after I rewrote a bit my code I've made separate file private method which I call first in didMove:to view to render all textures before their use.:
fileprivate func planeAnimationFillArrays() {
SKTextureAtlas.preloadTextureAtlases([SKTextureAtlas(named: "PlayerPlane")]) { [unowned self] in
self.leftTextureArrayAnimation = {
var array = [SKTexture]()
for i in stride(from: 10, through: 1, by: -1) {
let number = String(format: "%02d", i)
let texture = SKTexture(imageNamed: "\(number)")
array.append(texture)
}
SKTexture.preload(array, withCompletionHandler: {
print("preload is done")
})
return array
}()
self.rightTextureArrayAnimation = {
var array = [SKTexture]()
for i in stride(from: 10, through: 20, by: 1) {
let number = String(format: "%02d", i)
let texture = SKTexture(imageNamed: "\(number)")
array.append(texture)
}
SKTexture.preload(array, withCompletionHandler: {
print("preload is done")
})
return array
}()
}
}
Before all I used method that rendered my spriteAtlas SKTextureAtlas.preloadTextureAtlases([SKTextureAtlas(named: "XXX")]) and in completion handler of this method I used SKTexture.preload([SKTextures], completionHandler) before each animation array return. After all this little freezes are completely gone.
Now this code has some duplication but it seems to me it is easier to understand what it is done here. Hope someone found this solution helpful.
You know how to reduce this code please fill free to edit or post your own solution.
Thank you!