CIDetector face detection in real-time However memory consumption increases linearly how to avoid this issue? - swift

I have a question how to correctly call CIDetector correctly I'm trying to run the face detection in real-time this works very well. However the memory consumption of the app increases linearly with time how you can see in the image below I'm thinking this is due to objects being created but they're not released can anyone advise how to do it correctly.
I have pinpointed the issue down to this function as every time it's invoked memory increases linearly when it terminated it quickly drops down to almost 80 MB instead of 11 GB rising also check for memory leaks however none were found.
My target development platform is Mac OS I'm trying to extractthe mouth position from the CA detector and then use it to compute a Delta in the mouse function for a Game.
I also Looked that this post however I have tried their approach but it did not work for me
CIDetector isn't releasing memory
fileprivate func faceDetection(){
// setting up dispatchQueue
dispatchQueue.async {
// checking if sample buffer is equal to nil if not assign its value to sample
if let sample = self.sampleBuffers {
// if allfeatures is not equal to nil. if yes assign allfeatures to features otherwise return
guard let features = self.allFeatures(sample: sample) else { return }
// loop to cycle through all features
for feature in features {
// checks if the feature is a CIFaceFeature if yes assign feature to face feature and go on.
if let faceFeature = feature as? CIFaceFeature {
if !self.hasEnded {
if self.calX.count > 30 {
self.sens.append((self.calX.max()! - self.calX.min()!))
self.sens.append((self.calY.max()! - self.calY.min()!))
print((self.calX.max()! - self.calX.min()!))
self.hasEnded = true
} else {
self.calX.append(faceFeature.mouthPosition.x)
self.calY.append(faceFeature.mouthPosition.y)
}
} else {
self.mouse(position: CGPoint(x: (faceFeature.mouthPosition.x - 300 ) * 2, y: (faceFeature.mouthPosition.y + 20 ) * 2), faceFeature: faceFeature)
}
}
}
}
if !self.faceTrackingEnds {
self.faceDetection()
}
}
}

This problem was caused by repeatedly calling the function without waiting for its completion the fix was implementing a dispatch group and then calling the function on its completion
like this Now the CIdetector runs comfortably at 200 MB memory
fileprivate func faceDetection(){
let group = DispatchGroup()
group.enter()
// setting up dispatchQueue
dispatchQueue.async {
// checking if sample buffer is equal to nil if not assign its value to sample
if let sample = self.sampleBuffers {
// if allfeatures is not equal to nil. if yes assign allfeatures to features otherwise return
guard let features = self.allFeatures(sample: sample) else { return }
// loop to cycle through all features
for feature in features {
// checks if the feature is a CIFaceFeature if yes assign feature to face feature and go on.
if let faceFeature = feature as? CIFaceFeature {
self.mouse(position: faceFeature.mouthPosition, faceFeature: faceFeature)
}
}
}
group.leave()
}
group.notify(queue: .main) {
if !self.faceTrackingEnds {
self.faceDetection()
}
}
}

Related

Moving a window using Accessibility API excessively slow when using multiple displays macOS Swift

Problem summary:
I'm working on my own window manager for macOS and one of the features involves moving the windows of other applications using mouse events and the Accessibility API. I create event taps for key pressed and mouse moved events with CGEvent.tapCreate and after my "move mode" is entered by pressing cmd+w I feed mouse moved deltas (while freezing the cursor sprite itself) to a function that takes the location of the currently focused window, adds the deltas, and sets the position of the window (an AXUIElement) by using AXUIElementSetAttributeValue(windowElement, kAXPositionAttribute as CFString, position). This gets called a lot, once for every mouse moved event, and works fine when I have one display (built in MacBook display). Very snappy, no discernible lag, the window gets moved around as if it were the cursor.
However, when I test with more than one display (I've tested using either or both of a wired Dell monitor and a sidecar iPad), I get very inconsistent behavior and generally a lot of lag during window movement. The movement seems to be mostly correct, but delayed (the window might keep moving for as much as a few seconds after I lift my finger from the trackpad, or keep moving in the opposite direction of where I'm moving my finger for a bit).
Sometimes the movement is snappy and correct on one or more displays (otherwise identical behavior to using a single display), but slow on others. Which displays are affected seems random between launches. On one occasion moving my "Terminal" window was consistently snappy while "Notes" was slow. The most common case is just generally sluggish behavior across all displays. Usually at some point the mouse event tap will stop (I guess taking too long causes this to happen). I could restart it but that doesn't really solve the problem.
What I've tried:
Using Instruments (which I'm a novice at so I may have missed something) & timing code I'm 95% sure the problem is with using the call to AXUIElementSetAttributeValue to set the window position. If I take out all the logic regarding where to set (not getting where the window is currently, etc.) and just set the window to the same location repeatedly in response to mouse events, I can still observe the mouse event tap getting stopped (fails pretty silently). Doing all the logic except the location setting doesn't take long enough to stop the tap.
In researching approaches to accomplish what I want I found the following links:
Move other windows on Mac OS X using Accessibility API
How can I move/resize windows programmatically from another application?
Window move and resize APIs in OS X
Seem to be Obj-C versions of the same approach I'm using
set the size and position of all windows on the screen in swift
Swift approach to what I'm doing, very similar to my window movement code
I want to animate the movement of a foreign OS X app's window
This approach is about custom animating window movement rather than just moving but the asker indicates that they were able to move foreign windows by "getting an AXUIElementRef that holds the AXUIElement associated with the focused window, then setting the NSAccessibilityPositionAttribute." I have not been able to find more details about this approach and a lot of NSAccessibility stuff appears to have since been deprecated.
I also looked at various Apple docs including Quartz and of course a ton of Accessibility as well as other less useful sources. Nowhere did I find any mentions of performance discrepancy when using the Accessibility API with multiple displays.
Code:
// CODE FOR MOVING A WINDOW
class WindowTransformer {
var windowElement:AXUIElement?
init?(forWindowWithPid: pid_t) {
// Make Accessibility object for given PID
let accessApp:AXUIElement = AXUIElementCreateApplication(forWindowWithPid)
var windowData:AnyObject?
AXUIElementCopyAttributeValue(accessApp, kAXWindowsAttribute as CFString, &windowData)
windowElement = (windowData as? [AXUIElement])?.first
guard let _ = windowElement else { return nil }
}
func transformWindowWithDeltas(x: CGFloat, y: CGFloat) {
let current = getCurrentWindowPosition()
guard let current = current else { return }
let newX = current.x + x
let newY = current.y + y
setPosition(to: CGPoint(x: newX, y: newY))
}
func getCurrentWindowPosition() -> CGPoint? {
if windowElement == nil { return nil }
var positionData:CFTypeRef?
AXUIElementCopyAttributeValue(windowElement!,
kAXPositionAttribute as CFString,
&positionData)
let currentPos = axValueAsCGPoint(positionData! as! AXValue)
return currentPos
}
func axValueAsCGPoint(_ value: AXValue) -> CGPoint {
var point = CGPoint.zero
AXValueGetValue(value, AXValueType.cgPoint, &point)
return point
}
func setPosition(to: CGPoint) {
var newPoint = to
let position:CFTypeRef? = AXValueCreate(AXValueType(rawValue: kAXValueCGPointType)!, &newPoint)
// What I think is causing issues
let err = AXUIElementSetAttributeValue(windowElement!, kAXPositionAttribute as CFString, position!)
if err != .success {
// I've never seen this happen
print("AXError moving window \(err)")
}
}
}
// CODE THAT GETS EXECUTED ON RECEIVING MOUSE EVENTS
// Gets set to true when in "move mode"
var listeningAndEscapeFlag = false
// I have a notification center observer that updates this when applications are activated, will probably change how this is set eventually
var activePid:pid_t
var transformer:WindowTransformer?
func mouseEventAction(event: CGEvent) -> Unmanaged<CGEvent>? {
let unmodifiedEvent = Unmanaged.passRetained(event)
if !listeningEscapeAndMouseFlag { return unmodifiedEvent }
guard let activePid = activePid else { return unmodifiedEvent }
if transformer == nil {
transformer = WindowTransformer(forWindowWithPid: activePid)
}
let eventLocation = event.location
let deltaEvent = NSEvent.init(cgEvent: event)
let deltaX = deltaEvent?.deltaX
let deltaY = deltaEvent?.deltaY
guard let deltaX = deltaX, let deltaY = deltaY else { return nil }
// Attempt to move window based on mouse events
transformer?.transformWindowWithDeltas(x: deltaX, y: deltaY)
// Keeps cursor visibly frozen
CGWarpMouseCursorPosition(eventLocation)
return nil
}
// Gets passed as callback argument to CGEvent.tapCreate
func mouse_interceptor_callback(tapProxy: CGEventTapProxy,
eventType: CGEventType,
event: CGEvent,
data: UnsafeMutableRawPointer?) -> Unmanaged<CGEvent>? {
// I'm pretty sure modifying mouseMoved events directly doesn't actually do anything
let unmodifiedEvent = Unmanaged.passRetained(event)
if eventType != .mouseMoved {
return unmodifiedEvent
}
return mouseEventAction(event)
}
// HOW I'M CREATING A TAP
var port:CFMachPort?
func createMouseTap() {
let mask:CGEventMask = CGEventMask(1 << CGEventType.mouseMoved.rawValue)
port = CGEvent.tapCreate(tap: CGEventTapLocation.cghidEventTap, // Tap at place where system events enter window server
place: CGEventTapPlacement.headInsertEventTap, // Insert before other taps
options: CGEventTapOptions.defaultTap, // Can modify events
eventsOfInterest: mask,
callback: mouse_interceptor_callback, // fn to run on event
userInfo: nil)
}
func activateTap() {
guard let port = port else { return }
CGEvent.tapEnable(tap: port, enable: true)
let runLoopSrc = CFMachPortCreateRunLoopSource(kCFAllocatorDefault, port, 0)
CFRunLoopAddSource(CFRunLoopGetCurrent(), runLoopSrc, .commonModes)
}
Notes:
There are some modifications and omissions from my actual code for brevity but this is the code that is directly involved in what I'm attempting to do.
My computer performs well and there are no delays with other applications (or dragging windows manually) when using multiple displays.

AKAmplitudeTracker amplitude getting 0.0 using audioKit

I want to get the volume of AKAmplitudeTracker but getting -inf what is wrong with me please help out.
AKAudioFile.cleanTempDirectory()
AKSettings.audioInputEnabled = true
AKSettings.bufferLength = .medium
AKSettings.defaultToSpeaker = true
AKSettings.playbackWhileMuted = true
AKSettings.enableRouteChangeHandling = true
AKSettings.enableCategoryChangeHandling = true
AKSettings.enableLogging = true
do {
try AKSettings.setSession(category: .playAndRecord, with: .allowBluetoothA2DP)
} catch {
print("error \(error.localizedDescription)")
}
microphone = AKMicrophone()!
tracker = AKAmplitudeTracker(microphone)
booster = AKBooster(tracker, gain: 0)
AudioKit.output = booster
try AudioKit.start()
=================
extension AKAmplitudeTracker {
var volume: Decibel {
return 20.0 * log10(amplitude)
}
}
=================
OutPut print(tracker. amplitude)
0.0
Had a quick look, seems that you followed the basic setup, you do seem to fail to trace the data generated in time correctly! Amplitude data is provided during the time period for the computation that is taken from the microphone, so to look at what it looks like in timeline you can use a timer, as such:
func reset() throws {
do {
self.timer.invalidate()
self.timer = nil
} catch {
throw error
}
}
func microphoneTracker() {
guard self.timer == nil else { return }
self.watcher()
let timer = Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { _ in
log.info(self.akMicrophoneAmplitudeTracker.amplitude)
}
self.timer = timer
}
Change the withTimeInterval to how frequently you want to check the amplitude.
I think it's quite readable what I put there for you, but I'll break it down in a few words:
Keep a reference for the AKAmplitudeTracker in a property, here I've named it akMicrophoneAmplitudeTracker
Keep a reference for your timed event, that will check the amplitude value during a period
Compute the data in the closure body, the property holding value is .amplitude
The computation in the example is a logger that prints .amplitude
As required, use the .invalidate method to stop the timer
A few other things you may want to double-check on your code is to make sure that the tracker is part of the signal chain, as that's an AVAudioEngine engine requirement; I've also noticed in some other people's code a call for the method .start in the AKAmplitudeTracker, as follows:
akMicrophoneAmplitudeTracker.start()
To finish, have in mind that if you are testing it through Simulator, look at the microphone settings of your host-machine and expect amplitudes that might be different then the actual hardware.

Apparently random execution time with GKGraph.findpath() method in Swift

I'm having a pathfinder class in a SpriteKit game that a I want to use to process every path request in the game. So I have my class stored in my SKScene and I access it from different parts of the game always from the main thread. The pathfinder uses a GKGridGraph of a pretty good size (288 x 224). The class holds an array of requests processed one after another at each update() call from the main scene. Here is the code :
class PathFinder {
var isLookingForPath = false
var groundGraph : GKGridGraph<MyNode>?
var queued : [PathFinderRequest] = []
var thread: DispatchQoS.QoSClass = .userInitiated
func generate(minPoint: CGPoint) {
// generate the groundGraph grid
}
func update() {
// called every frame
if !self.isLookingForPath {
findPath()
}
}
func findPath(from start: TuplePosition, to end: TuplePosition, on layer: PathFinderLayer, callBack: PathFinderCallback) {
// Generating request
let id = String.randomString(length: 5)
let request = PathFinderRequest(id: id, start: start, end: end, layer: layer, callback: callBack)
// Append the request object at the end of the array
queued.append(request)
}
func findPath() {
self.isLookingForPath = true
guard let request = queued.first else {
isLookingForPath = false
return
}
let layer = request.layer
let callback = request.callback
let start = request.start
let end = request.end
let id = request.id
var graph = self.groundGraph
queued.removeFirst()
let findItem = DispatchWorkItem {
if let g = graph, let sn = g.node(atGridPosition: start.toVec()), let en = g.node(atGridPosition: end.toVec()) {
if let path = g.findPath(from: sn, to: en) as? [GKGridGraphNode], path.count > 0 {
// Here we have the path found
// it worked !
}
}
// Once the findPath() method execution is over,
// we reset the "flag" so we can call it once again from
// the update() method
self.isLookingForPath = false
}
// Execute the findPath() method in the chosen thread
// asynchronously
DispatchQueue.global(qos: thread).async(execute: findItem)
}
func drawPath(_ path: [GKGridGraphNode]) {
// draw the path on the scene
}
}
Well the code works quite good as it is. If I send random path request within (x+-10, y+-10) length it will return them to each object holding the callback in the request object pretty quickly, but suddenly one request is randomly taking a huge amount of time (approximatively 20s compared to 0.001s) and despite everything I tried I wasn't able to find out what happens. It's never on the same path, never the same caller, never after a certain amount of time... here is a video of the issue : https://www.youtube.com/watch?v=-IYlLOQgJrQ
It sure happens more quickly when there is too much entities requesting but I can't figure why I'm sure it has to deal with the DispacthQueue async calls that I use to prevent the game from freezing.
With delay on every call, the error appear later but is still here :
DispatchQueue.global(qos: thread).asyncAfter(deadline: .now() + 0.1, execute: findItem)
When I look for what is taking so much time to process it is a sub method of the GKGridGraph class :
So I really don't know how to figure this out, I tried everything I could think of but it always happens whatever the delay, the number of entities, the different threads, etc...
Thank you for your precious help !

AudioKit AKMicrophone not outputting any data

I am trying to capture FFT data from a microphone. I've managed to get it to work before with a similar codebase but since macOS Mojave it's broken - the fft data constantly stays 0.
Relevant Code:
var fft: AKFFTTap?
var inputDevice: AKDevice? {
didSet {
inputNode = nil
updateAudioNode()
}
}
var inputNode: AKNode? {
didSet {
if fft != nil {
// According to AKFFTTap class reference, it will always be on tap 0
oldValue?.avAudioNode.removeTap(onBus: 0)
}
fft = inputNode.map { AKFFTTap($0) }
}
}
[...]
guard let device = inputDevice else {
inputNode = ViewController.shared.player.mixer
return
}
do {
try AudioKit.setInputDevice(device)
}
catch {
print("Error setting input device: \(error)")
return
}
let microphoneNode = AKMicrophone()
do {
try microphoneNode.setDevice(device)
}
catch {
print("Failed setting node input device: \(error)")
return
}
microphoneNode.start()
microphoneNode.volume = 3
print("Switched Node: \(microphoneNode), started: \(microphoneNode.isStarted)")
inputNode = microphoneNode
try! AudioKit.start()
All the code is called, no errors are output, but the fft simply stays blank. With some code reordering I get varying errors.
A full version of the class, for completeness, is here.
Finally, I also tried implementing one to one the examples from the playground. Since XCode playgrounds seem to crash with AudioKit, I tried it in my own codebase, but there's no difference there either. AKFrequencyTracker, for example, gets 0s for both amplitude and frequency.
I am not 100% positive of this, but I'd like you to try AudioKit v4.5.1 out. We definitely fixed a bug in AKMicrophone, and that could have downstream consequences. I'll withdraw this answer and keep looking if it is not fixed. Let me know.

Swift - Update labels in realtime

I have a complex set of maths that takes several seconds (and that's on the faster iPhones). To keep the user interested and believe the program isn't taking a nap I need to update the labels/numbers in realtime.
Historically I would have used:
DispatchQueue.main.sync {...
But that gives Thread 1 errors when run nowadays.
So I am using:
DispatchQueue.global().async(execute: {
DispatchQueue.main.sync {
self.dateLabel.text = date1Formatter.string(from: newDate!)
// etc
}
})
Surprise, surprise this isn't updating the numbers in realtime, just at the end of the cycle. How do I "sync" it?
You shouldn't call main.sync at all.
Just call DispatchQueue.main.async Ashley Mills says:
DispatchQueue.global().async(execute: {
DispatchQueue.main.async {
self.dateLabel.text = date1Formatter.string(from: newDate!)
// etc
}
})