AVCaptureDevice configuration taking unpredictable time to propagate to connected AVCapturePhotoOutput - swift

I am trying to modify the exposure duration of photos taken.
Within the loop:
for (customDuration, customISO) in zip(exposureDurations, exposureISOs) { // exposureDurations and exposureISOs are just arrays of exposures
// code here
}
I set the AVCapture device's exposure to my custom value:
captureDevice = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInWideAngleCamera,
for: AVMediaType.video,
position: .back) {
captureSession = AVCaptureSession()
capturePhotoOutput = AVCapturePhotoOutput()
do { try captureDevice.lockForConfiguration() }
catch { print("capturebracket: cannot lock camera for configuration."); return }
// set the correct exposure
captureDevice.setExposureModeCustom(duration: customDuration, iso: customISO, completionHandler: { (_) -> Void in exposureModeSet = true })
// unlock the capture device (sets changes)
captureDevice.unlockForConfiguration()
// wait for changes to propagate
while !exposureModeSet {}
// get the correct photo settings
let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey : AVVideoCodecType.jpeg, AVVideoCompressionPropertiesKey : [AVVideoQualityKey : jpegQuality]])
Then I print the value to make sure it's been set correctly:
print("duration: \(captureDevice.exposureDuration)") // prints customExposure
print("device duration: \(( captureSession.inputs[0] as? AVCaptureDeviceInput)!.device.exposureDuration)") // prints customExposure
print("device duration: \((capturePhotoOutput.connections[0].inputPorts[0].input as? AVCaptureDeviceInput)!.device.exposureDuration)") // prints customExposure
photoSettings.isAutoStillImageStabilizationEnabled = false
Then I take the photo:
capturePhotoOutput.capturePhoto(with: photoSettings, delegate: self)
Somehow, when I loop through the above code with different exposures, the photos end up getting the wrong exposures. Specifically, the first run through works fine. The second will sometimes take a photo with the first exposure, sometimes with the second. The third will sometimes take a photo with the second exposure, sometimes with the third, and so on.
The interesting thing is that when I insert a long, say 1-3 second pause just before taking the photo, all of the exposures are set correctly... so it seems that the changes to AVCaptureDevice are taking time to propagate, and the photoOutput rushes ahead and takes the photo before the exposure has been set correctly.
Does anyone have any ideas? Can't seem to find anything in the docs about it.

Related

Syncing AVPlayer play (start) time with Timecode

I need my app to play a video (avplayer) at a specific time. Time in my app is represented through time code ("23:59:59:23").
I have a videos that need to play at specific timecode - for example
video.mp4 start at 00:01:03:22 (minute 1, second 3, frame 22)
What I have tried so far:
I have successfully setup my app to observe states within my avplayer based on this:media playback state
From what I've read I can achieve what I want by using AVPlayer.setRate. Although I really can't get my head around if this is what I need/how to use it.
My code below. Where I'm now stuck is what to do with everything you see from mulFactor to player.setRate. All taken from here. The below code plays all three of the videos in my app immediately. Assuming this is in fact the right way to achieve what I need, I'm guessing my next step will be replacing hostTime: with my timecodesource and time: with the time of playback????
if keyPath == #keyPath(AVPlayerItem.status) {
let status: AVPlayerItem.Status
if let statusNumber = change?[.newKey] as? NSNumber {
status = AVPlayerItem.Status(rawValue: statusNumber.intValue)!
} else {
status = .unknown
}
// Switch over status value
switch status {
case .readyToPlay:
// Player item is ready to play.
print("item is ready to play")
player.preroll(atRate: 1.0, completionHandler: {_ in print("prerolling avplayer")})
player.automaticallyWaitsToMinimizeStalling = false
let mulFactor = 24.0
let timeScale = CMTimeScale(mulFactor)
let seekCMTime = CMTime(value: CMTimeValue(CGFloat(1000.0)), timescale: timeScale)
let syncTime = CMClockGetHostTimeClock()
let hostTime = CMClockGetTime(syncTime)
player.setRate(1.0, time: seekCMTime, atHostTime: hostTime)
break
case .failed: break
// Player item failed. See error.
case .unknown: break
// Player item is not yet ready.
#unknown default:
print("FATAL ERROR IN AVPLAYER")
}
}

Moving a window using Accessibility API excessively slow when using multiple displays macOS Swift

Problem summary:
I'm working on my own window manager for macOS and one of the features involves moving the windows of other applications using mouse events and the Accessibility API. I create event taps for key pressed and mouse moved events with CGEvent.tapCreate and after my "move mode" is entered by pressing cmd+w I feed mouse moved deltas (while freezing the cursor sprite itself) to a function that takes the location of the currently focused window, adds the deltas, and sets the position of the window (an AXUIElement) by using AXUIElementSetAttributeValue(windowElement, kAXPositionAttribute as CFString, position). This gets called a lot, once for every mouse moved event, and works fine when I have one display (built in MacBook display). Very snappy, no discernible lag, the window gets moved around as if it were the cursor.
However, when I test with more than one display (I've tested using either or both of a wired Dell monitor and a sidecar iPad), I get very inconsistent behavior and generally a lot of lag during window movement. The movement seems to be mostly correct, but delayed (the window might keep moving for as much as a few seconds after I lift my finger from the trackpad, or keep moving in the opposite direction of where I'm moving my finger for a bit).
Sometimes the movement is snappy and correct on one or more displays (otherwise identical behavior to using a single display), but slow on others. Which displays are affected seems random between launches. On one occasion moving my "Terminal" window was consistently snappy while "Notes" was slow. The most common case is just generally sluggish behavior across all displays. Usually at some point the mouse event tap will stop (I guess taking too long causes this to happen). I could restart it but that doesn't really solve the problem.
What I've tried:
Using Instruments (which I'm a novice at so I may have missed something) & timing code I'm 95% sure the problem is with using the call to AXUIElementSetAttributeValue to set the window position. If I take out all the logic regarding where to set (not getting where the window is currently, etc.) and just set the window to the same location repeatedly in response to mouse events, I can still observe the mouse event tap getting stopped (fails pretty silently). Doing all the logic except the location setting doesn't take long enough to stop the tap.
In researching approaches to accomplish what I want I found the following links:
Move other windows on Mac OS X using Accessibility API
How can I move/resize windows programmatically from another application?
Window move and resize APIs in OS X
Seem to be Obj-C versions of the same approach I'm using
set the size and position of all windows on the screen in swift
Swift approach to what I'm doing, very similar to my window movement code
I want to animate the movement of a foreign OS X app's window
This approach is about custom animating window movement rather than just moving but the asker indicates that they were able to move foreign windows by "getting an AXUIElementRef that holds the AXUIElement associated with the focused window, then setting the NSAccessibilityPositionAttribute." I have not been able to find more details about this approach and a lot of NSAccessibility stuff appears to have since been deprecated.
I also looked at various Apple docs including Quartz and of course a ton of Accessibility as well as other less useful sources. Nowhere did I find any mentions of performance discrepancy when using the Accessibility API with multiple displays.
Code:
// CODE FOR MOVING A WINDOW
class WindowTransformer {
var windowElement:AXUIElement?
init?(forWindowWithPid: pid_t) {
// Make Accessibility object for given PID
let accessApp:AXUIElement = AXUIElementCreateApplication(forWindowWithPid)
var windowData:AnyObject?
AXUIElementCopyAttributeValue(accessApp, kAXWindowsAttribute as CFString, &windowData)
windowElement = (windowData as? [AXUIElement])?.first
guard let _ = windowElement else { return nil }
}
func transformWindowWithDeltas(x: CGFloat, y: CGFloat) {
let current = getCurrentWindowPosition()
guard let current = current else { return }
let newX = current.x + x
let newY = current.y + y
setPosition(to: CGPoint(x: newX, y: newY))
}
func getCurrentWindowPosition() -> CGPoint? {
if windowElement == nil { return nil }
var positionData:CFTypeRef?
AXUIElementCopyAttributeValue(windowElement!,
kAXPositionAttribute as CFString,
&positionData)
let currentPos = axValueAsCGPoint(positionData! as! AXValue)
return currentPos
}
func axValueAsCGPoint(_ value: AXValue) -> CGPoint {
var point = CGPoint.zero
AXValueGetValue(value, AXValueType.cgPoint, &point)
return point
}
func setPosition(to: CGPoint) {
var newPoint = to
let position:CFTypeRef? = AXValueCreate(AXValueType(rawValue: kAXValueCGPointType)!, &newPoint)
// What I think is causing issues
let err = AXUIElementSetAttributeValue(windowElement!, kAXPositionAttribute as CFString, position!)
if err != .success {
// I've never seen this happen
print("AXError moving window \(err)")
}
}
}
// CODE THAT GETS EXECUTED ON RECEIVING MOUSE EVENTS
// Gets set to true when in "move mode"
var listeningAndEscapeFlag = false
// I have a notification center observer that updates this when applications are activated, will probably change how this is set eventually
var activePid:pid_t
var transformer:WindowTransformer?
func mouseEventAction(event: CGEvent) -> Unmanaged<CGEvent>? {
let unmodifiedEvent = Unmanaged.passRetained(event)
if !listeningEscapeAndMouseFlag { return unmodifiedEvent }
guard let activePid = activePid else { return unmodifiedEvent }
if transformer == nil {
transformer = WindowTransformer(forWindowWithPid: activePid)
}
let eventLocation = event.location
let deltaEvent = NSEvent.init(cgEvent: event)
let deltaX = deltaEvent?.deltaX
let deltaY = deltaEvent?.deltaY
guard let deltaX = deltaX, let deltaY = deltaY else { return nil }
// Attempt to move window based on mouse events
transformer?.transformWindowWithDeltas(x: deltaX, y: deltaY)
// Keeps cursor visibly frozen
CGWarpMouseCursorPosition(eventLocation)
return nil
}
// Gets passed as callback argument to CGEvent.tapCreate
func mouse_interceptor_callback(tapProxy: CGEventTapProxy,
eventType: CGEventType,
event: CGEvent,
data: UnsafeMutableRawPointer?) -> Unmanaged<CGEvent>? {
// I'm pretty sure modifying mouseMoved events directly doesn't actually do anything
let unmodifiedEvent = Unmanaged.passRetained(event)
if eventType != .mouseMoved {
return unmodifiedEvent
}
return mouseEventAction(event)
}
// HOW I'M CREATING A TAP
var port:CFMachPort?
func createMouseTap() {
let mask:CGEventMask = CGEventMask(1 << CGEventType.mouseMoved.rawValue)
port = CGEvent.tapCreate(tap: CGEventTapLocation.cghidEventTap, // Tap at place where system events enter window server
place: CGEventTapPlacement.headInsertEventTap, // Insert before other taps
options: CGEventTapOptions.defaultTap, // Can modify events
eventsOfInterest: mask,
callback: mouse_interceptor_callback, // fn to run on event
userInfo: nil)
}
func activateTap() {
guard let port = port else { return }
CGEvent.tapEnable(tap: port, enable: true)
let runLoopSrc = CFMachPortCreateRunLoopSource(kCFAllocatorDefault, port, 0)
CFRunLoopAddSource(CFRunLoopGetCurrent(), runLoopSrc, .commonModes)
}
Notes:
There are some modifications and omissions from my actual code for brevity but this is the code that is directly involved in what I'm attempting to do.
My computer performs well and there are no delays with other applications (or dragging windows manually) when using multiple displays.

AudioKit Creating Sinewave Tone When Returning from Background

I'm using AudioKit to run an AKSequencer() that plays both mp3 and wav files using AKMIDISampler(). Everything works great, except in cases when the app has entered background state for more than 30+ min, and then brought back up again for use. It seems to then lose all of it's audio connections and plays the "missing file" sinewave tone mentioned in other threads. The app can happily can enter background momentarily, user can quit, etc without the tone. It seems to only happen when left in background for long periods of time and then brought up again.
I've tried changing the order of AudioKit.start() and file loading, but nothing seems to completely eliminate the issue.
My current workaround is simply to prevent the user's display from timing out, however that does not address many use-cases of the issue occurring.
Is there a way to handle whatever error I'm setting up that creates this tone? Here is a representative example of what I'm doing with ~40 audio files.
//viewController
override func viewDidLoad() {
sequencer.setupSequencer()
}
class SamplerWav {
let audioWav = AKMIDISampler()
func loadWavFile() {
try? audioWav.loadWav("some_wav_audio_file")
}
class SamplerMp3 {
let audioMp3 = AKMIDISampler()
let audioMp3_akAudioFile = try! AKAudioFile(readFileName: "some_other_audio_file.mp3")
func loadMp3File() {
try? audioMp3.loadAudioFile(audioMp3_akAudioFile)
}
class Sequencer {
let mixer = AKMixer()
let subMix = AKMixer()
let samplerWav = SamplerWav()
let samplerMp3 = SamplerMp3()
var callbackTrack: AKMusicTrack!
let callbackInstr = AKMIDICallbackInstrument()
func setupSequencer{
AudioKit.output = mixer.mixer
try! AudioKit.start()
callbackTrack = sequencer.newTrack()
callbackTrack?.setMIDIOutput(callbackInstr.midiIn)
samplerWav.loadWavFile()
samplerMp3.loadMp3File()
samplerWav.audioWav >>> subMix
samplerMp3.audioMp3 >>> submix
submix >>> mixer
}
//Typically run from a callback track
func playbackSomeSound(){
try? samplerWav.audioWav.play(noteNumber: 60, velocity: 100, channel: 1)
}
}
Thanks! I'm a big fan of AudioKit.
After some trial and error, here's a workflow that seems to address the issue for my circumstance:
-create my callback track(s) -once- from viewDidLoad
-stop AudioKit, and call .detach() on all my AKMIDISampler tracks and any routing in willResignActive
-start AudioKit (again), and reload and reroute all of the audio files/tracks from didBecomeActive

CIDetector face detection in real-time However memory consumption increases linearly how to avoid this issue?

I have a question how to correctly call CIDetector correctly I'm trying to run the face detection in real-time this works very well. However the memory consumption of the app increases linearly with time how you can see in the image below I'm thinking this is due to objects being created but they're not released can anyone advise how to do it correctly.
I have pinpointed the issue down to this function as every time it's invoked memory increases linearly when it terminated it quickly drops down to almost 80 MB instead of 11 GB rising also check for memory leaks however none were found.
My target development platform is Mac OS I'm trying to extractthe mouth position from the CA detector and then use it to compute a Delta in the mouse function for a Game.
I also Looked that this post however I have tried their approach but it did not work for me
CIDetector isn't releasing memory
fileprivate func faceDetection(){
// setting up dispatchQueue
dispatchQueue.async {
// checking if sample buffer is equal to nil if not assign its value to sample
if let sample = self.sampleBuffers {
// if allfeatures is not equal to nil. if yes assign allfeatures to features otherwise return
guard let features = self.allFeatures(sample: sample) else { return }
// loop to cycle through all features
for feature in features {
// checks if the feature is a CIFaceFeature if yes assign feature to face feature and go on.
if let faceFeature = feature as? CIFaceFeature {
if !self.hasEnded {
if self.calX.count > 30 {
self.sens.append((self.calX.max()! - self.calX.min()!))
self.sens.append((self.calY.max()! - self.calY.min()!))
print((self.calX.max()! - self.calX.min()!))
self.hasEnded = true
} else {
self.calX.append(faceFeature.mouthPosition.x)
self.calY.append(faceFeature.mouthPosition.y)
}
} else {
self.mouse(position: CGPoint(x: (faceFeature.mouthPosition.x - 300 ) * 2, y: (faceFeature.mouthPosition.y + 20 ) * 2), faceFeature: faceFeature)
}
}
}
}
if !self.faceTrackingEnds {
self.faceDetection()
}
}
}
This problem was caused by repeatedly calling the function without waiting for its completion the fix was implementing a dispatch group and then calling the function on its completion
like this Now the CIdetector runs comfortably at 200 MB memory
fileprivate func faceDetection(){
let group = DispatchGroup()
group.enter()
// setting up dispatchQueue
dispatchQueue.async {
// checking if sample buffer is equal to nil if not assign its value to sample
if let sample = self.sampleBuffers {
// if allfeatures is not equal to nil. if yes assign allfeatures to features otherwise return
guard let features = self.allFeatures(sample: sample) else { return }
// loop to cycle through all features
for feature in features {
// checks if the feature is a CIFaceFeature if yes assign feature to face feature and go on.
if let faceFeature = feature as? CIFaceFeature {
self.mouse(position: faceFeature.mouthPosition, faceFeature: faceFeature)
}
}
}
group.leave()
}
group.notify(queue: .main) {
if !self.faceTrackingEnds {
self.faceDetection()
}
}
}

iOS: Keep application running in background

How do I keep my application running in the background?
Would I have to jailbreak my iPhone to do this? I just need this app to check something from the internet every set interval and notify when needed, for my own use.
Yes, no need to jailbreak. Check out the "Implementing long-running background tasks" section of this doc from Apple.
From Apple's doc:
Declaring Your App’s Supported Background Tasks
Support for some types of background execution must be declared in advance by the app that uses them. An app declares support for a service using its Info.plist file. Add the UIBackgroundModes key to your Info.plist file and set its value to an array containing one or more of the following strings: (see Apple's doc from link mentioned above.)
I guess this is what you required
When an iOS application goes to the background, are lengthy tasks paused?
iOS Application Background Downloading
This might help you ...
Enjoy Coding :)
Use local notifications to do that. But this will not check every time. You will have to set a time where you will check your specific event, you may shorten this by decreasing your time slot. Read more about local notification to know how to achieve this at:
http://developer.apple.com/library/mac/#documentation/NetworkingInternet/Conceptual/RemoteNotificationsPG/Introduction/Introduction.html
I found a way, to keep app running in background by playing silence
Make sure, that you selected audio playback in background modes
Also, don't use this method for long time, since it consumes CPU resources and battery juice, but I think it's a suitable way to keep app alive for a few minutes.
Just create an instance of SilencePlayer, call play() and then stop(), when you done
import CoreAudio
public class SilencePlayer {
private var audioQueue: AudioQueueRef? = nil
public private(set) var isStarted = false
public func play() {
if isStarted { return }
print("Playing silence")
let avs = AVAudioSession.sharedInstance()
try! avs.setCategory(AVAudioSessionCategoryPlayback, with: .mixWithOthers)
try! avs.setActive(true)
isStarted = true
var streamFormat = AudioStreamBasicDescription(
mSampleRate: 16000,
mFormatID: kAudioFormatLinearPCM,
mFormatFlags: kLinearPCMFormatFlagIsSignedInteger | kLinearPCMFormatFlagIsPacked,
mBytesPerPacket: 2,
mFramesPerPacket: 1,
mBytesPerFrame: 2,
mChannelsPerFrame: 1,
mBitsPerChannel: 16,
mReserved: 0
)
let status = AudioQueueNewOutput(
&streamFormat,
SilenceQueueOutputCallback,
nil, nil, nil, 0,
&audioQueue
)
print("OSStatus for silence \(status)")
var buffers = Array<AudioQueueBufferRef?>.init(repeating: nil, count: 3)
for i in 0..<3 {
buffers[i]?.pointee.mAudioDataByteSize = 320
AudioQueueAllocateBuffer(audioQueue!, 320, &(buffers[i]))
SilenceQueueOutputCallback(nil, audioQueue!, buffers[i]!)
}
let startStatus = AudioQueueStart(audioQueue!, nil)
print("Start status for silence \(startStatus)")
}
public func stop() {
guard isStarted else { return }
print("Called stop silence")
if let aq = audioQueue {
AudioQueueStop(aq, true)
audioQueue = nil
}
try! AVAudioSession.sharedInstance().setActive(false)
isStarted = false
}
}
fileprivate func SilenceQueueOutputCallback(_ userData: UnsafeMutableRawPointer?, _ audioQueueRef: AudioQueueRef, _ bufferRef: AudioQueueBufferRef) -> Void {
let pointer = bufferRef.pointee.mAudioData
let length = bufferRef.pointee.mAudioDataByteSize
memset(pointer, 0, Int(length))
if AudioQueueEnqueueBuffer(audioQueueRef, bufferRef, 0, nil) != 0 {
AudioQueueFreeBuffer(audioQueueRef, bufferRef)
}
}
Tested on iOS 10 and Swift 4
I know this is not the answer to your question, but I think it is a solution.
This assumes that your trying to check something or get data from the internet on a regular basis?
Create a service that checks the internet every set interval for whatever it is you want to know, and create a push notification to alert you of it, if the server is down, or whatever it is your trying to monitor has changed state. Just an idea.
Yes you can do something like this. For that you need to set entry in info.plist to tell os that my app will run in background. I have done this while I wanted to pass user's location after particular time stamp to server. For that I have set "Required background modes" set to "App registers for location updates".
You can write a handler of type UIBackgroundTaskIdentifier.
You can already do this in the applicationDidEnterBackground Method