CGDisplayStream only capturing a single frame - swift

I'm finally starting to play with Swift (and macOS development) for the first time. I'm trying to write a simple screen capture app to get started. I've already taken a look at and succeeded in using the AVFoundation APIs for doing this (AVCaptureSession, AVCaptureScreenInput, etc). But now I'd like to attempt to go a little lower-level and play with the closer-to-the-metal CGDisplayStream API.
Unfortunately I've only been able to get it to capture a single frame. I suspect I may be missing something regarding potential interaction between the main Runloop and the DispatchQueue I'm passing in? Not really clear on if those things interact in the first place.
Here's a small reproduction of my issue:
import Foundation
import AVFoundation
import CoreGraphics
let mainDisplay = CGMainDisplayID()
let displayBounds = CGDisplayBounds(mainDisplay)
let recordingQueue = DispatchQueue.global(qos: .background)
let displayStreamProps : [CFString : Any] = [
CGDisplayStream.preserveAspectRatio: kCFBooleanTrue,
CGDisplayStream.showCursor: kCFBooleanTrue,
CGDisplayStream.minimumFrameTime: 60,
]
let displayStream = CGDisplayStream(
dispatchQueueDisplay: mainDisplay,
outputWidth: Int(displayBounds.width),
outputHeight: Int(displayBounds.height),
pixelFormat: Int32(kCVPixelFormatType_32BGRA),
properties: displayStreamProps as CFDictionary,
queue: recordingQueue,
handler: { status, displayTime, frameSurface, updateRef in
print("is only called once")
}
)
func quit(_ : Int32) {
displayStream?.stop()
}
signal(SIGINT, quit)
displayStream?.start()
RunLoop.current.run()
Any help would be massively appreciated!!

Removing this line seems to fix the issue:
CGDisplayStream.minimumFrameTime: 60,
The docs don't mention what the unit for this "time" field is, but it appears to be in seconds. So you could change it to 1.0/60.0 for 60fps capture.

Related

CGEventPost doesnt work in compiled mode on MacOS 12.6 but works in interpreted mode

I'm trying to programmaticaly send a custom key event (function keys, media keys) but it only works in interpreted mode, not in compiled code.
I tried using the following answer:
emulate media key press on Mac
The python example works perfectly, the swift example works when called as a script, ie this code:
#!/usr/bin/swift
import Quartz
let NX_KEYTYPE_SOUND_UP: UInt32 = 0
let NX_KEYTYPE_SOUND_DOWN: UInt32 = 1
let NX_KEYTYPE_PLAY: UInt32 = 16
let NX_KEYTYPE_NEXT: UInt32 = 17
let NX_KEYTYPE_PREVIOUS: UInt32 = 18
let NX_KEYTYPE_FAST: UInt32 = 19
let NX_KEYTYPE_REWIND: UInt32 = 20
let supportedKeys: [String: UInt32] = ["playpause": NX_KEYTYPE_PLAY, "next": NX_KEYTYPE_NEXT, "prev": NX_KEYTYPE_PREVIOUS, "volup": NX_KEYTYPE_SOUND_UP, "voldown": NX_KEYTYPE_SOUND_DOWN]
func HIDPostAuxKey(key: UInt32) {
func keyDown(_ down: Bool) {
let flags = NSEvent.ModifierFlags(rawValue: (down ? 0xa00 : 0xb00))
let data1 = Int((key << 16) | (down ? 0xa00 : 0xb00))
let ev = NSEvent.otherEvent(with: NSEvent.EventType.systemDefined,
location: NSPoint(x:0,y:0),
modifierFlags: flags,
timestamp: 0,
windowNumber: 0,
context: nil,
subtype: 8,
data1: data1,
data2: -1)
let cev = ev?.cgEvent
cev?.post(tap: CGEventTapLocation.cghidEventTap)
}
keyDown(true)
keyDown(false)
}
HIDPostAuxKey(key: supportedKeys[CommandLine.arguments[1]]!)
called via terminal (after doing chmod a+x keypress.swift)
works perfectly.
./keypress.swift volup
increases the volume, the HID for the volume even appears on screen.
If I try to compile the exact same code with
swiftc -o keypress keypress.swift
And adding the keypress binary to Security & Privacy -> accessibility, it does nothing. No error message, nothing.
First I tried to write a CLI app in Xcode using this code, it doesnt work (no error, but no key pressed). I've tried in Obj-C, in Swift, I checked that sandboxing wasnt enabled, no luck.
I'm stumpled that it works in interpreted mode but not in compiled mode.
Is there some flag to add when compiling to enable posting events? I'm out of ideas.
I'm running MacOS 12.6.2, Xcode 14.2, Swift 5
The difference between interpreted swift and compiled swift is ... speed
So adding
usleep(useconds_t(1 * 1_000)) //will sleep for 1 millisecond (.001 seconds)
after the cgEvent.post solves the problem.
(Note that I tested this on my iMac i5 3.2 GHz, you may need to change this delay)
This is not an perfect solution: the program should manage a queue of the events posted, but better, MacOS should handle this.
EDIT:
I was wrong. Sending a dummy event at start solves the problem.
func initEventQueue(){
// Send a command up key event
if let ev = CGEvent(keyboardEventSource: nil, virtualKey: 55, keyDown: false) {
ev.post(tap: .cghidEventTap)
}
}
Calling this at start solves the problem.
I've read all the documentation I could find, but there's no mention on how or when the event queue is created and why this solves my problem.
I'm still curious why this happens and what's the correct way to solve this.

Why is a process suspended by another process behind it?

The code is in a simple way, only read and parse an xml file into an array. I did not notice the problem until one day I tried to open a big xml file.
I added a blur view with NSProgressIndicator when the data is parsing, but the blur view did not show up until the parsing was completed.
self.addBlurView()
let file = HandleFile.shared.openFile(filePath)
self.removeBlurView()
guard let name = file.name, let path = file.path, let data = file.data else {
return
}
So I tried to delay parsing data. The blur view can be showed up, and removed when completed.
self.addBlurView()
DispatchQueue.main.asyncAfter(deadline: .now() + 0.3, execute: {
let file = HandleFile.shared.openFile(filePath)
self.removeBlurView()
guard let name = file.name, let path = file.path, let data = file.data else {
return
}
})
I thought it might be a problem fo thread, so I tried this in func addBlurView(), failed. I also tried to add an counting in addBlurView(), it counted to a certain number and paused, and continue counting after parsing data.
DispatchQueue.main.async {
self.blurView.isHidden = false
self.spinner.startAnimation(self)
}
Have no idea why this happen. Can anyone help to solve this problem?
Thanks.
As I mentioned in the comments above, main queue is a serial queue and all the tasks assigned to it are executed serially by main thread. In general, You should not perform any heavy lifting task (like loading file to memory) on main thread as it would block the main thread and render UI unresponsive.
Typically all the heavy lifting tasks like loading a file to a memory (anything which does not deal with UI rendering directly) should be delegated to one of dispatch queues. Try wrapping your openFile(filePath) call inside DispatchQueue
self.addBlurView()
DispatchQueue.global(qos: .default).async {
let file = HandleFile.shared.openFile(filePath)
}
Personally I would expect openFile function to have a completion block which is triggered on main queue when it finished loading file so that you can remove your blurView, but in your case it seems like its a synchronous statement so you can try
self.addBlurView()
DispatchQueue.global(qos: .default).async {
let file = HandleFile.shared.openFile(filePath)
DispatchQueue.main.async {
self.removeBlurView()
}
}

How to remove sandbox without xcode

Earlier I asked a question regarding generateCGImagesAsynchronously. Thankfully it got answered and works great.
The issue is that it only works as a Cocoa app on xcode. I am now trying to move this logic to an executable Swift package but AVFoundation code, such as generateCGImagesAsynchronously, won't work. That is, no error is raised but those functions seem to be mocked. I presume this might have to do with my package being sandboxed? I was able to remove the sandbox from the Cocoa app I previously wrote the code for, but I can't figure out how to do that for this executable.
I am new to Swift and trying to understand it and it's kind of frustrating to think that what I want my code to do is dependent on the IDE I am using.
If anyone can point me in the direction of where to read in the docs, or some other sources, on how to make programs without using xcode, that would be great. Thanks!
Here is my code:
import Darwin
import Foundation
import AppKit
import AVFoundation
import Cocoa
#discardableResult func writeCGImage(
_ image: CGImage,
to destinationURL: URL
) -> Bool {
guard let destination = CGImageDestinationCreateWithURL(
destinationURL as CFURL,
kUTTypePNG,
1,
nil
) else { return false }
CGImageDestinationAddImage(destination, image, nil)
return CGImageDestinationFinalize(destination)
}
func imageGenCompletionHandler(
requestedTime: CMTime,
image: CGImage?,
actualTime: CMTime,
result: AVAssetImageGenerator.Result,
error: Error?
) {
guard let image = image else { return }
let path = saveToPath.appendingPathComponent(
"img\(actualTime).png"
)
writeCGImage(image, to: path)
}
let arguments: [String] = Array(CommandLine.arguments.dropFirst())
// For now, we assume the second arg, which is the
// path that the user wants us to save to, always exists.
let saveToPath = URL(fileURLWithPath: arguments[1], isDirectory: true)
let vidURL = URL(fileURLWithPath: arguments[0])
let vidAsset = AVAsset(url: vidURL)
let vidDuration = vidAsset.duration
let imageGen = AVAssetImageGenerator(asset: vidAsset)
var frameForTimes = [NSValue]()
let sampleCounts = 20
let totalTimeLength = Int(truncatingIfNeeded: vidDuration.value as Int64)
let steps = totalTimeLength / sampleCounts
for sampleCount in 0 ..< sampleCounts {
let cmTime = CMTimeMake(
value: Int64(sampleCount * steps),
timescale: Int32(vidDuration.timescale)
)
frameForTimes.append(NSValue(time: cmTime))
}
imageGen.generateCGImagesAsynchronously(
forTimes: frameForTimes,
completionHandler: imageGenCompletionHandler
)
As I said in a comment on your previous question, this has nothing to do with Xcode per se. Xcode just generates a lot of code and build commands for you.
macOS is a complex operating system and programs that want to use its more advanced features must follow certain patterns. One of this patterns is called the run loop. If you create a Cocoa app, you get most of these things for free.
Since you are trying to perform some asynchronous actions, you need a run loop. Appending this should work:
RunLoop.current.run()
Otherwise, your program will simply terminate when the main thread (your code) finishes. The run loop, however, causes the program to run a loop and wait for asynchronous events (this also includes UI interactions, for example) to occur.
Note that inserting this same line also fixes your issues from the other question.

AudioKit Creating Sinewave Tone When Returning from Background

I'm using AudioKit to run an AKSequencer() that plays both mp3 and wav files using AKMIDISampler(). Everything works great, except in cases when the app has entered background state for more than 30+ min, and then brought back up again for use. It seems to then lose all of it's audio connections and plays the "missing file" sinewave tone mentioned in other threads. The app can happily can enter background momentarily, user can quit, etc without the tone. It seems to only happen when left in background for long periods of time and then brought up again.
I've tried changing the order of AudioKit.start() and file loading, but nothing seems to completely eliminate the issue.
My current workaround is simply to prevent the user's display from timing out, however that does not address many use-cases of the issue occurring.
Is there a way to handle whatever error I'm setting up that creates this tone? Here is a representative example of what I'm doing with ~40 audio files.
//viewController
override func viewDidLoad() {
sequencer.setupSequencer()
}
class SamplerWav {
let audioWav = AKMIDISampler()
func loadWavFile() {
try? audioWav.loadWav("some_wav_audio_file")
}
class SamplerMp3 {
let audioMp3 = AKMIDISampler()
let audioMp3_akAudioFile = try! AKAudioFile(readFileName: "some_other_audio_file.mp3")
func loadMp3File() {
try? audioMp3.loadAudioFile(audioMp3_akAudioFile)
}
class Sequencer {
let mixer = AKMixer()
let subMix = AKMixer()
let samplerWav = SamplerWav()
let samplerMp3 = SamplerMp3()
var callbackTrack: AKMusicTrack!
let callbackInstr = AKMIDICallbackInstrument()
func setupSequencer{
AudioKit.output = mixer.mixer
try! AudioKit.start()
callbackTrack = sequencer.newTrack()
callbackTrack?.setMIDIOutput(callbackInstr.midiIn)
samplerWav.loadWavFile()
samplerMp3.loadMp3File()
samplerWav.audioWav >>> subMix
samplerMp3.audioMp3 >>> submix
submix >>> mixer
}
//Typically run from a callback track
func playbackSomeSound(){
try? samplerWav.audioWav.play(noteNumber: 60, velocity: 100, channel: 1)
}
}
Thanks! I'm a big fan of AudioKit.
After some trial and error, here's a workflow that seems to address the issue for my circumstance:
-create my callback track(s) -once- from viewDidLoad
-stop AudioKit, and call .detach() on all my AKMIDISampler tracks and any routing in willResignActive
-start AudioKit (again), and reload and reroute all of the audio files/tracks from didBecomeActive

iOS: Keep application running in background

How do I keep my application running in the background?
Would I have to jailbreak my iPhone to do this? I just need this app to check something from the internet every set interval and notify when needed, for my own use.
Yes, no need to jailbreak. Check out the "Implementing long-running background tasks" section of this doc from Apple.
From Apple's doc:
Declaring Your App’s Supported Background Tasks
Support for some types of background execution must be declared in advance by the app that uses them. An app declares support for a service using its Info.plist file. Add the UIBackgroundModes key to your Info.plist file and set its value to an array containing one or more of the following strings: (see Apple's doc from link mentioned above.)
I guess this is what you required
When an iOS application goes to the background, are lengthy tasks paused?
iOS Application Background Downloading
This might help you ...
Enjoy Coding :)
Use local notifications to do that. But this will not check every time. You will have to set a time where you will check your specific event, you may shorten this by decreasing your time slot. Read more about local notification to know how to achieve this at:
http://developer.apple.com/library/mac/#documentation/NetworkingInternet/Conceptual/RemoteNotificationsPG/Introduction/Introduction.html
I found a way, to keep app running in background by playing silence
Make sure, that you selected audio playback in background modes
Also, don't use this method for long time, since it consumes CPU resources and battery juice, but I think it's a suitable way to keep app alive for a few minutes.
Just create an instance of SilencePlayer, call play() and then stop(), when you done
import CoreAudio
public class SilencePlayer {
private var audioQueue: AudioQueueRef? = nil
public private(set) var isStarted = false
public func play() {
if isStarted { return }
print("Playing silence")
let avs = AVAudioSession.sharedInstance()
try! avs.setCategory(AVAudioSessionCategoryPlayback, with: .mixWithOthers)
try! avs.setActive(true)
isStarted = true
var streamFormat = AudioStreamBasicDescription(
mSampleRate: 16000,
mFormatID: kAudioFormatLinearPCM,
mFormatFlags: kLinearPCMFormatFlagIsSignedInteger | kLinearPCMFormatFlagIsPacked,
mBytesPerPacket: 2,
mFramesPerPacket: 1,
mBytesPerFrame: 2,
mChannelsPerFrame: 1,
mBitsPerChannel: 16,
mReserved: 0
)
let status = AudioQueueNewOutput(
&streamFormat,
SilenceQueueOutputCallback,
nil, nil, nil, 0,
&audioQueue
)
print("OSStatus for silence \(status)")
var buffers = Array<AudioQueueBufferRef?>.init(repeating: nil, count: 3)
for i in 0..<3 {
buffers[i]?.pointee.mAudioDataByteSize = 320
AudioQueueAllocateBuffer(audioQueue!, 320, &(buffers[i]))
SilenceQueueOutputCallback(nil, audioQueue!, buffers[i]!)
}
let startStatus = AudioQueueStart(audioQueue!, nil)
print("Start status for silence \(startStatus)")
}
public func stop() {
guard isStarted else { return }
print("Called stop silence")
if let aq = audioQueue {
AudioQueueStop(aq, true)
audioQueue = nil
}
try! AVAudioSession.sharedInstance().setActive(false)
isStarted = false
}
}
fileprivate func SilenceQueueOutputCallback(_ userData: UnsafeMutableRawPointer?, _ audioQueueRef: AudioQueueRef, _ bufferRef: AudioQueueBufferRef) -> Void {
let pointer = bufferRef.pointee.mAudioData
let length = bufferRef.pointee.mAudioDataByteSize
memset(pointer, 0, Int(length))
if AudioQueueEnqueueBuffer(audioQueueRef, bufferRef, 0, nil) != 0 {
AudioQueueFreeBuffer(audioQueueRef, bufferRef)
}
}
Tested on iOS 10 and Swift 4
I know this is not the answer to your question, but I think it is a solution.
This assumes that your trying to check something or get data from the internet on a regular basis?
Create a service that checks the internet every set interval for whatever it is you want to know, and create a push notification to alert you of it, if the server is down, or whatever it is your trying to monitor has changed state. Just an idea.
Yes you can do something like this. For that you need to set entry in info.plist to tell os that my app will run in background. I have done this while I wanted to pass user's location after particular time stamp to server. For that I have set "Required background modes" set to "App registers for location updates".
You can write a handler of type UIBackgroundTaskIdentifier.
You can already do this in the applicationDidEnterBackground Method