How to get size and position of a sceneWindow under mac Catalyst - swift

I have been able to obtain the size of a sceneWindow when I resize it using
func windowScene(_ windowScene: UIWindowScene, didUpdate previousCoordinateSpace: UICoordinateSpace, interfaceOrientation previousInterfaceOrientation: UIInterfaceOrientation, traitCollection previousTraitCollection: UITraitCollection) {
print("movement trapped \(windowScene.coordinateSpace.bounds)"
}
within the sceneDelegate. But the x,y coordinates are always 0,0 regardless of where I drag the window to. Looking to be able to dictate where the new sceneWindow is located on the mac's screen relative to the "default" sceneWindow.

you can try to convert window frame from windowScene coordinateSpace to UIScreen coordinateSpace
windowFrame = [window convertRect:window.frame toCoordinateSpace:window.windowScene.screen.coordinateSpace];

Well
I didn't find a way to get this done in UIWindow but remember, Catalyst does support AppKit and you can call it anytime use objc runtime.
So here comes up the idea:
Get NSWindows from NSApplication
Get UIWindow from view.window
Compare some magic and lookup our target NSWindow
Get the frame of that NSWindow
var targetNSWindow: AnyObject? = nil
let nsWindows = (NSClassFromString("NSApplication")?.value(forKeyPath: "sharedApplication.windows") as? [AnyObject])!
for nsWindow in nsWindows {
let uiWindows = nsWindow.value(forKeyPath: "uiWindows") as? [UIWindow] ?? []
if uiWindows.contains(view.window!) {
targetNSWindow = nsWindow
}
}
if let found = targetNSWindow {
print(found.value(forKeyPath: "_frame")!)
}
And here is a sample output.
NSRect: {{818, 296}, {964, 614}}
A little bit more, you can have your window information from sceneDelegate and compare the magic with it in a similar way. But be careful, sometimes you don't have any window when the app just loads. Do the jobs in DispatchQueue.main.async block if that happens.

Related

How to center my window in another application's window?

My application is distributed via App Store (just to let you know about some limitations). I would like to center my application in the frontmost app window. Is it possible?
The only thing I need to know is the frame of the frontmost application. I checked NSWorkspace.shared.frontmostApplication but there is nothing about the window's position.
I found a solution for that. It looks like it doesn't even require Accessibility permission.
if let activeApp = NSWorkspace.shared.frontmostApplication {
(CGWindowListCopyWindowInfo([.excludeDesktopElements, .optionOnScreenOnly], kCGNullWindowID) as [AnyObject]?)?
.first { $0.object(forKey: kCGWindowOwnerPID) as? pid_t == activeApp.processIdentifier }
.flatMap { $0.object(forKey: kCGWindowBounds) as CFDictionary? }?
.flatMap {
guard let bounds = CGRect(dictionaryRepresentation: $0) else { return }
print("Active window bounds: \(bounds)")
}
}
Also, keep in mind:
The coordinates of the rectangle are specified in screen space, where the origin is in the upper-left corner of the main display.

How, exactly, do I render Metal on a background thread?

This problem is caused by user interface interactions such as showing the titlebar while in fullsreen. That question's answer provides a solution, but not how to implement that solution.
The solution is to render on a background thread. The issue is, the code provided in Apple's is made to cover a lot of content so most of it will extraneous code, so even if I could understand it, it isn't feasible to use Apple's code. And I can't understand it so it just plain isn't an option. How would I make a simple Swift Metal game use a background thread being as concise as possible?
Take this, for example:
class ViewController: NSViewController {
var MetalView: MTKView {
return view as! MTKView
}
var Device: MTLDevice = MTLCreateSystemDefaultDevice()!
override func viewDidLoad() {
super.viewDidLoad()
MetalView.delegate = self
MetalView.device = Device
MetalView.colorPixelFormat = .bgra8Unorm_srgb
Device = MetalView.device
//setup code
}
}
extension ViewController: MTKViewDelegate {
func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) {
}
func draw(in view: MTKView) {
//drawing code
}
}
That is the start of a basic Metal game. What would that code look like, if it were rendering on a background thread?
To fix that bug when showing the titlebar in Metal, I need to render it on a background thread. Well, how do I render it on a background thread?
I've noticed this answer suggests to manually redraw it 60 times a second. Presumably using a loop that is on a background thread? But that seems... not a clean way to fix it. Is there a cleaner way?
The main trick in getting this to work seems to be setting up the CVDisplayLink. This is awkward in Swift, but doable. After some work I was able to modify the "Game" template in Xcode to use a custom view backed by CAMetalLayer instead of MTKView, and a CVDisplayLink to render in the background, as suggested in the sample code you linked — see below.
Edit Oct 22:
The approach mentioned in this thread seems to work just fine: still using an MTKView, but drawing it manually from the display link callback. Specifically I was able to follow these steps:
Create a new macOS Game project in Xcode.
Modify GameViewController to add a CVDisplayLink, similar to below (see this question for more on using CVDisplayLink from Swift). Start the display link in viewWillAppear and stop it in viewWillDisappear.
Set mtkView.isPaused = true in viewDidLoad to disable automatic rendering, and instead explicitly call mtkView.draw() from the display link callback.
The full content of my modified GameViewController.swift is available here.
I didn't review the Renderer class for thread safety, so I can't be sure no more changes are required, but this should get you up and running.
Older implementation with CAMetalLayer instead of MTKView:
This is just a proof of concept and I can't guarantee it's the best way to do everything. You might find these articles helpful too:
I didn't try this idea, but given how much convenience MTKView generally provides over CAMetalLayer, it might be worth giving it a shot:
https://developer.apple.com/forums/thread/89241?answerId=268384022#268384022
Is drawing to an MTKView or CAMetalLayer required to take place on the main thread? and https://developer.apple.com/documentation/quartzcore/cametallayer/1478157-presentswithtransaction
class MyMetalView: NSView {
var displayLink: CVDisplayLink?
var metalLayer: CAMetalLayer!
override init(frame frameRect: NSRect) {
super.init(frame: frameRect)
setupMetalLayer()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
setupMetalLayer()
}
override func makeBackingLayer() -> CALayer {
return CAMetalLayer()
}
func setupMetalLayer() {
wantsLayer = true
metalLayer = layer as! CAMetalLayer?
metalLayer.device = MTLCreateSystemDefaultDevice()!
// ...other configuration of the metalLayer...
}
// handle display link callback at 60fps
static let _outputCallback: CVDisplayLinkOutputCallback = { (displayLink, inNow, inOutputTime, flagsIn, flagsOut, context) -> CVReturn in
// convert opaque context pointer back into a reference to our view
let view = Unmanaged<MyMetalView>.fromOpaque(context!).takeUnretainedValue()
/*** render something into view.metalLayer here! ***/
return kCVReturnSuccess
}
override func viewDidMoveToWindow() {
super.viewDidMoveToWindow()
guard CVDisplayLinkCreateWithActiveCGDisplays(&displayLink) == kCVReturnSuccess,
let displayLink = displayLink
else {
fatalError("unable to create display link")
}
// pass a reference to this view as an opaque pointer
guard CVDisplayLinkSetOutputCallback(displayLink, MyMetalView._outputCallback, Unmanaged<MyMetalView>.passUnretained(self).toOpaque()) == kCVReturnSuccess else {
fatalError("unable to configure output callback")
}
guard CVDisplayLinkStart(displayLink) == kCVReturnSuccess else {
fatalError("unable to start display link")
}
}
deinit {
if let displayLink = displayLink {
CVDisplayLinkStop(displayLink)
}
}
}

Drawing directly in a NSView without using the draw(_ updateRect: NSRect) function

I would like to draw CGImage pictures directly to a View and with the normal method using the draw func I only get 7 pictures in a second on a new Mac Book Pro. So I decided to use the updateLayer func instead. I have defined wantsUpdateLayer = true and my new updateLayer func is called as expected. But then starts my problem. When using the draw func, I get the current CGContext with "NSGraphicsContext.current?.cgContext" but in my updateLayer func the "NSGraphicsContext.current?.cgContext" is nil. So I do not know where to put my CGImage, that it will be displayed on my screen. Also "self.view?.window?.graphicsContext?.cgContext" and "self.window?.graphicsContext?.cgContext" are nil, too. There are no buttons or other elements in this view and in the window of the view, only one picture, filling the complete window. And this picture must change 30 times in a second. Generating the pictures is done by a separate thread and needs about 1 millisecond for a picture. I think that from "outside" the NSView class it is not possible to write the picture but my updateLayer func is inside the class.
Here is what the func looks like actually:
override func updateLayer ()
{
let updateRect: NSRect = NSRect(x: 0.0, y: 0.0, width: 1120.0, height: 768.0)
let context1 = self.view?.window?.graphicsContext?.cgContext
let context2 = self.window?.graphicsContext?.cgContext
let context3 = NSGraphicsContext.current?.cgContext
}
And all three contexts are nil in the time the function is called automatically after I set the needsDisplay flag.
Any ideas where to draw my CGImages?
The updateLayer func is called automatically by the user interface. I do not call it manually. It is called by the view. My problem is where inside this method to put my picture to be shown on the screen. Perhaps I have to add a layer or use a default layer of the view but I do not know how to do this.
Meanwhile I have found the solution with some tipps from a good friend:
override var wantsUpdateLayer: Bool
{
return (true)
}
override func updateLayer ()
{
let cgimage: CGImage? = picture // Here comes the picture
if cgimage != nil
{
let nsimage: NSImage? = NSImage(cgImage: cgimage!, size: NSZeroSize)
if nsimage != nil
{
let desiredScaleFactor: CGFloat? = self.window?.backingScaleFactor
if desiredScaleFactor != nil
{
let actualScaleFactor: CGFloat? = nsimage!.recommendedLayerContentsScale(desiredScaleFactor!)
if actualScaleFactor != nil
{
self.layer!.contents = nsimage!.layerContents(forContentsScale: actualScaleFactor!)
self.layer!.contentsScale = actualScaleFactor!
}
}
}
}
}
This is the way to directly write into the layer. Depending on the format (CGImage or NSImage) you first must convert it. As soon as the func wantsUpdateLayer returns a true, the func updateLayer() is used instead of the func draw(). Thats all.
For all who want to see my "Normal" draw function:
override func draw (_ updateRect: NSRect)
{
let cgimage: CGImage? = picture // Here comes the picture
if cgimage != nil
{
if #available(macOS 10.10, *)
{
NSGraphicsContext.current?.cgContext.draw(cgimage!, in: updateRect)
}
}
else
{
super.draw(updateRect)
}
}
The additional speed is 2 times or more, depending on what hardware you use. On a modern Mac Pro there is only a little bit more speed but on a modern Mac Book Pro you will get 10 times or more speed. This works with Mojave 10.14.6 and Catalina 10.15.6. I did not test it with older macOS versions. The "Normal" draw function works with 10.10.6 to 10.15.6.

Can't hide share button in USDZ + QLPreviewController

I got a project that involves a few USDZ files for the augmented reality features embedded in the app. While this works great, and we're really happy with how it performs, the built-in share button of the QLPreviewController is something that we'd like to remove. Subclassing the object doesn't have any effect, and trying to hide the rightBarButtonItem with the controller returned in delegate method still shows the button when a file is selected. The implementation of USDZ + QLPreviewController we're using is pretty basic. Is there a way around this issue?
func numberOfPreviewItems(in controller: QLPreviewController) -> Int {
return 1
}
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
let url = Bundle.main.url(forResource: models[selectedObject], withExtension: "usdz")! controller.navigationItem.rirButtonItems = nil.
// <- no effect return url as QLPreviewItem
}
#IBAction func userDidSelectARExperience(_ sender: Any) {
let previewController = QLPreviewController()
previewController.dataSource = self
previewController.delegate = self
present(previewController, animated: true)
}
This is the official answer from Apple.
Use ARQuickLookPreviewItem instead of QLPreviewItem. And set its canonicalWebPageURL to a URL (can be any URL).
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
guard let path = Bundle.main.path(forResource: "Experience", ofType: "usdz") else { fatalError("Couldn't find the supported input file.") }
let url = URL(fileURLWithPath: path)
if #available(iOS 13.0, *) {
let item = ARQuickLookPreviewItem(fileAt: url)
item.canonicalWebPageURL = URL(string: "http://www.google.com")
return item
} else { }
return url as QLPreviewItem
}
The version check is optional.
My approach is to add the QLPreviewController as an subview.
container is an UIView in storyboard.
let preview = QLPreviewController()
preview.dataSource = self
preview.view.frame = CGRect(origin: CGPoint(x: 0, y: -45), size: CGSize(width: container.frame.size.width, height: container.frame.size.height+45) )
container.addSubview(preview.view)
preview.didMove(toParent: self)
The y offset of the frame's origin and size may vary. This will ensure the AR QuickLook view to be the same size as the UIView, and hide the buttons (unfortunately, all of them) at the same time.
Instead of returning QLPreviewItem, use ARQuickLookPreviewItem which conforms to this protocol.
https://developer.apple.com/documentation/arkit/arquicklookpreviewitem
Then, assign a url that you would want to share (that will appear in share sheet) in canonicalWebPageURL property. By default, this property shares the file url (in this case, the USDZ file url). Doing so would not expose your file URL(s).
TLDR: I don't think you can.
I haven't seen any of the WWDC session even mention this and I can't seem to find any supporting developer documentation. I'm pretty sure the point of the ARKit QLPreviewController is so you don't have to do any actual coding on the AR side. I can see the appeal for this and for customisation in general, however, I'd suggest instead looking at some of the other ARKit projects that Apple has released and attempting to re-create those from the ground up as opposed to stripping this apart.
Please advise if this changes as I'd like to do something similar, especially within Safari.
I couldn't get to the share button at all to hide or disable it. Spent days to overcome this. I did rather unprofessional way of overcoming it. Subview QLPreviewController to a ViewController and subview a button or view on top of image view on top of share button and setting my company logo as image. It will be there all the time, even the top bar hides on full screen in AR mode. Not a clean solution. But works.

NSWindow transition animation for View Controller Segues

I've been trying to figure out how to get a NSWindow to perform a transition animation with Swift 3. I found a few examples in Objective-C, but I haven't been able to tease out the relevant details and translate into the target language / newer SDK and get it applied to the right object. This one is pretty flipping cool, but it's ~8yrs old: http://www.theregister.co.uk/2009/08/21/cocoa_graphics_framework/ -- I would imagine there's a better way to do the CGSCube effect now in macOS Sierra with Swift.
Here's what I have so far:
class ViewController: NSViewController {
func doAnimation() {
if let layer = view .layer {
layer.backgroundColor = CGColor.black
let rotateAnimation = CABasicAnimation(keyPath: "transform.rotation")
rotateAnimation.fromValue = 0.0
rotateAnimation.toValue = CGFloat(CGFloat.pi * 2.0)
rotateAnimation.duration = 10.0
layer.add(rotateAnimation, forKey: nil)
}
}
override func viewWillAppear() {
if let window = self.view.window {
window.styleMask = NSWindowStyleMask.borderless
window.backingType = NSBackingStoreType.buffered
window.backgroundColor = NSColor.clear
window.isOpaque = false
window.hasShadow = false
}
}
override func viewDidLoad() {
super.viewDidLoad()
doAnimation()
}
}
This doesn't really do the trick at all. I get a white background on my window instead of a transparent background, and my view rolls across the frame instead of the window frame itself being animated.
Ideally, I would like to do something more advanced like these 3d transitions -- https://cocoapods.org/pods/MKCubeController & https://www.appcoda.com/custom-view-controller-transitions-tutorial/ & https://github.com/andresbrun/ABCustomUINavigationController#cube but I'm not quite sure how to translate the examples from the iOS SDK over to the macOS SDK without UIKit. (Annoyingly, I remember pulling this off a few years back in ObjC, but the project was lost somewhere between formats / new computers.)
How can I apply a transform to the NSWindow itself while segueing between View Controllers? Any tips toward adding some 3d to this effect would be appreciated. I hoping there's maybe a CocoaPod that gets me halfway there.