How can you turn a macOS SwiftUI view into an image? - swift

I know that you can turn an IOS SwiftUI view into an image by following these steps, but this uses UIKit things in the SwiftUI extension which you cannot use in macOS SwiftUI.
Does anyone know how to snapshot/screenshot a macOS SwiftUI View?

In macOS same approach can be used. NSHostingController is analog of UIHostingController. Also to get it drawn the view should be added to some window:
extension View {
func snapshot() -> NSImage? {
let controller = NSHostingController(rootView: self)
let targetSize = controller.view.intrinsicContentSize
let contentRect = NSRect(origin: .zero, size: targetSize)
let window = NSWindow(
contentRect: contentRect,
styleMask: [.borderless],
backing: .buffered,
defer: false
)
window.contentView = controller.view
guard
let bitmapRep = controller.view.bitmapImageRepForCachingDisplay(in: contentRect)
else { return nil }
controller.view.cacheDisplay(in: contentRect, to: bitmapRep)
let image = NSImage(size: bitmapRep.size)
image.addRepresentation(bitmapRep)
return image
}
}

Related

White strip on iPad when rendering a view to an image

I am trying to render a black rectangle to an image and save it to the photo library. But every time I render it on my iPad, the picture has a white strip on the top, that doesn’t happen if I do this on the iPhone.
I am using Swift Playgrounds 4, so maybe that’s the reason. It’s a bit strange, since both Views the small and the bigger one are both „iPads“.
Thank you for your help!
That’s my code so far:
import SwiftUI
struct ContentView: View {
var body: some View {
VStack {
Button("Snapshot") {
// Save Screenshot
let image = snapshotView.snapshot()
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
}
}
}
var snapshotView: some View {
VStack {
Rectangle()
.frame(width: 200, height: 200)
}
}
}
extension View {
func snapshot() -> UIImage {
let controller = UIHostingController(rootView: self)
let view = controller.view
let targetSize = controller.view.intrinsicContentSize
view?.bounds = CGRect(origin: .zero, size: targetSize)
view?.backgroundColor = .clear
let renderer = UIGraphicsImageRenderer(size: targetSize)
return renderer.image { _ in
view?.drawHierarchy(in: controller.view.bounds, afterScreenUpdates: true)
}
}
}
Image of the Rectangle

Saving SwiftUi View as an image to photo album

I'm using extension to View I found on hackingwithswift.com:
extension View {
func snapshot() -> UIImage {
let controller = UIHostingController(rootView: self)
let view = controller.view
let targetSize = controller.view.intrinsicContentSize
view?.bounds = CGRect(origin: .zero, size: targetSize)
view?.backgroundColor = .clear
let renderer = UIGraphicsImageRenderer(size: targetSize)
return renderer.image { _ in
view?.drawHierarchy(in: controller.view.bounds, afterScreenUpdates: true)
}
}
}
I'm using in the following way:
I have object of type Canvas, which contains some drawing and I also added a border to make it more visible. Then I'm saving it to Photo Album, but finall photo is is out of position in relation to the original. I'm attaching screenshot of my view and finall photo.
let image = canvas.snapshot()
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
Add .edgesIgnoringSafeArea(.all) to the rootView: self parameter of the UIHostingController init call.
let controller = UIHostingController(rootView: self.edgesIgnoringSafeArea(.all))
That will make the screenshot not be clipped.

Opening an NSWIndow / NSView on a connected display

Environment: macOS Big Sur 11.5.2 / Swift 5.4
Target Platform: macOS
I came across this discussion while searching for how to open a new NSWindow / NSView on a connected display.
I converted the obj-c code to Swift, but I cannot get it to work. I have the method set up inside an IBAction and I linked the action to a button I placed in the view titled "Show External Display."
What am I doing wrong?
import Cocoa
class ViewController: NSViewController {
var externalDisplay: NSScreen?
var fullScreenWindow: NSWindow?
var fullScreenView: NSView?
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
}
override var representedObject: Any? {
didSet {
// Update the view, if already loaded.
}
}
#IBAction func showExternalDisplayButtonClicked(sender: Any?) {
// Open a window on the external display if present
if NSScreen.screens.count > 1 {
externalDisplay = NSScreen.screens.last
let externalDisplayRect = externalDisplay!.frame
fullScreenWindow = NSWindow(contentRect: externalDisplayRect, styleMask: .borderless, backing: .buffered, defer: true, screen: externalDisplay)
fullScreenWindow!.level = .normal
fullScreenWindow!.isOpaque = false
fullScreenWindow!.hidesOnDeactivate = false
fullScreenWindow!.backgroundColor = .red
let viewRect = NSRect(x: 0, y: 0, width: externalDisplay!.frame.width, height: externalDisplay!.frame.height)
fullScreenView = NSView(frame: viewRect)
fullScreenWindow!.contentView = fullScreenView
fullScreenWindow!.makeKeyAndOrderFront(self)
}
}
}
Why do you make styleMask: .borderless? Docs say a borderless window can't become key or main so makeKeyAndOrderFront fails.
If you make a normal window like this:
let mask: NSWindow.StyleMask = [.titled, .closable, .miniaturizable, .resizable]
fullScreenWindow = NSWindow(contentRect: externalDisplayRect, styleMask: mask, backing: .buffered, defer: true, screen: externalDisplay)
it opens on the last screen.

SwiftUI View to NSImage in AppKit

SwiftUI 2.0 | Swift 5.4 | Xcode 12.4 | macOS Big Sur 11.4
In iOS there is a way to render SwiftUI views to a UIImage that doesn't work in AppKit, only UIKit:
extension View {
func snapshot() -> UIImage {
let controller = UIHostingController(rootView: self)
let view = controller.view
let targetSize = controller.view.intrinsicContentSize
view?.bounds = CGRect(origin: .zero, size: targetSize)
view?.backgroundColor = .clear
let renderer = UIGraphicsImageRenderer(size: targetSize)
return renderer.image { _ in
view?.drawHierarchy(in: controller.view.bounds, afterScreenUpdates: true)
}
}
}
Is there any workaround to get that working on AppKit? The method UIGraphicsImageRenderer(size:) doesn't work on AppKit and there is no equivalent.
If I'm understanding your question correctly, you should be able to use an NSBitmapImageRef to do the trick!
Here's a simple example of generating an NSImage from SwiftUI view (Note: your host view must be visible!):
let view = MyView() // some SwiftUI view
let host = NSHostingView(rootView: view)
// Note: the host must be visible on screen, which I am guessing you already have it that way. If not, do that here.
let bitmapRep = host.bitmapImageRepForCachingDisplay(in: host.bounds)
host.cacheDisplay(in: host.bounds, to: bitmapRep!)
let image = NSImage(size: host.frame.size)
image.addRepresentation(bitmapRep!)
Here's an extension to NSHostingView that should safely make snapshots:
extension NSHostingView {
func snapshot() -> NSImage? {
// Make sure the view is visible:
guard self.window != nil else { return nil }
// Get bitmap data:
let bitmapRep = self.bitmapImageRepForCachingDisplay(in:
self.bounds)
self.cacheDisplay(in: self.bounds, to: bitmapRep!)
// Create NSImage from NSBitmapImageRep:
let image = NSImage(size: self.frame.size)
image.addRepresentation(bitmapRep!)
return image
}
}
I hope this helps. Let me know if you have any questions!

How to convert a Swift UI View to a NSImage

I am trying to use a SwiftUI View as a NSCursor on MacOS.
Using SwiftUI I am constructing a view that I then convert to an NSView using NSHostingView. Now I am trying to convert that to a NSImage via a NSBitmapImageRep. For some reason I always get an empty png when I inspect the variables while setting breakpoints.
(For now the setup of the cursor is done in the AppDelegate because I am currently just trying to get this to work.)
class AppDelegate: NSObject, NSApplicationDelegate {
var window: NSWindow!
func applicationDidFinishLaunching(_ aNotification: Notification) {
// Create the SwiftUI view that provides the window contents.
let contentView = ContentView()
let contentRect = NSRect(x: 0, y: 0, width: 480, height: 480)
// Create the window and set the content view.
window = NSWindow(
contentRect: contentRect,
styleMask: [.titled, .closable, .miniaturizable, .resizable, .fullSizeContentView],
backing: .buffered, defer: false)
window.center()
window.setFrameAutosaveName("Main Window")
window.contentView = NSHostingView(rootView: contentView)
window.makeKeyAndOrderFront(nil)
let myNSView = NSHostingView(rootView: ContentView()).bitmapImageRepForCachingDisplay(in: contentRect)!
NSHostingView(rootView: ContentView()).cacheDisplay(in: contentRect, to: myNSView)
let myNSImage = NSImage(size: myNSView.size)
myNSImage.addRepresentation(myNSView)
self.window.disableCursorRects()
// var myName = NSImage.Name("ColorRing")
// var myCursorImg = NSImage(named: myName)!
//
// var myCursor = NSCursor(image: myCursorImg, hotSpot: NSPoint(x: 10, y: 10))
var myCursor = NSCursor(image: myNSImage, hotSpot: NSPoint(x: 0, y: 0))
myCursor.set()
}
func applicationWillTerminate(_ aNotification: Notification) {
// Insert code here to tear down your application
}
}
When executing the part that is commented out, I get the mouse cursor from an external png file.
When executing the code shown above I always get an empty mouse cursor with the size 480x480.
In debug mode I can see, that myNSView is an empty png.
I am pretty sure I misunderstand the documentation and therefore misuse cacheDisplay(in:to:) and bitmapImageRepForCachingDisplay(in:).
The documentation tells me to use the NSBitmapImageRep from bitmapImageRepForCachingDisplay(in:) in cacheDisplay(in:to:). But for some reason I simply can not get this to work.
Does anyone have an idea what I am doing wrong?
To answer my own question:
It seems like the view that one wants to convert to an NSImage needs to be layed out inside of a window. Otherwise it probably does not know how large the view needs to be inside the rect.
The following code works for me:
func applicationDidFinishLaunching(_ aNotification: Notification) {
// Create the SwiftUI view that provides the window contents.
let contentView = ContentView()
let contentRect = NSRect(x: 0, y: 0, width: 480, height: 480)
// Create the window and set the content view.
window = NSWindow(
contentRect: contentRect,
styleMask: [.titled, .closable, .miniaturizable, .resizable, .fullSizeContentView],
backing: .buffered, defer: false)
window.center()
window.setFrameAutosaveName("Main Window")
window.contentView = NSHostingView(rootView: contentView)
window.makeKeyAndOrderFront(nil)
// CHANGED ----------
let newWindow = NSWindow(
contentRect: contentRect,
styleMask: [.titled, .closable, .miniaturizable, .resizable, .fullSizeContentView],
backing: .buffered, defer: false)
newWindow.contentView = NSHostingView(rootView: contentView)
let myNSBitMapRep = newWindow.contentView!.bitmapImageRepForCachingDisplay(in: contentRect)!
newWindow.contentView!.cacheDisplay(in: contentRect, to: myNSBitMapRep)
// CHANGED ----------
let myNSImage = NSImage(size: myNSBitMapRep.size)
myNSImage.addRepresentation(myNSBitMapRep)
self.window.disableCursorRects()
var myCursor = NSCursor(image: myNSImage, hotSpot: NSPoint(x: 0, y: 0))
myCursor.set()
}
Only lines between the two //CHANGED ----- comments have been changed for a working solution. I am simply embedding the view inside a window before applying the same methods as before.
Hope this helps some people in the future.