macOS - Programmatically display an image using NSImageView - swift

Using a Swift app targeting the macOS platform, I am trying to programmatically create an NSImageView object, assign an image to it, and then display it, using:
let imageViewObject = NSImageView();
let img = NSImage(cgImage: winCGImage, size: NSZeroSize);
imageViewObject.image = img
self.window.contentView?.addSubview(imageViewObject)
However, no image is displayed when I run this.
On the debugger, I can see that imageViewObject.image (i.e. img) has a valid image, and I can view it using the eye icon.
What am I missing?

You are creating an image view with a zero frame, first create the image and then the view passing the proper size.
let img = NSImage(cgImage: winCGImage, size: NSZeroSize)
let imageViewObject = NSImageView(frame: NSRect(origin: .zero, size: img.size))
imageViewObject.image = img
Avoid to create views with the default initializer ().

Related

Restrict size of NSTextAttachment image

I am trying to insert an image in UITextView. I have used the following code.
extension TextView {
func add(image: UIImage) {
let attachment = NSTextAttachment()
attachment.image = image
attachment.bounds = CGRect(x: 0, y: 0, width: 40, height: 40)
let attString = NSAttributedString(attachment: attachment)
self.attributedText = attString
}
}
The parent UIViewController calls add(image: UIImage).
In func textViewDidChange(_ textView: UITextView), I save the attributedText in CoreData as a Transforable NSAttributedString. I use
NSAttributedStringTransformer for Transformer
The image's size is 40X40 when added. The image has also the same size when I dismiss the parent UIViewController and present it back. However, if I quit the app and relaunch it, the image is not 40X40. It is larger than then UIScreen's size.
How to set size of the image to be 40X40 even after quitting the app?
By setting attachment bounds, you are not resizing the actual image, just the display bounds. Looks like the attributed string transformer doesn’t serialize the bounds you set. You will have to either resize the image directly, or extend the transformer to add the bounds after deserializing from data.
Edit: I see that NSAttributedStringTransformer is not an Apple-provided transformer. So take a look at the source code and see why the bounds are not serialized properly.

How do I create and print an offscreen SwiftUI view on Mac not iPhone

For over a week I have been trying to solve a printing issue in a macOS app I am writing using Swift and SwiftUI.
I need to print a view, which I have done by making it the main window but that does not work for the user. The data can come from a file or directly from user input so could be from 1 to 100 data sets.
Because I am printing the main window the application is unusable during the print process which is not acceptable.
The ideal solution would be to create the view off-screen then print the view. That way the user never sees it other than what comes out of the printer!
I have tried to find out how to print a view that is not the main window - no success, tried creating a second window managed to create the window but not print it!
No point in posting code as no idea which of the several ways I have tried could work, not even sure at this point if what I am trying to do is possible!
Please note this on Mac, not iPhone or iPad!
This worked for me, borrowed from the link in the comment from #Willeke. It involves converting to a bit map for printing.
let printInfo = NSPrintInfo()
let view = ContentView()
let contentRect = NSRect(x: 0, y: 0, width: 1080, height: 720) // these values will vary
let viewToPrint = NSHostingView(rootView: view)
viewToPrint.frame = contentRect
let bitMap = viewToPrint.bitmapImageRepForCachingDisplay(in: contentRect)!
viewToPrint.cacheDisplay(in: contentRect, to: bitMap)
let image = NSImage(size: bitMap.size)
image.addRepresentation(bitMap)
let imageView = NSImageView(frame: contentRect)
imageView.image = image
let printOperation = NSPrintOperation(view: imageView, printInfo: self.printInfo)
printOperation.showsPrintPanel = true
printOperation.showsProgressPanel = true
printOperation.run()

Programmatically place partial image over another in UIView using Swift 3

I just started working on Swift last week and i need a suggestion if the following approach is right ways of laying partial image on top of another image.
I have a UIView in which I am creating 3 images programmatically. Left arrow image, middle mobile image and right arrow image as shown below. Can I partially place arrow images 50% on the mobile image?
I have tried:
func setupUI(){
let mobileImage = UIImage(named: "mobile")
let arrowImage = UIImage(named: "arrow")
middleView = UIImageView(frame: CGRect(x: arrowImage!.size.width/2, y:0 , width:mobileImage!.size.width, height:mobileImage!.size.height))
middleView.image = mobileImage
middleView.layer.borderWidth = 1.0
middleView.layer.cornerRadius = 5.0
self.addSubview(middleView)
let yForArrow = mobileImage!.size.height - arrowImage!.size.height
leftArrow = UIImageView(frame: CGRect(x: 0, y:yForArrow, width:arrowImage!.size.width, height:arrowImage!.size.height))
leftArrow.image = arrowImage
self.addSubview(leftArrow)
let rightArrowX = mobileImage!.size.width
rightView = UIImageView(frame: CGRect(x: rightArrowX, y:yForArrow, width:arrowImage!.size.width, height:arrowImage!.size.height))
rightView.image = arrowImage
self.addSubview(rightView)
}
*At start it was not working, as i forgot to add setupUI() in init method. As shown in answer bellow.
Is setting frame correct way of doing it OR i should be using constraints?
To me it looks bad approach as i am hard coding the numbers in CGRect.
*This image is created in MS paint to show what it should look on iPhone.
I found the problem i missed adding setupUI() in init method.
SetupUI programatically adds images on UIView. As it was missing so no image was appearing in the iPhone simulator.
override init(frame: CGRect) {
super.init(frame: frame)
setupUI() // Code to add images in UIView
}

NSView dataWithPDF inside rect but with background view

I wan´t to take a screenshot of a NSView but when I do this, I get an image without the background. All the subviews are show, but not the background. I use this code:
if let image = NSImage(data: background.dataWithPDF(inside: background.bounds)) {
let imageView = NSImageView(image: image)
imageView.imageScaling = .scaleProportionallyUpOrDown
return NSImageView(image: image)
}
I thought, ok when I get only the subviews, then I will make a screenshot of the superview and I thried the following:
if let superview = background.superview, let image = NSImage(data: superview.dataWithPDF(inside: background.frame)) {
let imageView = NSImageView(image: image)
imageView.imageScaling = .scaleProportionallyUpOrDown
return NSImageView(image: image)
}
But I get the same result. Even if I set the background color of my background view I don´t get an image without transparent background.
How can I resolve this?
thank you
Artur
I got an Answer:
background.lockFocus()
if let rep = NSBitmapImageRep(focusedViewRect: background.bounds){
let img = NSImage(size: background.bounds.size)
img.addRepresentation(rep)
return NSImageView(image: img)
}
background.unlockFocus()
You could have converted the views backing layer.backgroundColor to an image. Then assign the image to a backing image view that sits in the 0th index of your views heirarchy and use the method dataWithPDF -> that will successfully capture a background.

Load app icon from xcassets on OS X

I can't load another app icon into NSImage. I tried to use both asset name and name of the particular file but image is always nil.
let image = NSImage(named: "Alerted")
//let alertedDog = NSImage(named: "Alerted128x128.png")
image?.size = NSSize(width: 128, height: 128)
NSApp.applicationIconImage = image
NSApp.dockTile.display()
Any thoughts?
In packaged file I see only standard AppIcon.icns and there is no Alerted.icns...
There are two types of icons. Additional should be of generic type.