How to implement SwiftUI’s .onDrag modifier with NSImage (macOS) - swift

I am building a sandboxed app for macOS with SwiftUI. I have a NSImage that is displayed as Image(nsImage: myNSImage). I want that View to support drag and drop, meaning that it can be dragged to any location that can receive image files.
Here’s my approach:
Image(nsImage: myNSImage)
.onDrag({
let itemProvider = NSItemProvider()
itemProvider.suggestedName = "image.png"
itemProvider.registerDataRepresentation(for: UTType.png) {
loadHandler in
loadHandler(nsImage.pngRepresentation, nil)
print("loadHandler completed") // Never prints !!
return nil
}
return itemProvider
})
This way I can drag the View. But it seems like it isn’t able to provide the image.
The “drop”, i.e. on the Desktop, is simply not working. I would expect that it saves an "image.png" in the destination URL, but it doesn’t.
How can I implement .onDrag so that it provides a image file based on NSImage?
Edit: I already have tried different UTTypes, for example .tiff in combination with nsImage.tiffRepresentation without luck.

Related

How to display image from known image path - SwiftUI

I've got an array of image URLs (all from storage, not internet). How can I show the images themselves?
Simplified code:
#Binding var files:[URL]//Array of image URLs taken from an NSOpenPanel instance
Form{
if files.count>0{
Image(files[0].path)//Problem
}
}
You say NSOpenPanel, so I'm making going to make the assumption here that we don't need to worry about waiting to load the image over a network:
if let nsImage = NSImage(contentsOf: url) {
Image(nsImage: nsImage)
}

SwiftUI - Saving Image to Share Sheet causes image to save blurry/low res

I have a bit of code in my app that generates a QR Code and scales it up (code reference I used from this link from Hackng with Swift. Now, I'm using the share sheet to allow the user to save the qr code to their camera roll and, it is working, but saving the image low res, and it saves to the camera roll blurry (and i assume if it is shared via other methods it will also be blurry)
Here is the code of my share sheet function:
struct ActivityView: UIViewControllerRepresentable {
let activityItems: [Any]
let applicationActivities: [UIActivity]?
func makeUIViewController(context: UIViewControllerRepresentableContext<ActivityView>) -> UIActivityViewController {
return UIActivityViewController(activityItems: activityItems, applicationActivities: applicationActivities)
}
func updateUIViewController(_ uiViewController: UIActivityViewController, context: UIViewControllerRepresentableContext<ActivityView>) {
}
}
and here's the code in my view struct:
.sheet(isPresented: $showShareSheet) {
ShareSheet(activityItems: [self.qrCodeImage])
}
Is there a trick to remove the interpolation on the image when it saves to the share sheet like the .interpolation(.none) on the image view itself?
Your problem is that the QR code image is actually tiny! Like really tiny:
Printing description of image:
<UIImage:0x60000202cc60 anonymous {23, 23}>
When you share this image, the way it will be displayed is dependant on the program or app that will display it, and is out of control of your app as far as I know.
However,
there is a way that you could potentially make it "pretty" in other apps, and this would be to increase the resolution to a larger amount so that when it's rendered it'll appear to have "sharp" pixels.
How would this be accomplished? I think I have an example buried somewhere in old code, I'll dig into it and see if I can find you an example ;)
Edit
I found the code:
extension UIImage {
func resized(toWidth width: CGFloat) -> UIImage? {
let canvasSize = CGSize(width: round(width), height: CGFloat(ceil(width/size.width * size.height)))
UIGraphicsBeginImageContextWithOptions(canvasSize, false, scale)
defer { UIGraphicsEndImageContext() }
let context = UIGraphicsGetCurrentContext();
context?.interpolationQuality = .none
// Set the quality level to use when rescaling
draw(in: CGRect(origin: .zero, size: canvasSize))
let r = UIGraphicsGetImageFromCurrentImageContext()
return r
}
}
The trick is to provide a way to scale the image, but the real magic is on line 7:
context?.interpolationQuality = .none
If you exclude this line, you'll get blurry images, which is what the OS does by default because you don't generally want to see the pixel edges in images.
You could use this extension like so:
.sheet(isPresented: $showShareSheet) {
ShareSheet(activityItems: [self.qrCodeImage.resized(toWidth: 512) ?? UIImage()])
}
However, this may be resizing the image way more often than necessary. Optimally you'd resize it in the same function that you generate it.

Can't hide share button in USDZ + QLPreviewController

I got a project that involves a few USDZ files for the augmented reality features embedded in the app. While this works great, and we're really happy with how it performs, the built-in share button of the QLPreviewController is something that we'd like to remove. Subclassing the object doesn't have any effect, and trying to hide the rightBarButtonItem with the controller returned in delegate method still shows the button when a file is selected. The implementation of USDZ + QLPreviewController we're using is pretty basic. Is there a way around this issue?
func numberOfPreviewItems(in controller: QLPreviewController) -> Int {
return 1
}
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
let url = Bundle.main.url(forResource: models[selectedObject], withExtension: "usdz")! controller.navigationItem.rirButtonItems = nil.
// <- no effect return url as QLPreviewItem
}
#IBAction func userDidSelectARExperience(_ sender: Any) {
let previewController = QLPreviewController()
previewController.dataSource = self
previewController.delegate = self
present(previewController, animated: true)
}
This is the official answer from Apple.
Use ARQuickLookPreviewItem instead of QLPreviewItem. And set its canonicalWebPageURL to a URL (can be any URL).
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
guard let path = Bundle.main.path(forResource: "Experience", ofType: "usdz") else { fatalError("Couldn't find the supported input file.") }
let url = URL(fileURLWithPath: path)
if #available(iOS 13.0, *) {
let item = ARQuickLookPreviewItem(fileAt: url)
item.canonicalWebPageURL = URL(string: "http://www.google.com")
return item
} else { }
return url as QLPreviewItem
}
The version check is optional.
My approach is to add the QLPreviewController as an subview.
container is an UIView in storyboard.
let preview = QLPreviewController()
preview.dataSource = self
preview.view.frame = CGRect(origin: CGPoint(x: 0, y: -45), size: CGSize(width: container.frame.size.width, height: container.frame.size.height+45) )
container.addSubview(preview.view)
preview.didMove(toParent: self)
The y offset of the frame's origin and size may vary. This will ensure the AR QuickLook view to be the same size as the UIView, and hide the buttons (unfortunately, all of them) at the same time.
Instead of returning QLPreviewItem, use ARQuickLookPreviewItem which conforms to this protocol.
https://developer.apple.com/documentation/arkit/arquicklookpreviewitem
Then, assign a url that you would want to share (that will appear in share sheet) in canonicalWebPageURL property. By default, this property shares the file url (in this case, the USDZ file url). Doing so would not expose your file URL(s).
TLDR: I don't think you can.
I haven't seen any of the WWDC session even mention this and I can't seem to find any supporting developer documentation. I'm pretty sure the point of the ARKit QLPreviewController is so you don't have to do any actual coding on the AR side. I can see the appeal for this and for customisation in general, however, I'd suggest instead looking at some of the other ARKit projects that Apple has released and attempting to re-create those from the ground up as opposed to stripping this apart.
Please advise if this changes as I'd like to do something similar, especially within Safari.
I couldn't get to the share button at all to hide or disable it. Spent days to overcome this. I did rather unprofessional way of overcoming it. Subview QLPreviewController to a ViewController and subview a button or view on top of image view on top of share button and setting my company logo as image. It will be there all the time, even the top bar hides on full screen in AR mode. Not a clean solution. But works.

How to create apple watchOS5 complication?

I've never worked in WatchOS5 and want to develop a horizontal complication (Modular large) for AppleWatch, like "Heart Rate". The idea is that I would display heart rate data in a different way. Right now I want to deploy the complication on development watch.
I have created a new project with a checkbox for "complication" added. I see that this added a complications controller with timeline configuration placeholders.
There is also an storyboard with a bunch of empty screens. I'm not sure as to how much effort I need to put into an apple watch app before I can deploy it. I see this Apple doc, but it does not describe how to layout my complication. Some section seem to have missing links.
Can I provide one style of complication only (large horizontal - modular large)
Do I need to provide any iPhone app content beyond managing the
complication logic, or can I get away without having a view controller?
Do I control the appearance of my complication by adding something to the assets folder (it has a bunch of graphic slots)?
Sorry for a complete beginner project, I have not seen a project focusing specifically on the horizontal complication for watch OS 5
You should be able to deploy it immediately, though it won't do anything. Have a look at the wwdc video explaining how to create a complication: video
You can't layout the complication yourself, you can chose from a set of templates that you fill with data. The screens you are seeing are for your watch app, not the complication.
You don't have to support all complication styles.
The complication logic is part of your WatchKit Extension, so technically you don't need anything in the iOS companion app, I'm not sure how much functionality you have to provide to get past the app review though.
Adding your graphics to the asset catalog won't do anything, you have to reference them when configuring the templates.
Here's an example by Apple of how to communicate with the apple watch app. You need to painstakingly read the readme about 25 times to get all the app group identifiers changed in that project.
Your main phone app assets are not visible to the watch app
Your watch storyboard assets go in WatchKit target
Your programmatically accessed assets go into the watch extension target
Original answers:
Can I provide one style of complication only (large horizontal -
modular large) - YES
Do I need to provide any iPhone app content beyond
managing the complication logic, or can I get away without having a
view controller? YES - watch apps have computation limits imposed on them
Do I control the appearance of my complication by
adding something to the assets folder (it has a bunch of graphic
slots)? See below - it's both assets folder and placeholders
Modify the example above to create a placeholder image displayed on the watch (when you are selecting a complication while modifying the screen layout)
func getPlaceholderTemplate(for complication: CLKComplication, withHandler handler: #escaping (CLKComplicationTemplate?) -> Void) {
// Pass the template to ClockKit.
if complication.family == .graphicRectangular {
// Display a random number string on the body.
let template = CLKComplicationTemplateGraphicRectangularLargeImage()
template.textProvider = CLKSimpleTextProvider(text: "---")
let image = UIImage(named: "imageFromWatchExtensionAssets") ?? UIImage()
template.imageProvider = CLKFullColorImageProvider(fullColorImage: image)
// Pass the entry to ClockKit.
handler(template)
}else {
handler(nil);
return
}
}
sending small packets to the watch (will not send images!)
func updateHeartRate(with sample: HKQuantitySample){
let context: [String: Any] = ["title": "String from phone"]
do {
try WCSession.default.updateApplicationContext(context)
} catch {
print("Failed to transmit app context")
}
}
Transferring images and files:
func uploadImage(_ image: UIImage, name: String, title: String = "") {
let data: Data? = UIImagePNGRepresentation(image)
do {
let fileManager = FileManager.default
let documentDirectory = try fileManager.url(for: .cachesDirectory,
in: .userDomainMask,
appropriateFor:nil,
create:true)
let fileURL = try FileManager.fileURL("\(name).png")
if fileManager.fileExists(atPath: fileURL.path) {
try fileManager.removeItem(at: fileURL)
try data?.write(to: fileURL, options: Data.WritingOptions.atomic)
} else {
try data?.write(to: fileURL, options: Data.WritingOptions.atomic)
}
if WCSession.default.activationState != .activated {
print("session not activated")
}
fileTransfer = WCSession.default.transferFile(fileURL, metadata: ["name":name, "title": title])
}
catch {
print(error)
}
print("Completed transfer \(name)")
}

Capturing still image with AVFoundation

I'm currently creating a simple application which uses AVFoundation to stream video into a UIImageView.
To achieve this, I created an instance of AVCaptureSession() and an AVCaptureSessionPreset():
let input = try AVCaptureDeviceInput(device: device)
print(input)
if (captureSession.canAddInput(input)) {
captureSession.addInput(input)
if (captureSession.canAddOutput(sessionOutput)) {
captureSession.addOutput(sessionOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.portrait
cameraView.layer.addSublayer(previewLayer)
captureSession.startRunning()
cameraView references to the UIImageView outlet.
I now want to implement a way of capturing a still image from the AVCaptureSession.
Correct me if theres a more efficient way, but I plan to have an additional UIImageView to hold the still image placed on top of the UIImageView which holds the video?
I've created a button with action:
#IBAction func takePhoto(_sender: Any) {
// functionality to obtain still image
}
My issue is, I'm unsure how to actually obtain a still image from the capture session and populate the new UIImageView with it.
After looking at information/questions posted on Stack, the majority of the solutions is to use:
captureStillImageAsynchronouslyFromConnection
I'm unsure if it's just Swift 3.0 but xCode isn't recognising this function.
Could someone please advise me on how to actually achieve the result of obtaining and displaying a still image upon button click.
Here is a link to my full code for better understanding of my program.
Thank you all in advance for taking the time to read my question and please feel free to tell me in case i've missed out some relevant data.
if you are targeting iOS 10 or above. captureStillImageAsynchronously(from:completionHandler:) is deprecated along with AVCaptureStillImageOutput.
As per the documentation
The AVCaptureStillImageOutput class is deprecated in iOS 10.0 and does
not support newer camera capture features such as RAW image output,
Live Photos, or wide-gamut color. In iOS 10.0 and later, use the
AVCapturePhotoOutput class instead. (The AVCaptureStillImageOutput
class remains supported in macOS 10.12.)
As per your code you are already using AVCapturePhotoOutput. So just follow these below steps to take a photo from session. Same can be found here in Apple documentation.
Create an AVCapturePhotoOutput object. Use its properties to determine supported capture settings and to enable certain features (for example, whether to capture Live Photos).
Create and configure an AVCapturePhotoSettings object to choose features and settings for a specific capture (for example, whether to enable image stabilization or flash).
Capture an image by passing your photo settings object to the capturePhoto(with:delegate:) method along with a delegate object implementing the AVCapturePhotoCaptureDelegate protocol. The photo capture output then calls your delegate to notify you of significant events during the capture process.
you are already doing step 1 and 2. So add this line in your code
#IBAction func takePhoto(_sender: Any) {
print("Taking Photo")
sessionOutput.capturePhoto(with: sessionOutputSetting, delegate: self as! AVCapturePhotoCaptureDelegate)
}
and implement the AVCapturePhotoCaptureDelegate function
optional public func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?)
Note that this delegate will give lots of control over taking photos. Check out the documentation for more functions. Also you need to process the image data which means you have to convert the sample buffer to UIImage.
if sampleBuffer != nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProviderCreateWithCFData(imageData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
// ...
// Add the image to captureImageView here...
}
Note that the image you get is rotated left so we have to manually rotate right so get preview like image.
More info can be found in my previous SO answer