How to create apple watchOS5 complication? - swift

I've never worked in WatchOS5 and want to develop a horizontal complication (Modular large) for AppleWatch, like "Heart Rate". The idea is that I would display heart rate data in a different way. Right now I want to deploy the complication on development watch.
I have created a new project with a checkbox for "complication" added. I see that this added a complications controller with timeline configuration placeholders.
There is also an storyboard with a bunch of empty screens. I'm not sure as to how much effort I need to put into an apple watch app before I can deploy it. I see this Apple doc, but it does not describe how to layout my complication. Some section seem to have missing links.
Can I provide one style of complication only (large horizontal - modular large)
Do I need to provide any iPhone app content beyond managing the
complication logic, or can I get away without having a view controller?
Do I control the appearance of my complication by adding something to the assets folder (it has a bunch of graphic slots)?
Sorry for a complete beginner project, I have not seen a project focusing specifically on the horizontal complication for watch OS 5

You should be able to deploy it immediately, though it won't do anything. Have a look at the wwdc video explaining how to create a complication: video
You can't layout the complication yourself, you can chose from a set of templates that you fill with data. The screens you are seeing are for your watch app, not the complication.
You don't have to support all complication styles.
The complication logic is part of your WatchKit Extension, so technically you don't need anything in the iOS companion app, I'm not sure how much functionality you have to provide to get past the app review though.
Adding your graphics to the asset catalog won't do anything, you have to reference them when configuring the templates.

Here's an example by Apple of how to communicate with the apple watch app. You need to painstakingly read the readme about 25 times to get all the app group identifiers changed in that project.
Your main phone app assets are not visible to the watch app
Your watch storyboard assets go in WatchKit target
Your programmatically accessed assets go into the watch extension target
Original answers:
Can I provide one style of complication only (large horizontal -
modular large) - YES
Do I need to provide any iPhone app content beyond
managing the complication logic, or can I get away without having a
view controller? YES - watch apps have computation limits imposed on them
Do I control the appearance of my complication by
adding something to the assets folder (it has a bunch of graphic
slots)? See below - it's both assets folder and placeholders
Modify the example above to create a placeholder image displayed on the watch (when you are selecting a complication while modifying the screen layout)
func getPlaceholderTemplate(for complication: CLKComplication, withHandler handler: #escaping (CLKComplicationTemplate?) -> Void) {
// Pass the template to ClockKit.
if complication.family == .graphicRectangular {
// Display a random number string on the body.
let template = CLKComplicationTemplateGraphicRectangularLargeImage()
template.textProvider = CLKSimpleTextProvider(text: "---")
let image = UIImage(named: "imageFromWatchExtensionAssets") ?? UIImage()
template.imageProvider = CLKFullColorImageProvider(fullColorImage: image)
// Pass the entry to ClockKit.
handler(template)
}else {
handler(nil);
return
}
}
sending small packets to the watch (will not send images!)
func updateHeartRate(with sample: HKQuantitySample){
let context: [String: Any] = ["title": "String from phone"]
do {
try WCSession.default.updateApplicationContext(context)
} catch {
print("Failed to transmit app context")
}
}
Transferring images and files:
func uploadImage(_ image: UIImage, name: String, title: String = "") {
let data: Data? = UIImagePNGRepresentation(image)
do {
let fileManager = FileManager.default
let documentDirectory = try fileManager.url(for: .cachesDirectory,
in: .userDomainMask,
appropriateFor:nil,
create:true)
let fileURL = try FileManager.fileURL("\(name).png")
if fileManager.fileExists(atPath: fileURL.path) {
try fileManager.removeItem(at: fileURL)
try data?.write(to: fileURL, options: Data.WritingOptions.atomic)
} else {
try data?.write(to: fileURL, options: Data.WritingOptions.atomic)
}
if WCSession.default.activationState != .activated {
print("session not activated")
}
fileTransfer = WCSession.default.transferFile(fileURL, metadata: ["name":name, "title": title])
}
catch {
print(error)
}
print("Completed transfer \(name)")
}

Related

How to implement SwiftUI’s .onDrag modifier with NSImage (macOS)

I am building a sandboxed app for macOS with SwiftUI. I have a NSImage that is displayed as Image(nsImage: myNSImage). I want that View to support drag and drop, meaning that it can be dragged to any location that can receive image files.
Here’s my approach:
Image(nsImage: myNSImage)
.onDrag({
let itemProvider = NSItemProvider()
itemProvider.suggestedName = "image.png"
itemProvider.registerDataRepresentation(for: UTType.png) {
loadHandler in
loadHandler(nsImage.pngRepresentation, nil)
print("loadHandler completed") // Never prints !!
return nil
}
return itemProvider
})
This way I can drag the View. But it seems like it isn’t able to provide the image.
The “drop”, i.e. on the Desktop, is simply not working. I would expect that it saves an "image.png" in the destination URL, but it doesn’t.
How can I implement .onDrag so that it provides a image file based on NSImage?
Edit: I already have tried different UTTypes, for example .tiff in combination with nsImage.tiffRepresentation without luck.

How do I use background fetch in iOS 13 to update pdfs?

I watched many videos and read many articles about the new background fetch in iOS 13, but I am still in the dark. I am making a dining app. Amongst other things I am presenting the menu for "today", which is changing every day. So far I use the PDFKit to show an example menu of a restaurant and have the menu for "today" and "tomorrow" downloaded into my project folder, but I want to use background fetch to get the menu updated every day. So far I only understood that I will have to check the boxes for "background fetch" and "background processing", where the refresh task is used to update content and the processing is used to clean up.
I am also wondering if I need my own website where I upload the pdf to retrieve the url from there. In the end I have to update an image and some labels as well, but I hope that I'll be able to do that on my own once I understood the principles.
I show you my code and screenshot of my view controller to give you a better understanding of my app so far.
let today = "AmericanDinerMenu" // PDF 1 with menu for today
let tomorrow = "ConniesDinerMenu" // PDF 2 with menu for tomorrow
func activePDF(PDF: String) {
if let path = Bundle.main.path(forResource: PDF, ofType: "pdf") {
let url = URL(fileURLWithPath: path)
if let pdfDocument = PDFDocument(url: url) {
pdfView.displayMode = .singlePageContinuous
pdfView.autoScales = true
pdfView.document = pdfDocument
}
}
}
I hope you can help me. Thank you!

Access to Photos on iOS(Swift), have to try twice to get picture library to show up. It doesn't show up the first time, but show's the second time

I call the function. Alright
func tabBarController(_ tabBarController: UITabBarController, shouldSelect viewController: UIViewController) -> Bool {
let index = viewControllers?.index(of: viewController)
if index == 2 {
let layout = UICollectionViewFlowLayout()
let photoSelectorController = PhotoSelectorController(collectionViewLayout: layout)
let navController = UINavigationController(rootViewController: photoSelectorController)
present(navController, animated: true, completion: nil)
return false }
return true
}
Photos not showing on first time
I have all of the right things asking for permission and everything..
I then call for the images with these functions. It works, but the second time I hit the button after canceling posting a post..
I'm not sure how to get the images from the library for the first call.
After that it works like a charm, but most users have been telling me this isn't a good experience , if they have to try twice.
I'm trying to reduce friction in the app usage.
It should show the pictures right after the user "Allows" the app access to the pictures so they can post, but I'm not sure what I'm doing wrong for it to show the pictures soon as someone grants access.
var selectedImage: UIImage?
var images = [UIImage]()
var assets = [PHAsset]()
fileprivate func assetsFetchOptions() -> PHFetchOptions {
let fetchOptions = PHFetchOptions()
fetchOptions.fetchLimit = 100
let sortDescriptor = NSSortDescriptor(key: "creationDate", ascending: false)
fetchOptions.sortDescriptors = [sortDescriptor]
return fetchOptions
}
fileprivate func fetchPhotos() {
let allPhotos = PHAsset.fetchAssets(with: .image, options: assetsFetchOptions())
DispatchQueue.global(qos: .background).async {
allPhotos.enumerateObjects { (asset, count, stop) in
print(asset)
let imageManager = PHImageManager.default()
let targetSize = CGSize(width: 200, height: 200)
let options = PHImageRequestOptions()
options.isSynchronous = true
imageManager.requestImage(for: asset, targetSize: targetSize, contentMode: .aspectFit, options: options, resultHandler: { (image, info) in
if let image = image {
self.images.append(image)
self.assets.append(asset)
if self.selectedImage == nil {
self.selectedImage = image
}
}
if count == allPhotos.count - 1 {
DispatchQueue.main.async {
self.collectionView?.reloadData()
}
}
})
}
}
}
If you fetchAssets before the user grants privacy access to your app, you'll get a PHFetchResult that's empty.
However, if before making that fetch you register as a photo library observer, you'll get a photoLibraryDidChange callback as soon as the user approves privacy access for the app... from that callback you can access an updated version of your original fetch result (see changeDetails(for:)) that has all of the assets your fetch should have found. Then you can tell your UI to update and display those assets. (This is how Apple's canonical PhotoKit example code works.)
Also, once you have a populated fetch result, please don't request thumbnails for the whole thing the way you're doing.
Users commonly have photo libraries with tens of thousands of assets, many of which are in iCloud and not on the local device. If you synchronously get all thumbnails, you'll take forever, use tons of memory and CPU resources, and generate all kinds of network traffic (slowing things down even more) for resources your user may never see.
PhotoKit is designed to allow easy use in conjunction with UI elements like UICollectionView. A collection view only loads cells that are currently (or soon to be) on screen, even if you've told it you have zillions of items in your collection — similarly, you can request thumbnails only for assets that are visible in your collection view. Wherever you have your per-cell UI setup logic is where you should have your PHImageManager request. (Again, this is what the canonical PhotoKit example code does.)
You can optimize even further by "preheating" the thumbnail fetch/generation process for assets that are soon to be onscreen. And then by managing your "preheating" to cancel such work in progress when further UI updates (e.g. fast scrolling of large collection) make it unnecessary. PHCachingImageManager does this. (And yet again, it's what the canonical Apple sample does. Actually, that sample's a bit out of date, and as such does more work than it needs to on this front — it does its own calculation of what cells are just outside the scroll rect, but since iOS 10 the UICollectionViewDataSourcePrefetching protocol manages that for you.)

iOS 11: [ImageManager] Unable to load image data

After update to iOS 11, photo assets now load slowly and I get this message in console:
[ImageManager] Unable to load image data,
/var/mobile/Media/DCIM/103APPLE/IMG_3064.JPG
I use static function to load image:
class func getAssetImage(asset: PHAsset, size: CGSize = CGSize.zero) -> UIImage? {
let manager = PHImageManager.default()
let option = PHImageRequestOptions()
option.isSynchronous = true
var assetImage: UIImage!
var scaleSize = size
if size == CGSize.zero {
scaleSize = CGSize(width: asset.pixelWidth, height: asset.pixelHeight)
}
manager.requestImage(for: asset, targetSize: scaleSize, contentMode: .aspectFit, options: option) { (image, nil) in
if let image = image {
assetImage = image
}
}
if assetImage == nil {
manager.requestImageData(for: asset, options: option, resultHandler: { (data, _, orientation, _) in
if let data = data {
if let image = UIImage.init(data: data) {
assetImage = image
}
}
})
}
return assetImage
}
Request image for asset usually always succeeds, but it prints this message. If I use requestImageData function only, there is no such message, but photos made with Apple camera lose their orientation and I get even more issues while loading big amount of images (I use image slideshow in my app).
Apple always sucks when it comes to updates, maybe someone got a solution how to fix this? It even fails to load an asset, when there is a big list of them in user camera. Switching to requestImageData is not an option for me as it brings nil data frequently now.
I would like to point out, that I call this function only once. It is not used in UITableView etc. I use other code for thumbs with globally initialised manager and options, so assets are definitely not nil or etc.
I call this function only when user clicks at certain thumb.
When gallery has like 5000 photos, maybe connection to assets is just overloaded and later it can't handle request and crashes?
So many questions.
Hey I was having the warning as well and here is what worked for me.
Replacing
CGSize(width: asset.pixelWidth, height: asset.pixelHeight)
by
PHImageManagerMaximumSize in requestImage call
removed the warning log 🎉
Hope this helps,
I had the same problem. Though this did not completely solve it, but it definitely helped.
option.isNetworkAccessAllowed = true
This helps only on the devices where Optimise iPhone Storage option for Photos app has been turned on.
Your code has some serious issues. You are saying .isSynchronous = true without stepping into a background thread to do the fetch. That is illegal and is what is causing the slowness. Plus, you are asking for a targetSize without also saying .resizeMode = .exact, which means you are getting much bigger images than you are asking for.
However, the warning you're seeing is irrelevant and can be ignored. It in no way signals a failure of image delivery; it seems to be just some internal message that has trickled up to the console by mistake.
This seems to be a bug with iOS 11, but I found I could work around by setting synchronous option false. I reworked my code to deal with the async delivery. Probably you can use sync(execute:) for quick fix.
Also, I believe the problem only occurred with photos delivered by iCloud sharing.
You can try method "requestImageData" with following options. This worked for me in iOS 11.2 (both on device and simulator).
let options = PHImageRequestOptions()
options.deliveryMode = .highQualityFormat
options.resizeMode = .exact
options.isSynchronous = true
PHImageManager.default().requestImageData(for: asset, options: options, resultHandler: { (data, dataUTI, orientation, info) in

Is it possible to (programmatically) set wallpapers for each separate "space"/desktop in macOS?

I'm making a small app for myself to change the desktop background image periodically.
My program contains this block of code:
let screen = NSScreen.main()!
let newWallpaperURL = URL(/* ... */)
// ...
try! NSWorkspace.shared().setDesktopImageURL(newWallpaperURL, for: screen, options: [:])
This works, but only for the current "space" the keyboard is focused on.
e.g. if I'm in a fullscreen app, only the background of the Space occupied fullscreen app will be changed (not the background of my normal desktop).
If I have two Spaces/desktops, it only changes the background image of one of them.
Is it possible to individually set wallpapers for each space programmatically?
You can get all the screens and set all of them.
let screens = NSScreen.screens
let newWallpaperURL = URL(/* ... */)
for i in screens {
try! NSWorkspace.shared().setDesktopImageURL(newWallpaperURL, for: i, options: [:])
}
Use this in Xcode 8.x:
if let screens = NSScreen.screens() {
let newWallpaperURL = URL(/* ... */))
for screen in screens {
try? NSWorkspace.shared().setDesktopImageURL(newWallpaperURL, for: screen, options: [:])
}
}
Unlike other solutions posted here this one will work in the current Xcode 8. NSScreen.screens is a class var in Xcode 9 (currently beta) but a class func in Xcode 8 which is why you need to put .screens() instead of .screens. Also, screens returns an optional so you need to safely unwrap it before passing it to the for loop.