Is it possible to set default alert style for macOS UNNotification? - swift

My application posts local macOS notifications. It's important that these are not banner style but alert style instead.
So far I have achieved that by setting NSUserNotificationAlertStyle=alert in the Info.plist.
When a first NSUserNotification is sent to the user, he is asked to approve the app's alerts.
Once approved, the App gets Alert style set to Alerts in the macOS Preferences.
However, the NSUserNotification API s deprecated since macOS 11.
In the code below I've tried to use the newer UNNotification API, the Alert style is always set to banner
no matter the NSUserNotificationAlertStyle property value.
Is there any way to work around that? Since users can change the alert style anyway,
I am not sure why the apps cannot decide on the default style themselves.
import UserNotifications
import SwiftUI
#main
struct UNNotificationIssuesApp: App {
init() {
let notificationCenter = UNUserNotificationCenter.current()
notificationCenter.requestAuthorization(options: [.alert, .sound]) { (granted, error) in
if granted {
notificationCenter.getNotificationSettings { settings in
print("Granted app notification permissions: \(settings)")
}
} else {
print("All the app's notification permissions were revoked")
}
if let err = error {
print("An error occurred while granting app notification permissions:", err.localizedDescription)
}
}
}
var body: some Scene {
WindowGroup {
ContentView()
}
}
}

Related

Badge is not displaying OSX cocoa swift

I am trying to display badge count in MacOS Ventura for my application, Here is my code
NSApplication.shared.dockTile.showsApplicationBadge = true
NSApplication.shared.dockTile.badgeLabel = "2"
NSApplication.shared.dockTile.display()
I also requested for notification allow
//create the notificationCenter
let center = UNUserNotificationCenter.current()
center.delegate = self
// set the type as sound or badge
center.requestAuthorization(options: [.sound,.alert,.badge]) { (granted, error) in
if granted {
print("Notification Enable Successfully")
}else{
print(error.debugDescription) // prints "Notifications are not allowed for this application"
}
}
but it is always going into error part, I don't know what's issue there.
I also checked in Notification settings in System preference, It's showing only alert there. There is no badge setting there.
How to add badge setting in notification setting? Here is the screenshot
Any help would be appreciated.

How to process deep link from an aps notification using the same #EnvironmentObject used in onOpenUrl without using a singleton in AppDelegate

I am trying to coordinate my deep link with push notifications so they both process my custom url scheme in the same manner and navigate to the appropriate view. The challenge seems to be with push notifications and how to process the link passed, through an apn from Azure Notification Hubs, using the same #EnvironmentObject that the onOpenUrl uses without breaking the SwiftUI paradigm and using a singleton.
Here is how I trigger the notification on my simulator, which works fine and navigates me to the appropriate view:
xcrun simctl openurl booted "myapp://user/details/123456"
Which triggers this the onOpenUrl in this code:
var body: some Scene {
WindowGroup {
ContentView()
.environmentObject(sessionInfo)
.onOpenURL { url in
print("onOpenURL: \(url)")
if sessionInfo.processDeepLink(url: url) {
print("deep link TRUE")
} else {
print("deep link FALSE")
}
}
}
}
And all my DeepLinks work just as desired. I wanted to trigger them from a notification so I created an apns file with the same link that worked using xcrun:
{
"aps": {
"alert": { // alert data },
"badge": 1,
"link_url":"myapp://user/details/123456"
}
}
and pushed it to the simulator like this:
xcrun simctl push xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx com.myco.myapp test.apn
How do I reference my object from the AppDelegate which gets the message:
func notificationHub(_ notificationHub: MSNotificationHub!, didReceivePushNotification message: MSNotificationHubMessage!) {
print("notificationHub...")
let userInfo = ["message": message!]
print("user: ")
NotificationCenter.default.post(name: NSNotification.Name("MessageReceived"), object: nil, userInfo: userInfo)
if (UIApplication.shared.applicationState == .background) {
print("Notification received in the background")
} else {
print("Notification received in the foreground")
}
UIApplication.shared.applicationIconBadgeNumber = 4
}
I looked at this post, but couldn't relate the components to my app, possibly due to the NotificationHub part of it. I also saw this post, but again didn't know how to connect it.
I saw this post and it looks like push notification and deep linking are two different things. I could use the same logic if I could access the SessionInfo object from the AppDelegate. I'm concerned about messing around in there given I'm new to iOS development. Here is what I'm using in the App:
#StateObject var sessionInfo: SessionInfo = SessionInfo()
This post seems to cover it but I'm still having trouble. I don't know what this means:
static let shared = AppState()
And my SessionInfo is decorated with #MainActor. When I access it from other places I use:
#EnvironmentObject var sessionInfo: SessionInfo
I also saw this post, which there was no selected answer, but the one which did exist seemed to recommend making the AppDelegate and EnvrionmentObject and push it into the ContentView. I think what I really need is the AppDelegate when the notification arrives to update something shared/published to the SessionInfo object so the url is parsed and the navigation kicked off. This seems backwards to me.
This post makes the AppDelegate an ObservableObject with a property which is published and makes the AppDelegate an EnvrionmentObject, so when the value is updated, it's published. If it were the navigation link/object that would work but something would still need to process it and it would not make sense for the onOpenUrl to use the AppDelegate, so again I think this is backwards.
If I did follow the post where there is a static SessionInfo object in the SessionInfo class, singleton, that means I would need to remove the #EnvironmentObject var sessionInfo: SessionInfo from the ContentView and the .environmentObject(sessionInfo) on the main View I am using I think and instead instantiate the shared object in each view where it is used. Right? It seems like I followed this whole #EnvrionmentObject, #StateObject, #MainActor paradigm and would have to abandon it. I'm not sure if that is right or what the tradeoffs are.
Most recently this post seems to be pretty in-depth, but introduces a new element, UNUserNotificationCenter, which I heard referenced in this youtube video.
This article was very helpful for the notification part.
Azure NotificationHubs the message info is in message.userInfo["aps"] vs userInfo["aps"] in the example or most places I have seen it. Not much documentation on MSNotificationHubMessage:
func notificationHub(_ notificationHub: MSNotificationHub, didReceivePushNotification message: MSNotificationHubMessage) {
print("notificationHub...")
let title = message.title ?? ""
let body = message.body ?? ""
print("title: \(title)")
print("body: \(body)")
let userInfo = ["message": message]
NotificationCenter.default.post(name: NSNotification.Name("MessageReceived"), object: nil, userInfo: userInfo)
guard let aps = message.userInfo["aps"] as? [String: AnyObject] else {
return
}
...
}
Second, this post provided the answer which I adapted for my project:
final class AppDelegate: NSObject, UIApplicationDelegate, UNUserNotificationCenterDelegate, MSNotificationHubDelegate {
var navMgr: NavigationManager = NavigationManager()
...
}
and
#UIApplicationDelegateAdaptor private var appDelegate: AppDelegate
var body: some Scene {
WindowGroup {
ContentView()
.environmentObject(sessionInfo)
.environmentObject(appDelegate.navMgr)
.onOpenURL { url in
print("onOpenURL: \(url)")
if sessionInfo.processDeepLink(url: url) {
print("deep link TRUE")
} else {
print("deep link FALSE")
}
}
}
}

Open Health Kit Permission Settings for the app that uses Health Kit

I am using Health Kit to display steps information for the user.
let healthKitTypes: Set = [ HKObjectType.quantityType(forIdentifier: HKQuantityTypeIdentifier.stepCount)!]
self.healthStore.requestAuthorization(toShare: healthKitTypes, read: healthKitTypes) { (bool, error) in
if (bool) {
// Auth granted
} else {
// Navigate user to HealthKit app permissions
}
}
This code shows a popup to the user with the asked permissions. What I want to do in the // Navigate user to HealthKit app permissions is to open the Health Kit app permission settings directly for my app if the user did not approve the permissions on the popup.
I know that settings of the app can be opened by using something like this:
let settingsAction = UIAlertAction(title: "Settings", style: .default) { (_) -> Void in
guard let settingsUrl = URL(string: UIApplication.openSettingsURLString) else {
return
}
if UIApplication.shared.canOpenURL(settingsUrl) {
UIApplication.shared.open(settingsUrl, completionHandler: { (success) in
print("Settings opened: \(success)") // Prints true
})
}
}
... I want to do similar, just in this case open the Health Kit Setting permission screen so that users can enable the requested permissions directly in settings - if they decline them in the app popup.
Thanks

How to make SFSpeechRecognizer available on macOS?

I am trying to use Apple's Speech framework to do speech recognition on macOS 10.15.1. Before macOS 10.15, speech recognition was only available on iOS, but according to the documentation and this talk, should now be available on macOS as well.
However, all my my attempts to use it have resulted in the SFSpeechRecognizer's isAvailable property being set to false. Per that talk and the documentation I've enabled Siri and made sure that my app has the "Privacy - Speech Recognition Usage Description" key set to a string value in Info.plist.
I've also tried enabling code signing (which this question suggests might be necessary), enabling Dictation under Keyboard > Dictation in the System preferences.
Here's some example code, although the specifics probably aren't important; I've tried it using a Storyboard instead of SwiftUI, putting the instantiation of the SFSpeechRecognizer inside and outside the requestAuthorization callback, leaving the locale unspecified, etc. Nothing seems to have any effect:
import SwiftUI
import Speech
struct ContentView: View {
func tryAuth() {
SFSpeechRecognizer.requestAuthorization { authStatus in
switch authStatus {
case .authorized:
print("authorized")
case .denied:
print("denied")
case .restricted:
print("restricted")
case .notDetermined:
print("notDetermined")
#unknown default:
print("unanticipated auth status encountered")
}
}
}
func speechTest() {
guard let recognizer = SFSpeechRecognizer(locale: Locale(identifier: "en-US")) else {
// Not supported for device's locale
print("couldnt get recognizer")
return
}
if !recognizer.isAvailable {
print("not available")
return
}
print("Success")
}
var body: some View {
VStack {
Button("Try auth") {
self.tryAuth()
}
Button("Test") {
self.speechTest()
}
}
}
}
What's especially odd is that if I run the app and then click the "Try auth" button, the authStatus returned from the callback is always .authorized. However, I've never been presented with a dialog asking me to authorize the app, and the app doesn't show up in the list of authorized apps under System Preferences > Security and Privacy > Privacy > Speech Recogniztion.
Nonetheless, clicking the "Test" button afterwards results in printing not available.
It seems like there's some hole in my understanding of the macOS privacy/permissions system, but I'm not sure how to debug further. I also think it should be possible to get this working, because I've seen other questions on StackOverflow suggesting that people have done so, for example here, here.
EDIT: At the suggestion of a comment, I tried simply ignoring the fact that isAvailable is false by replacing my check for it with code to actually try to transcribe a file, e.g.:
let request = SFSpeechURLRecognitionRequest(url: URL(fileURLWithPath: "/Users/james/Downloads/test.wav"))
recognizer.recognitionTask(with: request) { (result, error) in
guard let result = result else {
print("There was an error transcribing that file")
print("print \(error!.localizedDescription)")
return
}
if result.isFinal {
print(result.bestTranscription.formattedString)
}
}
Then it fails, printing: The operation couldn’t be completed. (kAFAssistantErrorDomain error 1700.). So it seems like it really is necessary to check for isAvailable, and my question remains: how to get it to be true?
Had similar problems with SFSpeechRecognizer... Perhaps you can set the delegate of SFSpeechRecognizer before requesting for authorization, as shown here.
For example:
class ViewController: NSViewController {
var speechRecognizer: SFSpeechRecognizer!
override func viewDidLoad() {
super.viewDidLoad()
speechRecognizer = SFSpeechRecognizer(locale: Locale(identifier: "en-US"))
speechRecognizer.delegate = self
}
override func viewWillAppear() {
SFSpeechRecognizer.requestAuthorization { authStatus in
...
}
if !speechRecognizer.isAvailable {
print("Not available!")
}
let url = Bundle.main.url(forResource: "sample", withExtension: "mp3")!
let request = SFSpeechURLRecognitionRequest(url: url)
// will now ask for authorisation
speechRecognizer.recognitionTask(with: request) { (result, error) in
...
}
}
}
extension ViewController: SFSpeechRecognizerDelegate {
}
Then the authorisation dialog will be properly shown.
In addition, it seems that only when there is a call to recognitionTask, that the user will be asked to give permission. Instead, calling requestAuthorization alone will not have any effect.

On iOS 13 the biometric authentication alert does not show on older phones

On older iPhones such as the 6, 6s etc. The biometric authentication dialogue/alert is hidden. If you press the home button on the iPhone to authenticate via a fingerprint it still works, but the dialogue/alert is hidden, which is a source of confusion for users.
Various sources (1) (2) have reported this as an iOS 13 bug.
This worked correctly on iOS 12, the issue started on iOS 13.
My biometric auth code looks like this and is fired in a view controller's viewDidAppear method:
let localAuthContext = LAContext()
var error: NSError?
if localAuthContext.canEvaluatePolicy(LAPolicy.deviceOwnerAuthenticationWithBiometrics, error: &error) {
localAuthContext.evaluatePolicy(LAPolicy.deviceOwnerAuthenticationWithBiometrics, localizedReason: "SIGNIN.TITLE.Login".localized) { [weak self] (success, error) in
if success {
// success
} else {
// failure
}
}
} else {
// can't evaluate policy
}
So, do I need to change something in my code for iOS 13, or is this an Apple issue?
It seems to be issue in processing.
I've fixed this issue by showing it from main queue so it will surely show maybe after delay but it will not remain hidden.
DispatchQueue.main.async {
if localAuthContext.canEvaluatePolicy(LAPolicy.deviceOwnerAuthenticationWithBiometrics, error: &error) {
localAuthContext.evaluatePolicy(LAPolicy.deviceOwnerAuthenticationWithBiometrics, localizedReason: "SIGNIN.TITLE.Login".localized) { [weak self] (success, error) in
if success {
// success
} else {
// failure
}
}
} else {
// can't evaluate policy
}
}
It just happens from iOS 13 and above. The solution is trying to call evaluate function twice like this:
let systemVersion = UIDevice.current.systemVersion
// Trick here: Try to do an pre-evaluate
if systemVersion.compare("13.0", options: .numeric) != .orderedAscending {
context.evaluatePolicy(.deviceOwnerAuthentication, localizedReason: "Authenticate to open the app", reply: { (_, _) in
//Ignore callback here
})
}
context.evaluatePolicy(.deviceOwnerAuthentication, localizedReason: "Authenticate to open the app", reply: { (success, error) in
// Handle callback here
})
Tested and work well for all iOS 13.x.x versions so far.
This seems to be an Apple issue on older iOS 13 versions. I am unable to reproduce this issue from iOS 13.1.2 onwards.