How to get highlighted text from apps like Mail.app in macOS - swift

Below is a snippet that prints highlighted text, however, it does not work with all apps. For instance, one app I know it does not work with is Mail.
What attributes should I be searching for to get the highlighted text in apps like Mail :)
func getHighlightedText() -> AnyObject? {
let systemWideElement = AXUIElementCreateSystemWide()
var focusedElement: AnyObject?
let focusedCode = AXUIElementCopyAttributeValue(systemWideElement, "AXFocusedUIElement" as CFString, &focusedElement)
if (focusedCode == AXError.success) {
var selectedText: AnyObject?
let textCode = AXUIElementCopyAttributeValue(focusedElement as! AXUIElement, "AXSelectedText" as CFString, &selectedText)
if (textCode == AXError.success) {
return selectedText
}
}
return nil
}
sleep(3) #enough time to switch to another app and highlight the text
print(getHighlightedText())

Related

Select Text in Webkit applications via macOS accessibility API

I need to select text in a WebKit view of another application (Apple Mail) using accessibility APIs.
For regular text fields, I do something like this:
func selectText(withRange range: CFRange) throws {
var range = range
guard let newValue: AXValue = AXValueCreate(AXValueType.cfRange, &range) else { return }
AXUIElementSetAttributeValue(self, kAXSelectedTextRangeAttribute as CFString, newValue)
}
However, in the composing window of Apple Mail every text seems to be of type Static Text which doesn't come with the necessary AXSelectedTextRange
It has AXSelectedTextMarkerRange, though, which requires an AXTextMarker. I just don't get how to create one of these. I have no trouble reading the text from a user created selection using this here, but I'm unable to select text via the accessibility APIs.
Thanks to the hint from Willeke I was able to figure it out. It is indeed possible to do it using AXTextMarkerForIndex. Knowing that it's actually pretty straightforward.
Here's my code:
func getTextMarker(forIndex index: CFIndex) throws -> AXTextMarker? {
var textMarker: AnyObject?
guard AXUIElementCopyParameterizedAttributeValue(self,"AXTextMarkerForIndex" as CFString, index as AnyObject, &textMarker) == .success else { return nil }
return textMarker as! AXTextMarker
}
func selectStaticText(withRange range: CFRange) throws {
guard let textMarkerStart = try? getTextMarker(forIndex: range.location) else { return }
guard let textMarkerEnd = try? getTextMarker(forIndex: range.location + range.length) else { return }
let textMarkerRange = AXTextMarkerRangeCreate(kCFAllocatorDefault, textMarkerStart, textMarkerEnd)
AXUIElementSetAttributeValue(self, "AXSelectedTextMarkerRange" as CFString, textMarkerRange)
}

Is there a way to make MLVision text recognition faster?

I am using MLVision cloud text recognition for my app. I capture/upload a photo and then I start the process. When it recognises the image and extract the text, then I separate it and append every separated block into an array.
The code below is for the whole process.
lazy var vision = Vision.vision()
var textRecognizer: VisionTextRecognizer!
var test = [] as Array<String>
override func viewDidLoad() {
super.viewDidLoad()
let options = VisionCloudTextRecognizerOptions()
options.languageHints = ["en","hi"]
textRecognizer = vision.cloudTextRecognizer(options: options)
}
//where pickedImage is the image that user captures.
let visionImage = VisionImage(image: pickedImage)
textRecognizer.process(visionImage, completion: { (features, error) in
guard error == nil, let features = features else {
self.resultView.text = "Could not recognize any text"
self.dismiss(animated: true, completion: nil)
return
}
for block in features.blocks {
for line in block.lines{
//for element in line.elements{
self.resultView.text = self.resultView.text + "\(line.text)"
}
}
self.separate()
})
func separate(){
let separators = CharacterSet(charactersIn: (":)(,•/·]["))
let ofWordsArray = self.resultView.text.components(separatedBy: separators)
for word in ofWordsArray{
let low = word.trimmingCharacters(in: .whitespacesAndNewlines).lowercased()
if low != ""{
test.append(low)
}
}
print(test)
}
Everything works fine and I get the result that I want.The problem is that I think is really slow. It takes about 20sec for the entire process.Is there a way to make it faster?
Thanks in advance.
You are using the VisionCloudTextRecognizer. Speed will depend on your connection, in my case it was only few seconds. Your other option is to use on-device text recognition or use a hybrid approach, where you first detect on-device, then correct with Cloud API later.

Implementing "Open file with" in Swift Cocoa App

I'm working on a macOS cocoa-app in Swift where I import several different file types into the app for the user to interact with.
I'm currently trying to determine if it's possible to implement the "Open file with" feature, so that the user could open those files in a different program if they wanted to:
I've found a few different SO questions that seem tangentially related to what I'm trying to do:
Swift: How to open file with associated application?
Launch OSX Finder window with specific files selected
...but so far nothing to indicate if it's possible to implement right-click Finder/file (?) access in the way I had in mind.
Apologies if this is too vague of a question; any help / guidance appreciated!
Without going into details, it's pretty straight forward:
Get the list of all known applications that can open a specific file type (see LSCopyApplicationURLsForURL, a Core Foundation C function).
Build the menu. You can use NSWorkspace (and probably URL) to get the application icons.
Use NSWorkspace.openFile(_:withApplication:) to tell the application to open the given document.
2022, Swift 5
Get app list associated with local file:
func getAppsAssociatedWith(_ url: URL?) {
guard let url = localFileURL,
let retainedArr = LSCopyApplicationURLsForURL( url as CFURL, .all)?.takeRetainedValue(),
let listOfRelatedApps = retainedArr as? Array<URL>
else {
return []
}
return listOfRelatedApps
}
Getting thumbnail for app:
let singleAppIcon = NSWorkspace.shared
.icon(forFile: appUrl.path)
.scaledCopy(sizeOfLargerSide: 17)
Open url with app:
#available(macOS 10.15, iOS 9.0, *)
public class func openUrlWithApp(_ urls: [URL], appUrl: URL) {
NSWorkspace.shared.open(urls, withApplicationAt: appUrl, configuration: NSWorkspace.OpenConfiguration())
}
In my app I'm cashing all apps icons in dictionary.
[someFile localURL : app icon]
If I have already got icon earlier - no need to get it once more
var relatedAppsThumbnails: [URL: Image] = [:]
func updateRelatedApps() {
guard let url = currImgUrl, // file url to get icons from related apps
let retainedArr = LSCopyApplicationURLsForURL( url as CFURL, .all)?.takeRetainedValue(),
let listOfRelatedApps = retainedArr as? Array<URL>
else {
relatedApps = []
return
}
self.relatedApps = listOfRelatedApps
// add app icon in case of it wasn't added yet
for appUrl in listOfRelatedApps {
if relatedAppsThumbnails[appUrl] == nil {
let nsImg = NSWorkspace.shared.icon(forFile: appUrl.path)
.scaledCopy(sizeOfLargerSide: 17)
relatedAppsThumbnails[appUrl] = Image(nsImage: nsImg)
}
}
}
LSCopyApplicationURLsForURL is deprecated. You can use this alternative:
func getListOfExternalApps(forURL url: URL) -> [(URL, Image)] {
let listOfExternalApps = NSWorkspace.shared.urlsForApplications(toOpen: url)
let icons = listOfExternalApps.map {
let nsimage = NSWorkspace.shared.icon(forFile: $0.path())
nsimage.size = CGSize(width: .s16, height: .s16)
return Image(nsImage: nsimage)
}
return Array(zip(listOfExternalApps, icons))
}

How Save UILocalNotifications in CoreData

Answer is below, image is here:
I was searching how to do this for a couple of days and was only able to find people who stored UILocalNotificaations in NSUserDefaults. Saving these in NSUserDefaults seemed wrong to me because it is supposed to be used for small flags. I just now finally figured out how to store notifications in CoreData. This is Using Xcode 7.3.1 and Swift 2.2
First off you need to create a new entity in your CoreDataModel
and then add a single attribute to it. the attribute should be of type Binary Data I named my table/entity "ManagedFiredNotifications" and my attribute "notification". it should look like this:
Image linked in Question above.
Next you need to add an extension to UILocalNotification it should go like this:
extension UILocalNotification {
func save() -> Bool {
let appDelegate = UIApplication.sharedApplication().delegate as? AppDelegate
let firedNotificationEntity = NSEntityDescription.insertNewObjectForEntityForName("ManagedFiredNotifications", inManagedObjectContext: appDelegate!.managedObjectContext)
guard appDelegate != nil else {
return false
}
let data = NSKeyedArchiver.archivedDataWithRootObject(self)
firedNotificationEntity.setValue(data, forKey: "notification")
do {
try appDelegate!.managedObjectContext.save()
return true
} catch {
return false
}
}
}
Now for saving a notification all you need to do is call
UILocalNotification.save()
On the notification you would like to save. my notifications were named 'notification' so I would call notification.save()
To retrieve a notification you need a method like this
func getLocalFiredNotifications() -> [UILocalNotification]? {
let managedObjectContext = (UIApplication.sharedApplication().delegate as? AppDelegate)!.managedObjectContext
let firedNotificationFetchRequest = NSFetchRequest(entityName: "ManagedFiredNotifications")
firedNotificationFetchRequest.includesPendingChanges = false
do {
let fetchedFiredNotifications = try managedObjectContext.executeFetchRequest(firedNotificationFetchRequest)
guard fetchedFiredNotifications.count > 0 else {
return nil
}
var firedNotificationsToReturn = [UILocalNotification]()
for managedFiredNotification in fetchedFiredNotifications {
let notificationData = managedFiredNotification.valueForKey("notification") as! NSData
let notificationToAdd = NSKeyedUnarchiver.unarchiveObjectWithData(notificationData) as! UILocalNotification
firedNotificationsToReturn.append(notificationToAdd)
}
return firedNotificationsToReturn
} catch {
return nil
}
}
Note that this returns an array of UILocalNotifications.
When retrieving these if you plan on removing a few of them and then storing the list again you should remove them when you get them something like this works:
func loadFiredNotifications() {
let notifications = StudyHelper().getLocalFiredNotifications()
if notifications != nil {
firedNotifications = notifications!
} else {
// throw an error or log it
}
classThatRemoveMethodIsIn().removeFiredLocalNotifications()
}
I hope this helps someone who had the same problems that I did trying to implement this.

issue receiving outcomes when sending text to wit.ai

I'm using the following to send text to wit.ai through a button press function:
#IBAction func searchButton(sender: AnyObject) {
searchQueryText = searchTextInput.text!
if searchQueryText != "" {
wit.interpretString(searchQueryText, customData: nil)
}
func interpretString(string: String, customData: AnyObject) {
}
this works fine as the text is sent to wit.ai. However I get no response from wit.ai back to the app. I can get the response fine if a microphone is used, just not text. I have tried calling the witDidGraspIntent function to force it to run on button press, but I can't work out what I should use in the 'outcomes' parameter. Can anybody help on this? I'm not sure if there is a different way to run the function after button press? This is the function:
func witDidGraspIntent(outcomes: [AnyObject]!, messageId: String!, customData: AnyObject!, error e: NSError!) {
if ((e) != nil) {
print("\(e.localizedDescription)")
return
}
let outcomes : NSArray = outcomes!
let firstOutcome : NSDictionary = outcomes.objectAtIndex(0) as! NSDictionary
if let intent = firstOutcome.objectForKey("intent") as? String {
searchResultsIntent = intent
}
if searchResultsIntent == "searchIntent" {
intentLabel.text = "\(searchResultsIntent)"
print(outcomes[0])
} else {
intentLabel.text = "I'm sorry, I did not understand that."
}
}
here is the documentation for wit.ai: https://wit.ai/docs/ios/4.0.0/api
any assistance is greatly appreciated!
cheers.
Wit sdk gives a sharedInstance (singleton) for users to work on, so you have initiate it like -:
Wit.sharedInstance().accessToken = "TOKEN"
Wit.sharedInstance().delegate = self
and invoke the interpretString function using the sharedInstance i.e.
Wit.sharedInstance().interpretString(text, customData: nil)