Launch my app by hotkeys and pass it arguments from Finder (macOS) - swift

I'm only starting macOS programming. Did some tutorials, reading docs at developers.apple.com. Trying to implement a simple(?) thing, but can't seem to get the whole picture for now.
I want to be able to launch my app by pressing some hot keys combination. The app itself is just a window with a text field that has a list of selected files in Finder (if any).
Naturally, I'm not asking for a concrete implementation. But some hints and directions on the general structure, or on what concepts and classes to inspect would be very helpful.
macOS 10.13.4, Xcode 9.3.1, Swift 4

Probably the best approach is to implement a "service". See the Services Implementation Guide.
A service is a tool that appears in the Services submenu of the application menu and in contextual menus. It can be configured to be invoked by a hot key. The active application at the time the service is invoked cooperates by providing the current selection.

Related

On macOS Monterey, cannot create shortcut actions with Catalyst

We are trying to create shortcut actions with Catalyst.
Our app is already available on Mac, and we previously integrated the intents framework on iOS. So according to the WWDC21 "Meet Shortcuts on macOS" presentation, "it's likely that [we] have compiled out [our] Intents integration in the process of coming to Mac". So, it's no surprise that we cannot create shortcut actions for Mac in our app with Catalyst.
The WWDC presentation suggests to "make sure to audit your code to re-enable this functionality when running on macOS Monterey." We do not understand what we need to do based on this suggestion.
What we tried so far :
we managed to create shortcut actions for mac with Catalyst, in the app available at https://github.com/mralexhay/ShortcutsExample. So, the problem does come from our app.
we managed to create shortcut actions for iOS in our app
we tried to create a fresh intent extension in our app, but the shortcut actions are still available only on iOS, not on Mac.
Has anyone found a solution in a similar situation ?
When creating a shortcut action, Shortcuts get mixed up with app identifiers. You therefore need to delete all the compiled versions of your app.
I'm having a similar problem with this example "Meet Shortcuts on macOS", I haven't done anything with shortcuts before, but I have with AppleScripts. I have managed to sort out a couple of problem due to beta changes, but I end up with this method
let task = createTask(name: title, due: dueDate)
Which doesn't exist, worst still its suppose to return a Task to set to the CreateTaskIntentResponse.task property, but Task is already defined. So I can't really redefine it and besides it seems like it should be a generated type based on all the intent info I supplied.

Keyboard Filter Driver. Scan Code -> VK_??? (OEM Specific)

Preface (Imaginary. So someone does not ask 'What are you trying to do?):
I have a Win32 C++ application.
This application wants to know when the user wants to open the start menu via Ctrl+Esc
Of course, Ctrl+Esc is fired from the operating system so the application never see's it.
I have looked at Windows Virtual Keys.
I see that there are plenty of OEM specific VK's
(0x92-0x96,0xE0,0xE9-0xF5,..)
So my thought was:
Keyboard Filter Driver.
When my application has the focus it tells the Keyboard Filter Driver.
When my driver sees the Ctrl is down and an Esc down occurs (And my application has focus):
-- Swallow the Esc and replace it with a scan code that will produce say a VK_0x92 (OEM Specific).
Since I have swallowed the Esc the operating system will never see Ctr+Esc
My application will then see the VK_0x92 and know the user wants to open the start menu and perform some action.
My question is how do I 'muck' the input within my driver (KEYBOARD_INPUT_DATA) in order for a say
VK_0x92 to appear within my application?
Thanks in advance for any pointers.
It is all about the Keyboard Layout.
What I needed to do was not supported by Microsoft Keyboard Layout Creator (MKLC).
See: Keyboard Layout Samples.
I found the samples to be very old and hard to read through. Clearly the US and German keyboard samples are not the most recent.
I wrote a program to create Visual Studio projects for keyboard layouts by pointing to a specific layout (I.e, KBDUS.dll for example). I generate the source code, .vxcproj, ... I then make my modifications and build it.
Installing the layout is another can of worms entirely. I have asked in several places for Microsoft to release the source code for the CustomAction Dll that is contained within the MKLC generated .MSI to no avail.

Need clarification on vs code debug

Can anyone explain these three debug symbols on VSCode I have found on the internet?
My vs code has the one with the play icon.
All demos online on debugging have the one in the middle. How do I get that?
Also, node js debugging is installed but I think it shows as disabled, with no option I can find to enable it.
To answer your question directly [TL;DR]: you already have it if you are using the latest version of vscode. It will take you to the same view as the one on the right
If you look at the codicon libray ref the middle one you pointed out is not present.
Visual Studio Code made changes in February 2020 ref that incorporates running and debugging to be something more harmonious:
User studies revealed that new users have difficulties finding how to run their programs in VS Code. One reason is that the existing "Debugging" functionality is not something that they relate to "Running" a program. For that reason, we are making "Run" more prominent in the UI.
The main menu Debug has become the Run menu.
The Run and Debug view has become the Run view and the corresponding Activity Bar icon now shows a large "Play" icon with a small "bug" decoration.
So in other words, there is no difference. The 'Run' and 'Debug' view is synonymous and the icon reflects those changes. As they noted, the Debug view is now called the 'Run' view, but it still offers debugging and breakpoints.
There are 2 possibilities you are running into however:
The tutorials and guides you are using are out-dated (showing an outdated version of vscode)
The tutorial or guide is using an extension that offers debugging capabilities. Extensions have some control over the icon you see
The extension is for single file debugging, according to the June 2020 ref notes, vscode recommends the following:
For debug extensions that want to improve the single file debug experience by adding a "Run" and/or "Debug" button to the editor, we recommend following these guidelines for a consistent look and feel:
Contribute Run and/or Debug commands in the package.json (see Mock Debug):
Use the command titles "Run File"/"Debug File" or "Run Python File"/"Debug Python File".
Use the $(play) icon for Run and $(debug-alt-small) for Debug.
Where their codicon library was updated in June to reflect the following:
As you can see, none of them are prefixed with verbiage like 'run', but they all represent the same functionality.
Additionally, you may see this icon as well:
This represents the panel (view) where the output of your debug will go.

Controlling VS code with a launchpad

I really enjoy using vs code but there are some many shortcuts to remember and every new plugins come with a new set.
Of course, I can use the command palette in order to quickly execute a command, but I would like something even more faster such as assigning a shortcut to any of the keys for a device like a Novation Launchpad midi controller.
Stackoverflow is maybe not the best plase to ask this question but I didn't knew where to post it, so is there anyone who tried something like this? I have seen this video (https://www.youtube.com/watch?v=LOyNUGS4RC8) linking such a device with visual studio so perhaps someone created a software dedicated for vs code already.
Regards,
Johnny -
I've setup this with MIDI Loupe and midiStroke.
MIDI Loupe listens your device and logs channel/key/value you just hit on your midi controller - with this tool you inspect your device's output.
Then in midiStroke you map controls to shortcuts.
Note: I've found my midiStroke setup for M-Audio Axiom 49 keys don't require values (only key number) but controls (e.g. record start) do require it. Also for me letter keys didn't not work if uppercased (e.g. for garageband recording start I need simply R button hit and R should be r in this case)
Detailed tutorial here

Assign command to the central soft button within javaMe

I have the mobile javaMe application that has been working on Nokia Phones. However, now I'm porting it to Samsung 5611, and I've faced with such a problem: no command is assigned on the central soft button, all of them are contained in the right-button menu. When the same midlet was launched on Nokia 3110c, one command was placed on central button, other ones (if >=2) were grouped into the options menu.
I tried Item.setDefaultCommand (no effect) and Display.getInstance().setThirdSoftButton(true) (such method not supported in SDK 3.4). Also I tried to change the type of one command to Ok or Screen, and change the priority, everything is without success.
Thanks in advance. Any idea will be helpful.
Sadly there's no way for the developer to decide exactly on which softbuttons the commands belong. It is the individual device that decides. Some devices has two softbuttons, and some has three.
You can fiddle a bit with priorities, but you still can't force commands to specific softbuttons.
That's high-level GUI (Form) for you.
If you want to have control of such things, you need to go with low-level GUI (Canvas / GameCanvas). Nowadays there are several APIs you can use to create Form-like low-level GUI. Check out LWUIT for example, which I imagine makes it easy for you to port your high-level code into low-level.
But even when using low-level coding, you have to be aware of different devices having different keycodes for the softbuttons.