Listen to keypress across all windows in Swift - swift

How can I listen to NSEvents while my app has the focus, but independently of which window is active?
I'm not looking to listen to system-wide events, I only want to listen to single and/or modified keystrokes while my app has the focus, but all examples I could find, listen NSEvents from NSWindow or NSView.
My app has multiple windows and I want to catch all keystrokes while any of those windows are focused.

This will work, but it's overkill for this problem. See Alexander's answer for the correct approach.
Override NSApplication.sendEvent. This is the central dispatching method for all events in the application. You can determine which events you care about, and either consume them, or send them along using super. This is a documented override point in NSApplication. It is not a hack.
Note that this requires subclassing NSApplication itself. It's not handled by the application delegate. For instructions on subclassing NSApplication in Swift, see Subclass NSApplication in Swift. (Read all the comments; the answer is a little messy. The key is setting NSPrincipalClass.)
For more technical details, see the Cocoa Event Handling Guide, which covers the whole architecture.

It looks like the NSEvent.addLocalMonitorForEvents(matching:handler:) API is exactly what you're looking for.

Related

How do you develop simple code for Gtk+ 3 touch input?

I've been searching for a little while to investigate whether or not our Gtk+ 3 application (Gtk#, actually, but bound to Gtk+ 3) will be easily convertible to receive touch events rather than mouse clicks. This is pure investigation at the moment since I don't actually have a touch screen yet (that's also the reason why I can't easily test my theories).
As far as I can tell, the Gnome 3 can indeed do this and the documentation that seems relevant is the gesture stuff, found here.
I've not been able to find any sample Gtk+ code so I'm hoping it's a simple matter of instantiating one of the gesture classes and simply setting it up to deliver events to the main event loop.
Is there anyone with experience with Gtk gestures that can guide me in a simple example? Let's say I have a DrawableArea and I want to receive a very simple, single-touch event that gives me an event with the point within the area that was touched.
Once I have that, I should be able to build on it to handle swipes, pinches and so on.
You cannot inject pointer events as touch events: they are fundamentally different, and they interact with the gesture recognition state machine that lives inside GTK.
GTK has the ability (for debugging purposes) to emulate touch events via pointer, though obviously it cannot emulate multi-touch events because a pointing device only has one event sequence. If you have a recent version of GTK 3.x, you can use the GTK_TEST_TOUCHSCREEN environment variable set to a non-zero value.
If you want to know how to use GtkGesture implementations in your own widgets, then I suggest you look at the GtkGesture API reference, and look at the various types of gestures available, like GtkGestureSwipe or GtkGestureZoom. You're supposed to add GtkGesture instances to your widget when constructing it, and then catch the signals on the specific gesture when they are recognised.

What is the correct location (MVC-wise) for iOS app event handlers?

I am writing an iOS app that registers for call events (not-in-call, dialing, disconnected etc).
I have code that registers for the call event, but I'm not sure where is the correct location to put it (in the Model? In the Controller?).
All samples place the code in the app delegate, but that seems awkward. After all, the app delegate is not part of MVC.
Thanks!
After all, the app delegate is not part of MVC.
I could argue, but I have a feeling you worry too much about the "correct design". If you want to change the place of these event handlers, I'd put them somewhere in the controller (certainly not model, because they are not data providers...)
But after all, why are they "awkward" in the app delegate? That's exactly why the singleton application object has a delegate: system-wide events shall notify the app (and its delegate), respectively, and not some internal part of the application. That's mixing things up.
Since these events control parts of your application the best answer would be: in a controller.
This Apple style to put lots of stuff in the delegate is indeed bad coding practice.

Aspect Oriented Programming in Objective-C for iPhone

Could anyone help me first when I can use AOP, and what is it exactly in an iphone programming.
I need to access to the app project source code and call some of the functions and be notified their views loaded from outside like a library.
I found these so far, but looks very complicated to follow. Some doesnt build or the source code removed.
https://github.com/ndcube/AOP-for-Objective-C
https://github.com/moszi/AOP-in-Objective-C
ACAspect on cocoadev
If you have a specific view in a view controller and want to be notified when it is loaded, you can register for a KVO notification when that instance variable (the outlet) changes.
You'll want to read up on Key Value Observing in Cocoa. There are several methods you will need to learn how to use.
Do a search on "Introduction to Key-Value Observing Programming Guide" in the XCode docs and read that section.
Make sure you balance each call to addObserver:forKeyPath:options:context: with a call to removeObserver:forKeyPath:, or your app may crash after the observing object is deallocated.

Fix UIScrollView to pass events UP the chain rather than DOWN

UIView's that don't handle their events pass them up the chain. By default, this passes them to their parent View, and if not handled (ultimately) to their parent UIViewController.
UIScrollView breaks this (there's lots of questions on SO, variations on the theme of "why does my app stop working once I add a UIScrollView?)
UISV decides whether the event is for itself, and if not, it passes it DOWN (into its subviews); if they don't handle the event, UISV just throws it away. That's the bug.
In that case, it's supposed to throw them back up to its own parent view - and ultimately parent UIVC. AFAICT, this is why so many people get confused: it's not working as documented (NB: as views are documented; UISV simply is "undocumented" on this matter - it doesn't declare what it aims to do in this situation).
So ... is there an easy fix for this bug? Is there a category I could write that would fix UISV in general and avoid me having to create "fake" UIView subclasses who exist purely to capture events and hand them where they're supposed to go? (which makes for bug-prone code)
In particular, from Apple's docs:
If the time fires without a significant change in position, the scroll view sends tracking events to the touched subview of the content view. If the user then drags their finger far enough before the timer elapses, the scroll view cancels any tracking in the subview and performs the scrolling itself.
...if I could override that "if the timer fires" method, and implement it correctly, I believe I could fix all my UISV instances.
But:
- would apple consider this "using a private API" (their description of "private" is nonsensical in normal programming terms, and I can't understand what they do and don't mean by it)
- does anyone know what this method is, or a good way to go about finding it? (debugging the compiled ObjC classes to find the symbol names, perhaps?)
I've found a partial answer, that's correct, but not 100% useable :(.
iPhone OS 4.0 lets you remotely add listeners to a given view, via the UIGestureRecognizer class. That's great, and works neatly.
Only problem is ... it won't work on any 3.x iPhones and iPod Touches.
(but if you're targetting 4.0 and above, it's an easy way forwards)
EDIT:
On OS 3.x, I created a custom UIView subclass that has extra properties:
NSObject *objectToDelegateToOnTouch;
id touchSourceIdentifier;
Whenever a touch comes in, the view sends the touch message directly to the objectToDelegateToOnTouch, but with the extra parameter of the touchSourceIdentifier.
This way, whenever you get a touch, you know where it came from (you can use an object, or a string, or anything you want as the "identifier").

Does UIApplication send a "Shake-to-Edit" notification in iPhone OS 3.0?

In iPhone OS 3.0, UIApplication allows you to set a applicationSupportsShakeToEdit flag. The documentation says "The default value is YES. Set the property to NO if you don’t want your application to display the Undo and Redo buttons when users shake the device."
This is all great and it ties in to the new NSUndoManager class nicely. However - I don't want to use the built in NSUndoManager in my app! I'm writing a drawing app, and I already have an undo/redo manager that does some fancy stuff (it manages the data required for each undo operation, and will page it to disk if the app is low on memory). I'd much rather just listen for a notification from the UIApplication and trigger undo myself. (I could just make a bogus NSUndoManager, but I also don't want the "Are you sure?" panel to show...)
Does anyone know if such a notification exists? I figure it must - but I can't find it documented anywhere. Is there a way to monitor all notifications going through the app, maybe?
Thanks!
You may well have solved this issue by now, but in case someone comes across this searching for a Shake solution as I did I laid out how you can get the 3.0 Shake event messsages easily in this thread:
How do I detect when someone shakes an iPhone?
It outlines how you can respond to shake without using an UndoManager or presenting the Undo API. Even if you set applicationSupportsShakeToEdit to NO, these events will still be received..