Catching all inputs from external (bluetooth) keyboard - iphone

Question
I would like to catch/preview all keyboard inputs that our application receives.
It would be enough if only the inputs from the external keyboard is caught.
I've come around solutions examples such as register a notification observer for UITextViewTextDidChangeNotification and UITextFieldTextDidChangeNotification but since this requires a TextView or TextField to be the current first responder it does not capture all our needs.
An example for this is that we are using the scanning to filter a list where the entire view is the list and navigation buttons.
Are there any way to register an observer that can preview all key inputs the application recieves?
Background:
I have an iPhone application that scans barcodes to identify objects at several different views.
Currently we scan the barcodes either by the camera (using zbar library) or with an MFI-certified barcode scanner.
In order to support more devices (for example an iPad) we are investigating other means to capture bar codes.
One of the device classes we are looking at are bluetooth scanners that identifies as a HID keyboard. This would be a great addition since it be usable with different products and manufactures.
In order to

Another option for iOS 7 and above is to use declare a keyCommands method on the current view controller. It can trap special key combinations. The downside is that you'll need to explicitly declare what you're looking for.
I answered my own question over here about getting special characters out of 2D barcodes from scanners connected in HID mode.

Seems like usage of IOHID** functions may not be rejected by AppStore reviewers because IOKit is "white-list-framework" (Will Apple reject Mac OS App which uses IOKit framework?)
So you can really try to use callback function from this topic IOHIDEventSystemClientScheduleWithRunLoop with EXC_BAD_ACCESS, hope that helps!
I used this code and it works even if your app is in background (just set special background mode), captures all system touch and keyboard events.

Related

Capturing game controller (not keyboard) input in Unity when the application is in the background?

In a nutshell, my goal is to create my own program for visually looking at game controller input.
However, it appears that once the unity application is in the background, ergo, once I switch to a different window to play a game, the controller input for my unity program is no longer read.
I've seen that the user Elringus had a project called UnityRawInput to help with this, but it only works for keyboard input. I've seen a lot of different responses related to this question that mention using native libraries to "hook" to get background input, but it apparently only works for keyboards.
What information is out there such that I can hook to game controllers themselves? I cannot seem to easily search for information pertaining to game controllers (and their various axis) themselves and not keyboards.
Why not a keyboard? Because I am very interested in a game controller's triggers in the way where, when pressed slightly, it can preform "weaker" actions rather than being pressed all the way, which a keyboard cannot discern as far as I can tell.
Though, wherever the solution might be, even outside of Unity, I'd like to know. Thanks! :D

Way to avoid chain of gestures to test iPhone App code often

Suppose an application runs fine. But now app is in a phase of adding functionality to it. Assume programmer added functionality to one button which is visible after applying many gesture on iPhone.
(for example, tap one of the tabs then, tap one of the tableview cells visible thereafter, then few more taps & say on Navigation bar one button is visible to which programmer added functionality ).
So while testing functionality of that button, programmer has to tap the iPhone many times to goto that specific button.
If that added functionality is critical & needs to be tested many times then it would be tedious process of just reaching that button which may lead to some frustration.
So is there any tool available that will help user in skipping this chain of tappings on iPhone.
Or is there any other way to test such an app.
You can use Instruments with UIAutomation, which lets you script UI actions, log messages and take screenshots. The test scripts are written using Javascript (search for UIAElement to find the API reference).
But the best resource to get you started is the WWDC 2010 session "Automating User Interface Testing with Instruments".
You should also read the Accessibility Programming Guide since UI Automation leverages on that.
Don't neglect the preprocessor constants as well. Something like:
#if TARGET_IPHONE_SIMULATOR
// Some code to automatically skip the view controllers leading to this
#else
// Production code
#endif
Otherwise I would investigate the UIAutomation classes for automating input to an iOS application. Furthermore, you should be Unit Testing the code behind your buttons. Writing a unit tests that pushes a button will always work. It's really rather pointless to unit test the framework code.

Simulating System Wide Touch Events on iOS [duplicate]

Im trying to simulate a touch on as UIWebView, how can I programmatically fire a touch event at a certain location? (x and y coordinates)
Just call touchesBegan?
Ideally I'd like to do it without any javascript hack because in the future it may not be a uiwebview
It's not easy to synthesize a touch event on the iPhone: you have to use undisclosed API, so you have a high probability of breaking on every update of the iOS and getting rejecting from Apple.
Here's a link that demonstrates how to synthesize a touch event on the iPhone:
Here's another question on StackOverflow: How to send a touch event to iPhone OS?
It's worth pointing out the KIF framework here. It's intended to run in the simulator but part of the code is simulating touch evens in code. with luck, this will be a good starting point.
https://github.com/square/KIF
Specifically, look at stepToTapViewWithAccessibilityLabel in KIFTestStep.m and the line
[view tapAtPoint:tappablePointInElement];
What you need to do is first create the events you want, and then send them to SpringBoard over the "purple port" eg. mach port. To make them system wide you must forward them to each application over the port. That means you need to actually do what the windowmanager does and looking at which app is active, screen locked, etc.
There are a hand full of private framework APIs that work (IOSurface, GraphicServices, SpringBoardServices, etc.) to get you the pieces you need.
You will have to load these private frameworks at runtime using something like dlopen().
This is 100% possible without jailbreak as of iOS 6.1.4 (current ATM), but you will be loading private frameworks which is not allowed by apple for AppStore ;)
It is possible. Exactly how you mentioned, using GSEvents and sending them to the purple named port of the aplication you are trying to control/simulate. Of course you need KennyTM's GSEvent.h to accomplish this.
I've done this for iOS 4.3, just by changing some of the values that Kenny had (like kGSHandInfoTypeTouchDown), but now I'm trying to do it for iOS 5 and it's not working, till now.
EDIT: It is now working for iOS 5.1.
Without jailbreaking there is no real way to hook a gesture recognizer into all views of the entire system. First off, your app running in the background doesn't have the ability of executing this code.

Simulate touch on iphone

Im trying to simulate a touch on as UIWebView, how can I programmatically fire a touch event at a certain location? (x and y coordinates)
Just call touchesBegan?
Ideally I'd like to do it without any javascript hack because in the future it may not be a uiwebview
It's not easy to synthesize a touch event on the iPhone: you have to use undisclosed API, so you have a high probability of breaking on every update of the iOS and getting rejecting from Apple.
Here's a link that demonstrates how to synthesize a touch event on the iPhone:
Here's another question on StackOverflow: How to send a touch event to iPhone OS?
It's worth pointing out the KIF framework here. It's intended to run in the simulator but part of the code is simulating touch evens in code. with luck, this will be a good starting point.
https://github.com/square/KIF
Specifically, look at stepToTapViewWithAccessibilityLabel in KIFTestStep.m and the line
[view tapAtPoint:tappablePointInElement];
What you need to do is first create the events you want, and then send them to SpringBoard over the "purple port" eg. mach port. To make them system wide you must forward them to each application over the port. That means you need to actually do what the windowmanager does and looking at which app is active, screen locked, etc.
There are a hand full of private framework APIs that work (IOSurface, GraphicServices, SpringBoardServices, etc.) to get you the pieces you need.
You will have to load these private frameworks at runtime using something like dlopen().
This is 100% possible without jailbreak as of iOS 6.1.4 (current ATM), but you will be loading private frameworks which is not allowed by apple for AppStore ;)
It is possible. Exactly how you mentioned, using GSEvents and sending them to the purple named port of the aplication you are trying to control/simulate. Of course you need KennyTM's GSEvent.h to accomplish this.
I've done this for iOS 4.3, just by changing some of the values that Kenny had (like kGSHandInfoTypeTouchDown), but now I'm trying to do it for iOS 5 and it's not working, till now.
EDIT: It is now working for iOS 5.1.
Without jailbreaking there is no real way to hook a gesture recognizer into all views of the entire system. First off, your app running in the background doesn't have the ability of executing this code.

Playing iPhone movies through TV out

Is there any way to emulate the Videos app such that we still maintain controls on the device (iPad/iPhone), but sends the video out through the cables to the TV? I looked into screen mirroring, but it's way too slow for videos, and regardless, the UIGetScreenImage() used by screen mirroring is no longer allowed by Apple.
The Videos app seems to have exactly what I need, but I don't see anything simple to make that happen.
Update (10/15/10): So apparently movies played through UIWebView have TV-Out support, while MPMoviePlayerController movies don't.
http://rebelalfons.posterous.com/iphone-os-support-for-tv-out
However, there is a caveat: this does not work on older devices updated to the most recent iOS. That is, iPod touches, iPhone 3G & 3GS don't work, while iPhone 4 and iPads do. Hoping there's some more stuff that we can use to fill in the gaps in compability, since I know its possible. Apps like AirVideo and StreamToMe currently support this functionality.
I am not sure if there is an official way to do this. But as for your examples, AirVideo uses method Swizzling (check out: http://www.cocoadev.com/index.pl?MethodSwizzling) to override checks in Movie Player, and acting like Videos app
I guess some with some reverse engineering on SDK, you can find where the checks are made, and swizzle that method, with your custom one.
In 3.2 and later (postdating the site you link to), UIScreen has a class method, 'screens' that'll return an array of one object — the main screen — if no external display is available, or two screens — the main screen and the external screen — if a TV lead is connected. The task should be as simple as positioning the views you want to appear on the external screen within its frame and the controls you want on the device within the other.
Have you tried that?
Edited for one additional comment: also as of 3.2, it is explicitly permissible to create an MPMovieController and then grab the view from it to treat as a normal UIView rather than doing a full 'present'. So that's how you'd get a movie view that you can position as you wish.