Flash cs5 iOS holding down? - iphone

Hey, I was wondering if there was an event or something that can check if my user is holding down a button? (for iphone, iOS)

The Event class is TouchEvent. You'll need to toggle the touch state in a variable if you want to keep the fact someone is pressing down after the fact (of the event).
You can use MouseEvent if you need/desire a single touch point. Though you still need the variable.
You need to set Multitouch.inputMode to MultitouchInputMode.TOUCH_POINT in order to turn on touch events on the iPhone. Unfortunately, you can only test touch/gesture events on the iPhone itself. (MouseEvent and TouchEvent are virtually identical in use: MOUSE_DOWN = TOUCH_BEGIN, MOUSE_MOVE = TOUCH_MOVE, TOUCH_END = MOUSE_UP. The main difference is that you can only have one "mouse" yet multiple touches.)
Personally, I use MouseEvent to test on the machine, with an if-then setting up TouchEvent listeners if Multitouch.supportsTouchEvents is true (which it only is on the iPhone/Android).

If the button is a Cocoa Touch UIButton, you can easily connect a number of its gesture events from Interface builder to code Xcode IBActions (where you write the appropriate methods).
How your project would connect to flash, I am not sure. Give us more details and perhaps someone can better help. ;)

Related

How do you develop simple code for Gtk+ 3 touch input?

I've been searching for a little while to investigate whether or not our Gtk+ 3 application (Gtk#, actually, but bound to Gtk+ 3) will be easily convertible to receive touch events rather than mouse clicks. This is pure investigation at the moment since I don't actually have a touch screen yet (that's also the reason why I can't easily test my theories).
As far as I can tell, the Gnome 3 can indeed do this and the documentation that seems relevant is the gesture stuff, found here.
I've not been able to find any sample Gtk+ code so I'm hoping it's a simple matter of instantiating one of the gesture classes and simply setting it up to deliver events to the main event loop.
Is there anyone with experience with Gtk gestures that can guide me in a simple example? Let's say I have a DrawableArea and I want to receive a very simple, single-touch event that gives me an event with the point within the area that was touched.
Once I have that, I should be able to build on it to handle swipes, pinches and so on.
You cannot inject pointer events as touch events: they are fundamentally different, and they interact with the gesture recognition state machine that lives inside GTK.
GTK has the ability (for debugging purposes) to emulate touch events via pointer, though obviously it cannot emulate multi-touch events because a pointing device only has one event sequence. If you have a recent version of GTK 3.x, you can use the GTK_TEST_TOUCHSCREEN environment variable set to a non-zero value.
If you want to know how to use GtkGesture implementations in your own widgets, then I suggest you look at the GtkGesture API reference, and look at the various types of gestures available, like GtkGestureSwipe or GtkGestureZoom. You're supposed to add GtkGesture instances to your widget when constructing it, and then catch the signals on the specific gesture when they are recognised.

How to receive single tap and hold Remote-Control Event with iOS?

I'm looking for my app to listen for the single tap and hold of the play/pause button on headphones for iOS devices. Currently, the behavior is to start Voice Control (on my iPhone 4).
The event list for remote control events allows for getting the double tap and hold (BeginSeekingForward) and the release (EndSeekingForward), but I'm looking for the single tap and hold which currently activates Voice Control.
Is there a way for my app to override Voice Control and listen for the single tap and hold?
You can add in a UILongPressGestureRecognizer to your UIButton and work from there.
If you want to prevent your other method from being called, you'll also need a UITapGestureRecognizer that counts taps and sets a value to true when it receives a second tap and another method (Check from your UIResponder touch* methods) for a touch release and set the value back to false.
Using that value, you can check with the UILongPressGestureRecognizer to see if the user double tapped or not.
Alternatively, you could also just have the value set to false upon a touch to the button that doesn't have a tap count of two.
Hope this helps!
EDIT: You cannot override the headphones (as far as I know) without jailbreaking the device. Normally you should never have to code for people interacting using the Apple headphones, since that would severely reduce the market and usability of your app. If someone were to forget their headphones, for example, they could not utilize your app. Its just something to think about. You don't want to limit your apps accessibility too much.

Simulating System Wide Touch Events on iOS [duplicate]

Im trying to simulate a touch on as UIWebView, how can I programmatically fire a touch event at a certain location? (x and y coordinates)
Just call touchesBegan?
Ideally I'd like to do it without any javascript hack because in the future it may not be a uiwebview
It's not easy to synthesize a touch event on the iPhone: you have to use undisclosed API, so you have a high probability of breaking on every update of the iOS and getting rejecting from Apple.
Here's a link that demonstrates how to synthesize a touch event on the iPhone:
Here's another question on StackOverflow: How to send a touch event to iPhone OS?
It's worth pointing out the KIF framework here. It's intended to run in the simulator but part of the code is simulating touch evens in code. with luck, this will be a good starting point.
https://github.com/square/KIF
Specifically, look at stepToTapViewWithAccessibilityLabel in KIFTestStep.m and the line
[view tapAtPoint:tappablePointInElement];
What you need to do is first create the events you want, and then send them to SpringBoard over the "purple port" eg. mach port. To make them system wide you must forward them to each application over the port. That means you need to actually do what the windowmanager does and looking at which app is active, screen locked, etc.
There are a hand full of private framework APIs that work (IOSurface, GraphicServices, SpringBoardServices, etc.) to get you the pieces you need.
You will have to load these private frameworks at runtime using something like dlopen().
This is 100% possible without jailbreak as of iOS 6.1.4 (current ATM), but you will be loading private frameworks which is not allowed by apple for AppStore ;)
It is possible. Exactly how you mentioned, using GSEvents and sending them to the purple named port of the aplication you are trying to control/simulate. Of course you need KennyTM's GSEvent.h to accomplish this.
I've done this for iOS 4.3, just by changing some of the values that Kenny had (like kGSHandInfoTypeTouchDown), but now I'm trying to do it for iOS 5 and it's not working, till now.
EDIT: It is now working for iOS 5.1.
Without jailbreaking there is no real way to hook a gesture recognizer into all views of the entire system. First off, your app running in the background doesn't have the ability of executing this code.

Simulate touch on iphone

Im trying to simulate a touch on as UIWebView, how can I programmatically fire a touch event at a certain location? (x and y coordinates)
Just call touchesBegan?
Ideally I'd like to do it without any javascript hack because in the future it may not be a uiwebview
It's not easy to synthesize a touch event on the iPhone: you have to use undisclosed API, so you have a high probability of breaking on every update of the iOS and getting rejecting from Apple.
Here's a link that demonstrates how to synthesize a touch event on the iPhone:
Here's another question on StackOverflow: How to send a touch event to iPhone OS?
It's worth pointing out the KIF framework here. It's intended to run in the simulator but part of the code is simulating touch evens in code. with luck, this will be a good starting point.
https://github.com/square/KIF
Specifically, look at stepToTapViewWithAccessibilityLabel in KIFTestStep.m and the line
[view tapAtPoint:tappablePointInElement];
What you need to do is first create the events you want, and then send them to SpringBoard over the "purple port" eg. mach port. To make them system wide you must forward them to each application over the port. That means you need to actually do what the windowmanager does and looking at which app is active, screen locked, etc.
There are a hand full of private framework APIs that work (IOSurface, GraphicServices, SpringBoardServices, etc.) to get you the pieces you need.
You will have to load these private frameworks at runtime using something like dlopen().
This is 100% possible without jailbreak as of iOS 6.1.4 (current ATM), but you will be loading private frameworks which is not allowed by apple for AppStore ;)
It is possible. Exactly how you mentioned, using GSEvents and sending them to the purple named port of the aplication you are trying to control/simulate. Of course you need KennyTM's GSEvent.h to accomplish this.
I've done this for iOS 4.3, just by changing some of the values that Kenny had (like kGSHandInfoTypeTouchDown), but now I'm trying to do it for iOS 5 and it's not working, till now.
EDIT: It is now working for iOS 5.1.
Without jailbreaking there is no real way to hook a gesture recognizer into all views of the entire system. First off, your app running in the background doesn't have the ability of executing this code.

Fix UIScrollView to pass events UP the chain rather than DOWN

UIView's that don't handle their events pass them up the chain. By default, this passes them to their parent View, and if not handled (ultimately) to their parent UIViewController.
UIScrollView breaks this (there's lots of questions on SO, variations on the theme of "why does my app stop working once I add a UIScrollView?)
UISV decides whether the event is for itself, and if not, it passes it DOWN (into its subviews); if they don't handle the event, UISV just throws it away. That's the bug.
In that case, it's supposed to throw them back up to its own parent view - and ultimately parent UIVC. AFAICT, this is why so many people get confused: it's not working as documented (NB: as views are documented; UISV simply is "undocumented" on this matter - it doesn't declare what it aims to do in this situation).
So ... is there an easy fix for this bug? Is there a category I could write that would fix UISV in general and avoid me having to create "fake" UIView subclasses who exist purely to capture events and hand them where they're supposed to go? (which makes for bug-prone code)
In particular, from Apple's docs:
If the time fires without a significant change in position, the scroll view sends tracking events to the touched subview of the content view. If the user then drags their finger far enough before the timer elapses, the scroll view cancels any tracking in the subview and performs the scrolling itself.
...if I could override that "if the timer fires" method, and implement it correctly, I believe I could fix all my UISV instances.
But:
- would apple consider this "using a private API" (their description of "private" is nonsensical in normal programming terms, and I can't understand what they do and don't mean by it)
- does anyone know what this method is, or a good way to go about finding it? (debugging the compiled ObjC classes to find the symbol names, perhaps?)
I've found a partial answer, that's correct, but not 100% useable :(.
iPhone OS 4.0 lets you remotely add listeners to a given view, via the UIGestureRecognizer class. That's great, and works neatly.
Only problem is ... it won't work on any 3.x iPhones and iPod Touches.
(but if you're targetting 4.0 and above, it's an easy way forwards)
EDIT:
On OS 3.x, I created a custom UIView subclass that has extra properties:
NSObject *objectToDelegateToOnTouch;
id touchSourceIdentifier;
Whenever a touch comes in, the view sends the touch message directly to the objectToDelegateToOnTouch, but with the extra parameter of the touchSourceIdentifier.
This way, whenever you get a touch, you know where it came from (you can use an object, or a string, or anything you want as the "identifier").