Simulating Multi Touch UIAutomation in Instruments on iOS device - multi-touch

Can someone point me in the right direction for how to simulate a multi touch tap in Instruments?
There are a few functions in which the number of touches is a parameter, but I don't understand how to actually define the coordinates of each touch.
For example, I need to simulate the user holding down a touch (or tap) at coordinate x1,y1 and x2, y2.
The application is not using standard accessible UI objects so I can only use coordinates.

I think the following should represent a held tap at specific coordinates, but it's only a single finger tap.
target.frontMostApp().mainWindow().scrollViews()[0].touchAndHold(1.7);
If you really need a two finger
Try performing a swipe action that doesn't move very far
target.frontMostApp().mainWindow().tableViews()[1].cells()[2].dragInsideWithOptions({startOffset:{x:0.0, y:0.1}, endOffset:{x:0.5, y:0.1}, duration:0.25});

Related

Windows, c++ detect finger touch on screen?

In virtual key codes in c++ vk_shift means the shift key for example. But is there any such code to indicate that a finger is touching the touchscreen ? Or how is it handled ?
There are two approaches. Either use Touch Input which will give you a very raw result in TOUCHINPUT, x-y point of the touch events, mostly useful for debugging purpose or if your app needs a unique interface, or Touch Gesture, which will be immediately familiar for users accustomed to other touch interfaces since you'll be handling GESTUREINFO, where you'll get tap, zoom, pan and rotation instead.

Controls in SceneKit/SpriteKit?

I need help implementing controls to move a SceneKit sphere. I want to have the user touch on the screen and when they move their finger around, the ball will move relative to the position of the user's finger. I don't want the y to increase or decrease I just want x and z, almost like a hula hoop.
I think you can use the GameController Framework seen Apple have released the WWDC Demo 'Fox', and it use GameController to control the character.

Detect hand swipe gesture in Unity using Kinect with OpenNI

I have a 3D model in my Unity project and I have a JavaScript that rotates the camera based on keyboard arrow keys (left/right).
Now, I need to have a script that detects a horizontal swipe hand gesture and returns a vector that I would use to rotate the camera.
I am using the ZigFu SDK with PrimeSense OpenNI/NITE. The ZigFu SDK comes with sample scripts, one of which is SwipeDetector - I am wondering how does it work?
My setup:
I have 3 GameObjects: a 3D model, a MainCamera, and a Directional Light.
So, how do I use the SwipeDetector script in my project? The way I do it right now is 1)Create an empty game object "SwipeDetection", 2) "drag and drop" the SwipeDetector script from ZigFu. I've put in logs in the SwipeDetector script, but I don't see them.
The Zigfu bindings (I'm assuming you're using version 1.4?) dont have a SwipeDetector sample, but they do include a SwipeDetector MonoBehaviour. The SwipeDetector detects vertical and horizontal swipes, but unfortunately doesn't detect the velocity of the swipe.
You have a few options:
Use the provided Swipe Detector, and rotate the camera by a fixed amount every time you detect a horizontal swipe (SwipeDetector_Left or SwipeDetector_Right events)
Use the provided Swipe Detector, start rotating on Swipe, and stop rotating on the SwipeDetector_Release event. This would be similar to pressing on the arrow keys (assuming you have the same behaviour on keydown/keyup events)
Keep track of the hand velocity, and check its value when the swipe occurs. Use this value to rotate the camera. You can keep track of velocity by creating a new MonoBehaviour, and implementing Hand_Create, Hand_Update, and Hand_Destroy (look at any of the scripts in the HandpointControls folder). Keep a queue with the hand points from the last n frames. The delta between the newest & oldest points will be your velocity for those n frames (I recommend you start with 15 frames, or about half a second)
(This will be included in a future Zigfu release :))
Your game object setup sounds right - if you dont see any logs you may not be performing the 'focus gesture' correctly. Try waving or performing a tap towards the sensor - this should cause the Hand_Create event to be called. Once you have a valid handpoint you should get the proper events from the Swipe Detector.
Also worth mentioning your swipe detection game object should have a HandPointControl component (added implicitly with RequireComponent) and that 'ActiveOnStart' should be true.

How to track the touch vector?

I need to calculate the direction of dragging a touch, to determine if the user is dragging up the screen, or down the screen.
Actually pretty simple, right? But:
1) Finger goes down, you get -touchesBegan:withEvent: called
2) Must wait until finger moves, and -touchesMoved:withEvent: gets called
3) Problem: At this point it's dangerous to tell if the user did drag up or down.
My thoughts: Check the time and accumulate calculates vectors until it's secure to tell the direction of touch.
Easy? No. Think about it: What if the user holds the finger down for 5 minutes on the same spot, but THEN decides to move up or down? BANG! Your code would fail, because it tried to determine the direction of touch when the finger didn't move really.
Problem 2: When the finger goes down and stays at the same spot for a few seconds because the user is a bit in the wind and thinks about what to do now, you'll get a lot of -touchesMoved:withEvent: calls very likely, but with very minor changes in touch location.
So my next thought: Do the accumulation in -touchesMoved:withEvent:, but only if a certain threshold of movement has been exceeded.
I bet you have some better concepts in place?
If you're just talking about detecting swipes, then in OS 3.2, Apple introduced gesture recognizers to make this sort of thing easier in iPad applications. You can also write your own recognizers for other kinds of gestures (like long drags), with the OS doing most of the heavy lifting, and notifying you about the "interesting moments".
According to some reports, it looks like they'll be in 4.0 and available to iPhone, as well.

When does a touchesBegan become a touchesMoved?

When you drag a finger across the iPhone touchscreen, it generates touchesMoved events at a nice, regular 60Hz.
However, the transition from the initial touchesBegan event to the first touchesMoved is less obvious: sometimes the device waits a while.
What's it waiting for? Larger time/distance deltas? More touches to lump into the event?
Does anybody know?
Importantly, this delay does not happen with subsequent fingers, which puts the first touch at a distinct disadvantage. It's very asymmetric and bad news for apps that demand precise input, like games and musical instruments.
To see this bug/phenomenon in action
slowly drag the iPhone screen unlock slider to the right. note the sudden jump & note how it doesn't occur if you have another finger resting anywhere else on the screen
try "creeping" across a narrow bridge in any number of 3D games. Frustrating!
try a dual virtual joystick game & note that the effect is mitigated because you're obliged to never end either of the touches which amortizes the unpleasantness.
Should've logged this as a bug 8 months ago.
After a touchesBegan event is fired the UIKit looks for a positional movement of the finger touch which translates into touchedMoved events as the x/y of the finger is changed until the finger is lifted and the touchesEnded event is fired.
If the finger is held down in one place it will not fire the touchesMoved event until there is movement.
I am building an app where you have to draw based on touchesMoved and it does happen at intervals but it is fast enough to give a smooth drawing appearance. Since it is an event and buried in the SDK you might have to do some testing in your scenario to see how fast it responds, depending on other actions or events it could be variable to the situation it is used. In my experience it is within a few ms of movement and this is with about 2-3k other sprites on the screen.
The drawing does start on the touchesBegan event though so the first placement is set then it chains to the touhesMoved and ends with the touchesEnd. I use all the events for the drag operation, so maybe the initial move is less laggy perceptually in this case.
To test in your app you could put a timestamp on each event if it is crucial to your design and work out some sort of easing.
http://developer.apple.com/IPhone/library/documentation/UIKit/Reference/UIResponder_Class/Reference/Reference.html#//apple_ref/occ/instm/UIResponder/touchesMoved:withEvent:
I don't think it's a bug, it's more of a missing feature.
Ordinarily, this is intended behavior to filter out accidental micro-movements that would transform a tap or long press into a slide when this was not intended by the user.
This is nothing new, it has always been there, for instance there are a few pixels of tolerance for double clicks in pointer-based GUIs - or even this same tolerance before a drag is started, because users sometimes inadvertently drag when they just meant to click. Try slowly moving an item on the desktop (OSX or Windows) to see it.
The missing feature is that it doesn't appear to be configurable.
An idea: Is it possible to enter a timed loop on touchesBegan that periodically checks the touch's locationInView:?
I don't represent any kind of official answer but it makes sense that touchesBegan->touchesMoved has a longer duration than touchesMoved->touchesMoved. It would be frustrating to developers if every touchesBegan came along with a bunch of accidental touchesMoved events. Apple must have determined (experimentally) some distance at which a touch becomes a drag. Once the touchesMoved has begun, there is no need to perform this test any more because every point until the next touchesUp is guaranteed to be a touchesMoved.
This seems to be what you are saying in your original post, Rythmic Fistman, and I just wanted to elaborate a bit more and say that I agree with your reasoning. This means if you're calculating a "drag velocity" of some sort, you are required to use distance traveled as a factor, rather than depending on the frequency of the update timer (which is better practice anyway).
Its waiting for the first move.
That's how the OS distinguishes a drag from a tap. Once you drag, all new notifications are touchesMoved.
This is also the reason why you should write code to execute on touch up event.
Currently such "delay" between touchesBegan and touchesMoved is present also when other fingers are touching the screen. Unfortunately it seems that an option to disable it doesn't exist yet. I'm also a music app developer (and player), and I find this behavior very annoying.