Windows, c++ detect finger touch on screen? - touch

In virtual key codes in c++ vk_shift means the shift key for example. But is there any such code to indicate that a finger is touching the touchscreen ? Or how is it handled ?

There are two approaches. Either use Touch Input which will give you a very raw result in TOUCHINPUT, x-y point of the touch events, mostly useful for debugging purpose or if your app needs a unique interface, or Touch Gesture, which will be immediately familiar for users accustomed to other touch interfaces since you'll be handling GESTUREINFO, where you'll get tap, zoom, pan and rotation instead.

Related

Handle pan, zoom and click with both Mouse/Touch on Unity3D

I'm working on a 2D turn based game and I'm in the condition to handle mouse and touch.
The game has an hexagonal map and requires pan, zoom and click action.
I decided to apply delegate-pattern so each object that requires an action, sends the event to its delegate who will do or not stuff.
In this way all inputs converge to a TurnManager that, with a state machine, handles events in accordance to the current game state.
Example:
MapCell.OnMouseUpAsButton() calls
delegate.OnCellClick(MapCell)
All works well and in this way I can handle when do something.
The problems arrived when I started to implement Zoom and Pan in the map.
For these two actions, I had to avoid classic Monobehaviour method (OnMouseDown, OnMouseUpAsButton, ...) and use LateUpdate.
So,
I created a CameraHandler that in the LateUpdate uses:
HandleMouse()
HandleTouch()
and, using delegate pattern, evokes the actions below:
OnMapWillPan()
OnMapPan()
OnMapEnd()
To avoid Pan or Clicks over UI elements, TurnManager filters received events with EventSystem.current.IsPointerOverGameObject()
Problem
On Mac/Mouse all works great! :D
On smartphone/Touch I can't click on nothing and only Pan is working. The debug on device is infernal because the lack of breakpoint or console.
Questions
Do you ever handle this things? How?
Which approach did you use?
What do you think I'm doing wrong?
Are there best practices to avoid problem like this and handle correctly crossplatform input?
Are there any good lecture/book for this argument?
PS: if needed I can show the code

Simulating Multi Touch UIAutomation in Instruments on iOS device

Can someone point me in the right direction for how to simulate a multi touch tap in Instruments?
There are a few functions in which the number of touches is a parameter, but I don't understand how to actually define the coordinates of each touch.
For example, I need to simulate the user holding down a touch (or tap) at coordinate x1,y1 and x2, y2.
The application is not using standard accessible UI objects so I can only use coordinates.
I think the following should represent a held tap at specific coordinates, but it's only a single finger tap.
target.frontMostApp().mainWindow().scrollViews()[0].touchAndHold(1.7);
If you really need a two finger
Try performing a swipe action that doesn't move very far
target.frontMostApp().mainWindow().tableViews()[1].cells()[2].dragInsideWithOptions({startOffset:{x:0.0, y:0.1}, endOffset:{x:0.5, y:0.1}, duration:0.25});

How to track the touch vector?

I need to calculate the direction of dragging a touch, to determine if the user is dragging up the screen, or down the screen.
Actually pretty simple, right? But:
1) Finger goes down, you get -touchesBegan:withEvent: called
2) Must wait until finger moves, and -touchesMoved:withEvent: gets called
3) Problem: At this point it's dangerous to tell if the user did drag up or down.
My thoughts: Check the time and accumulate calculates vectors until it's secure to tell the direction of touch.
Easy? No. Think about it: What if the user holds the finger down for 5 minutes on the same spot, but THEN decides to move up or down? BANG! Your code would fail, because it tried to determine the direction of touch when the finger didn't move really.
Problem 2: When the finger goes down and stays at the same spot for a few seconds because the user is a bit in the wind and thinks about what to do now, you'll get a lot of -touchesMoved:withEvent: calls very likely, but with very minor changes in touch location.
So my next thought: Do the accumulation in -touchesMoved:withEvent:, but only if a certain threshold of movement has been exceeded.
I bet you have some better concepts in place?
If you're just talking about detecting swipes, then in OS 3.2, Apple introduced gesture recognizers to make this sort of thing easier in iPad applications. You can also write your own recognizers for other kinds of gestures (like long drags), with the OS doing most of the heavy lifting, and notifying you about the "interesting moments".
According to some reports, it looks like they'll be in 4.0 and available to iPhone, as well.

When does a touchesBegan become a touchesMoved?

When you drag a finger across the iPhone touchscreen, it generates touchesMoved events at a nice, regular 60Hz.
However, the transition from the initial touchesBegan event to the first touchesMoved is less obvious: sometimes the device waits a while.
What's it waiting for? Larger time/distance deltas? More touches to lump into the event?
Does anybody know?
Importantly, this delay does not happen with subsequent fingers, which puts the first touch at a distinct disadvantage. It's very asymmetric and bad news for apps that demand precise input, like games and musical instruments.
To see this bug/phenomenon in action
slowly drag the iPhone screen unlock slider to the right. note the sudden jump & note how it doesn't occur if you have another finger resting anywhere else on the screen
try "creeping" across a narrow bridge in any number of 3D games. Frustrating!
try a dual virtual joystick game & note that the effect is mitigated because you're obliged to never end either of the touches which amortizes the unpleasantness.
Should've logged this as a bug 8 months ago.
After a touchesBegan event is fired the UIKit looks for a positional movement of the finger touch which translates into touchedMoved events as the x/y of the finger is changed until the finger is lifted and the touchesEnded event is fired.
If the finger is held down in one place it will not fire the touchesMoved event until there is movement.
I am building an app where you have to draw based on touchesMoved and it does happen at intervals but it is fast enough to give a smooth drawing appearance. Since it is an event and buried in the SDK you might have to do some testing in your scenario to see how fast it responds, depending on other actions or events it could be variable to the situation it is used. In my experience it is within a few ms of movement and this is with about 2-3k other sprites on the screen.
The drawing does start on the touchesBegan event though so the first placement is set then it chains to the touhesMoved and ends with the touchesEnd. I use all the events for the drag operation, so maybe the initial move is less laggy perceptually in this case.
To test in your app you could put a timestamp on each event if it is crucial to your design and work out some sort of easing.
http://developer.apple.com/IPhone/library/documentation/UIKit/Reference/UIResponder_Class/Reference/Reference.html#//apple_ref/occ/instm/UIResponder/touchesMoved:withEvent:
I don't think it's a bug, it's more of a missing feature.
Ordinarily, this is intended behavior to filter out accidental micro-movements that would transform a tap or long press into a slide when this was not intended by the user.
This is nothing new, it has always been there, for instance there are a few pixels of tolerance for double clicks in pointer-based GUIs - or even this same tolerance before a drag is started, because users sometimes inadvertently drag when they just meant to click. Try slowly moving an item on the desktop (OSX or Windows) to see it.
The missing feature is that it doesn't appear to be configurable.
An idea: Is it possible to enter a timed loop on touchesBegan that periodically checks the touch's locationInView:?
I don't represent any kind of official answer but it makes sense that touchesBegan->touchesMoved has a longer duration than touchesMoved->touchesMoved. It would be frustrating to developers if every touchesBegan came along with a bunch of accidental touchesMoved events. Apple must have determined (experimentally) some distance at which a touch becomes a drag. Once the touchesMoved has begun, there is no need to perform this test any more because every point until the next touchesUp is guaranteed to be a touchesMoved.
This seems to be what you are saying in your original post, Rythmic Fistman, and I just wanted to elaborate a bit more and say that I agree with your reasoning. This means if you're calculating a "drag velocity" of some sort, you are required to use distance traveled as a factor, rather than depending on the frequency of the update timer (which is better practice anyway).
Its waiting for the first move.
That's how the OS distinguishes a drag from a tap. Once you drag, all new notifications are touchesMoved.
This is also the reason why you should write code to execute on touch up event.
Currently such "delay" between touchesBegan and touchesMoved is present also when other fingers are touching the screen. Unfortunately it seems that an option to disable it doesn't exist yet. I'm also a music app developer (and player), and I find this behavior very annoying.

Multi-touch appearing as one touch

I am getting inconsistent results using multi-touch.
I am trying to detect a 2 fingered swipe.
Sometimes I successfully detect 2 touches, othertimes it appears as 1 touch but oscillating in position between the two fingers.
I created a test app to try to isolate this behaviour but I found that the test app behaved ok.
The app where the problem occurs has a number of sub views, and my touch detection code is on the root view. Could this be the problem? Do I need to include touch detection on every subview?
You are not guaranteed to always get both touches in the array - for example if one finger were moving and the other were still, it would not appear in TouchesMoved. Just assume the finger is still down in the old place until reported otherwise.
After much poking around, I've realised that even when I get two fingered detection working well, there is still a high likelyhood that occasionally it will pick up 1 finger if only for a short while. In the case of my application that won't work because I am already using 1 finger movement for something else. So back to the drawing board