I am getting inconsistent results using multi-touch.
I am trying to detect a 2 fingered swipe.
Sometimes I successfully detect 2 touches, othertimes it appears as 1 touch but oscillating in position between the two fingers.
I created a test app to try to isolate this behaviour but I found that the test app behaved ok.
The app where the problem occurs has a number of sub views, and my touch detection code is on the root view. Could this be the problem? Do I need to include touch detection on every subview?
You are not guaranteed to always get both touches in the array - for example if one finger were moving and the other were still, it would not appear in TouchesMoved. Just assume the finger is still down in the old place until reported otherwise.
After much poking around, I've realised that even when I get two fingered detection working well, there is still a high likelyhood that occasionally it will pick up 1 finger if only for a short while. In the case of my application that won't work because I am already using 1 finger movement for something else. So back to the drawing board
Related
everyone!
I'm doing a research project involving detecting the simultaneous detection of palms placed on a multitouch screen.
I've done quite a bit googling and found out that there's a lot of libraries both for gesture recognition (AS3, https://github.com/fljot/Gestouch for instance) and computer vision. I'm working with JSTouchController (https://github.com/sebleedelisle/JSTouchController), but it tracks only 5 fingers at a time. So if I place one palm on a screen and the library finds all five fingers, it won't track second palm being placed at all. It works correctly from time to time, though.
So, the question is: are there any libraries to track ten fingers simultaneously with acceptable quality on modern touch screens?
The number of touch points is usually restricted by the device or OS. If you want to prototype something quickly, one possibility is to use the LEAP motion sensor. It can track ten fingers in front of it, although not via touch.
I found that sometimes the number of touchesbegan touches does not correspond to the number of reported touchesended/touchescanceled touches, leading to reported touches thats no longer active but never are reported as dead.
This happens if the user is cluttering the screen with touches, perhaps by putting their whole palm on it or melting two or more fingers into one touch. Not sure why it happens, but a number of users has sent me a report of the opengl based interface in my app freezing because of it. I put in a workaround which eliminated the issue, but i wonder if there is a way to accurately track all active touches?
Basically I want to maintain an array of UItouch pointers pointing to all fingers currently touching the display at any given time. Making sure the array is always empty if the user is not touching it, no matter what crazy things the user did.
Anyone else noticed this? Any ideas on how to solve it?
I need to calculate the direction of dragging a touch, to determine if the user is dragging up the screen, or down the screen.
Actually pretty simple, right? But:
1) Finger goes down, you get -touchesBegan:withEvent: called
2) Must wait until finger moves, and -touchesMoved:withEvent: gets called
3) Problem: At this point it's dangerous to tell if the user did drag up or down.
My thoughts: Check the time and accumulate calculates vectors until it's secure to tell the direction of touch.
Easy? No. Think about it: What if the user holds the finger down for 5 minutes on the same spot, but THEN decides to move up or down? BANG! Your code would fail, because it tried to determine the direction of touch when the finger didn't move really.
Problem 2: When the finger goes down and stays at the same spot for a few seconds because the user is a bit in the wind and thinks about what to do now, you'll get a lot of -touchesMoved:withEvent: calls very likely, but with very minor changes in touch location.
So my next thought: Do the accumulation in -touchesMoved:withEvent:, but only if a certain threshold of movement has been exceeded.
I bet you have some better concepts in place?
If you're just talking about detecting swipes, then in OS 3.2, Apple introduced gesture recognizers to make this sort of thing easier in iPad applications. You can also write your own recognizers for other kinds of gestures (like long drags), with the OS doing most of the heavy lifting, and notifying you about the "interesting moments".
According to some reports, it looks like they'll be in 4.0 and available to iPhone, as well.
I discovered this: When a UIView's touchesBegan: method is fired and there is only one finger on the screen, the system will wait up to 0.25 seconds to see if the finger move far enough, before it will fire touchesMoved:. There is no programmatically way to get around this problem.
But now the interesting part: If one finger is already somewhere on the screen, this ugly behavior is turned off. every next finger is interpteted as one that wants to move. touchesMoved: is fired immediately with no delay.
So the BIG question, guys: Is it possible to fake this first initial finger somewhere on the screen, so that the following real finger will be interpreted as a second finger? That would rescue my weekend ;)
Matt Gallagher describes how to synthesize touch events in his post here. He intends this more for user interface testing, though, because he does use some private instance variables which you would not want to rely on for a shipping application.
When you drag a finger across the iPhone touchscreen, it generates touchesMoved events at a nice, regular 60Hz.
However, the transition from the initial touchesBegan event to the first touchesMoved is less obvious: sometimes the device waits a while.
What's it waiting for? Larger time/distance deltas? More touches to lump into the event?
Does anybody know?
Importantly, this delay does not happen with subsequent fingers, which puts the first touch at a distinct disadvantage. It's very asymmetric and bad news for apps that demand precise input, like games and musical instruments.
To see this bug/phenomenon in action
slowly drag the iPhone screen unlock slider to the right. note the sudden jump & note how it doesn't occur if you have another finger resting anywhere else on the screen
try "creeping" across a narrow bridge in any number of 3D games. Frustrating!
try a dual virtual joystick game & note that the effect is mitigated because you're obliged to never end either of the touches which amortizes the unpleasantness.
Should've logged this as a bug 8 months ago.
After a touchesBegan event is fired the UIKit looks for a positional movement of the finger touch which translates into touchedMoved events as the x/y of the finger is changed until the finger is lifted and the touchesEnded event is fired.
If the finger is held down in one place it will not fire the touchesMoved event until there is movement.
I am building an app where you have to draw based on touchesMoved and it does happen at intervals but it is fast enough to give a smooth drawing appearance. Since it is an event and buried in the SDK you might have to do some testing in your scenario to see how fast it responds, depending on other actions or events it could be variable to the situation it is used. In my experience it is within a few ms of movement and this is with about 2-3k other sprites on the screen.
The drawing does start on the touchesBegan event though so the first placement is set then it chains to the touhesMoved and ends with the touchesEnd. I use all the events for the drag operation, so maybe the initial move is less laggy perceptually in this case.
To test in your app you could put a timestamp on each event if it is crucial to your design and work out some sort of easing.
http://developer.apple.com/IPhone/library/documentation/UIKit/Reference/UIResponder_Class/Reference/Reference.html#//apple_ref/occ/instm/UIResponder/touchesMoved:withEvent:
I don't think it's a bug, it's more of a missing feature.
Ordinarily, this is intended behavior to filter out accidental micro-movements that would transform a tap or long press into a slide when this was not intended by the user.
This is nothing new, it has always been there, for instance there are a few pixels of tolerance for double clicks in pointer-based GUIs - or even this same tolerance before a drag is started, because users sometimes inadvertently drag when they just meant to click. Try slowly moving an item on the desktop (OSX or Windows) to see it.
The missing feature is that it doesn't appear to be configurable.
An idea: Is it possible to enter a timed loop on touchesBegan that periodically checks the touch's locationInView:?
I don't represent any kind of official answer but it makes sense that touchesBegan->touchesMoved has a longer duration than touchesMoved->touchesMoved. It would be frustrating to developers if every touchesBegan came along with a bunch of accidental touchesMoved events. Apple must have determined (experimentally) some distance at which a touch becomes a drag. Once the touchesMoved has begun, there is no need to perform this test any more because every point until the next touchesUp is guaranteed to be a touchesMoved.
This seems to be what you are saying in your original post, Rythmic Fistman, and I just wanted to elaborate a bit more and say that I agree with your reasoning. This means if you're calculating a "drag velocity" of some sort, you are required to use distance traveled as a factor, rather than depending on the frequency of the update timer (which is better practice anyway).
Its waiting for the first move.
That's how the OS distinguishes a drag from a tap. Once you drag, all new notifications are touchesMoved.
This is also the reason why you should write code to execute on touch up event.
Currently such "delay" between touchesBegan and touchesMoved is present also when other fingers are touching the screen. Unfortunately it seems that an option to disable it doesn't exist yet. I'm also a music app developer (and player), and I find this behavior very annoying.