When does a touchesBegan become a touchesMoved? - iphone

When you drag a finger across the iPhone touchscreen, it generates touchesMoved events at a nice, regular 60Hz.
However, the transition from the initial touchesBegan event to the first touchesMoved is less obvious: sometimes the device waits a while.
What's it waiting for? Larger time/distance deltas? More touches to lump into the event?
Does anybody know?
Importantly, this delay does not happen with subsequent fingers, which puts the first touch at a distinct disadvantage. It's very asymmetric and bad news for apps that demand precise input, like games and musical instruments.
To see this bug/phenomenon in action
slowly drag the iPhone screen unlock slider to the right. note the sudden jump & note how it doesn't occur if you have another finger resting anywhere else on the screen
try "creeping" across a narrow bridge in any number of 3D games. Frustrating!
try a dual virtual joystick game & note that the effect is mitigated because you're obliged to never end either of the touches which amortizes the unpleasantness.
Should've logged this as a bug 8 months ago.

After a touchesBegan event is fired the UIKit looks for a positional movement of the finger touch which translates into touchedMoved events as the x/y of the finger is changed until the finger is lifted and the touchesEnded event is fired.
If the finger is held down in one place it will not fire the touchesMoved event until there is movement.
I am building an app where you have to draw based on touchesMoved and it does happen at intervals but it is fast enough to give a smooth drawing appearance. Since it is an event and buried in the SDK you might have to do some testing in your scenario to see how fast it responds, depending on other actions or events it could be variable to the situation it is used. In my experience it is within a few ms of movement and this is with about 2-3k other sprites on the screen.
The drawing does start on the touchesBegan event though so the first placement is set then it chains to the touhesMoved and ends with the touchesEnd. I use all the events for the drag operation, so maybe the initial move is less laggy perceptually in this case.
To test in your app you could put a timestamp on each event if it is crucial to your design and work out some sort of easing.
http://developer.apple.com/IPhone/library/documentation/UIKit/Reference/UIResponder_Class/Reference/Reference.html#//apple_ref/occ/instm/UIResponder/touchesMoved:withEvent:

I don't think it's a bug, it's more of a missing feature.
Ordinarily, this is intended behavior to filter out accidental micro-movements that would transform a tap or long press into a slide when this was not intended by the user.
This is nothing new, it has always been there, for instance there are a few pixels of tolerance for double clicks in pointer-based GUIs - or even this same tolerance before a drag is started, because users sometimes inadvertently drag when they just meant to click. Try slowly moving an item on the desktop (OSX or Windows) to see it.
The missing feature is that it doesn't appear to be configurable.
An idea: Is it possible to enter a timed loop on touchesBegan that periodically checks the touch's locationInView:?

I don't represent any kind of official answer but it makes sense that touchesBegan->touchesMoved has a longer duration than touchesMoved->touchesMoved. It would be frustrating to developers if every touchesBegan came along with a bunch of accidental touchesMoved events. Apple must have determined (experimentally) some distance at which a touch becomes a drag. Once the touchesMoved has begun, there is no need to perform this test any more because every point until the next touchesUp is guaranteed to be a touchesMoved.
This seems to be what you are saying in your original post, Rythmic Fistman, and I just wanted to elaborate a bit more and say that I agree with your reasoning. This means if you're calculating a "drag velocity" of some sort, you are required to use distance traveled as a factor, rather than depending on the frequency of the update timer (which is better practice anyway).

Its waiting for the first move.
That's how the OS distinguishes a drag from a tap. Once you drag, all new notifications are touchesMoved.
This is also the reason why you should write code to execute on touch up event.

Currently such "delay" between touchesBegan and touchesMoved is present also when other fingers are touching the screen. Unfortunately it seems that an option to disable it doesn't exist yet. I'm also a music app developer (and player), and I find this behavior very annoying.

Related

how to achieve same flinging results on android & iphone

I ported a game from iPhone to android. All OpenGL based, and exactly the same calculations for scrolling. I noticed that on the iPhone when scrolling through the game I can scroll faster, and the starting speed as I lifted my finger felt the same as my finger was moving.
However on the android device unfortunatly this was wan not the same. As I lifter my finger when scrolling, the start scroll speed felt slower. Scrolling on the iPhone feels more accurate.
Is there anything special about how the android handles the touches that is differnt than the iPhone? and how can I take advantage of this to achieve a similar feeling as on the iPhone. On the On Android all applications when flinging the speed that I have lifted my finger from doesn't feel the same as how fast my finger moved.
I found it. The android gives a ACTION_MOVE then a ACTION_UP for the same (or close) location. iPhone doesn't do this at all! It just gives a touchesEnded! So if there is motion I will at minimum have 3 points on the android ALWAYS, But on the iPhone its 2 points (touch down, touch up both at different locations)
The next thing is the ACTION_MOVE AND ACTION_UP don't happen right away, and its significant time interval when calculating the average speed of sampled points.
Solution: On ACTION_UP, if there is a ACTION_MOVE slide all the seconds stored so that the ACTION_MOVE "happens" when the ACTION_UP occured. Don't put the final touches up point in the speed calculation. And now calculate speed as usual.
Summary
On Android if you have moved your finger you get a ACTION_MOVE before a ACTION_UP. And ACTION_UP is roughly at the same location at the ACTION_MOVE making it seem as if the speed at the end is roughly 0. iPhone does not do this, iPhone does not give a touchesMoved for the last touchesEnded (IE touch down, move finger, lift, if you do it fast enough on iPhone you wont get the intermediate event for touchesMoved your finger, where as on the android you do).
Android / iPhone equivalent's
ACTION_UP = touchesEnded
ACTION_DOWN = touchesBegan
ACTION_MOVE = touchesMoved
Also I noticed on Android there is some point history functions. I did not use those. I stored touches and there timestamps in my own array.
Are you taking into account the dpi of the screen, or resolution of the screen, to either do your flings in dots-per-second or in px-per-second? That could easily affect things.
In addition, yes, android's touch processing seems to be a bit slower. You might be able to add a 'fudge factor' of some amount to get a closer response (maybe +10% or something like that). But that said, the 'flinging' speed of android apps is a combination of touchscreen and the framework's particular math calcs for determining fling speed and rate of decay -- so in terms of apps OTHER than yours, you could just simply be seeing much different math/approaches between the two platforms.

Accurately tracking all active touches?

I found that sometimes the number of touchesbegan touches does not correspond to the number of reported touchesended/touchescanceled touches, leading to reported touches thats no longer active but never are reported as dead.
This happens if the user is cluttering the screen with touches, perhaps by putting their whole palm on it or melting two or more fingers into one touch. Not sure why it happens, but a number of users has sent me a report of the opengl based interface in my app freezing because of it. I put in a workaround which eliminated the issue, but i wonder if there is a way to accurately track all active touches?
Basically I want to maintain an array of UItouch pointers pointing to all fingers currently touching the display at any given time. Making sure the array is always empty if the user is not touching it, no matter what crazy things the user did.
Anyone else noticed this? Any ideas on how to solve it?

How to track the touch vector?

I need to calculate the direction of dragging a touch, to determine if the user is dragging up the screen, or down the screen.
Actually pretty simple, right? But:
1) Finger goes down, you get -touchesBegan:withEvent: called
2) Must wait until finger moves, and -touchesMoved:withEvent: gets called
3) Problem: At this point it's dangerous to tell if the user did drag up or down.
My thoughts: Check the time and accumulate calculates vectors until it's secure to tell the direction of touch.
Easy? No. Think about it: What if the user holds the finger down for 5 minutes on the same spot, but THEN decides to move up or down? BANG! Your code would fail, because it tried to determine the direction of touch when the finger didn't move really.
Problem 2: When the finger goes down and stays at the same spot for a few seconds because the user is a bit in the wind and thinks about what to do now, you'll get a lot of -touchesMoved:withEvent: calls very likely, but with very minor changes in touch location.
So my next thought: Do the accumulation in -touchesMoved:withEvent:, but only if a certain threshold of movement has been exceeded.
I bet you have some better concepts in place?
If you're just talking about detecting swipes, then in OS 3.2, Apple introduced gesture recognizers to make this sort of thing easier in iPad applications. You can also write your own recognizers for other kinds of gestures (like long drags), with the OS doing most of the heavy lifting, and notifying you about the "interesting moments".
According to some reports, it looks like they'll be in 4.0 and available to iPhone, as well.

Is there a way to fake a Touch into the system?

I discovered this: When a UIView's touchesBegan: method is fired and there is only one finger on the screen, the system will wait up to 0.25 seconds to see if the finger move far enough, before it will fire touchesMoved:. There is no programmatically way to get around this problem.
But now the interesting part: If one finger is already somewhere on the screen, this ugly behavior is turned off. every next finger is interpteted as one that wants to move. touchesMoved: is fired immediately with no delay.
So the BIG question, guys: Is it possible to fake this first initial finger somewhere on the screen, so that the following real finger will be interpreted as a second finger? That would rescue my weekend ;)
Matt Gallagher describes how to synthesize touch events in his post here. He intends this more for user interface testing, though, because he does use some private instance variables which you would not want to rely on for a shipping application.

Multi-touch appearing as one touch

I am getting inconsistent results using multi-touch.
I am trying to detect a 2 fingered swipe.
Sometimes I successfully detect 2 touches, othertimes it appears as 1 touch but oscillating in position between the two fingers.
I created a test app to try to isolate this behaviour but I found that the test app behaved ok.
The app where the problem occurs has a number of sub views, and my touch detection code is on the root view. Could this be the problem? Do I need to include touch detection on every subview?
You are not guaranteed to always get both touches in the array - for example if one finger were moving and the other were still, it would not appear in TouchesMoved. Just assume the finger is still down in the old place until reported otherwise.
After much poking around, I've realised that even when I get two fingered detection working well, there is still a high likelyhood that occasionally it will pick up 1 finger if only for a short while. In the case of my application that won't work because I am already using 1 finger movement for something else. So back to the drawing board