I ported a game from iPhone to android. All OpenGL based, and exactly the same calculations for scrolling. I noticed that on the iPhone when scrolling through the game I can scroll faster, and the starting speed as I lifted my finger felt the same as my finger was moving.
However on the android device unfortunatly this was wan not the same. As I lifter my finger when scrolling, the start scroll speed felt slower. Scrolling on the iPhone feels more accurate.
Is there anything special about how the android handles the touches that is differnt than the iPhone? and how can I take advantage of this to achieve a similar feeling as on the iPhone. On the On Android all applications when flinging the speed that I have lifted my finger from doesn't feel the same as how fast my finger moved.
I found it. The android gives a ACTION_MOVE then a ACTION_UP for the same (or close) location. iPhone doesn't do this at all! It just gives a touchesEnded! So if there is motion I will at minimum have 3 points on the android ALWAYS, But on the iPhone its 2 points (touch down, touch up both at different locations)
The next thing is the ACTION_MOVE AND ACTION_UP don't happen right away, and its significant time interval when calculating the average speed of sampled points.
Solution: On ACTION_UP, if there is a ACTION_MOVE slide all the seconds stored so that the ACTION_MOVE "happens" when the ACTION_UP occured. Don't put the final touches up point in the speed calculation. And now calculate speed as usual.
Summary
On Android if you have moved your finger you get a ACTION_MOVE before a ACTION_UP. And ACTION_UP is roughly at the same location at the ACTION_MOVE making it seem as if the speed at the end is roughly 0. iPhone does not do this, iPhone does not give a touchesMoved for the last touchesEnded (IE touch down, move finger, lift, if you do it fast enough on iPhone you wont get the intermediate event for touchesMoved your finger, where as on the android you do).
Android / iPhone equivalent's
ACTION_UP = touchesEnded
ACTION_DOWN = touchesBegan
ACTION_MOVE = touchesMoved
Also I noticed on Android there is some point history functions. I did not use those. I stored touches and there timestamps in my own array.
Are you taking into account the dpi of the screen, or resolution of the screen, to either do your flings in dots-per-second or in px-per-second? That could easily affect things.
In addition, yes, android's touch processing seems to be a bit slower. You might be able to add a 'fudge factor' of some amount to get a closer response (maybe +10% or something like that). But that said, the 'flinging' speed of android apps is a combination of touchscreen and the framework's particular math calcs for determining fling speed and rate of decay -- so in terms of apps OTHER than yours, you could just simply be seeing much different math/approaches between the two platforms.
Related
everyone!
I'm doing a research project involving detecting the simultaneous detection of palms placed on a multitouch screen.
I've done quite a bit googling and found out that there's a lot of libraries both for gesture recognition (AS3, https://github.com/fljot/Gestouch for instance) and computer vision. I'm working with JSTouchController (https://github.com/sebleedelisle/JSTouchController), but it tracks only 5 fingers at a time. So if I place one palm on a screen and the library finds all five fingers, it won't track second palm being placed at all. It works correctly from time to time, though.
So, the question is: are there any libraries to track ten fingers simultaneously with acceptable quality on modern touch screens?
The number of touch points is usually restricted by the device or OS. If you want to prototype something quickly, one possibility is to use the LEAP motion sensor. It can track ten fingers in front of it, although not via touch.
I have an idea for an iPhone game / app that needs to be able to track height position of the iPhone. I am new to iPhone development so I don't know how the accelerometer works. But the idea is that the user should place the iphone on a flat surface (with the iPhone back against the surface). The user will the lower and raise the surface periodically and the iPhone should be able to track this movement. We can assume that the surface will go back to its original position so we only care about how much it was lowered / raised from its original position during the movement.
The amount raised / lowered will be a few centimeters. Is this possible to track and how would you go about solving this?
Thank you very much for your help!
Best regards,
Lukas
This is not possible to track directly. However, the accelerometer data can be used to sort of do that. Acceleration is the time-derivative of speed, which is the time-derivative of position. By integrating the acceleration twice, you can track position.
Caveat though: this will probably not be very accurate, with significant drift errors.
Now you can also track orientation with the magnetometer, and you can use the camera to "watch" the environment. This suggests the possibility to fix the position by triangulation.
I don't expect that to be easy though.
I have a GPS app that I would like to detect if the user is standing still and not moving. Using Core Location works for this, but is sometimes not accurate because new updates move and gives the illusion of speed and motion.
So, I am wondering if in addition to that, I can also use Core Motion. Is this a good idea to detect motion such as someone walking, running, driving, etc, and know when they are no longer doing that motion? Or, is Core Motion only for small movements such as tilting the device or lifting it to your ear?
I wanted to tell others who visit this question what I've learned and what I think about this approach.
I have been doing some research of my own to know whether this is possible, and more importantly, even if it is what is the battery consumption and accuracy of the location change detected. For Android though, this question was asked quite sometime back. The answer provides links to this Google Tech Talk. At 23:20, the speaker talks about how difficult it is to achieve this and the accuracy you will achieve in the results.
Even though I have to come to realize the battery consumption from sensors on the iPhone is a little lesser than in most Android phones, I still think this is a costly affair in terms of accuracy and battery consumption.
you can use the GPS with the sensor readings to distinguish between walking, running, etc. if you combine the tilt angle frequency change and the GPS speed information (you need to do some work to get some of this info of course, but thats the way to do it).
You are talking about 4 different measurements from 4 different sensors (technically more than 4 but..) -
Latitude & Longitude - from CoreLocation. It uses a mix of GPS + cell tower triangulation.
Accelerometer - the current orientation of the device in 3D space.
Gyroscope - orientation of the device on its own axis.
Magnetometer - which tells you which direction a device is point w.r.t south,north,east,west
Of all these I think only Latitude & Longitude are of use to you. Basically what you do is to make the sensitivity (i.e. the update rate from the sensor) a bit more relaxed. With some tweaking around with this you should be able to tell with good accuracy if a person is standing or moving.
I need to calculate the direction of dragging a touch, to determine if the user is dragging up the screen, or down the screen.
Actually pretty simple, right? But:
1) Finger goes down, you get -touchesBegan:withEvent: called
2) Must wait until finger moves, and -touchesMoved:withEvent: gets called
3) Problem: At this point it's dangerous to tell if the user did drag up or down.
My thoughts: Check the time and accumulate calculates vectors until it's secure to tell the direction of touch.
Easy? No. Think about it: What if the user holds the finger down for 5 minutes on the same spot, but THEN decides to move up or down? BANG! Your code would fail, because it tried to determine the direction of touch when the finger didn't move really.
Problem 2: When the finger goes down and stays at the same spot for a few seconds because the user is a bit in the wind and thinks about what to do now, you'll get a lot of -touchesMoved:withEvent: calls very likely, but with very minor changes in touch location.
So my next thought: Do the accumulation in -touchesMoved:withEvent:, but only if a certain threshold of movement has been exceeded.
I bet you have some better concepts in place?
If you're just talking about detecting swipes, then in OS 3.2, Apple introduced gesture recognizers to make this sort of thing easier in iPad applications. You can also write your own recognizers for other kinds of gestures (like long drags), with the OS doing most of the heavy lifting, and notifying you about the "interesting moments".
According to some reports, it looks like they'll be in 4.0 and available to iPhone, as well.
When you drag a finger across the iPhone touchscreen, it generates touchesMoved events at a nice, regular 60Hz.
However, the transition from the initial touchesBegan event to the first touchesMoved is less obvious: sometimes the device waits a while.
What's it waiting for? Larger time/distance deltas? More touches to lump into the event?
Does anybody know?
Importantly, this delay does not happen with subsequent fingers, which puts the first touch at a distinct disadvantage. It's very asymmetric and bad news for apps that demand precise input, like games and musical instruments.
To see this bug/phenomenon in action
slowly drag the iPhone screen unlock slider to the right. note the sudden jump & note how it doesn't occur if you have another finger resting anywhere else on the screen
try "creeping" across a narrow bridge in any number of 3D games. Frustrating!
try a dual virtual joystick game & note that the effect is mitigated because you're obliged to never end either of the touches which amortizes the unpleasantness.
Should've logged this as a bug 8 months ago.
After a touchesBegan event is fired the UIKit looks for a positional movement of the finger touch which translates into touchedMoved events as the x/y of the finger is changed until the finger is lifted and the touchesEnded event is fired.
If the finger is held down in one place it will not fire the touchesMoved event until there is movement.
I am building an app where you have to draw based on touchesMoved and it does happen at intervals but it is fast enough to give a smooth drawing appearance. Since it is an event and buried in the SDK you might have to do some testing in your scenario to see how fast it responds, depending on other actions or events it could be variable to the situation it is used. In my experience it is within a few ms of movement and this is with about 2-3k other sprites on the screen.
The drawing does start on the touchesBegan event though so the first placement is set then it chains to the touhesMoved and ends with the touchesEnd. I use all the events for the drag operation, so maybe the initial move is less laggy perceptually in this case.
To test in your app you could put a timestamp on each event if it is crucial to your design and work out some sort of easing.
http://developer.apple.com/IPhone/library/documentation/UIKit/Reference/UIResponder_Class/Reference/Reference.html#//apple_ref/occ/instm/UIResponder/touchesMoved:withEvent:
I don't think it's a bug, it's more of a missing feature.
Ordinarily, this is intended behavior to filter out accidental micro-movements that would transform a tap or long press into a slide when this was not intended by the user.
This is nothing new, it has always been there, for instance there are a few pixels of tolerance for double clicks in pointer-based GUIs - or even this same tolerance before a drag is started, because users sometimes inadvertently drag when they just meant to click. Try slowly moving an item on the desktop (OSX or Windows) to see it.
The missing feature is that it doesn't appear to be configurable.
An idea: Is it possible to enter a timed loop on touchesBegan that periodically checks the touch's locationInView:?
I don't represent any kind of official answer but it makes sense that touchesBegan->touchesMoved has a longer duration than touchesMoved->touchesMoved. It would be frustrating to developers if every touchesBegan came along with a bunch of accidental touchesMoved events. Apple must have determined (experimentally) some distance at which a touch becomes a drag. Once the touchesMoved has begun, there is no need to perform this test any more because every point until the next touchesUp is guaranteed to be a touchesMoved.
This seems to be what you are saying in your original post, Rythmic Fistman, and I just wanted to elaborate a bit more and say that I agree with your reasoning. This means if you're calculating a "drag velocity" of some sort, you are required to use distance traveled as a factor, rather than depending on the frequency of the update timer (which is better practice anyway).
Its waiting for the first move.
That's how the OS distinguishes a drag from a tap. Once you drag, all new notifications are touchesMoved.
This is also the reason why you should write code to execute on touch up event.
Currently such "delay" between touchesBegan and touchesMoved is present also when other fingers are touching the screen. Unfortunately it seems that an option to disable it doesn't exist yet. I'm also a music app developer (and player), and I find this behavior very annoying.