Detecting palms on a touch screen - multi-touch

everyone!
I'm doing a research project involving detecting the simultaneous detection of palms placed on a multitouch screen.
I've done quite a bit googling and found out that there's a lot of libraries both for gesture recognition (AS3, https://github.com/fljot/Gestouch for instance) and computer vision. I'm working with JSTouchController (https://github.com/sebleedelisle/JSTouchController), but it tracks only 5 fingers at a time. So if I place one palm on a screen and the library finds all five fingers, it won't track second palm being placed at all. It works correctly from time to time, though.
So, the question is: are there any libraries to track ten fingers simultaneously with acceptable quality on modern touch screens?

The number of touch points is usually restricted by the device or OS. If you want to prototype something quickly, one possibility is to use the LEAP motion sensor. It can track ten fingers in front of it, although not via touch.

Related

How to hold gun with leap motion hand

Newbie here, I'm developing game where I need to pick up gun and other objects. I'm using leap motion hands. How to pick up OR Connect gun with Leap Motion so that gun (or other) object move with hand motion.
P.S. I searched but failed to find any material across Stackoverflow.
Due to the limitations of the Leap Motion imposed by physics, this is almost certainly going to be impossible.
This is due to the location of the Leap Motion's IR camera and what it can (and more importantly, what it cannot) see. When your hand is in a fist, your fingers block the camera from being able to detect the position of most of your fingers, making any sort of typical gun-holding position impossible to detect. Note that this may change based on the location of your sensor bar (which you did not include in your question and I have limited experience with other mountings, but I can't think of any mounting where the Leap Motion would have the necessary unobstructed view).
There was a project I worked on where I tried to use that same kind of pose and a "trigger pull" motion to activate an effect inside the Unity application. However, due to the location of the sensor bar (on the desk) this was virtually impossible and we had to reconfigure for a horizontal hand position (location relative to sensor was movement, hand in a fist was "fire" and would not reset and allow a second fire until hand was in an open palm gesture).

how to achieve same flinging results on android & iphone

I ported a game from iPhone to android. All OpenGL based, and exactly the same calculations for scrolling. I noticed that on the iPhone when scrolling through the game I can scroll faster, and the starting speed as I lifted my finger felt the same as my finger was moving.
However on the android device unfortunatly this was wan not the same. As I lifter my finger when scrolling, the start scroll speed felt slower. Scrolling on the iPhone feels more accurate.
Is there anything special about how the android handles the touches that is differnt than the iPhone? and how can I take advantage of this to achieve a similar feeling as on the iPhone. On the On Android all applications when flinging the speed that I have lifted my finger from doesn't feel the same as how fast my finger moved.
I found it. The android gives a ACTION_MOVE then a ACTION_UP for the same (or close) location. iPhone doesn't do this at all! It just gives a touchesEnded! So if there is motion I will at minimum have 3 points on the android ALWAYS, But on the iPhone its 2 points (touch down, touch up both at different locations)
The next thing is the ACTION_MOVE AND ACTION_UP don't happen right away, and its significant time interval when calculating the average speed of sampled points.
Solution: On ACTION_UP, if there is a ACTION_MOVE slide all the seconds stored so that the ACTION_MOVE "happens" when the ACTION_UP occured. Don't put the final touches up point in the speed calculation. And now calculate speed as usual.
Summary
On Android if you have moved your finger you get a ACTION_MOVE before a ACTION_UP. And ACTION_UP is roughly at the same location at the ACTION_MOVE making it seem as if the speed at the end is roughly 0. iPhone does not do this, iPhone does not give a touchesMoved for the last touchesEnded (IE touch down, move finger, lift, if you do it fast enough on iPhone you wont get the intermediate event for touchesMoved your finger, where as on the android you do).
Android / iPhone equivalent's
ACTION_UP = touchesEnded
ACTION_DOWN = touchesBegan
ACTION_MOVE = touchesMoved
Also I noticed on Android there is some point history functions. I did not use those. I stored touches and there timestamps in my own array.
Are you taking into account the dpi of the screen, or resolution of the screen, to either do your flings in dots-per-second or in px-per-second? That could easily affect things.
In addition, yes, android's touch processing seems to be a bit slower. You might be able to add a 'fudge factor' of some amount to get a closer response (maybe +10% or something like that). But that said, the 'flinging' speed of android apps is a combination of touchscreen and the framework's particular math calcs for determining fling speed and rate of decay -- so in terms of apps OTHER than yours, you could just simply be seeing much different math/approaches between the two platforms.

how would I use iphone motion detection for an egg shaking-like application?

I am hoping to build an application similar to those egg shaking applications, to better understand how to detect motion on the iphone. I've been looking at accelerometer methods and motion and motion methods, but can't seem to get working what I want to do.
The specifics of my need are as follows: I want to be able to play one sound when user shakes the phone away from them, and play another sound when they shake back towards them. The motion from the user would be very similar to an egg shaker, with two different sounds able to be played depending on whether they moved the device towards or away from their chest. It would also be good to measure the intensity with which they moved away or towards.
Any ideas?
I've searched apple's sample code for a similar application, but there doesnt seem to be one.
Look at this game which is open source and makes great use of the accelerometer. It's a good one to be able to tell which direction you are going, but I haven't messed around much with intensity. I'm sure it's easy enough once you get into the details.
http://github.com/haqu/tweejump

How to track the touch vector?

I need to calculate the direction of dragging a touch, to determine if the user is dragging up the screen, or down the screen.
Actually pretty simple, right? But:
1) Finger goes down, you get -touchesBegan:withEvent: called
2) Must wait until finger moves, and -touchesMoved:withEvent: gets called
3) Problem: At this point it's dangerous to tell if the user did drag up or down.
My thoughts: Check the time and accumulate calculates vectors until it's secure to tell the direction of touch.
Easy? No. Think about it: What if the user holds the finger down for 5 minutes on the same spot, but THEN decides to move up or down? BANG! Your code would fail, because it tried to determine the direction of touch when the finger didn't move really.
Problem 2: When the finger goes down and stays at the same spot for a few seconds because the user is a bit in the wind and thinks about what to do now, you'll get a lot of -touchesMoved:withEvent: calls very likely, but with very minor changes in touch location.
So my next thought: Do the accumulation in -touchesMoved:withEvent:, but only if a certain threshold of movement has been exceeded.
I bet you have some better concepts in place?
If you're just talking about detecting swipes, then in OS 3.2, Apple introduced gesture recognizers to make this sort of thing easier in iPad applications. You can also write your own recognizers for other kinds of gestures (like long drags), with the OS doing most of the heavy lifting, and notifying you about the "interesting moments".
According to some reports, it looks like they'll be in 4.0 and available to iPhone, as well.

Multi-touch appearing as one touch

I am getting inconsistent results using multi-touch.
I am trying to detect a 2 fingered swipe.
Sometimes I successfully detect 2 touches, othertimes it appears as 1 touch but oscillating in position between the two fingers.
I created a test app to try to isolate this behaviour but I found that the test app behaved ok.
The app where the problem occurs has a number of sub views, and my touch detection code is on the root view. Could this be the problem? Do I need to include touch detection on every subview?
You are not guaranteed to always get both touches in the array - for example if one finger were moving and the other were still, it would not appear in TouchesMoved. Just assume the finger is still down in the old place until reported otherwise.
After much poking around, I've realised that even when I get two fingered detection working well, there is still a high likelyhood that occasionally it will pick up 1 finger if only for a short while. In the case of my application that won't work because I am already using 1 finger movement for something else. So back to the drawing board