iPhone -- possible to tell the difference between a fingertip and a whole fingerpad? - iphone

Is it possible to detect exactly how much finger is in contact with the screen? Say I wanted to make a fingerprinting app, how would I detect the outline of a person's fingeR?

No, the UITouch system does a lot of processing to determine a single point location for each touch given the larger touched area. This is meant to aid the user as there can be some difference between where one thinks he is touching and where the screen is actually touched.

Related

How to calculate the diameter of the TouchPoint on an iPhone,iPad and Android Device?

Now i understand we can have 5 touch points by default in iPhone and have varied touch points enabled onto the different SDKs. However i have accomplished registering the touch points and getting distances, actual number of touch points. I would want to know if there's a way to accomplish and get the Diameter of a particular touch point for e.g. calculating the thumb touch in comparison with the index finger, Any ideas?
I think Apple makes it fairly clear that they don't intend to give third-party developers access to low-level multi-touch information. From Apple's documentation on Event Handling in iOS:
A finger on the screen affords a much different level of precision
than a mouse pointer. When a user touches the screen, the area of
contact is actually elliptical and tends to be offset below the point
where the user thinks he or she touched. This “contact patch” also
varies in size and shape based on which finger is touching the screen,
the size of the finger, the pressure of the finger on the screen, the
orientation of the finger, and other factors. The underlying
Multi-Touch system analyzes all of this information for you and
computes a single touch point.
I can’t speak to Android, but the public APIs in the iOS SDK don’t give you any information about a touch other than its position. These guys found a private API (i.e. one that’ll get you rejected from the App Store if you use it) for getting the diameter of a touch on the screen, but they haven’t provided any further information or released the library.
It's likely possible on Android (4.3 I have, but most likely others too), as you can both activate an overlay display of touch properties in the Developer options as Pointer location (in Input category), which shows you coordinates, their Delta (=difference) and two properties you might be interested in: Prs, that is pressure and Size. It seems you can find the thresholds that would recognize a thumb and the index finger in most cases. That's just what I would imply from the values I have seen in that display.
A proof that this is possible within the allowed API is Yet Another MultiTouch Test Android App listed on on Google Play which for me shows the pressure of each touch in it's own interface (so it must be available in the standard API).

iPhone Screen Press Detection

I am trying to build an app that detects if a user puts their lips on the screen. How could I get an image of their lips when they put it on the screen?
I know how to detect finger touches, would it be similar to that?
Edit: Also, wondering about doing this for Android.
I believe the iphone hardware supports somewhere between 10 and 20 touch points, but the SDK only gives you access to 5. To get a good outline of the lips you'd need a much higher number of touch points.
Could you use the front facing camera and avoid the obvious hygiene issues?!
You can't get the form of the touches, so you wouldn't be able to detect lips on the screens.
As others have posted here, it is not possible to detect all of the contact points of the lips and the screen. Yes, it would be similar, but more difficult. The iPhone screen is capacitive, so it should detect lips, which are similar to fingertips. (When outside in the cold and I need to scroll with my gloves on, I've used my chin.)
As far as the shape, you may be able to map multiple touches, but touches don't take on any particular shape, so you won't be able to draw the lips based on that. You may want to consider preloading generic pictures of lips and then resizing it to match the size of their lips, depending on where the touch inputs are.

How can I capture the amount of surface area that the user is making contact with the iPhone screen?

Is there a way to capture the amount of screen that is making contact with the users? I assume there is since this finger painting app shows the ipad responding to only the pixels that the user makes contact with.
Thanks so much in advance for your help!
The size of the touch is abstracted away by the framework, and UITouches only contain calculated (“best estimated”) points instead of the raw, actual areas that were touched. I would guess that the “pressure” was calculated from the duration and the direction of the touch.
In a nutshell, there is no public API to get the contact area.
I don't think Apple provides APIs for the size of the touch, or as #nickthedude said (I think) any kind of way to measure pressure. Basically, you need to implement your own algorithm/policy for determining line thickness/opacity/other effects. I believe a common way to do this is to measure the amount of time spent for the stroke, and work from there. For instance, if the user moved more quickly, you might want a thinner line segment. Apple really should just provide a canvas view of some kind. Best of luck!
to get the exact area you may have to roll your own but you can get uievents pretty easily and then do some magic from there. Basically impliment/override touchesBegan, touchesEnded, touchesMoved on the UIView in question and put in your custom code there.
Looking at the video maybe the amount of touches in the UIEvent set might correspond to the "pressure" of the touch, then again maybe not.
What if you laid down a series of successively smaller square uiviews wherever the user touched then if the touches "spilled" into the larger uiviews behind the smaller front ones than you could conjecture that the touch pressure was harder. Something to try I guess. Good luck.
Why not just describe what you want to do and foxus on asking about that instead - it may not have anything at all to do with the example that has you so otherwise enthralled - I can use a camera to monitor your hand from across the table and paint pixels on the screen via BT, completely ignoring any contact between your fingers and the screen.

Custom touch tracking in iphone

I don't want to use any of the normal touch events in the iphone sdk.
When an user touches the screen I want to find where he touched and all of pixels he touches. Is there a way to do it in iphone ? may be using a low level SDK.
i want this to do it for some thing like a drawing app with finger on iphone.
There is no low-level API. IIRC, the data in a touch object is actually returned by the hardware. In other words, that is all the data that software can get.
Having done some touch UI experiments in the distant past, I can tell you that processing real world touches in software is a lot more complicated than you would expect on first glance. It's not like tracking a mouse. A finger is actually a very blunt and imprecise pointing instrument on the scale of a mobil screen. There is a great deal of variation in finger size, pressure of contact, contact area, angle of contact and consistency of contact. It takes a lot of processing to turn that blunt imprecision into a single point or collection of points that an API can easily use.
I wouldn't try to reinvent the wheel even if you find a way to extract more data from the hardware. If nothing else, (puts on interface-nazi hat) if your touch interface behaves different from other apps, users will be confused when the have to switch back and forth.
A touch is a rather imprecise gesture, so getting all the pixels that one encompasses is not really possible. However, you can get the rough 'center' of a touch, and extrapolate an area around that for a group of 'touched pixels'. No need to use a low level SDK, just override -touchesBegan:withEvent: on UIView.

Possible to change the alpha value of certain pixels on iPhone?

Is it possible to change just a portion of a Sprite's alpha in response to user interaction? A good example of what I mean is iFog or iSteam, where the user can wipe "steam" off the iPhone's screen. Swapping images out wouldn't be feasible due to the sheer number of possibilities where the user could touch and move...
For example, say you have a simple app that has a brick wall in the background that has graffiti on it, so there'd be two sprites, one of the brick wall, then one of the graffiti that has a higher z value than the brick wall. Then, based upon where the user touches (assuming their touch controls a sandblaster), some of the graffiti should be removed, but not all of it, which could be accomplished by changing the alpha value on a portion of the graffiti sprite. Is there any way to do this in cocos2d-iphone? Or, do I need to drop down into openGL, and if so, where would be a good place to start my search on how to accomplish this?
Ideally, I'd like to accomplish this on a cocos2d-iphone Sprite, but if it's not possible, where's the best place to start looking?
Thanks in advance,
Ben
The answer is here: http://www.cocos2d-iphone.org/forum/topic/7921#post-46394
But short answer: You have to override the draw method and resort to openGL methods.
I know this question is old, but it needs an answer anyway.