I am using MCU RX63N + ADS7843 + TFT LCD (GFT035AB320240Y).
I'm having a problem with touch handling.
Although I calibrate the ADC value after reading about it, but when using the fingers to touch the small buttons still can not get the exact touch position.
Is there any solution to this problem?
Look forward to the help of everyone.
The reason is the bad calibration. Try calibrating with a touch pen and repeating it several times.
Related
everyone!
I'm doing a research project involving detecting the simultaneous detection of palms placed on a multitouch screen.
I've done quite a bit googling and found out that there's a lot of libraries both for gesture recognition (AS3, https://github.com/fljot/Gestouch for instance) and computer vision. I'm working with JSTouchController (https://github.com/sebleedelisle/JSTouchController), but it tracks only 5 fingers at a time. So if I place one palm on a screen and the library finds all five fingers, it won't track second palm being placed at all. It works correctly from time to time, though.
So, the question is: are there any libraries to track ten fingers simultaneously with acceptable quality on modern touch screens?
The number of touch points is usually restricted by the device or OS. If you want to prototype something quickly, one possibility is to use the LEAP motion sensor. It can track ten fingers in front of it, although not via touch.
I have been experimenting with the Core Motion framework to detect a user spinning around, say on a merry-go-round, holding an iphone in his hand.
There are ways to detect the device motion around its own axes, but what is a good way to detect the iPhone spinning in circles?
Thanks
You can use the gyroscope. Take a look here: Gyroscope example
You have to remind that it is only availble on iPhone4 and iPhone4S.
There is one degenerate case where you can run into trouble, only magnetometer (compass) can help in that particular case.
If you put the device (a) on the desk in stationary position then (b) on a perfectly horizontal turntable rotating slowly you will get the same qualitative sensor readings. Both the gyro and the accelerometer readings are constant in the two cases, although the readings quantitatively differ. The sad part is: gyro bias error can render case (a) to look like (b) and vice-versa. In this particular case you need a compass to cancel the gyro drift. Case (a) is typical for a phone.
Apart from this degenerate case, gyroscopes and accelerometers with sensor fusion are sufficient to track arbitrary rotations of the device.
I ported a game from iPhone to android. All OpenGL based, and exactly the same calculations for scrolling. I noticed that on the iPhone when scrolling through the game I can scroll faster, and the starting speed as I lifted my finger felt the same as my finger was moving.
However on the android device unfortunatly this was wan not the same. As I lifter my finger when scrolling, the start scroll speed felt slower. Scrolling on the iPhone feels more accurate.
Is there anything special about how the android handles the touches that is differnt than the iPhone? and how can I take advantage of this to achieve a similar feeling as on the iPhone. On the On Android all applications when flinging the speed that I have lifted my finger from doesn't feel the same as how fast my finger moved.
I found it. The android gives a ACTION_MOVE then a ACTION_UP for the same (or close) location. iPhone doesn't do this at all! It just gives a touchesEnded! So if there is motion I will at minimum have 3 points on the android ALWAYS, But on the iPhone its 2 points (touch down, touch up both at different locations)
The next thing is the ACTION_MOVE AND ACTION_UP don't happen right away, and its significant time interval when calculating the average speed of sampled points.
Solution: On ACTION_UP, if there is a ACTION_MOVE slide all the seconds stored so that the ACTION_MOVE "happens" when the ACTION_UP occured. Don't put the final touches up point in the speed calculation. And now calculate speed as usual.
Summary
On Android if you have moved your finger you get a ACTION_MOVE before a ACTION_UP. And ACTION_UP is roughly at the same location at the ACTION_MOVE making it seem as if the speed at the end is roughly 0. iPhone does not do this, iPhone does not give a touchesMoved for the last touchesEnded (IE touch down, move finger, lift, if you do it fast enough on iPhone you wont get the intermediate event for touchesMoved your finger, where as on the android you do).
Android / iPhone equivalent's
ACTION_UP = touchesEnded
ACTION_DOWN = touchesBegan
ACTION_MOVE = touchesMoved
Also I noticed on Android there is some point history functions. I did not use those. I stored touches and there timestamps in my own array.
Are you taking into account the dpi of the screen, or resolution of the screen, to either do your flings in dots-per-second or in px-per-second? That could easily affect things.
In addition, yes, android's touch processing seems to be a bit slower. You might be able to add a 'fudge factor' of some amount to get a closer response (maybe +10% or something like that). But that said, the 'flinging' speed of android apps is a combination of touchscreen and the framework's particular math calcs for determining fling speed and rate of decay -- so in terms of apps OTHER than yours, you could just simply be seeing much different math/approaches between the two platforms.
I'm using the accelerometer to move an object on my screen.
It's only working when the iphone is flat.
If I use the iPhone in another position, the object is not moving like I want
(the Y axis is not well managed).
So, I've to calibrate the position of the accelerometer, I guess.
But I've no idea how to do this.
Please help me.
Thanks in advance.
Regards,
ALpesH
hi hope all is well.
A simple google search came up with this beautiful post.
It is exactly what you wanted. A tutorial on Accelerometer Calibration & Optimizations.
This tutorial assumes you know the basics of the accelerometer. If not there are plenty of tutorials on google that will help get you accustomed to the accelerometer basics.
This tutorial will focus on 3 things:
Calibrating the accelerometer so the user can play your game from any
position.
Changing the "sensitivity" of your object's movement.
Adding the option to "invert" the controls.
First off, why bother adding these features? Simple. Launch your accelerometer based game and try the following tests:
Play sitting up in perfect position.
List item
Play it slouched over.
Play it lying down on your side.
Play it lying on your back with the device parallel to the floor and
the screen facing you.
Obtained from this website:
http://www.iphonedevsdk.com/forum/iphone-sdk-tutorials/39833-tutorial-accelerometer-calibration-optimizations.html
Let me know if this helps if it does choose this as the answer.
Other than that if you need any more help let me know
Ive implemented this tutorial code in my own app and i am not playing a simple ping pong game upside down and also playing it on the side whilst lying on the bed. :D
Pk
I am developing one game where I want to move UIImageView based on accelerometer. When I rotate iphone device left to right or right to left the UIImageView have to rotate in particular angle. It's moving also but the problem occurs when I play background sound because of that sound, it sends some acceleration point even if my iphone is idle.
So my UIImageView is also moving. It should not happen. When I decrease the iphone sound volume it works very well. What I have to do for that.
And also if anyone knows how to get acceleration point only when iphone is moving from left to right or right to left. It should not detect when iphone is xz or yz plane.
If anybody knows the solution please reply.
Have you got any filtering on the input from the accelertometer? I would expect the noise from the speaker the accelerometer is picking up is vastly different in amplitude and frequency than the game control.
There is a simple low pass filter in the Apple accelerometer graph sample code.