iphone moving direction using accelerometer - iphone

I want some help on UIAccelerometer class. I want to find some way to find out or distinguish when I wave iphone device from right-to-left and the left-to-right. I am getting x,y,z values as well as interval value. I am also getting velocity and distance calculated from normal physics rule.
Distance = (prevDist + sqrt(pow((prevx - acceleration.x), 2) + pow((prevy - acceleration.y), 2) + pow((prevz - acceleration.z), 2)))
Velocity = (Distance * timeintercal) { here distance means newdistance-prevdistance and timeinterval for that distance }
When I am moving iphone device right-to-left then left-to-right I am getting total distance traveld in both direction and velocity at regular intervals. Can you help me how I can find out that I am moving iphone device left-to-right or right-to-left?
Help would be appreciated.
Thanks.

First, you have a few physics errors. Your distance is not a 'distance' variable because you are summing acceleration values from the accelerometer. Also, your velocity is actually a 'speed' variable because you are multiplying acceleration by a time interval.
What you really want to do is monitor acceleration.x for values that exceed a certain pre-determined magnitude. This magnitude should be low enough that the user does not have to swing the iPhone violenty, but high enough that simple movements will not cross your threshold. If acceleration.x is greater than 0 by your pre-determined threshold or more, then you are accelerating right. If acceleration.y is less than 0 by your pre-determined amount or more, then you are accelerating left.
It is far more difficult to determine the velocity of the phone over a period of time, because you will have to integrate the acceleration values continuously. The iPhone accelerometer is not incredibly precise and can be noisy, so after a small period of time your application will probably be convinced your phone is drifting in random directions. Also, remember that gravity will always show up in your acceleration values.

It depends on which way the user will be holding the device.
You don't need the velocity or the distance to figure out which way the device is being swung.
Instead, you can keep a history of the past (say 5) accelerations.
Assuming that the device is held the normal way (portrait, right-side up), you would use the average of the previous x - accelerations to determine the direction.
So basically what I did is in the accelerometer:didAccelerate event:
//prevAccels is an NSMutableArray
float VAL = .75;
int HISTORY_NUMBER = 5;
if (accleration.x>VAL && acceleration.y>VAL){ //if there has been a sudden jerking motion
float avgX;
for (int i = 0; i<[prevAccels count]; i++)
{
UIAcceleration *a = [prevAccels objectAtIndex:i];
avgX += a.x;
}
avgX /= HISTORY_NUMBER; //get the average of the previous accelerations
BOOL flickRight = (avgX<-VAL)?YES:NO;
//[self wave:flickRight]; //do stuff here
}
[prevAccels addObject:acceleration]; //add the acceleration to the history
if ([prevAccels count]>HISTORY_NUMBER) [prevAccels removeObjectAtIndex:0]; //trim the list if it is too large
Hope that helps!

One thing to keep in mind is that gravity is always affecting the accelerometers, so that needs to be taken into account. Unfortunately, you can't count on straight forward methods to compute distances and actual velocity due to not knowing at any instant where "down" is.
To note when "swing left" or "swing right" happens, also consider that the movement is likely to be in an arc, and you'll get opposite acceleration when the swinging stops. To know that the swinging is underway, take a look at the "out" acceleration vector, e.g. the one that points away from the center of rotation. Using that you can distinguish between a "start moving" state and a "stop moving state" which otherwise will have equal-but-opposite acceleration values.

Related

Calculating Lean Angle with Core Motion

I have a record session for my application. When user started a record session I start collecting data from device's CMMotionManager object and store them on CoreData to process and present later. The data I'm collecting includes gps data, accelerometer data and gyro data. The frequency of data is 10Hz.
Currently I'm struggling to calculate the lean angle of device with motion data. It is possible to calculate which side of device is land by using gravity data but I want to calculate right or left angle between user and ground regardless of travel direction.
This problem requires some linear algebra knowledge to solve. For example for calculation on some point I must calculate the equation of a 3D line on a calculated plane. I am working on this one for a day and it's getting more complex. I'm not good at math at all. Some math examples related to the problem is appreciated too.
It depends on what you want to do with the collected data and what ways the user will go with that recording iPhone in her/his pocket. The reason is that Euler angles are no safe and especially no unique way to express a rotation. Consider a situation where the user puts the phone upright into his jeans' back pocket and then turns left around 90°. Because CMAttitude is related to a device lying flat on the table, you have two subsequent rotations for (pitch=x, roll=y, yaw=z) according to this picture:
pitch +90° for getting the phone upright => (90, 0, 0)
roll +90° for turning left => (90, 90, 0)
But you can get the same position by:
yaw +90° for turning the phone left (0, 0, 90)
pitch -90° for making the phone upright (-90, 0, 90)
You see two different representations (90, 90, 0) and (-90, 0, 90) for getting to the same rotation and there are more of them. So you press Start button, do some fancy rotations to put the phone into the pocket and you are in trouble because you can't rely on Euler Angles when doing more complex motions (s. gimbal lock for more headaches on this ;-)
Now the good news: you are right linear algebra will do the job. What you can do is force your users to put the phone in always the same position e.g. fixed upright in the right back pocket and calculate the angle(s) relative to the ground by building the dot product of gravity vector from CMDeviceMotion g = (x, y, z) and the postion vector p which is the -Y axis (0, -1, 0) in upright position:
g • x = x*0 + y*(-1) + z*0 = -y = ||g||*1*cos (alpha)
=> alpha = arccos (-y/9.81) as total angle. Note that gravitational acceleration g is constantly about 9.81
To get the left-right lean angle and forward-back angle we use the tangens:
alphaLR = arctan (x/y)
alphaFB = arctan (z/y)
[UPDATE:]
If you can't rely on having the phone at a predefined postion like (0, -1, 0) in the equations above, you can only calculate the total angle but not the specific ones alphaLR and alphaFB. The reason is that you only have one axis of the new coordinate system where you need two of them. The new Y axis y' will then be defined as average gravity vector but you don't know your new X axis because every vector perpedicular to y' will be valid.
So you have to provide further information like let the users walk a longer distance into one direction without deviating and use GPS and magnetometer data to get the 2nd axis z'. Sounds pretty error prone in practise.
The total angle is no problem as we can replace (0, -1, 0) with the average gravity vector (pX, pY, pZ):
g•p = xpX + ypY + zpZ = ||g||||p||*cos(alpha) = ||g||^2*cos(alpha)
alpha = arccos ((xpX + ypY + z*pZ) / 9.81^2)
Two more things to bear in mind:
Different persons wear different trowsers with different pockets. So the gravity vector will be different even for the same person wearing other clothes and you might need some kind of normalisation
CMMotionManager does not work in the background i.e. the users must not push the standby button
If I understand your question, I think you are interested in getting the attitude of your device. You can do this using the attitude property of the CMDeviceMotion object that you get from the deviceMotion property of the CMMotionManager object.
There are two different angles that you might be interested in the CMAttitude class: roll and pitch. If you imagine your device as an airplane with the propeller at the top (where the headphone jack is), pitch is the angle the plane/device would make with the ground if the plane were in a climb or dive. Meanwhile, roll is the angle that the "wings" would make with the ground if the plane were to be banking or in mid barrel roll.
(BTW, there is a third angle called yaw that I think is not relevant for your question.)
The angles will be given in radians, but it's easy enough to convert them to degrees if that's what you want (by multiplying by 180 and then dividing by pi).
Assuming I understand what you want, the good news is that you may not need to understand any linear algebra to capture and use these angles. (If I'm missing something, please clarify and I'd be happy to help further.)
UPDATE (based on comments):
The attitude values in the CMAttitude object are relative to the ground (i.e., the default reference frame has the Z-axis as vertical, that is pointing in the opposite direction as gravity), so you don't have to worry about cancelling out gravity. So, for example, if you lie your device on a flat table top, and then roll it up onto its side, the roll property of the CMAttitude object will change from 0 to plus or minus 90 degrees (+- .5pi radians), depending on which side you roll it onto. Meanwhile, if you start it lying flat and then gradually stand it up on its end, the same will happen to the pitch property.
While you can use the pitch, roll, and yaw angles directly if you want, you can also set a different reference frame (e.g., a different direction for "up"). To do this, just capture the attitude in that orientation during a "calibration" step and then use CMAttitude's multiplyByInverseOfAttitude: method to transform your attitude data to the new reference frame.
Even though your question only mentioned capturing the "lean angle" (with the ground), you will probably want to capture at least 2 of the 3 attitude angles (e.g., pitch and either roll or yaw, depending on what they are doing), potentially all three, if the device is going to be in a person's pocket. (The device could rotate in the pocket in various ways if the pocket is baggy, for example.) For the most part, though, I think you will probably be able to rely on just two of the three (unless you see radical shifts in yaw throughout the course of a recording session). So for example, in my jeans pocket, the phone is usually nearly vertical. Thus, for me, pitch would vary a whole bunch as I, say, walk, sit or run. Roll would vary whenever I change the direction I'm facing. Meanwhile, yaw would not vary much at all (unless I do kart-wheels, which I can't!). So yaw can probably be ignored for me.
To summarize the main point: to use these attitude angles, you don't need to do any linear algebra, nor worry about gravity (although you may want to use this for other purposes, of course).
UPDATE 2 (based on Kay's new post):
Kay just replied and showed how to use gravity and linear algebra to make sure your angles are unique. (And, btw, I think you should give the bounty to that post, fwiw.)
Depending on what you want to do, you may want to use this math. You would want to use the linear algebra and gravity if you need a standardized way of "talking about" and/or comparing attitudes over the course of your recording session. If you just want to visualize them, you can probably still get away with not using the increased complexity. (For example, visualizing (pitch=90, roll=0, yaw=0) should be the same as visualizing (pitch=0, roll=90, yaw=90).) In my approach above, while you could have multiple ways of referring to the "same" attitude, none of them is actually wrong, per se. They will still give you the angles relative to the ground.
But the fact that the gyroscope can switch from one valid description of an attitude to another means that what I wrote above about getting away with only 2 of the 3 components needs to be corrected: because of this, you will need to capture all three components, no matter what. Sorry.

How many pixels is an impulse in box2d

I have a ball that you blow on with air. I want the ball to be blown more if it is close to the blower and blown less if it is farther away from the blower. I am using box2d and I am using the impulse function."body->ApplyLinearImpulse(force, body->GetPosition())". I can't seem to find a formula or a way to accomplish this. If I want the ball to blow to a total distance of 300 pixels right, how could I accomplish this? Please help.
If you want to calculate the distance before simulation you have to take a look at box2d sources. When simulating the velocity of the body is modified according to gravity, extra applied forces, linear damping, angular damping and possibly something more. Also velocity relies on velocity iterations.
But I think if you want a really smooth motion (like from a blow) you'd better use applyForce function instead of impulse. But be sure you are applying the force each simulation step.
EDIT:
Also you can simulate the air resistance as:
Fa = -k*V*V. I've simulated movement in the pipe this way. Worked great.
So each step you can make something like this:
BlowForce = k1 / distance; // k1 - coefficient
Resistance = -k2 * V * V; //k2 - another coefficient
TotalForce = BlowForce + Resistance;
body->ApplyForce(TotalForce);
I am not a box 2d expert but what i would do is create a small box which is actually invisible and let the ball hit the box...if the blower is blowing more i would give more speed to the box in opposite direction. As far as 300 pixel length is concerned you have to adjust the forces and velocity such that the ball goes
300/<your_rendering_window_to_physics_world_ratio>
in physical world.
Force = mass * acceleration, so take the mass you set your body to, calculate the acceleration you want (remember to divide 300px by PTM_RATIO) and then multiply the two together.

Distance moved by Accelerometer

I want to move objects on iPhone screen (rectangle, circle and so on) by moving iPhone.
For example, I move iPhone along X-axis and the object moves along X-axis. The same for Y,Z-axis.
How can I do this?
Can I get algorithm for it?
Thank you.
P.S:
I looked for a while and seems like it is possible using accelerometer.
You get position by integrating the linear acceleration twice but the error is horrible. It is useless in practice.
Here is an explanation why (Google Tech Talk) at 23:20. I highly recommend this video.
However the gyro mouse might work for your application, see between 37:00-38:25 in the video.
Similar questions:
track small movements of iphone with no GPS
What is the real world accuracy of phone accelerometers when used for positioning?
how to calculate phone's movement in the vertical direction from rest?
iOS: Movement Precision in 3D Space
How to use Accelerometer to measure distance for Android Application Development
How can I find distance traveled with a gyroscope and accelerometer?
Not easy to do accurately. An accelerometer only reports acceleration, not movement. You can try (numerically) integrating the acceleration over time, but you're likely to end up with cumulative errors adding up and leading to unintended motion.
Pseudocode, obviously untested:
init:
loc = {0, 0, 0} ; location of object on screen
vel = {0, 0, 0} ; velocity of object on screen
t = 0 ; last time measured
step:
t0 = time ; get amount of time since last measurement
dt = t0 - t
accel = readAccelerometer()
vel += accel * dt ; update velocity
loc += vel * dt ; update position
displayObjectAt(loc)
Why don't you use the acceleration itself instead distance moved? For example;
If your acceleration is 1.4G east, your object should move to east 14pixels per second until your acceleration changes.
So that object's movement is will be like it is under a relative force and I think it's more realistic. Faster moving device, bigger force to effect object.
Also look:
http://www.freescale.com/files/sensors/doc/app_note/AN3397.pdf
and
Using accelerometer, gyroscope and compass to calculate device's movement in 3D world

Compensating compass lag with the gyroscope on iPhone 4

I've been experimenting with the compass and gyroscope on iPhone 4 and would like some help with an issue I'm having. I want to compensate for the slowness of the compass by using data from the gyroscope.
Using CMMotionManager and its CMDeviceMotion object (motionManager.deviceMotion), I get the CMAttitude object. Correct me if I'm wrong (please), but here is what I've deduced from the CMAttitude object's yaw property (I don't need pitch nor roll for my purposes):
yaw ranges from 0 to PI when the phone is pointing downwards (as indicated by deviceMotion.gravity.z) and swinging counterclockwise and 0 to -PI when swung clockwise
when the device is pointing upwards, yaw ranges from -PI to 0 and PI to 0, respectively
and from the compass data (I'm using locationManager.heading.magneticHeading), I see that the compass gives values from 0 to 360, with the value increasing when swinging clockwise
All right, so using all of this information together, I'm able to get a value I call horizontal that, regardless of whether the device is pointing up or down, will give values from 0 to 360 and increase when the device is swung clockwise (though I am still having trouble when deviceManager.gravity.z is around 0 -- the yaw value freaks out at this gravity.z value).
It seems to me that I could "synchronize" the horizontal and magneticHeading values, using a calculated horizontal value that maps to magneticHeading, and "synchronize" the horizontal value to magneticHeading when I feel the compass has "caught up."
So my questions:
Am I on the right track with this?
Am I using the gyro data from CMDeviceMotion properly and the assumptions I listed above correct?
Why might yaw freak out when gravity.z is around 0?
Thank you very much. I look forward to hearing your answers!
Just trying to answer... correct me if i'm wrong..
1.Yes you are on the right track
2.gravity in CM is already "isolated" from user gravity (gravity value caused by user acceleration) thats why there is two gravity, the "gravity" and "userAcceleration" its on apple CM documentation
// Note : not entirely isolated //
3.
if you have a gravity 0 it mean that the coresponding axis is perpendicular with gravity.
gravity.z is the iPhone screen thats why it -9.82m/s2 if you put on the desk with screen upright, actualy it hard to get 0 or maximum value of the gravity due to the sensor noise (it's normal, all sensor has a noise expecially cheap sensor).
what i do on my apps is I will switch my reference axis to other axis (in your case may be x or y) for certain limits, how the strategy is depend on the purpose or which side is your reference.
the other thing is, gyro is fast but its not stable, you need to re-calibrate the value for several interval. In my case every 5 second. I've experiment with gyro for calculating angle between two plane, i try with exacly 90 degree ruler and it will give an error about 0.5 degree every second try and keep increasing, but thats is mine, maybe others have a better method for avoid the error.
below is my steps "
Init
Read gravity XYZ -> Xg Yg Zg
Check if Xg < 0.25 If TRUE try Yg then Zg // Note 1 = 1g = 9.82 m/s^2
Read the compass and gyro
Configure and calibrate the gyro using the compass and calulate based on which axis i use in point 3.
If 5 second is pass then recalibrate, read the compass
If the the difference with gyro reading is > 5 degree skip recalibartion the gyro.
If the the difference with gyro reading is < 5 degree calibrate the gyro using compass value
Note: for number 7 : is to check if the phone affected with magnetic field or near huge steel such or high voltage electrical line or in noisy and heavy equipment in factory plant.
Thats all... Hope this could help you...
And sorry for my english..
Here is an example of an iPhone app where the compass get compensated with the gyroscope. Code and project can be seen here:
http://www.sundh.com/blog/2011/09/stabalize-compass-of-iphone-with-gyroscope/
The direction of the yaw axis vector is undefined when in zero gravity (or free fall, or close enough).
In order to do synchronization while in motion, you need to create a filter for your "horizontal" value that has the same lag/delay response characteristics as the magnetic compass. Either that, or wait until motion stops long enough for both values to settle before recalculating the offset.
Answer to question 1 is Yes, question 2 you are on the right track but you could use a variable name that is not 'horizontal', question 3 is answered by hotpaw2 and also a yaw in a chopper or helicopter at near zero altitude would alert the pilot with an alarm. There is a time lag because part of the software is local while there are other factors which can slow it down including access to a sensor for detecting magnetic waves, the device position and direction, preparing the graphic output for the compass display, computing and outputting data from the gyro and sensors through a relatively slow interface, using a general purpose handheld device not custom designed for the type of task being asked of it.

How to detect iPhone movement in space using accelerometer?

I am trying to make an application that would detect what kind of shape you made with your iPhone using accelerometer.
As an example, if you draw a circle with your hand holding the iPhone, the app would be able to redraw it on the screen.
This could also work with squares, or even more complicated shapes.
The only example of application I've seen doing such a thing is AirPaint (http://vimeo.com/2276713), but it doesn't seems to be able to do it in real time.
My first try is to apply a low-pass filter on the X and Y parameters from the accelerometer, and to make a pointer move toward these values, proportionally to the size of the screen.
But this is clearly not enought, I have a very low accuracy, and if I shake the device it also makes the pointer move...
Any ideas about that ?
Do you think accelerometer data is enought to do it ? Or should I consider using other data, such as the compass ?
Thanks in advance !
OK I have found something that seems to work, but I still have some problems.
Here is how I proceed (admiting the device is hold verticaly) :
1 - I have my default x, y, and z values.
2 - I extract the gravity vector from this data using a low pass filter.
3 - I substract the normalized gravity vector from each x, y, and z, and get the movement acceleration.
4 - Then, I integrate this acceleration value with respect to time, so I get the velocity.
5 - I integrate this velocity again with respect to time, and find a position.
All of the below code is into the accelerometer:didAccelerate: delegate of my controller.
I am trying to make a ball moving according to the position i found.
Here is my code :
NSTimeInterval interval = 0;
NSDate *now = [NSDate date];
if (previousDate != nil)
{
interval = [now timeIntervalSinceDate:previousDate];
}
previousDate = now;
//Isolating gravity vector
gravity.x = currentAcceleration.x * kFileringFactor + gravity.x * (1.0 - kFileringFactor);
gravity.y = currentAcceleration.y * kFileringFactor + gravity.y * (1.0 - kFileringFactor);
gravity.z = currentAcceleration.z * kFileringFactor + gravity.z * (1.0 - kFileringFactor);
float gravityNorm = sqrt(gravity.x * gravity.x + gravity.y * gravity.y + gravity.z * gravity.z);
//Removing gravity vector from initial acceleration
filteredAcceleration.x = acceleration.x - gravity.x / gravityNorm;
filteredAcceleration.y = acceleration.y - gravity.y / gravityNorm;
filteredAcceleration.z = acceleration.z - gravity.z / gravityNorm;
//Calculating velocity related to time interval
velocity.x = velocity.x + filteredAcceleration.x * interval;
velocity.y = velocity.y + filteredAcceleration.y * interval;
velocity.z = velocity.z + filteredAcceleration.z * interval;
//Finding position
position.x = position.x + velocity.x * interval * 160;
position.y = position.y + velocity.y * interval * 230;
If I execute this, I get quite good values, I mean I can see the acceleration going into positive or negative values according to the movements I make.
But when I try to apply that position to my ball view, I can see it is moving, but with a propencity to go more in one direction than the other. This means, for example, if I draw circles with my device, i will see the ball describing curves towards the top-left corner of the screen.
Something like that : http://img685.imageshack.us/i/capturedcran20100422133.png/
Do you have any ideas about what is happening ?
Thanks in advance !
The problem is that you can't integrate acceleration twice to get position. Not without knowing initial position and velocity. Remember the +C term that you added in school when learning about integration? Well by the time you get to position it is a ct+k term. And it is is significant. That's before you consider that the acceleration data you're getting back is quantised and averaged, so you're not actually integrating the actual acceleration of the device. Those errors will end up being large when integrated twice.
Watch the AirPaint demo closely and you'll see exactly this happening, the shapes rendered are significantly different to the shapes moved.
Even devices that have some position and velocity sensing (a Wiimote, for example) have trouble doing gesture recognition. It is a tricky problem that folks pay good money (to companies like AILive, for example) to solve for them.
Having said that, you can probably quite easily distinguish between certain types of gesture, if their large scale characteristics are different. A circle can be detected if the device has received accelerations in each of six angle ranges (for example). You could detect between swiping the iphone through the air and shaking it.
To tell the difference between a circle and a square is going to be much more difficult.
You need to look up how acceleration relates to velocity and velocity to position. My mind is having a wee fart at the moment, but I am sure it the integral... you want to intergrate acceleration with respect to time. Wikipedia should help you with the maths and I am sure there is a good library somewhere that can help you out.
Just remember though that the accelerometers are not perfect nor polled fast enough. Really sudden movements may not be picked up that well. But for gently drawing in the air, it should work fine.
Seems like you are normalizing your gravity vector before subtraction with the instantaneous acceleration. This would keep the relative orientation but remove any relative scale. The latest device I tested (admittedly not an Idevice) returned gravity at roughly -9.8 which is probably calibrated to m/s. Assuming no other acceleration, if you were to normalize this then subtract it from the filtered pass, you would end up with a current accel of -8.8 instead of 0.0f;
2 options:
-You can just subtract out the gravity vector after the filter pass
-Capture the initial accel vector length, normalize the accel and the gravity vectors, scale the accel vector by the dot of the accel and gravity normals.
Also worth remembering to take the orientation of the device into account.