Calculating Lean Angle with Core Motion - iphone

I have a record session for my application. When user started a record session I start collecting data from device's CMMotionManager object and store them on CoreData to process and present later. The data I'm collecting includes gps data, accelerometer data and gyro data. The frequency of data is 10Hz.
Currently I'm struggling to calculate the lean angle of device with motion data. It is possible to calculate which side of device is land by using gravity data but I want to calculate right or left angle between user and ground regardless of travel direction.
This problem requires some linear algebra knowledge to solve. For example for calculation on some point I must calculate the equation of a 3D line on a calculated plane. I am working on this one for a day and it's getting more complex. I'm not good at math at all. Some math examples related to the problem is appreciated too.

It depends on what you want to do with the collected data and what ways the user will go with that recording iPhone in her/his pocket. The reason is that Euler angles are no safe and especially no unique way to express a rotation. Consider a situation where the user puts the phone upright into his jeans' back pocket and then turns left around 90°. Because CMAttitude is related to a device lying flat on the table, you have two subsequent rotations for (pitch=x, roll=y, yaw=z) according to this picture:
pitch +90° for getting the phone upright => (90, 0, 0)
roll +90° for turning left => (90, 90, 0)
But you can get the same position by:
yaw +90° for turning the phone left (0, 0, 90)
pitch -90° for making the phone upright (-90, 0, 90)
You see two different representations (90, 90, 0) and (-90, 0, 90) for getting to the same rotation and there are more of them. So you press Start button, do some fancy rotations to put the phone into the pocket and you are in trouble because you can't rely on Euler Angles when doing more complex motions (s. gimbal lock for more headaches on this ;-)
Now the good news: you are right linear algebra will do the job. What you can do is force your users to put the phone in always the same position e.g. fixed upright in the right back pocket and calculate the angle(s) relative to the ground by building the dot product of gravity vector from CMDeviceMotion g = (x, y, z) and the postion vector p which is the -Y axis (0, -1, 0) in upright position:
g • x = x*0 + y*(-1) + z*0 = -y = ||g||*1*cos (alpha)
=> alpha = arccos (-y/9.81) as total angle. Note that gravitational acceleration g is constantly about 9.81
To get the left-right lean angle and forward-back angle we use the tangens:
alphaLR = arctan (x/y)
alphaFB = arctan (z/y)
[UPDATE:]
If you can't rely on having the phone at a predefined postion like (0, -1, 0) in the equations above, you can only calculate the total angle but not the specific ones alphaLR and alphaFB. The reason is that you only have one axis of the new coordinate system where you need two of them. The new Y axis y' will then be defined as average gravity vector but you don't know your new X axis because every vector perpedicular to y' will be valid.
So you have to provide further information like let the users walk a longer distance into one direction without deviating and use GPS and magnetometer data to get the 2nd axis z'. Sounds pretty error prone in practise.
The total angle is no problem as we can replace (0, -1, 0) with the average gravity vector (pX, pY, pZ):
g•p = xpX + ypY + zpZ = ||g||||p||*cos(alpha) = ||g||^2*cos(alpha)
alpha = arccos ((xpX + ypY + z*pZ) / 9.81^2)
Two more things to bear in mind:
Different persons wear different trowsers with different pockets. So the gravity vector will be different even for the same person wearing other clothes and you might need some kind of normalisation
CMMotionManager does not work in the background i.e. the users must not push the standby button

If I understand your question, I think you are interested in getting the attitude of your device. You can do this using the attitude property of the CMDeviceMotion object that you get from the deviceMotion property of the CMMotionManager object.
There are two different angles that you might be interested in the CMAttitude class: roll and pitch. If you imagine your device as an airplane with the propeller at the top (where the headphone jack is), pitch is the angle the plane/device would make with the ground if the plane were in a climb or dive. Meanwhile, roll is the angle that the "wings" would make with the ground if the plane were to be banking or in mid barrel roll.
(BTW, there is a third angle called yaw that I think is not relevant for your question.)
The angles will be given in radians, but it's easy enough to convert them to degrees if that's what you want (by multiplying by 180 and then dividing by pi).
Assuming I understand what you want, the good news is that you may not need to understand any linear algebra to capture and use these angles. (If I'm missing something, please clarify and I'd be happy to help further.)
UPDATE (based on comments):
The attitude values in the CMAttitude object are relative to the ground (i.e., the default reference frame has the Z-axis as vertical, that is pointing in the opposite direction as gravity), so you don't have to worry about cancelling out gravity. So, for example, if you lie your device on a flat table top, and then roll it up onto its side, the roll property of the CMAttitude object will change from 0 to plus or minus 90 degrees (+- .5pi radians), depending on which side you roll it onto. Meanwhile, if you start it lying flat and then gradually stand it up on its end, the same will happen to the pitch property.
While you can use the pitch, roll, and yaw angles directly if you want, you can also set a different reference frame (e.g., a different direction for "up"). To do this, just capture the attitude in that orientation during a "calibration" step and then use CMAttitude's multiplyByInverseOfAttitude: method to transform your attitude data to the new reference frame.
Even though your question only mentioned capturing the "lean angle" (with the ground), you will probably want to capture at least 2 of the 3 attitude angles (e.g., pitch and either roll or yaw, depending on what they are doing), potentially all three, if the device is going to be in a person's pocket. (The device could rotate in the pocket in various ways if the pocket is baggy, for example.) For the most part, though, I think you will probably be able to rely on just two of the three (unless you see radical shifts in yaw throughout the course of a recording session). So for example, in my jeans pocket, the phone is usually nearly vertical. Thus, for me, pitch would vary a whole bunch as I, say, walk, sit or run. Roll would vary whenever I change the direction I'm facing. Meanwhile, yaw would not vary much at all (unless I do kart-wheels, which I can't!). So yaw can probably be ignored for me.
To summarize the main point: to use these attitude angles, you don't need to do any linear algebra, nor worry about gravity (although you may want to use this for other purposes, of course).
UPDATE 2 (based on Kay's new post):
Kay just replied and showed how to use gravity and linear algebra to make sure your angles are unique. (And, btw, I think you should give the bounty to that post, fwiw.)
Depending on what you want to do, you may want to use this math. You would want to use the linear algebra and gravity if you need a standardized way of "talking about" and/or comparing attitudes over the course of your recording session. If you just want to visualize them, you can probably still get away with not using the increased complexity. (For example, visualizing (pitch=90, roll=0, yaw=0) should be the same as visualizing (pitch=0, roll=90, yaw=90).) In my approach above, while you could have multiple ways of referring to the "same" attitude, none of them is actually wrong, per se. They will still give you the angles relative to the ground.
But the fact that the gyroscope can switch from one valid description of an attitude to another means that what I wrote above about getting away with only 2 of the 3 components needs to be corrected: because of this, you will need to capture all three components, no matter what. Sorry.

Related

Store orientation to an array - and compare

I want to achieve the following:
I want the user to be able to "record" the movement of the iPhone using the gyroscope. And after that, the user should be able to replicate the same movement. I extract the pitch, roll and yaw using:
[self.motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMDeviceMotion *motion, NSError *error)
{
CMAttitude *attitude = motion.attitude;
NSLog(#"pitch: %f, roll: %f, yaw: %f]", attitude.pitch, attitude.roll, attitude.yaw);
}];
I'm thinking that I could store these values into an array, if the user is in record mode. And when the user tries to replicate that movement, I'm could compare the replicated movement array to the recorded one. The thing is, how can I compare the two arrays in a smart way? They will never have exactly the same values, but they can be somewhat the same.
Am I at all on the right track here?
UPDATE: I think that maybe Alis answer about using DTW could be the right way for me here. But I'm not that smart (apparently), so if anyone could help me out with the first steps with comparing to arrays I would be a happy man!
Thanks!
Try dynamic time warping. Here is an illustrative example with 1D arrays. In the database we already have the following 2 arrays:
Array 1: [5, 3, 1]
Array 2: [1, 3, 5, 8, 8]
We measured [2, 4, 6, 7]. Which array is the most similar to the newly measured? Obviously, the second array is similar to the newly measured and the first is not.
Let's compute the cost matrices according to this paper, subsection 2.1:
D(i,j)=Dist(i,j)+MIN(D(i-1,j),D(i,j-1),D(i-1,j-1))
Here D(i,j) is the (i,j) element of the cost matrix, see below. Check Figure 3 of that paper to see this recurrence relation is applied. In short: columns are computed first, starting from D(1,1); D(0,*) and D(*,0) are left out in the MIN. If we are comparing arrays A and B then Dist(i,j) is the distance between A[i] and B[j]. I simply used ABS(A[i]-B[j]). The cost matrices for this example:
For Array 1 we have 13 as score, for Array 2 we have 5. The lower score wins, so the most similar array is Array 2. The best warping path is marked gray.
This is only a sketch of DTW. There are a number of issues you have to address in a real-world application. For example using offset instead of fixed ending points, or defining measures of fit: see this paper, page 363, 5. boundary conditions and page 364. The above linked paper has further details too.
I just noticed you are using yaw, pitch and roll. Simply put: don't and another reason not to. Can you use the accelerometer data instead? "An accelerometer is a direct measurement of orientation" (from the DCM manuscript) and that is what you need. And as for tc's question, does the orientation relative to North matter? I guess not.
It is far easier to compare the acceleration vectors than orientations (Euler angles, rotation matrices, quaternions) as tc pointed that out. If you are using acceleration data, you have 3 dimensional vectors at each time point, the (x,y,z) coordinates. I would simply compute
Dist(i,j)=SQRT((A[i][X]-B[j][X])^2+(A[i][Y]-B[j][Y])^2+(A[i][Z]-B[j][Z])^2),
that is the Eucledian distance between the two points.
I think Ali's approach is in general a good way to go, but there is a general problem called gimbal lock (or SO discussions on this topic) when using Euler angles i.e. pitch, roll and yaw. You will run into it when you record a more complex movement lasting longer than a few ticks and thus leading to large angle deltas in different angular directions.
In a nutshell that means, that you will have more than one mathematical representation for the same position just depending on the order of movements you made to get there - and a loss of information on the other side. Consider an airplane flying up in the air from left to right. X axis is from left to right, Y axis points up to the air. The following two movement sequences will lead to the same end position although you will get there on totally different ways:
Sequence A:
Rotation around yaw +90°
Rotation around pitch +90°
Sequence B:
Rotation around pitch +90°
Rotation around roll +90°
In both cases your airplane points down to the ground and you can see its bottom from your position.
The only solution to this is to avoid Euler angles and thus make things more complicated. Quaternions are the best way to deal with this but it took a while (for me) to get an idea of this pretty abstract representation. OK, this answer doesn't take you any step further regarding your original problem, but it might help you avoiding waste of time. Maybe you can do some conceptual changes to set up your idea.
Kay

iphone - core motion (relative rotation)

Is there a way to obtain a relative rotation from core motion?
What I need is: how much it rotated in one axis and which direction (+ sign = anti-clockwise, - = clockwise, according to the right-hand rule).
I have found the property rotationRate, but I am now sure how I would extract the angle out of it, as this is giving me radians per second.
I have done all kind of stuff on the last days but nothing is giving me stable values. I have tried to do a timed sample of core motion data, using a NSTimer and calculate the difference between two samples, so I would have how much it rotated since the last sample, but from times to times it gives me crazy numbers like 13600 degrees even when the iPhone is resting on the table.
Any thoughts on how this can be accomplished?
thanks
There is indeed. You can get what you're looking for by drilling down into the properties of CMMotionManager, through CMDeviceMotion and finally to CMAttitude. The attitude of the device is defined as:
the orientation of a body relative to
a given frame of reference.
In the case of DeviceMotion's CMAttitude, that frame of reference is established by the framework when starting device motion updates. From that point in time on, the attitude of the device is reported relative to that reference frame (not relative to the previous frame).
The CMAttitude class provides some handy built in functionality to convert a CMAttitude to a form that is actually useful for something, like Euler Angles, a rotation matrix, or a quaternion. You sound like you're looking for the Euler Angle representation (Pitch, Yaw, Roll).
The answer provided above isn't quite accurate, though it's probably sufficient to answer this question. Core Motion tries to determine the device's absolute attitude at all times, meaning that the definition of the axes can vary depending on the device's orientation. For example, if the device is face-up, then pitch up/down is a rotation about the y-axis, but if the device is in landscape orientation, then pitch is a rotation about the z-axis (perpendicular to the plane of the screen). This is somewhat helpful if your application will only be used in one orientation, or you want a delta like the question asked for, but makes it excessively complicated if you want to know absolute orientation.

Getting level of rotation with UIAccleration

Games like FroggyJump for iPhone figure out the rotation of the iphone. I'm getting confused with the acceleration values. How do I calculate the level of rotation? I suppose I need to consider when the iphone isn't perfectly upright.
Thank you.
I'm also wanting to use the new Core Motion framework with the "Device Motion" for iPhone 4 for extra precision. I guess I'll have to use that low pass filter for the other devices.
It's the yaw.
Having given Froggy Jump a quick go, I think it's likely directly using the accelerometer's x value as the left/right acceleration on the frog. If it is stationary, you can think of an accelerometer as giving you the vector that points upward into space, relative to the local axes. For something like a ball rolling or anything else accelerating due to tilt, you want to use the values directly.
For anything that involves actually knowing angles, you're probably best picking the axis around which you want to detect rotation then using the C function atan2f on the accelerometer values for the other two axes. With just an accelerometer, there are some scenarios in which you can't detect rotation — for example, if the device is flat on a table then an accelerometer can't detect yaw. The general rule is that rotations around the gravity vector can't be detected with an accelerometer alone.

Calibrating code to iphone acellerometer and gyro data

I'm concepting an iPhone app that will require precise calibration to the iPhones accelerometer and gyro data. I will have to simulate specific movements that I would eventually like to execute code. (Think shake-to-shuffle, or undo).
Is there a good way of doing this already? or something you can come up with? Perhaps some way to generate a time/value graph of the movement data as it is being captured?
Movement data being captured - see the accelerometer graph sample app, which shows the data in real time: http://developer.apple.com/library/ios/#samplecode/AccelerometerGraph/Introduction/Intro.html
The data is pretty noisy - the gyro and accelerometer aren't good enough right now to be able to track where the phone is in local 3d space, for example. The rotation, however, is very solid, and the orientation of the device can be pretty accurately tracked. You may have the best results making gestures out of rotation data instead of movement along an axis. Or, basic direction like shakes along an axis will work as Jacob Jennings said.
A good starting point for accelerometer gesture recognition is this tutorial by Kevin Bomberry at AblePear:
http://blog.ablepear.com/2010/02/iphone-sdk-shake-rattle-roll.html
He sets a blanket threshold for the absolute value of acceleration on any axis. I would generate an 'event' for the axis that had the highest acceleration during the break of the threshold (Z POSITIVE, X NEGATIVE, etc), and push these on an 'event history' queue. At the end of each didAccelerate call, evaluate the queue for patterns that match a gesture, for example:
X POSITIVE, X NEGATIVE, X POSITIVE, X NEGATIVE might be considered a 'shake' along that axis. This should provide a couple different gesture commands.
See the following for a simple queue category addition to NSMutableArray:
How do I make and use a Queue in Objective-C?

Compensating compass lag with the gyroscope on iPhone 4

I've been experimenting with the compass and gyroscope on iPhone 4 and would like some help with an issue I'm having. I want to compensate for the slowness of the compass by using data from the gyroscope.
Using CMMotionManager and its CMDeviceMotion object (motionManager.deviceMotion), I get the CMAttitude object. Correct me if I'm wrong (please), but here is what I've deduced from the CMAttitude object's yaw property (I don't need pitch nor roll for my purposes):
yaw ranges from 0 to PI when the phone is pointing downwards (as indicated by deviceMotion.gravity.z) and swinging counterclockwise and 0 to -PI when swung clockwise
when the device is pointing upwards, yaw ranges from -PI to 0 and PI to 0, respectively
and from the compass data (I'm using locationManager.heading.magneticHeading), I see that the compass gives values from 0 to 360, with the value increasing when swinging clockwise
All right, so using all of this information together, I'm able to get a value I call horizontal that, regardless of whether the device is pointing up or down, will give values from 0 to 360 and increase when the device is swung clockwise (though I am still having trouble when deviceManager.gravity.z is around 0 -- the yaw value freaks out at this gravity.z value).
It seems to me that I could "synchronize" the horizontal and magneticHeading values, using a calculated horizontal value that maps to magneticHeading, and "synchronize" the horizontal value to magneticHeading when I feel the compass has "caught up."
So my questions:
Am I on the right track with this?
Am I using the gyro data from CMDeviceMotion properly and the assumptions I listed above correct?
Why might yaw freak out when gravity.z is around 0?
Thank you very much. I look forward to hearing your answers!
Just trying to answer... correct me if i'm wrong..
1.Yes you are on the right track
2.gravity in CM is already "isolated" from user gravity (gravity value caused by user acceleration) thats why there is two gravity, the "gravity" and "userAcceleration" its on apple CM documentation
// Note : not entirely isolated //
3.
if you have a gravity 0 it mean that the coresponding axis is perpendicular with gravity.
gravity.z is the iPhone screen thats why it -9.82m/s2 if you put on the desk with screen upright, actualy it hard to get 0 or maximum value of the gravity due to the sensor noise (it's normal, all sensor has a noise expecially cheap sensor).
what i do on my apps is I will switch my reference axis to other axis (in your case may be x or y) for certain limits, how the strategy is depend on the purpose or which side is your reference.
the other thing is, gyro is fast but its not stable, you need to re-calibrate the value for several interval. In my case every 5 second. I've experiment with gyro for calculating angle between two plane, i try with exacly 90 degree ruler and it will give an error about 0.5 degree every second try and keep increasing, but thats is mine, maybe others have a better method for avoid the error.
below is my steps "
Init
Read gravity XYZ -> Xg Yg Zg
Check if Xg < 0.25 If TRUE try Yg then Zg // Note 1 = 1g = 9.82 m/s^2
Read the compass and gyro
Configure and calibrate the gyro using the compass and calulate based on which axis i use in point 3.
If 5 second is pass then recalibrate, read the compass
If the the difference with gyro reading is > 5 degree skip recalibartion the gyro.
If the the difference with gyro reading is < 5 degree calibrate the gyro using compass value
Note: for number 7 : is to check if the phone affected with magnetic field or near huge steel such or high voltage electrical line or in noisy and heavy equipment in factory plant.
Thats all... Hope this could help you...
And sorry for my english..
Here is an example of an iPhone app where the compass get compensated with the gyroscope. Code and project can be seen here:
http://www.sundh.com/blog/2011/09/stabalize-compass-of-iphone-with-gyroscope/
The direction of the yaw axis vector is undefined when in zero gravity (or free fall, or close enough).
In order to do synchronization while in motion, you need to create a filter for your "horizontal" value that has the same lag/delay response characteristics as the magnetic compass. Either that, or wait until motion stops long enough for both values to settle before recalculating the offset.
Answer to question 1 is Yes, question 2 you are on the right track but you could use a variable name that is not 'horizontal', question 3 is answered by hotpaw2 and also a yaw in a chopper or helicopter at near zero altitude would alert the pilot with an alarm. There is a time lag because part of the software is local while there are other factors which can slow it down including access to a sensor for detecting magnetic waves, the device position and direction, preparing the graphic output for the compass display, computing and outputting data from the gyro and sensors through a relatively slow interface, using a general purpose handheld device not custom designed for the type of task being asked of it.