Drifting yaw angle after moving fast - iphone

In my current project I ran into trouble regarding the quaternion provided by Core Motion's CMAttitude. I put the iPhone 5 (iOS 6.0.1) at a well defined start position. Then I start to move the device quickly around like in a fast pacing game. When I return to the start position after 10-30 seconds the reported yaw angle differ from the start position for 10-20 degrees (most of the time ≈11°).
I used the old (and sadly no longer available) Core Motion Teapot sample to validate the effect. The Euler Angles for logging are read directly from CMAttitude:
NSLog(#"pitch: %f, roll: %f, yaw: %f", attitude.pitch * 180 / M_PI, attitude.roll * 180 / M_PI, attitude.yaw * 180 / M_PI);
I found this on two different iPhone 5 devices manufactured at different times in different factories. But really weird is that my iPhone 4, running iOS 5.1.1, is working as expected. It seems to me an iOS bug and I filed a bug report already, but on the other hand I can hardly imagine that nobody had yet stumbled upon it. I suspect it might has to do with the redesigned Core Motion API. Starting with version 5 the magnetometer (compass) is considered for sensor fusion as well. Console shows that bias estimations from locationd are provided to CoreMotion:
locationd[41] <Notice>: GYTT inserted: bias,-0.196419,1.749323,-1.828088,variance,0.002644,0.004651,0.002527,temperature,31.554688
My question(s): Is there chance to block magnetometer readings when using Device Motion? I tried deactivating location services but it doesn't affect Core Motion. If not possible, what is the alternative / workaround, Accelerometer based gravity estimation?
PS: As we are dealing with quaternion based models this is not related to Gimbal Lock
EDIT:
After doing some more measurements it seems clear that only yaw is affected. Pitch and roll show deviations within tolerance (<= 1°) while yaw is drifting regardless of the starting position. CMDeviceMotion.gravity appears to be clean too.
EDIT (2):
I could reproduce the problem with the MotionGraphs sample attached to recent XCode versions. The yaw graph is reproducibly drifting away from origin.

Not the definitive solution but at least a workaround for my own question (I leave it as unanswered to invite you). It turned out that at least DeviceMotion.gravity is not affected by the bug. So I decided to redesign this pretty simple part of motion detection and use arcsin (gravity.x/||gravity||) for moving the main player character to the side when tilting the device.
This is definitely the second best solution as it destroys information about the full rotation status contained in a quaternion. I decided to it that way for strategical considerations:
I think most developers do tilt motion detection with the gravity vector rather than CMAttitude.quaternion because most people are not that amused about quaternion maths ;-) Thus any future bugs related to the gravity vector will probably be fixed during the beta phase because of a larger number of users.
If it is a software bug and not related to hardware issues, what I assume, and if the bug will be fixed ASAP, there is still a number of devices that might not get updated for what reason ever. Thus the risk that a potential future customer will run into trouble is small but > 0. So the second best solution might be sometimes the best.

I've done something similar in my own code and found the same z-axis rotational drift (yaw). I've applied a balanced filter. At each motion manager time interval, I grab the current quaternion (z-component) and then save that after calculations as oldZ, for use in the next set of calculations. My filter simply balances the NEW z value with the z value immediately preceeding it, preventing it from moving too much too quickly. Depending on your hardware and the exact tolerance in your program, you can manage drift quite well in this way. You will see the gyro drift slightly, but then begin to be corrected as the filter continues to act. My filter looks something like this and prevents more than 0.5 degree of "stray":
filtZ = 0.65 * oldZ + 0.35 * z;
The 0.65 and 0.35 values were determined experimentally and I would advise you to play with those as you have time. Output will still be scaled 0-1 and can then be utilized in the same way you've been doing (or re-introduced to the quaternion if you must retain all 4 dimensions throughoutf).

Related

What exactly does the iPhone accelerometer measure?

The apple documentation for UIAcceleration class says,
"When a device is laying still with its back on a horizontal surface, each acceleration event has approximately the following values:
x: 0
y: 0
z: -1"
Now, I am confused! How can the acceleration be non-zero, when you clearly say the "device is laying still"?
UPDATE
Judging by the responses, I think this should be called something like 'forceometer' or 'gravitometer' and not accelerometer!
You get a -1 on the Z axis because gravity is acting on the device, applying a constant acceleration of 1G. I assume you want user acceleration, which you can get from the DeviceMotion object using a device motion handler as opposed to an acceleration handler. The userAcceleration property filters out the effects of gravity on the device and only gives you how much the user is accelerating it.
I found the answer [in the CoreMotion Reference guide, thanks to bensnider:
The accelerometer measures the sum of two acceleration vectors: gravity and user acceleration. User acceleration is the acceleration that the user imparts to the device.
You'll find the best answers in datasheet of the accelerometer used (LIS302DL).
It measures the gravity. The unit is chosen so that the gravity, 9.81 m/s^2, equals 1 unit. The sign tells how the phone axis is directed. In other words, what the phone considers downwards.
The phone measures 0 as acceleration in free fall. I don't know how much you want to throw your phone up and down to test it :)
When you're sitting, gravity is pulling you down to your chair. If it weren't for the chair or ground for that matter, you'd be falling down with acceleration of about 9.8m/s^2. In order for the chair to prevent you from falling down, it has to act with a force in the opposite direction with at least the same value.
The accelometer shows the value of the pulling force and it's a three-dimensional vector. In this case it's directed straight down. And the value given is expressed in G, units of gravity acceleration multiplied by that value.
Answerers keep missing the right wording that should set it straight for you... The device is "laying still" only relatively to you. It is actually not laying still at all. The http://en.wikipedia.org/wiki/Centripetal_force of gravity gives it (and you) centripetal acceleration. It is real, it is what keeps you from flying off Earth on a tangent, and it is what the accelerometer dutifully shows. (Earth is nothing special - we rotate about the Sun also etc etc, whose centripetal accelerations are way smaller, but they would be all shown by an accelerometer sensitive enough.)
I don't yet have sufficient reputation to reply directly to the comment by #gigahari above, but as an addendum, folks should be aware that some apps (such as the physics apps phyphox and PhysicsToolbox Sensor Suite) do not report (a+g) -- both phyphox's "with g" option and PhysicsToolbox report the vector sum (a-g), which is sometimes referred to as the "Operational Definition of Weight." A brief discussion of this version of the operational definition of weight is on WikiPedia, at https://en.wikipedia.org/wiki/Weight#Operational_definition

Is there any way to remove the small bias along the gravity axis in the accelerometer data

Similar to this question:
CMDeviceMotion userAcceleration drift
I'm using CMDeviceMotion.userAcceleration in iOS5 SDK to plot its x, y, z components over time. Like the above post, I see z acceleration component shows always small positive values (0.005 - 0.015) while x and y components are centering along zero (-0.005 - 0.005) when my iPhone 4s is sitting on a flat surface.
This small bias keeps adding up to the estimated velocity (which I compute by integrating the acceleration data) even when my phone is not moving a bit. Is there any known way to remove this bias from the accelerometer data? I cannot simply subtract the bias from z component because it seems that the bias spreads over x y and z along the gravity axis if the device is in some arbitrary orientation.
I know that the data in CMDeviceMotion.userAcceleration has already factored out the gravity using Gyro data but wonder if there is any effective way to remove this residual bias?
First, you need some external reference that does not drift such as GPS. Then you have to perform sensor fusion (Kalman filter comes to mind). Otherwise you cannot remove the bias and the integration error will grow indefinitely.
UPDATE: You cannot get relative displacement just by integrating the acceleration, see my answer to Android accelerometer accuracy (Inertial navigation). However, I give some examples there what you actually can do.
If you check my answer you will see that it is the gyro white noise that makes the integration hopeless.
Old question, but I wanted to share some insight. Part of the bias in the accelerometers actually does not come from any inaccuracies in the sensors, but from an oversight in the calculations that Apple does. The calculations assume that gravity always is 1 G (which is by definition 9.80665 m/s2). Any left-over must then be user acceleration.
However, gravity varies slightly all over the world. If the gravity in your area is not exactly 9.80665 m/s2, then there will be a small bias in the user acceleration, which is detectable with a low-pass filter. Such a bias can removed with the following calculation:
- (void) handleDeviceMotion:(CMDeviceMotion *)m atTime:(NSDate *)time
{
// calculate user acceleration in the direction of gravity
double verticalAcceleration = m.gravity.x * m.userAcceleration.x +
m.gravity.y * m.userAcceleration.y +
m.gravity.z * m.userAcceleration.z;
// update the bias in low pass filter (bias is an object variable)
double delta = verticalAcceleration - bias;
if (ABS(delta) < 0.1) bias += 0.01 * delta;
// remove bias from user acceleration
CMAcceleration acceleration;
acceleration.x = m.userAcceleration.x - bias * m.gravity.x;
acceleration.y = m.userAcceleration.y - bias * m.gravity.y;
acceleration.z = m.userAcceleration.z - bias * m.gravity.z;
// do something with acceleration
}
Mind you, even with that bias removed, there is still a lot of noise, and there could also be a manufacturing bias different for each accelerometer chip. Therefore, you will still have a hard time deriving velocity and certainly position from this.
Thanks Ali for updating your answer and other references. They certainly helped my understanding on this issue (and I was surprised to see how many people are interested in this issue). I may sound a bit stubborn but I still think I didn't find the answer for my original question from anywhere. Let's forget about integration now. With more experiments I see some constant biases (though even smaller) on x and y axes as well when I averaged the user acceleration data over time. I was just wondering if there's any way to remove these biases from "user" acceleration data which I get from iOS5 CMDeviceMotion. If they were caused by the white noise of the gyroscope in the process of filtering out the gravity, I guess we may see random noise in the user accelerometer data but not those biases. But based on my impression so far, it seems that those biases were caused by the limited "accuracy" of both accelerometer and gyroscope and there's nothing we can do about that although I'm not 100% sure. I was trying to put my impression in comment (not in answer section) but SO didn't allow because it was too long but I was wondering how many people would back up my impression by voting so I decided to put it in answer section... Sorry if I was rambling a bit.

iPhone - What does the gyroscope measures? Can I get an absolute degree measurement in all axis?

I am relatively new to iPhone development and I am playing with the gyroscope, using Core Motion. After a few tests, this is my question.
What information is exactly the gyroscope measuring? absolute angles? I mean, suppose I hold my phone in portrait, at exactly 90 degrees and start sampling. These may be not the correct values, but suppose that at this position, the gyroscope gives me 0, 0 and 0 degrees for yaw, pitch and roll.
Now I throw my iphone in the air and as it goes up it rolls at random a high number of full turns in all axis and returns to my hand at the same position as before. Will the gyroscope read 0,0,0 (meaning that it has the same position as before = absolute angle) or not?
If not, there's a way to measure absolute degrees in all axis? As absolute degrees I mean assuming 0,0,0 as the position it was when the sampling started.
thanks
The gyroscope measures many things for you, and yes, one of these is "absolute angles". Take a look at the docs on CMDeviceMotion. It can give you a rotation rate, which is how fast the device is spinning, and it can give you a CMAttitude. The CMAttitude is what you're calling "absolute angles". It is technically defined as:
the orientation of a body relative to
a given frame of reference
The really nice thing is that normal gyroscopes, as noted in the other answer, are prone to drift. The Core Motion framework does a lot of processing behind the scened for you in an effort to compensate for the drift before the measurements are reported. Practically, I've found that the framework does a remarkable (though not perfect) job at this task. Unless you need long term precision to a magnetic pole or something, the attitude reported by the framework can be considered as a perfect relative attitude measurement, for all intents and purposes.
The iPhone uses accelerometers for its internal angle measurements, which means they are relative to the Earth's gravity. That's about as absolute as you're going to get, unless you need this program to work in space, too.

Compensating compass lag with the gyroscope on iPhone 4

I've been experimenting with the compass and gyroscope on iPhone 4 and would like some help with an issue I'm having. I want to compensate for the slowness of the compass by using data from the gyroscope.
Using CMMotionManager and its CMDeviceMotion object (motionManager.deviceMotion), I get the CMAttitude object. Correct me if I'm wrong (please), but here is what I've deduced from the CMAttitude object's yaw property (I don't need pitch nor roll for my purposes):
yaw ranges from 0 to PI when the phone is pointing downwards (as indicated by deviceMotion.gravity.z) and swinging counterclockwise and 0 to -PI when swung clockwise
when the device is pointing upwards, yaw ranges from -PI to 0 and PI to 0, respectively
and from the compass data (I'm using locationManager.heading.magneticHeading), I see that the compass gives values from 0 to 360, with the value increasing when swinging clockwise
All right, so using all of this information together, I'm able to get a value I call horizontal that, regardless of whether the device is pointing up or down, will give values from 0 to 360 and increase when the device is swung clockwise (though I am still having trouble when deviceManager.gravity.z is around 0 -- the yaw value freaks out at this gravity.z value).
It seems to me that I could "synchronize" the horizontal and magneticHeading values, using a calculated horizontal value that maps to magneticHeading, and "synchronize" the horizontal value to magneticHeading when I feel the compass has "caught up."
So my questions:
Am I on the right track with this?
Am I using the gyro data from CMDeviceMotion properly and the assumptions I listed above correct?
Why might yaw freak out when gravity.z is around 0?
Thank you very much. I look forward to hearing your answers!
Just trying to answer... correct me if i'm wrong..
1.Yes you are on the right track
2.gravity in CM is already "isolated" from user gravity (gravity value caused by user acceleration) thats why there is two gravity, the "gravity" and "userAcceleration" its on apple CM documentation
// Note : not entirely isolated //
3.
if you have a gravity 0 it mean that the coresponding axis is perpendicular with gravity.
gravity.z is the iPhone screen thats why it -9.82m/s2 if you put on the desk with screen upright, actualy it hard to get 0 or maximum value of the gravity due to the sensor noise (it's normal, all sensor has a noise expecially cheap sensor).
what i do on my apps is I will switch my reference axis to other axis (in your case may be x or y) for certain limits, how the strategy is depend on the purpose or which side is your reference.
the other thing is, gyro is fast but its not stable, you need to re-calibrate the value for several interval. In my case every 5 second. I've experiment with gyro for calculating angle between two plane, i try with exacly 90 degree ruler and it will give an error about 0.5 degree every second try and keep increasing, but thats is mine, maybe others have a better method for avoid the error.
below is my steps "
Init
Read gravity XYZ -> Xg Yg Zg
Check if Xg < 0.25 If TRUE try Yg then Zg // Note 1 = 1g = 9.82 m/s^2
Read the compass and gyro
Configure and calibrate the gyro using the compass and calulate based on which axis i use in point 3.
If 5 second is pass then recalibrate, read the compass
If the the difference with gyro reading is > 5 degree skip recalibartion the gyro.
If the the difference with gyro reading is < 5 degree calibrate the gyro using compass value
Note: for number 7 : is to check if the phone affected with magnetic field or near huge steel such or high voltage electrical line or in noisy and heavy equipment in factory plant.
Thats all... Hope this could help you...
And sorry for my english..
Here is an example of an iPhone app where the compass get compensated with the gyroscope. Code and project can be seen here:
http://www.sundh.com/blog/2011/09/stabalize-compass-of-iphone-with-gyroscope/
The direction of the yaw axis vector is undefined when in zero gravity (or free fall, or close enough).
In order to do synchronization while in motion, you need to create a filter for your "horizontal" value that has the same lag/delay response characteristics as the magnetic compass. Either that, or wait until motion stops long enough for both values to settle before recalculating the offset.
Answer to question 1 is Yes, question 2 you are on the right track but you could use a variable name that is not 'horizontal', question 3 is answered by hotpaw2 and also a yaw in a chopper or helicopter at near zero altitude would alert the pilot with an alarm. There is a time lag because part of the software is local while there are other factors which can slow it down including access to a sensor for detecting magnetic waves, the device position and direction, preparing the graphic output for the compass display, computing and outputting data from the gyro and sensors through a relatively slow interface, using a general purpose handheld device not custom designed for the type of task being asked of it.

Detect the iPhone rotation spin?

I want to create an application could detect the number of spin when user rotates the iPhone device. Currently, I am using the Compass API to get the angle and try many ways to detect spin. Below is the list of solutions that I've tried:
1/ Create 2 angle traps (piece on the full round) on the full round to detect whether the angle we get from compass passed them or not.
2/ Sum all angle distance between times that the compass is updated (in updateHeading function). Let try to divide the sum angle to 360 => we could get the spin number
The problem is: when the phone is rotated too fast, the compass cannot catch up with the speed of the phone, and it returns to us the angle with latest time (not continuously as in the real rotation).
We also try to use accelerometer to detect spin. However, this way cannot work when you rotate the phone on a flat plane.
If you have any solution or experience on this issue, please help me.
Thanks so much.
The iPhone4 contains a MEMS gyrocompass, so that's the most direct route.
As you've noticed, the magnetometer has sluggish response. This can be reduced by using an anticipatory algorithm that uses the sluggishness to make an educated guess about what the current direction really is.
First, you need to determine the actual performance of the sensor. To do this, you need to rotate it at a precise rate at each of several rotational speeds, and record the compass behavior. The rotational platform should have a way to read the instantaneous position.
At slower speeds, you will see a varying degree of fixed lag. As the speed increases, the lag will grow until it approaches 180 degrees, at which point the compass will suddenly flip. At higher speeds, all you will see is flipping, though it may appear to not flip when the flips repeat at the same value. At some of these higher speeds, the compass may appear to rotate backwards, opposite to the direction of rotation.
Getting a rotational table can be a hassle, and ensuring it doesn't affect the local magnetic field (making the compass useless) is a challenge. The ideal table will be made of aluminum, and if you need to use a steel table (most common), you will need to mount the phone on a non-magnetic platform to get it as far away from the steel as possible.
A local machine shop will be a good place to start: CNC machines are easily capable of doing what is needed.
Once you get the compass performance data, you will need to build a model of the observed readings vs. the actual orientation and rotational rate. Invert the model and apply it to the readings to obtain a guess of the actual readings.
A simple algorithm implementation will be to keep a history of the readings, and keep a list of the difference between sequential readings. Since we know there is compass lag, when a difference value is non-zero, we will know the current value has some degree of inaccuracy due to lag.
The next step is to create a list of 'corrected' readings, where the know lag of the prior actual values is used to generate an updated value that is used to create an updated value that is added to the last value in the 'corrected' list, and is stored as the newest value.
When the cumulative correction (the difference between the latest values in the actual and corrected list exceed 360 degrees, that means we basically don't know where the compass is pointing. Hopefully, that point won't be reached, since most rotational motion should generally be for a fairly short duration.
However, since your goal is only to count rotations, you will be off by less than a full rotation until the accumulated error reaches a substantially higher value. I'm not sure what this value will be, since it depends on both the actual compass lag and the actual rate of rotation. But if you care only about a small number of rotations (5 or so), you should be able to obtain usable results.
You could use the velocity of the acceleration to determine how fast the phone is spinning and use that to fill in the blanks until the phone has stopped, at which point you could query the compass again.
If you're using an iPhone 4, the problem has been solved and you can use Core Motion to get rotational data.
For earlier devices, I think an interesting approach would be to try to detect wobbling as the device rotates, using UIAccelerometer on a very fine reporting interval. You might be able to get some reasonable patterns detected from the motion at right angles to the plane of rotation.