6DOF using accelerometer and gyroscope - iphone

Has any one developed 6DOF pose estimation using only the iPhone sensors, not video? Drift from the accelerometer and gyroscope are understood.
The gyroscope provides fairly reliable relative orientation rates. I've been able to develop with the gyroscope data.
However, I'm having more problems deriving translation from the accelerometer. Double integration of the acceleration leads to useless position data very quickly (less than half a second).
I have attempted to remove the bias with a calibration step, but the position is still poor. What's worse, is the bias isn't constant. It changes over time, and the noise drowse the signal.
I'm interested if anyone has been able to develop a 6DOF with only the accelerometer and gyroscope that works reliably for 5-10 seconds with little drift in both translation and orientation.

The gyro yaw using DeviceMotion has a drift when you first start updating, try not to take those samples and everyone will be happy.

I made a post about this: Get orientation device in the iPhone for Opengl Es, I having the same issue, I´m just trying to make a filter but it´s not working good, there is a book about these http://www.amazon.com/iOS-Sensor-Programming-Augmented-Location/dp/1449382657, but I didn´t read this book.

Related

iPhone4 iOS5 is there a physics engine to convert CMDeviceMotion events into displacement?

I'm running a CMDeviceMotion processing queue on iPhone 4, which gives me user-induced acceleration, along with the rotation rates. I can filter this data myself.
What I'm trying to understand is how to convert these discrete samples of acceleration, device attitude and rotational rate into a 3 dimensional displacement. This is possible with classical mechanics for straight lines, but I"m thinking of more advanced calculations - for example curves. This can be handled with GPS, but I'm looking for a much better resolution - lets say within 10 feet. GPS under clear sky has an average accuracy of about 30 feet.
Is there some sort of a physics engine or physics processor that can take a set of device motion or acceleration/turn rate events and give me a distance of how far the phone is from the original location?
I know that there are various pedometer and bike GPS trackers for iPhone. Are they based on GPS or do they actually do the acceleration integration like I'm describing?
Unfortunately, the acceleration integration you are describing won't work in itself.
However, you may improve the accuracy by fusing with the GPS signal and/or make domain specific assumptions. For details, see the above link.

Detecting the user's spinning motion

I have been experimenting with the Core Motion framework to detect a user spinning around, say on a merry-go-round, holding an iphone in his hand.
There are ways to detect the device motion around its own axes, but what is a good way to detect the iPhone spinning in circles?
Thanks
You can use the gyroscope. Take a look here: Gyroscope example
You have to remind that it is only availble on iPhone4 and iPhone4S.
There is one degenerate case where you can run into trouble, only magnetometer (compass) can help in that particular case.
If you put the device (a) on the desk in stationary position then (b) on a perfectly horizontal turntable rotating slowly you will get the same qualitative sensor readings. Both the gyro and the accelerometer readings are constant in the two cases, although the readings quantitatively differ. The sad part is: gyro bias error can render case (a) to look like (b) and vice-versa. In this particular case you need a compass to cancel the gyro drift. Case (a) is typical for a phone.
Apart from this degenerate case, gyroscopes and accelerometers with sensor fusion are sufficient to track arbitrary rotations of the device.

iPhone/iPad gyroscope

I never really understand the applications of the gyroscope on the iPhone/iPad, does it serve the similar function as the accelerometer but like an improvement to the accelerometer? What is the practical use of it?
"An accelerometer is a direct measurement of orientation, while a gyro is a measurement of the time rate of change of orientation." (1) By combing the output of the two sensors, called sensor fusion, one can determine the orientation of the device precisely and fast.
If you only use accelerometer with a low-pass filter, you still get a reasonable estimate for the orientation but it will lag.
Here is an excellent live demo of both (Google Tech Talk), starting at 21:50.
Gyroscope measures orientation, where accelerometer measures movement. Both have useful applications (gyroscope: Which direction am I driving towards? Accelerometer: Did I just shake my device?)
The accelerometer tells you the difference in the force being experienced by the device and the force it would experience if it were in free fall. So if the device is static, the accelerometer tells you which way up is. When it's being shaken around, you get a summation of up plus the direction of the shake. Hence the accelerometer can detect some rotation, but not around the gravity vector and only if the device is otherwise static.
The gyroscope tells you the velocity at which the device is being rotated. So you can integrate values coming from it to keep track of orientation. That works across all axes and irrespective of device movement.

Obtaining accurate displacement/position change from iPhone accelerometer

I'm currently trying to implement an augmented reality iphone application (ios 4.2) that uses accelerometer data to translate and rotate an OpenGl object on the screen. I have already succeeded in getting the object to respond to the phones rotation, but this was always going to be the easy part.
For the translational part, I have I've tried implementing some of the techniques from this paper (http://www.freescale.com/files/sensors/doc/app_note/AN3397.pdf)
but it's still not very accurate. I'm in the process of implementing a kalman filter to filter the accelerometer data.
Has anyone had any luck in determining phone translational movement? If so, how accurate did you get it, and what techniques did you use to obtain this accuracy?

iPhone - detecting motion with gyroscope/accelerometer

I'm trying to detect a swinging motion with an iPhone 4 using the gyro/accelerometer. I searched for some posts on SO about this, but couldn't find anything specific to my issues.
Do I need to do any sort of calibration for data from the gyroscope/accelerometer?
Anyone think of how I would measure a swinging motion?
Thanks!
1: Most iPhone games using the accelerometer don't do any calibration, but not all iphones are the same; there is some variation in accelerometer calibration. You could add a manual or automatic calibration to your program. If however, detecting a swinging motion is all you want, calibration is not necessary.
2: Apple has a nice little app that generates graphs of accelerometer motions in the iPhone SDK. You can download and build that and see the measurements for the motion you want. Then you can write code to detect similar accelerometer measurements.