How to determine relative position using accelerometer and gyro data - accelerometer

I am designing a robot, and need to track the distance and direction of the robot motion, Nothing in 3D, I only need x,y and angle in x y plane.
My question :
Is it possible to use gyro and accelerometer with kalman filtering or any other methods to
track this? (I do not have motor encoders)
My constraints : I do not have space to include a gps (due to power requirements)
or motor encoders (due to motor support)

No, not really. If you integrate the accelerometer values twice you get position but the error is horrible. It is useless in practice.
Here is an explanation why (Google Tech Talk) at 23:20.
A related question is probably this.

Related

Gravity removal algorithms from accelerometer in IMU units under acceleration? [duplicate]

I have a small remote controlled car going on the room floor. For simplicity let us assume it is moving along say x-axis. Now, the floor seems flat but there are very minute uneven bumps in every surface. So whenever the car is not exactly flat (as it was at starting position) or in other words whenever the car has even slightest of tilt then,
Total Acceleration obtained from accelerometer = Linear Acceleration + Acceleration due to tilt
My question is how to remove the acceleration due to tilt so that I get only linear acceleration? Can I somehow use gyroscope to do that?
I have implemented sensor fusion for the Shimmer platform based on this manuscript, it's basically a tutorial:
Direction Cosine Matrix IMU: Theory
This manuscript pretty much answers your question.
These have also been a big help:
An introduction to inertial navigation
An Introduction to the Kalman Filter
Pedestrian Localisation for Indoor Environments
Combine Gyroscope and Accelerometer Data
Just promise me you won't try double integrating the linear acceleration because it won't work and I suspect that it is what you are trying to do.

How to obtain only linear acceleration from accelerometer using gyroscope?

I have a small remote controlled car going on the room floor. For simplicity let us assume it is moving along say x-axis. Now, the floor seems flat but there are very minute uneven bumps in every surface. So whenever the car is not exactly flat (as it was at starting position) or in other words whenever the car has even slightest of tilt then,
Total Acceleration obtained from accelerometer = Linear Acceleration + Acceleration due to tilt
My question is how to remove the acceleration due to tilt so that I get only linear acceleration? Can I somehow use gyroscope to do that?
I have implemented sensor fusion for the Shimmer platform based on this manuscript, it's basically a tutorial:
Direction Cosine Matrix IMU: Theory
This manuscript pretty much answers your question.
These have also been a big help:
An introduction to inertial navigation
An Introduction to the Kalman Filter
Pedestrian Localisation for Indoor Environments
Combine Gyroscope and Accelerometer Data
Just promise me you won't try double integrating the linear acceleration because it won't work and I suspect that it is what you are trying to do.

How can i handle distance through accelerometer?

My object starts from zero. When the time goes..It covers some distance, so how can I measure this?
Oh, it's simple. All you have to do is implement an Inertial Measurement Unit and then an Inertial Navigation System. It's going to be hard to do without rotation sensors, it would probably require a Kalman Filter for accuracy, and typically it is done with ring laser gyros or fiber optic gyros, which are "solid state" devices that work by measuring relativistic effects and sell for rather higher prices than the silicon micromachined sensors in the iPhone, but you might get it to work.
Or, you could just use the GPS.
Other than just being alerted that the device did move, the accelerometer will not be much use. You will not get a reading of "device moved 10cm" or something similar, as far as I know you'll just get a value for how much acceleration occurred.
If you need to track your device's movement in the physical world you'll need to use the Location APIs.
You can figure this out, but it won't be that accurate, mainly due to sample rate and the inaccuracy of the accelerometer.
First figure out direction and force of the movement. If the user moves the iphone at +0.1G along the X axis and 0G along the Y and Z axis, then our force is +0.1G on the X axis. 1G is 9.8m/s, so the phone has move 0.9m if it has been traveling for 1 second.

Getting level of rotation with UIAccleration

Games like FroggyJump for iPhone figure out the rotation of the iphone. I'm getting confused with the acceleration values. How do I calculate the level of rotation? I suppose I need to consider when the iphone isn't perfectly upright.
Thank you.
I'm also wanting to use the new Core Motion framework with the "Device Motion" for iPhone 4 for extra precision. I guess I'll have to use that low pass filter for the other devices.
It's the yaw.
Having given Froggy Jump a quick go, I think it's likely directly using the accelerometer's x value as the left/right acceleration on the frog. If it is stationary, you can think of an accelerometer as giving you the vector that points upward into space, relative to the local axes. For something like a ball rolling or anything else accelerating due to tilt, you want to use the values directly.
For anything that involves actually knowing angles, you're probably best picking the axis around which you want to detect rotation then using the C function atan2f on the accelerometer values for the other two axes. With just an accelerometer, there are some scenarios in which you can't detect rotation — for example, if the device is flat on a table then an accelerometer can't detect yaw. The general rule is that rotations around the gravity vector can't be detected with an accelerometer alone.

Calibrating code to iphone acellerometer and gyro data

I'm concepting an iPhone app that will require precise calibration to the iPhones accelerometer and gyro data. I will have to simulate specific movements that I would eventually like to execute code. (Think shake-to-shuffle, or undo).
Is there a good way of doing this already? or something you can come up with? Perhaps some way to generate a time/value graph of the movement data as it is being captured?
Movement data being captured - see the accelerometer graph sample app, which shows the data in real time: http://developer.apple.com/library/ios/#samplecode/AccelerometerGraph/Introduction/Intro.html
The data is pretty noisy - the gyro and accelerometer aren't good enough right now to be able to track where the phone is in local 3d space, for example. The rotation, however, is very solid, and the orientation of the device can be pretty accurately tracked. You may have the best results making gestures out of rotation data instead of movement along an axis. Or, basic direction like shakes along an axis will work as Jacob Jennings said.
A good starting point for accelerometer gesture recognition is this tutorial by Kevin Bomberry at AblePear:
http://blog.ablepear.com/2010/02/iphone-sdk-shake-rattle-roll.html
He sets a blanket threshold for the absolute value of acceleration on any axis. I would generate an 'event' for the axis that had the highest acceleration during the break of the threshold (Z POSITIVE, X NEGATIVE, etc), and push these on an 'event history' queue. At the end of each didAccelerate call, evaluate the queue for patterns that match a gesture, for example:
X POSITIVE, X NEGATIVE, X POSITIVE, X NEGATIVE might be considered a 'shake' along that axis. This should provide a couple different gesture commands.
See the following for a simple queue category addition to NSMutableArray:
How do I make and use a Queue in Objective-C?