Getting level of rotation with UIAccleration - iphone

Games like FroggyJump for iPhone figure out the rotation of the iphone. I'm getting confused with the acceleration values. How do I calculate the level of rotation? I suppose I need to consider when the iphone isn't perfectly upright.
Thank you.
I'm also wanting to use the new Core Motion framework with the "Device Motion" for iPhone 4 for extra precision. I guess I'll have to use that low pass filter for the other devices.
It's the yaw.

Having given Froggy Jump a quick go, I think it's likely directly using the accelerometer's x value as the left/right acceleration on the frog. If it is stationary, you can think of an accelerometer as giving you the vector that points upward into space, relative to the local axes. For something like a ball rolling or anything else accelerating due to tilt, you want to use the values directly.
For anything that involves actually knowing angles, you're probably best picking the axis around which you want to detect rotation then using the C function atan2f on the accelerometer values for the other two axes. With just an accelerometer, there are some scenarios in which you can't detect rotation — for example, if the device is flat on a table then an accelerometer can't detect yaw. The general rule is that rotations around the gravity vector can't be detected with an accelerometer alone.

Related

iphone - core motion (relative rotation)

Is there a way to obtain a relative rotation from core motion?
What I need is: how much it rotated in one axis and which direction (+ sign = anti-clockwise, - = clockwise, according to the right-hand rule).
I have found the property rotationRate, but I am now sure how I would extract the angle out of it, as this is giving me radians per second.
I have done all kind of stuff on the last days but nothing is giving me stable values. I have tried to do a timed sample of core motion data, using a NSTimer and calculate the difference between two samples, so I would have how much it rotated since the last sample, but from times to times it gives me crazy numbers like 13600 degrees even when the iPhone is resting on the table.
Any thoughts on how this can be accomplished?
thanks
There is indeed. You can get what you're looking for by drilling down into the properties of CMMotionManager, through CMDeviceMotion and finally to CMAttitude. The attitude of the device is defined as:
the orientation of a body relative to
a given frame of reference.
In the case of DeviceMotion's CMAttitude, that frame of reference is established by the framework when starting device motion updates. From that point in time on, the attitude of the device is reported relative to that reference frame (not relative to the previous frame).
The CMAttitude class provides some handy built in functionality to convert a CMAttitude to a form that is actually useful for something, like Euler Angles, a rotation matrix, or a quaternion. You sound like you're looking for the Euler Angle representation (Pitch, Yaw, Roll).
The answer provided above isn't quite accurate, though it's probably sufficient to answer this question. Core Motion tries to determine the device's absolute attitude at all times, meaning that the definition of the axes can vary depending on the device's orientation. For example, if the device is face-up, then pitch up/down is a rotation about the y-axis, but if the device is in landscape orientation, then pitch is a rotation about the z-axis (perpendicular to the plane of the screen). This is somewhat helpful if your application will only be used in one orientation, or you want a delta like the question asked for, but makes it excessively complicated if you want to know absolute orientation.

iPhone - What does the gyroscope measures? Can I get an absolute degree measurement in all axis?

I am relatively new to iPhone development and I am playing with the gyroscope, using Core Motion. After a few tests, this is my question.
What information is exactly the gyroscope measuring? absolute angles? I mean, suppose I hold my phone in portrait, at exactly 90 degrees and start sampling. These may be not the correct values, but suppose that at this position, the gyroscope gives me 0, 0 and 0 degrees for yaw, pitch and roll.
Now I throw my iphone in the air and as it goes up it rolls at random a high number of full turns in all axis and returns to my hand at the same position as before. Will the gyroscope read 0,0,0 (meaning that it has the same position as before = absolute angle) or not?
If not, there's a way to measure absolute degrees in all axis? As absolute degrees I mean assuming 0,0,0 as the position it was when the sampling started.
thanks
The gyroscope measures many things for you, and yes, one of these is "absolute angles". Take a look at the docs on CMDeviceMotion. It can give you a rotation rate, which is how fast the device is spinning, and it can give you a CMAttitude. The CMAttitude is what you're calling "absolute angles". It is technically defined as:
the orientation of a body relative to
a given frame of reference
The really nice thing is that normal gyroscopes, as noted in the other answer, are prone to drift. The Core Motion framework does a lot of processing behind the scened for you in an effort to compensate for the drift before the measurements are reported. Practically, I've found that the framework does a remarkable (though not perfect) job at this task. Unless you need long term precision to a magnetic pole or something, the attitude reported by the framework can be considered as a perfect relative attitude measurement, for all intents and purposes.
The iPhone uses accelerometers for its internal angle measurements, which means they are relative to the Earth's gravity. That's about as absolute as you're going to get, unless you need this program to work in space, too.

How can i handle distance through accelerometer?

My object starts from zero. When the time goes..It covers some distance, so how can I measure this?
Oh, it's simple. All you have to do is implement an Inertial Measurement Unit and then an Inertial Navigation System. It's going to be hard to do without rotation sensors, it would probably require a Kalman Filter for accuracy, and typically it is done with ring laser gyros or fiber optic gyros, which are "solid state" devices that work by measuring relativistic effects and sell for rather higher prices than the silicon micromachined sensors in the iPhone, but you might get it to work.
Or, you could just use the GPS.
Other than just being alerted that the device did move, the accelerometer will not be much use. You will not get a reading of "device moved 10cm" or something similar, as far as I know you'll just get a value for how much acceleration occurred.
If you need to track your device's movement in the physical world you'll need to use the Location APIs.
You can figure this out, but it won't be that accurate, mainly due to sample rate and the inaccuracy of the accelerometer.
First figure out direction and force of the movement. If the user moves the iphone at +0.1G along the X axis and 0G along the Y and Z axis, then our force is +0.1G on the X axis. 1G is 9.8m/s, so the phone has move 0.9m if it has been traveling for 1 second.

Calibrating code to iphone acellerometer and gyro data

I'm concepting an iPhone app that will require precise calibration to the iPhones accelerometer and gyro data. I will have to simulate specific movements that I would eventually like to execute code. (Think shake-to-shuffle, or undo).
Is there a good way of doing this already? or something you can come up with? Perhaps some way to generate a time/value graph of the movement data as it is being captured?
Movement data being captured - see the accelerometer graph sample app, which shows the data in real time: http://developer.apple.com/library/ios/#samplecode/AccelerometerGraph/Introduction/Intro.html
The data is pretty noisy - the gyro and accelerometer aren't good enough right now to be able to track where the phone is in local 3d space, for example. The rotation, however, is very solid, and the orientation of the device can be pretty accurately tracked. You may have the best results making gestures out of rotation data instead of movement along an axis. Or, basic direction like shakes along an axis will work as Jacob Jennings said.
A good starting point for accelerometer gesture recognition is this tutorial by Kevin Bomberry at AblePear:
http://blog.ablepear.com/2010/02/iphone-sdk-shake-rattle-roll.html
He sets a blanket threshold for the absolute value of acceleration on any axis. I would generate an 'event' for the axis that had the highest acceleration during the break of the threshold (Z POSITIVE, X NEGATIVE, etc), and push these on an 'event history' queue. At the end of each didAccelerate call, evaluate the queue for patterns that match a gesture, for example:
X POSITIVE, X NEGATIVE, X POSITIVE, X NEGATIVE might be considered a 'shake' along that axis. This should provide a couple different gesture commands.
See the following for a simple queue category addition to NSMutableArray:
How do I make and use a Queue in Objective-C?

How to determine absolute orientation

I have a xyz accelerometer and magnetometer. Now I want to determine the orientation of the device using both. The problem I see is that depending on the device orientation, I'd need to use the sensors in different order.
Let me give an example. If I have the device facing me then changes in both the roll and pitch can be determined with the accelerometer. For yaw I use the magnetometer.
But if I put the device horizontally (ie. turn it 90º, facing the ceiling) then any change in the up vector (now horizontal) isn't notice, as the accelerometer doesn't detect any change. This can now be detected with the magnetometer.
So the question is, how to determine when to use one or the other. Is this enough with both sensors or do I need something else?
Thanks
The key is to use the cross product of the two vectors, gravity and magnetometer. The cross product gives a new vector perpendicular to them both. That means it is horizontal (perpendicular to down) and 90 degrees away from north. Now you have three orthogonal vectors which define orientation. It is a little ugly because they are not all perpendicular but that is easy to fix. If you then cross this new vector back with the gravity vector that gives a third vector perpendicular to the gravity vector and the magnet plane vector. Now you have three perpendicular vectors which defines your 3D orientation coordinate system. The original accelerometer (gravity) vector defines Z (up/down) and the two cross product vectors define the east/west and north/south components of the orientation.
Here is some documentation that walks through this project. As is clear from other answers, the math can be tricky.
http://www.freescale.com/files/sensors/doc/app_note/AN4248.pdf
I think the question "how to determine when to use one or the other" is misguided. You should always use both sensors for orientation. There are cases where one of them is useless. However, these are edge cases.
If I understand you correctly, you'll need something to detect pitch (tilting) and orientation according to the cardinal points (North, East, South and West).
The pitch can be read from the accelerometer.
The orientation according to the cardinal points can be read from a compass.
Combining the output from these two sensors correctly with the right math in your software will most likely give you the absolute orientation.
I think it's doable that way.
Good luck.
In the event you still need absolute orientation you can check this break out board from Adafruit: https://www.adafruit.com/products/2472. The nice thing about this is board is that it has an ARM Cortex-M0 processor to do all of the calculations for you.