I'm concepting an iPhone app that will require precise calibration to the iPhones accelerometer and gyro data. I will have to simulate specific movements that I would eventually like to execute code. (Think shake-to-shuffle, or undo).
Is there a good way of doing this already? or something you can come up with? Perhaps some way to generate a time/value graph of the movement data as it is being captured?
Movement data being captured - see the accelerometer graph sample app, which shows the data in real time: http://developer.apple.com/library/ios/#samplecode/AccelerometerGraph/Introduction/Intro.html
The data is pretty noisy - the gyro and accelerometer aren't good enough right now to be able to track where the phone is in local 3d space, for example. The rotation, however, is very solid, and the orientation of the device can be pretty accurately tracked. You may have the best results making gestures out of rotation data instead of movement along an axis. Or, basic direction like shakes along an axis will work as Jacob Jennings said.
A good starting point for accelerometer gesture recognition is this tutorial by Kevin Bomberry at AblePear:
http://blog.ablepear.com/2010/02/iphone-sdk-shake-rattle-roll.html
He sets a blanket threshold for the absolute value of acceleration on any axis. I would generate an 'event' for the axis that had the highest acceleration during the break of the threshold (Z POSITIVE, X NEGATIVE, etc), and push these on an 'event history' queue. At the end of each didAccelerate call, evaluate the queue for patterns that match a gesture, for example:
X POSITIVE, X NEGATIVE, X POSITIVE, X NEGATIVE might be considered a 'shake' along that axis. This should provide a couple different gesture commands.
See the following for a simple queue category addition to NSMutableArray:
How do I make and use a Queue in Objective-C?
Related
The apple documentation for UIAcceleration class says,
"When a device is laying still with its back on a horizontal surface, each acceleration event has approximately the following values:
x: 0
y: 0
z: -1"
Now, I am confused! How can the acceleration be non-zero, when you clearly say the "device is laying still"?
UPDATE
Judging by the responses, I think this should be called something like 'forceometer' or 'gravitometer' and not accelerometer!
You get a -1 on the Z axis because gravity is acting on the device, applying a constant acceleration of 1G. I assume you want user acceleration, which you can get from the DeviceMotion object using a device motion handler as opposed to an acceleration handler. The userAcceleration property filters out the effects of gravity on the device and only gives you how much the user is accelerating it.
I found the answer [in the CoreMotion Reference guide, thanks to bensnider:
The accelerometer measures the sum of two acceleration vectors: gravity and user acceleration. User acceleration is the acceleration that the user imparts to the device.
You'll find the best answers in datasheet of the accelerometer used (LIS302DL).
It measures the gravity. The unit is chosen so that the gravity, 9.81 m/s^2, equals 1 unit. The sign tells how the phone axis is directed. In other words, what the phone considers downwards.
The phone measures 0 as acceleration in free fall. I don't know how much you want to throw your phone up and down to test it :)
When you're sitting, gravity is pulling you down to your chair. If it weren't for the chair or ground for that matter, you'd be falling down with acceleration of about 9.8m/s^2. In order for the chair to prevent you from falling down, it has to act with a force in the opposite direction with at least the same value.
The accelometer shows the value of the pulling force and it's a three-dimensional vector. In this case it's directed straight down. And the value given is expressed in G, units of gravity acceleration multiplied by that value.
Answerers keep missing the right wording that should set it straight for you... The device is "laying still" only relatively to you. It is actually not laying still at all. The http://en.wikipedia.org/wiki/Centripetal_force of gravity gives it (and you) centripetal acceleration. It is real, it is what keeps you from flying off Earth on a tangent, and it is what the accelerometer dutifully shows. (Earth is nothing special - we rotate about the Sun also etc etc, whose centripetal accelerations are way smaller, but they would be all shown by an accelerometer sensitive enough.)
I don't yet have sufficient reputation to reply directly to the comment by #gigahari above, but as an addendum, folks should be aware that some apps (such as the physics apps phyphox and PhysicsToolbox Sensor Suite) do not report (a+g) -- both phyphox's "with g" option and PhysicsToolbox report the vector sum (a-g), which is sometimes referred to as the "Operational Definition of Weight." A brief discussion of this version of the operational definition of weight is on WikiPedia, at https://en.wikipedia.org/wiki/Weight#Operational_definition
Is there a way to obtain a relative rotation from core motion?
What I need is: how much it rotated in one axis and which direction (+ sign = anti-clockwise, - = clockwise, according to the right-hand rule).
I have found the property rotationRate, but I am now sure how I would extract the angle out of it, as this is giving me radians per second.
I have done all kind of stuff on the last days but nothing is giving me stable values. I have tried to do a timed sample of core motion data, using a NSTimer and calculate the difference between two samples, so I would have how much it rotated since the last sample, but from times to times it gives me crazy numbers like 13600 degrees even when the iPhone is resting on the table.
Any thoughts on how this can be accomplished?
thanks
There is indeed. You can get what you're looking for by drilling down into the properties of CMMotionManager, through CMDeviceMotion and finally to CMAttitude. The attitude of the device is defined as:
the orientation of a body relative to
a given frame of reference.
In the case of DeviceMotion's CMAttitude, that frame of reference is established by the framework when starting device motion updates. From that point in time on, the attitude of the device is reported relative to that reference frame (not relative to the previous frame).
The CMAttitude class provides some handy built in functionality to convert a CMAttitude to a form that is actually useful for something, like Euler Angles, a rotation matrix, or a quaternion. You sound like you're looking for the Euler Angle representation (Pitch, Yaw, Roll).
The answer provided above isn't quite accurate, though it's probably sufficient to answer this question. Core Motion tries to determine the device's absolute attitude at all times, meaning that the definition of the axes can vary depending on the device's orientation. For example, if the device is face-up, then pitch up/down is a rotation about the y-axis, but if the device is in landscape orientation, then pitch is a rotation about the z-axis (perpendicular to the plane of the screen). This is somewhat helpful if your application will only be used in one orientation, or you want a delta like the question asked for, but makes it excessively complicated if you want to know absolute orientation.
My object starts from zero. When the time goes..It covers some distance, so how can I measure this?
Oh, it's simple. All you have to do is implement an Inertial Measurement Unit and then an Inertial Navigation System. It's going to be hard to do without rotation sensors, it would probably require a Kalman Filter for accuracy, and typically it is done with ring laser gyros or fiber optic gyros, which are "solid state" devices that work by measuring relativistic effects and sell for rather higher prices than the silicon micromachined sensors in the iPhone, but you might get it to work.
Or, you could just use the GPS.
Other than just being alerted that the device did move, the accelerometer will not be much use. You will not get a reading of "device moved 10cm" or something similar, as far as I know you'll just get a value for how much acceleration occurred.
If you need to track your device's movement in the physical world you'll need to use the Location APIs.
You can figure this out, but it won't be that accurate, mainly due to sample rate and the inaccuracy of the accelerometer.
First figure out direction and force of the movement. If the user moves the iphone at +0.1G along the X axis and 0G along the Y and Z axis, then our force is +0.1G on the X axis. 1G is 9.8m/s, so the phone has move 0.9m if it has been traveling for 1 second.
Games like FroggyJump for iPhone figure out the rotation of the iphone. I'm getting confused with the acceleration values. How do I calculate the level of rotation? I suppose I need to consider when the iphone isn't perfectly upright.
Thank you.
I'm also wanting to use the new Core Motion framework with the "Device Motion" for iPhone 4 for extra precision. I guess I'll have to use that low pass filter for the other devices.
It's the yaw.
Having given Froggy Jump a quick go, I think it's likely directly using the accelerometer's x value as the left/right acceleration on the frog. If it is stationary, you can think of an accelerometer as giving you the vector that points upward into space, relative to the local axes. For something like a ball rolling or anything else accelerating due to tilt, you want to use the values directly.
For anything that involves actually knowing angles, you're probably best picking the axis around which you want to detect rotation then using the C function atan2f on the accelerometer values for the other two axes. With just an accelerometer, there are some scenarios in which you can't detect rotation — for example, if the device is flat on a table then an accelerometer can't detect yaw. The general rule is that rotations around the gravity vector can't be detected with an accelerometer alone.
Is it possible to determine the speed at which someone is shaking their iPhone? This would be the time they start moving to the ending point where they are now going back to the origin. Basically it is one swipe that I'd like to measure the speed of. This discussion comments on initial speed: http://discussions.apple.com/message.jspa?messageID=8297689#8297689. It seems that the important component of distance is lacking in the iPhone to get a good measure of speed.
Sure, it sounds like all you'd need to do would be to numerically integrate the acceleration twice to get the distance traveled. For instance, look at
Calculate the position of an accelerating body after a certain time
Note that you'll have to subtract gravity from the measured acceleration to get the kinetic acceleration, which is what you should integrate. As for how to do that, re: GoatRider's comment: I might try storing the last measured acceleration whose magnitude was equal to gravity (I think that's 1 in iPhone units?). Then for each acceleration measurement you make whose magnitude is greater than 1, subtract the last known acceleration of gravity - this will need to be a vector subtraction - and use that as the kinetic acceleration. Of course, this assumes that the user keeps the phone in the same orientation throughout the swipe, which I think would be approximately true.
Unfortunately, there's no technique you can use to distinguish between gravitational acceleration and kinetic acceleration in general - that is, a determined user could always find a way to fool whatever algorithm you might come up with. (Trivia: that's called the equivalence principle, and it's the foundation of Einstein's theory of general relativity)
You'll have to do the calculations yourself. Each acceleration event you receive will tell you the relative G-forces registering on the accelerometer and the time at which the event was recorded. You'll have to sample over several events and interpolate. Here's more info on the acceleration event itself:
UIAcceleration Class Reference