I don't have an iPhone 4 with me right now and I am trying to find a documentation that shows the ranges of yaw, pitch and roll and the correspondent positions of the device.
I know that the accelerometer varies from -1 to +1 but on my tests yesterday on my iPhone, showed me that the roll varies from -M_PI to +M_PI, but yaw and pitch ranges are half of that. Is this correct?
Where do I find documentation about these ranges? I don't see it on Apple vague docs.
Thanks.
This is not a full answer, but in the interest of starting the ball rolling:
I'm assuming you are talking about the device attitude rather than the raw gyro data.
Anecdotally (I have an ipod touch 4 gen sitting in front of me displaying these values):
pitch: looks to be a range of -(M_PI/2) -> +(M_PI/2) although mine caps at ~ +1.55 / -1.51
roll: -M_PI -> +M_PI
yaw: -M_PI -> +M_PI
Just a note, at least on my device pitch doesn't differentiate tilt "forward" vs "backward", just gives the angle of the device relative to the direction of gravity. To figure out if the screen is pointed down or up, you can of course check gravity.z.
If you're using CMDeviceMotion there is a property called gravity on it, just grab gravity.z. It will be negative if the device's display is tilting upward (away from gravity) and positive if the display is facing down (toward gravity)
Note that the algorithms used by CMDeviceMotion are pretty good at separating gravity from user acceleration but under certain kinds of motion there may be some lag before the values become correct, I would love to here from someone with a better solution.
I have recently faced the same problem for an iOS app that counts the number of flips that the phone does. Apple has rejected it so I have published it on GitHub, may be useful for you:
Flip Your Phone! - https://github.com/apascual/flip-your-phone
I never thought on the solution using the gravity Z variable, I will try it soon and I come back with some updates.
Related
I have an idea for an iPhone game / app that needs to be able to track height position of the iPhone. I am new to iPhone development so I don't know how the accelerometer works. But the idea is that the user should place the iphone on a flat surface (with the iPhone back against the surface). The user will the lower and raise the surface periodically and the iPhone should be able to track this movement. We can assume that the surface will go back to its original position so we only care about how much it was lowered / raised from its original position during the movement.
The amount raised / lowered will be a few centimeters. Is this possible to track and how would you go about solving this?
Thank you very much for your help!
Best regards,
Lukas
This is not possible to track directly. However, the accelerometer data can be used to sort of do that. Acceleration is the time-derivative of speed, which is the time-derivative of position. By integrating the acceleration twice, you can track position.
Caveat though: this will probably not be very accurate, with significant drift errors.
Now you can also track orientation with the magnetometer, and you can use the camera to "watch" the environment. This suggests the possibility to fix the position by triangulation.
I don't expect that to be easy though.
I remember from WWDC that there was a talk showing a teapot in OpenGL ES, which rotated with movement of device. It appeared like the teapot stood still in space.
When the app launched, the teapot started in a specific position. Then when device got rotated, the teapot rotated too to stand still in space.
At this talk, they mentioned that we must get the "reference frame" e.g. upon app launch, which tells us how the user initially held the device.
For instance here's the accelerometer axis:
I want to know rotation around Y axis, but relative to how the user holds the device.
So when the user holds it upright and rotates around Y, I need to know that rotation value.
I think the key is removing the gravity from the readings? Also I target iPhone 4 / 4S with gyros, but I think CoreMotion would sensor-fusion them automatically.
How could I figure out by how much the user rotated the device around the Y-axis?
From your other question Why is this CMDeviceMotionHandler never called by CoreMotion? I know that you working on iOS 4 - things have changed slightly in iOS5. In general gyro data or even better sensor fusion of accelerometer and gyro data as done in DeviceMotion is the best approach for getting proper results.
So if you got this up and running, you will need to work with CMAttitude's multiplyByInverseOfAttitude method to get all CMDeviceMotion results relative to your reference frame. Just store a reference of the very first CMAttitude in a class member and call multiplyByInverseOfAttitude with it on all subsequent calls. Then all members of CMDeviceMotion.attitude will refer to this reference frame.
For getting the rotation around Y axis, a first approach is to take Euler angles i.e. CMAttitude.roll. If you just need to track small motions this might be fine. If motions are more extensive, you will run into trouble regarding Gimbal Lock. Then you need advanced techniques like quaternion operation to get stable results, but this sounds like an own question.
I'm working on an iPhone app for motorcyclist that will detect a crash after it has occurred. Currently we're in the data acquisition process and plotting graphs and looking at data. What i need to log is the forward user acceleration and tilt angle of the bike relative to bike standing upright on the road. I can get the user acceleration vector, i.e. the forward direction the rider is heading by sqrt of the x,y and z accelerometer values squared. But for the tilt angle i need a reference that is constant, so i thought lets use the gravity vector. Now, i realize that deviceMotion API has gravity and user acceleration values, where do these values come from and what do they mean? If i take the sqrt of the x,y and z squared components of the gravity will that always give me my up direct? How can i use that to find the tilt angle of the bike relative to an upright bike on the road? Thanks.
Setting aside "whiy" do this...
You need a very low-pass filter. So once the phone is put wherever-it-rides on the bike, you'll have various accelerations from maneuvers and the accel from gravity ever present in the background. That gives you an on-going vector for "down", and you can then interpret the accel data in that context... Fwd accel would tip the bike opposite of braking, so I think you could sort out fwd direction in real time too.
Very interesting idea.
Assuming that it's not a "joke question" you will need a reference point to compare with i.e. the position taken when the user clicks "starting". Then you can use cos(currentGravity.z / |referenceGravity|) with |referenceGravity| == 1 because Core Motion measures accelerations in g.
But to be honest there are a couple of problems for instance:
The device has to be in a fixed position when taking the reference frame, if you put it in a pocket and it's just moving a little bit inside, your measurement is rubbish
Hmm, the driver is dead but device is alive? Chances are good that the iPhone won't survive as well
If an app goes to the background Core Motion falls asleep and stops delivering values
It has to be an inhouse app because forget about getting approval for the app store
Or did we misunderstand you and it's just a game?
Since this is not a joke.
I would like to address the point of mount issue. How to interpret the data depends largely on how the iPhone is positioned. Some issues might not be apparent to those that don't actually ride motorcycles.
Particularly when it comes to going around curves/corners. In low speed turns the motorcycle leans but the rider does not or just leans slightly. In higher speed turns both the rider and the motorcycle lean. This could present an issue if not addressed. I won't cover all scenarios but..
For example, most modern textile motorcycle jackets have a cell phone pocket just inside on the left. If the user were to put there phone in this pocket, you could expect to see only 'accelerating' & 'braking'(~z) acceleration. In this scenario you would expect to almost never see significant amounts of side to side (~x) acceleration because the rider leans proportionally into the g-force of the turn. So while going around a curve one would expect to see an increase in (y)down from it's general 1g state. So essentially the riders torso is indexed to gravity as far as (x) measurements go.
If the device were mounted to the bike you would have to adjust for what you would expect to see given that mounting point.
As far as the heuristics of the algorithm to detect a crash go, that is very hard to define. Some crashes are like you see on television, bike flips ripping into a million pieces, that crash should be extremely easy to detect, Huh 3gs measured up... Crash! But what about simple downs?(bike lays on it's side, oops, rider gets up, picks up bike rides away) They might occur without any particularly remarkable g-forces.(with the exception of about 1g left or right on the x axis)
A couple more suggestions:
Sensitivity adjustment, maybe even with some sort of learn mode (where the user puts the device in this mode and rides, the device then records/learns average riding for that user)
An "I've stopped" or similar button; maybe the rider didn't crash, maybe he/she just broke down, it does happen and since you have some sort of ad-hoc network setup it should be easy to spread the news.
I'd like to transform the yaw, pitch and roll of the iPhone from the body frame to the world frame, i.e. azimuth, pitch and roll. On Android this is easily done with the
SensorManager.remapCoordinateSystem(), SensorManager.getOrientation methods as detailed here: http://blog.mysticlakesoftware.com/2009/07/sensor-accelerometer-magnetics.html
Are similar methods available for the iPhone or can someone point me in the right direction how to do this transformation?
Thanks
The accelerometer is good enough to get gravity direction vector in device coordinate system. That is in case when device calms down.
The next step for full device orientation is to use CLLocationManager and get the true north vector in device coordinate system.
With the normalized true north vector and gravity vector you can easily get all other directions using the dot and cross vectors product.
The accelerometer (UIAccelerometer) will give you a vector from the device's accelerometer chip straight down. If you can assume that the device is being held fairly steady (i.e., that you're not reading acceleration from actual movement), then you can use simple trig (acos(), asin()) to determine the device's orientation.
If you're worried that the device might be moving, you can wait for several accelerometer readings in a row that are nearly the same. You can also filter out any vector with a length that's ± TOLERANCE (as you define it) from 1.0
In more general terms, the device has no way of knowing its orientation, other than by "feeling gravity", which is done via the accelerometer. The challenges you'll have center around the fact that the accelerometer feels all acceleration, of which gravity is only one possible source.
If you're targeting a device with a gyroscope (iPhone 4 at the time of writing), the CoreMotion framework's CMMotionManager can supply you with CMDeviceMotion updates. The framework does a good job of processing the raw sensor data and separating gravity and userAcceleration for you. You're interested in the gravity vector, which can define the pitch and roll with a little trig. To add yaw, (device rotation around the gravity vector) you'll also need to use the CoreLocation framework's CLLocationManager to get compass heading updates.
Is there a way to detect if an iphone lying down in a table face up is rotating?. I do realize that this kind of movement is not reported by the accelerometer and neither is it reported to the - (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation method.
Is there a way to detect angle variations for the phone rotating this way?. Thank you.
The gravity vector will be constant as it rotates on a flat table so you won't see anything on the accelerometers. You could follow compass heading changes to detect this rotation but only on an iPhone 3G S. See the CLLocationManager for details, look at the heading methods.
EDIT - With an iPhone 4 you can detect the rotation using the gyros. There is a new class in iOS 4 called CMMotionManager for getting rotation rate from the gyros.
When the phone is stationary the sum of the acceleration vectors should be +1. When the phone is rotating (assuming the sensor is off-center) the sum of the vectors should be more than 1 and (hopefully) somewhat constant.
If you look at the decay of that curve, I wouldn't be surprised if that shape is distinctive enough to be used to determine whether the phone is rotating or not.
This is the AccelerometerGraph sample app from Apple.
I guess you could do it if the iPhone has a compass. Other than that I don't think it will be possible or reliable.
This would really depend on the location of the accelerometer on the device, i just tested this using the accelerometergraph sample application on a 2g itouch and you can see the initial acceleration on the x and y axis(the 2g does not have the accelerometer in the center of the device I guess). So in a sense you could detect the rotation, however I think the challenge would be differentiating that acceleration from directional acceleration. And I'm sure the values would change if apple placed the accelerometer in different locations on different models. There would definitally not be any way of doing it via shouldAutorotateToInterfaceOrientation, I recommend you load the accelerometergraph sample application in the sdk and experiment with the acceleration vectors to see if you can isolate a rotation vector reliably on multiple devices.