I noticed that if I call the startDeviceMotionUpdates method of the CMMotionManager class and then rotate the device and put it back on the table the reported device attitude is changing constantly, every time the reported rotation matrix is different from the last time. Has anyone else noticed the same behavior ?
Apparently drifting of the gyro reference frame is a common issue and this is because it computes current attitude by aggregating rotation rates around axes over time
Related
I remember from WWDC that there was a talk showing a teapot in OpenGL ES, which rotated with movement of device. It appeared like the teapot stood still in space.
When the app launched, the teapot started in a specific position. Then when device got rotated, the teapot rotated too to stand still in space.
At this talk, they mentioned that we must get the "reference frame" e.g. upon app launch, which tells us how the user initially held the device.
For instance here's the accelerometer axis:
I want to know rotation around Y axis, but relative to how the user holds the device.
So when the user holds it upright and rotates around Y, I need to know that rotation value.
I think the key is removing the gravity from the readings? Also I target iPhone 4 / 4S with gyros, but I think CoreMotion would sensor-fusion them automatically.
How could I figure out by how much the user rotated the device around the Y-axis?
From your other question Why is this CMDeviceMotionHandler never called by CoreMotion? I know that you working on iOS 4 - things have changed slightly in iOS5. In general gyro data or even better sensor fusion of accelerometer and gyro data as done in DeviceMotion is the best approach for getting proper results.
So if you got this up and running, you will need to work with CMAttitude's multiplyByInverseOfAttitude method to get all CMDeviceMotion results relative to your reference frame. Just store a reference of the very first CMAttitude in a class member and call multiplyByInverseOfAttitude with it on all subsequent calls. Then all members of CMDeviceMotion.attitude will refer to this reference frame.
For getting the rotation around Y axis, a first approach is to take Euler angles i.e. CMAttitude.roll. If you just need to track small motions this might be fine. If motions are more extensive, you will run into trouble regarding Gimbal Lock. Then you need advanced techniques like quaternion operation to get stable results, but this sounds like an own question.
I'm testing the response time differences between the native accelerometer method and the cocos2d udpate method (which is every frame, or 60 times a second if using the maximum frame rate) where update takes variable information that is updated in accelerometer.
Of course, when moving sprites across the screen using just the accelerometer method, they are not smooth and even if the sprite's position is calculated the same with the acceleration response in either accelerometer and update the accelerometer method clearly doesn't update as often as the sprite moves much more slowly across the screen.
I'm guessing this is because iOS does not natively update the UIAccelerometer anywhere near 60 times a second, so does anyone know where I can find out how often it does?
its up to you.
the UIAccelerometer object has Updateinterval property that you defined
how many times the accelerometer will update.
Ex:
UIAccelerometer *Accel;
Accel.updateInterval = 1.0f/30.0f;
which mean 30 times per one seconed.
it can take up to 60.
I never really understand the applications of the gyroscope on the iPhone/iPad, does it serve the similar function as the accelerometer but like an improvement to the accelerometer? What is the practical use of it?
"An accelerometer is a direct measurement of orientation, while a gyro is a measurement of the time rate of change of orientation." (1) By combing the output of the two sensors, called sensor fusion, one can determine the orientation of the device precisely and fast.
If you only use accelerometer with a low-pass filter, you still get a reasonable estimate for the orientation but it will lag.
Here is an excellent live demo of both (Google Tech Talk), starting at 21:50.
Gyroscope measures orientation, where accelerometer measures movement. Both have useful applications (gyroscope: Which direction am I driving towards? Accelerometer: Did I just shake my device?)
The accelerometer tells you the difference in the force being experienced by the device and the force it would experience if it were in free fall. So if the device is static, the accelerometer tells you which way up is. When it's being shaken around, you get a summation of up plus the direction of the shake. Hence the accelerometer can detect some rotation, but not around the gravity vector and only if the device is otherwise static.
The gyroscope tells you the velocity at which the device is being rotated. So you can integrate values coming from it to keep track of orientation. That works across all axes and irrespective of device movement.
I've been trying to rotate my view based on the CMAttitude returned from CMMotionManager specifically the pitch=x and roll=y. I'm using a reference attitude to set my horizon.
This works great for portrait mode but the minute i try to do it for a landscape view it goes wrong.
As the phone is now rotated 90 ccw I was hoping that coremotion would know landscape was in place and keep the pitch and roll useful. Instead I still have the axis pointing their original way.
To try and compensate I simply changed the sign on roll=x and switched pitch=y.
This appeared to work till I held the device in front of me and the turned around 180 degrees. The view spun upside down and inverted.
My spidy sense is telling me I need to apply a proper transformation on the pitch roll and yaw to reorientate the attitude
I'm hoping some geniuses or genii can help me. Maths is obviously not a strong point of mine.
Your are right, changing pitch and roll will lead into serious trouble. The simplest way seems to work with a new reference attitude like in CoreMotionTeapot sample. Just when the orientation change is detected, you have to grab the current attitude before multiplying it with your former reference attitude and set it as new reference attitude.
I'd like to transform the yaw, pitch and roll of the iPhone from the body frame to the world frame, i.e. azimuth, pitch and roll. On Android this is easily done with the
SensorManager.remapCoordinateSystem(), SensorManager.getOrientation methods as detailed here: http://blog.mysticlakesoftware.com/2009/07/sensor-accelerometer-magnetics.html
Are similar methods available for the iPhone or can someone point me in the right direction how to do this transformation?
Thanks
The accelerometer is good enough to get gravity direction vector in device coordinate system. That is in case when device calms down.
The next step for full device orientation is to use CLLocationManager and get the true north vector in device coordinate system.
With the normalized true north vector and gravity vector you can easily get all other directions using the dot and cross vectors product.
The accelerometer (UIAccelerometer) will give you a vector from the device's accelerometer chip straight down. If you can assume that the device is being held fairly steady (i.e., that you're not reading acceleration from actual movement), then you can use simple trig (acos(), asin()) to determine the device's orientation.
If you're worried that the device might be moving, you can wait for several accelerometer readings in a row that are nearly the same. You can also filter out any vector with a length that's ± TOLERANCE (as you define it) from 1.0
In more general terms, the device has no way of knowing its orientation, other than by "feeling gravity", which is done via the accelerometer. The challenges you'll have center around the fact that the accelerometer feels all acceleration, of which gravity is only one possible source.
If you're targeting a device with a gyroscope (iPhone 4 at the time of writing), the CoreMotion framework's CMMotionManager can supply you with CMDeviceMotion updates. The framework does a good job of processing the raw sensor data and separating gravity and userAcceleration for you. You're interested in the gravity vector, which can define the pitch and roll with a little trig. To add yaw, (device rotation around the gravity vector) you'll also need to use the CoreLocation framework's CLLocationManager to get compass heading updates.