I've been trying to rotate my view based on the CMAttitude returned from CMMotionManager specifically the pitch=x and roll=y. I'm using a reference attitude to set my horizon.
This works great for portrait mode but the minute i try to do it for a landscape view it goes wrong.
As the phone is now rotated 90 ccw I was hoping that coremotion would know landscape was in place and keep the pitch and roll useful. Instead I still have the axis pointing their original way.
To try and compensate I simply changed the sign on roll=x and switched pitch=y.
This appeared to work till I held the device in front of me and the turned around 180 degrees. The view spun upside down and inverted.
My spidy sense is telling me I need to apply a proper transformation on the pitch roll and yaw to reorientate the attitude
I'm hoping some geniuses or genii can help me. Maths is obviously not a strong point of mine.
Your are right, changing pitch and roll will lead into serious trouble. The simplest way seems to work with a new reference attitude like in CoreMotionTeapot sample. Just when the orientation change is detected, you have to grab the current attitude before multiplying it with your former reference attitude and set it as new reference attitude.
Related
I have an idea for an iPhone game / app that needs to be able to track height position of the iPhone. I am new to iPhone development so I don't know how the accelerometer works. But the idea is that the user should place the iphone on a flat surface (with the iPhone back against the surface). The user will the lower and raise the surface periodically and the iPhone should be able to track this movement. We can assume that the surface will go back to its original position so we only care about how much it was lowered / raised from its original position during the movement.
The amount raised / lowered will be a few centimeters. Is this possible to track and how would you go about solving this?
Thank you very much for your help!
Best regards,
Lukas
This is not possible to track directly. However, the accelerometer data can be used to sort of do that. Acceleration is the time-derivative of speed, which is the time-derivative of position. By integrating the acceleration twice, you can track position.
Caveat though: this will probably not be very accurate, with significant drift errors.
Now you can also track orientation with the magnetometer, and you can use the camera to "watch" the environment. This suggests the possibility to fix the position by triangulation.
I don't expect that to be easy though.
I remember from WWDC that there was a talk showing a teapot in OpenGL ES, which rotated with movement of device. It appeared like the teapot stood still in space.
When the app launched, the teapot started in a specific position. Then when device got rotated, the teapot rotated too to stand still in space.
At this talk, they mentioned that we must get the "reference frame" e.g. upon app launch, which tells us how the user initially held the device.
For instance here's the accelerometer axis:
I want to know rotation around Y axis, but relative to how the user holds the device.
So when the user holds it upright and rotates around Y, I need to know that rotation value.
I think the key is removing the gravity from the readings? Also I target iPhone 4 / 4S with gyros, but I think CoreMotion would sensor-fusion them automatically.
How could I figure out by how much the user rotated the device around the Y-axis?
From your other question Why is this CMDeviceMotionHandler never called by CoreMotion? I know that you working on iOS 4 - things have changed slightly in iOS5. In general gyro data or even better sensor fusion of accelerometer and gyro data as done in DeviceMotion is the best approach for getting proper results.
So if you got this up and running, you will need to work with CMAttitude's multiplyByInverseOfAttitude method to get all CMDeviceMotion results relative to your reference frame. Just store a reference of the very first CMAttitude in a class member and call multiplyByInverseOfAttitude with it on all subsequent calls. Then all members of CMDeviceMotion.attitude will refer to this reference frame.
For getting the rotation around Y axis, a first approach is to take Euler angles i.e. CMAttitude.roll. If you just need to track small motions this might be fine. If motions are more extensive, you will run into trouble regarding Gimbal Lock. Then you need advanced techniques like quaternion operation to get stable results, but this sounds like an own question.
I noticed that if I call the startDeviceMotionUpdates method of the CMMotionManager class and then rotate the device and put it back on the table the reported device attitude is changing constantly, every time the reported rotation matrix is different from the last time. Has anyone else noticed the same behavior ?
Apparently drifting of the gyro reference frame is a common issue and this is because it computes current attitude by aggregating rotation rates around axes over time
I don't have an iPhone 4 with me right now and I am trying to find a documentation that shows the ranges of yaw, pitch and roll and the correspondent positions of the device.
I know that the accelerometer varies from -1 to +1 but on my tests yesterday on my iPhone, showed me that the roll varies from -M_PI to +M_PI, but yaw and pitch ranges are half of that. Is this correct?
Where do I find documentation about these ranges? I don't see it on Apple vague docs.
Thanks.
This is not a full answer, but in the interest of starting the ball rolling:
I'm assuming you are talking about the device attitude rather than the raw gyro data.
Anecdotally (I have an ipod touch 4 gen sitting in front of me displaying these values):
pitch: looks to be a range of -(M_PI/2) -> +(M_PI/2) although mine caps at ~ +1.55 / -1.51
roll: -M_PI -> +M_PI
yaw: -M_PI -> +M_PI
Just a note, at least on my device pitch doesn't differentiate tilt "forward" vs "backward", just gives the angle of the device relative to the direction of gravity. To figure out if the screen is pointed down or up, you can of course check gravity.z.
If you're using CMDeviceMotion there is a property called gravity on it, just grab gravity.z. It will be negative if the device's display is tilting upward (away from gravity) and positive if the display is facing down (toward gravity)
Note that the algorithms used by CMDeviceMotion are pretty good at separating gravity from user acceleration but under certain kinds of motion there may be some lag before the values become correct, I would love to here from someone with a better solution.
I have recently faced the same problem for an iOS app that counts the number of flips that the phone does. Apple has rejected it so I have published it on GitHub, may be useful for you:
Flip Your Phone! - https://github.com/apascual/flip-your-phone
I never thought on the solution using the gravity Z variable, I will try it soon and I come back with some updates.
I am developing one game where I want to move UIImageView based on accelerometer. When I rotate iphone device left to right or right to left the UIImageView have to rotate in particular angle. It's moving also but the problem occurs when I play background sound because of that sound, it sends some acceleration point even if my iphone is idle.
So my UIImageView is also moving. It should not happen. When I decrease the iphone sound volume it works very well. What I have to do for that.
And also if anyone knows how to get acceleration point only when iphone is moving from left to right or right to left. It should not detect when iphone is xz or yz plane.
If anybody knows the solution please reply.
Have you got any filtering on the input from the accelertometer? I would expect the noise from the speaker the accelerometer is picking up is vastly different in amplitude and frequency than the game control.
There is a simple low pass filter in the Apple accelerometer graph sample code.