I'm trying to detect a swinging motion with an iPhone 4 using the gyro/accelerometer. I searched for some posts on SO about this, but couldn't find anything specific to my issues.
Do I need to do any sort of calibration for data from the gyroscope/accelerometer?
Anyone think of how I would measure a swinging motion?
Thanks!
1: Most iPhone games using the accelerometer don't do any calibration, but not all iphones are the same; there is some variation in accelerometer calibration. You could add a manual or automatic calibration to your program. If however, detecting a swinging motion is all you want, calibration is not necessary.
2: Apple has a nice little app that generates graphs of accelerometer motions in the iPhone SDK. You can download and build that and see the measurements for the motion you want. Then you can write code to detect similar accelerometer measurements.
Related
I have been experimenting with the Core Motion framework to detect a user spinning around, say on a merry-go-round, holding an iphone in his hand.
There are ways to detect the device motion around its own axes, but what is a good way to detect the iPhone spinning in circles?
Thanks
You can use the gyroscope. Take a look here: Gyroscope example
You have to remind that it is only availble on iPhone4 and iPhone4S.
There is one degenerate case where you can run into trouble, only magnetometer (compass) can help in that particular case.
If you put the device (a) on the desk in stationary position then (b) on a perfectly horizontal turntable rotating slowly you will get the same qualitative sensor readings. Both the gyro and the accelerometer readings are constant in the two cases, although the readings quantitatively differ. The sad part is: gyro bias error can render case (a) to look like (b) and vice-versa. In this particular case you need a compass to cancel the gyro drift. Case (a) is typical for a phone.
Apart from this degenerate case, gyroscopes and accelerometers with sensor fusion are sufficient to track arbitrary rotations of the device.
Morning,
I have hunted around StackOverFlow for about an hour and found lots of sample code (mainly github) for creating Augmented Reality apps that display where a 2nd location is relative to your current location (e.g. New York).
However, I noticed that none of these are using the Gyroscope functionality provided in the iPhone 4 that gives a far smoother experience to the end users.
Does anyone know if such an example of sample code exists?
Cheers,
Charlie
You can definitely use CoreMotion to get data from the gyro. The basic approach would be to get CMAttitude.rotationMatrix and multiply its inverse (transpose) by a reference matrix which you initially set. The Teapot sample project on developer.apple.com shows the basic approach of working with CoreMotion.
For a true augmented reality app you will need to create a model using OpenGL ES. I personally found v1.1 to be more reliable on iOS, after having tried GL ES 2.0. The Teapot sample also uses GLES 1.1.
Using the gyro is much more accurate and "smooth" than using the Magneotmeter for getting the device's rotation around its reference axis. The trick is how to initially calibrate the reference matrix in order to get the true "heading" of the device and to place your GL ES model objects in the correct position around the camera. After you have achieved that you can rotate your model in 3D by multiplying of GL's viewMatrix with the inverse of the CMAttitude.rotationMatrix.
Lastly, if you intend to support iPhone 3Gs as well then don't forget to check gyroAvailable property of CMMotionManager and provide an alternative implementation using the magnetometer.
You can try using CMMotionManager instance methods
startDeviceMotionUpdatesToQueue:withHandler: or startGyroUpdatesToQueue:withHandler:
[CMMotionManagerObject startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue] withHandler:^ (CMDeviceMotion *devMotion, NSError *error)
{
CMAttitude *currentAttitude = devMotion.attitude;
xRotation = currentAttitude.roll*180/M_PI;
yRotation = currentAttitude.pitch*180/M_PI;
zRotation = currentAttitude.yaw*180/M_PI;
}];
If you use startGyroUpdatesToQueue:withHandler: you can get the result through the property gyroData
I don't know of any code sample, unfortunately.
A problem common to all AR apps is that you need to find out the orientation of your device. You can do that with atan2 and the accelerometer, but it has an unholy amount of noise (as seen in Apple's AccelerometerGraph sample project). If you try to fix it with an adaptive low pass filter you reduce the noise but you also make it less responsive.
The gyro doesn't have noise but error accumulates fast enough that you have to constantly reset the position using the accelerometer. It seems good to rotate an object, but not to replace the compass.
I'm currently trying to implement an augmented reality iphone application (ios 4.2) that uses accelerometer data to translate and rotate an OpenGl object on the screen. I have already succeeded in getting the object to respond to the phones rotation, but this was always going to be the easy part.
For the translational part, I have I've tried implementing some of the techniques from this paper (http://www.freescale.com/files/sensors/doc/app_note/AN3397.pdf)
but it's still not very accurate. I'm in the process of implementing a kalman filter to filter the accelerometer data.
Has anyone had any luck in determining phone translational movement? If so, how accurate did you get it, and what techniques did you use to obtain this accuracy?
Has any one developed 6DOF pose estimation using only the iPhone sensors, not video? Drift from the accelerometer and gyroscope are understood.
The gyroscope provides fairly reliable relative orientation rates. I've been able to develop with the gyroscope data.
However, I'm having more problems deriving translation from the accelerometer. Double integration of the acceleration leads to useless position data very quickly (less than half a second).
I have attempted to remove the bias with a calibration step, but the position is still poor. What's worse, is the bias isn't constant. It changes over time, and the noise drowse the signal.
I'm interested if anyone has been able to develop a 6DOF with only the accelerometer and gyroscope that works reliably for 5-10 seconds with little drift in both translation and orientation.
The gyro yaw using DeviceMotion has a drift when you first start updating, try not to take those samples and everyone will be happy.
I made a post about this: Get orientation device in the iPhone for Opengl Es, I having the same issue, I´m just trying to make a filter but it´s not working good, there is a book about these http://www.amazon.com/iOS-Sensor-Programming-Augmented-Location/dp/1449382657, but I didn´t read this book.
Is there a way to detect if an iphone lying down in a table face up is rotating?. I do realize that this kind of movement is not reported by the accelerometer and neither is it reported to the - (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation method.
Is there a way to detect angle variations for the phone rotating this way?. Thank you.
The gravity vector will be constant as it rotates on a flat table so you won't see anything on the accelerometers. You could follow compass heading changes to detect this rotation but only on an iPhone 3G S. See the CLLocationManager for details, look at the heading methods.
EDIT - With an iPhone 4 you can detect the rotation using the gyros. There is a new class in iOS 4 called CMMotionManager for getting rotation rate from the gyros.
When the phone is stationary the sum of the acceleration vectors should be +1. When the phone is rotating (assuming the sensor is off-center) the sum of the vectors should be more than 1 and (hopefully) somewhat constant.
If you look at the decay of that curve, I wouldn't be surprised if that shape is distinctive enough to be used to determine whether the phone is rotating or not.
This is the AccelerometerGraph sample app from Apple.
I guess you could do it if the iPhone has a compass. Other than that I don't think it will be possible or reliable.
This would really depend on the location of the accelerometer on the device, i just tested this using the accelerometergraph sample application on a 2g itouch and you can see the initial acceleration on the x and y axis(the 2g does not have the accelerometer in the center of the device I guess). So in a sense you could detect the rotation, however I think the challenge would be differentiating that acceleration from directional acceleration. And I'm sure the values would change if apple placed the accelerometer in different locations on different models. There would definitally not be any way of doing it via shouldAutorotateToInterfaceOrientation, I recommend you load the accelerometergraph sample application in the sdk and experiment with the acceleration vectors to see if you can isolate a rotation vector reliably on multiple devices.