Spinning iphone based on ground - iphone

I am rotate iphone based on the ground
But the UIAcceleration values of (x&y&z) are same . Can you tell me exactly what happens here ?
Also i want know how to its calculate UIAcceleration values of(x&y&z) ?
I am really interested to work with UIAcceleration values.
Can anyone help me?
Thanks in advance.....

The UIAcceleration value can only detect acceleration relative to the orientation of the device.
By rotating the iPhone, in its frame of reference there is a constant gravity force (along the z axis) and a constant centrifugal force (on the xy-plane), so the UIAcceleration will not change.
You cannot distinguish acceleration due to gravity or linear acceleration or rotation by the accelerometer. You need other devices e.g. the compass (magnetometer) to detect a rotational change.

Related

How do I get absolute acceleration in unity?

I want to define a coordinate system when unity starts based on the phone, but then have that stay constant forever on after, so that when I then measure acceleration, it disregards how the phone is tilted (where it is facing) and instead just measures how it is moved. Is there an api for this? The normal Input.acceleration or gyro doesn't work.
Using Gyroscope.userAcceleration; is the possible solution. It measures the acceleration that user provides to the device i.e, it discards the effect of gravity.

How to get velocity from variable accelerometer(in simple linar motion)?

I want compute the current iphone motion velocity anytime based on accelerometer, the accelerometer is variable. Anyone can give any idea?
It's basically impossible. The only way is to integrate the acceleration, but that magnifies the inaccuracy of the iPhone's not very accurate accelerometer, and because you don't have an independent orientation sensor (the iPhone uses gravity to figure that out!), you can't distinguish lateral acceleration from tilting the phone.
How people do this in the real world is to measure velocity using something else like GPS, and use the accelerometer to interpolate.

Transform device orientation to world frame in objective c

I'd like to transform the yaw, pitch and roll of the iPhone from the body frame to the world frame, i.e. azimuth, pitch and roll. On Android this is easily done with the
SensorManager.remapCoordinateSystem(), SensorManager.getOrientation methods as detailed here: http://blog.mysticlakesoftware.com/2009/07/sensor-accelerometer-magnetics.html
Are similar methods available for the iPhone or can someone point me in the right direction how to do this transformation?
Thanks
The accelerometer is good enough to get gravity direction vector in device coordinate system. That is in case when device calms down.
The next step for full device orientation is to use CLLocationManager and get the true north vector in device coordinate system.
With the normalized true north vector and gravity vector you can easily get all other directions using the dot and cross vectors product.
The accelerometer (UIAccelerometer) will give you a vector from the device's accelerometer chip straight down. If you can assume that the device is being held fairly steady (i.e., that you're not reading acceleration from actual movement), then you can use simple trig (acos(), asin()) to determine the device's orientation.
If you're worried that the device might be moving, you can wait for several accelerometer readings in a row that are nearly the same. You can also filter out any vector with a length that's ± TOLERANCE (as you define it) from 1.0
In more general terms, the device has no way of knowing its orientation, other than by "feeling gravity", which is done via the accelerometer. The challenges you'll have center around the fact that the accelerometer feels all acceleration, of which gravity is only one possible source.
If you're targeting a device with a gyroscope (iPhone 4 at the time of writing), the CoreMotion framework's CMMotionManager can supply you with CMDeviceMotion updates. The framework does a good job of processing the raw sensor data and separating gravity and userAcceleration for you. You're interested in the gravity vector, which can define the pitch and roll with a little trig. To add yaw, (device rotation around the gravity vector) you'll also need to use the CoreLocation framework's CLLocationManager to get compass heading updates.

How can I detect if an iPhone is rotating while being face up in a table?

Is there a way to detect if an iphone lying down in a table face up is rotating?. I do realize that this kind of movement is not reported by the accelerometer and neither is it reported to the - (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation method.
Is there a way to detect angle variations for the phone rotating this way?. Thank you.
The gravity vector will be constant as it rotates on a flat table so you won't see anything on the accelerometers. You could follow compass heading changes to detect this rotation but only on an iPhone 3G S. See the CLLocationManager for details, look at the heading methods.
EDIT - With an iPhone 4 you can detect the rotation using the gyros. There is a new class in iOS 4 called CMMotionManager for getting rotation rate from the gyros.
When the phone is stationary the sum of the acceleration vectors should be +1. When the phone is rotating (assuming the sensor is off-center) the sum of the vectors should be more than 1 and (hopefully) somewhat constant.
If you look at the decay of that curve, I wouldn't be surprised if that shape is distinctive enough to be used to determine whether the phone is rotating or not.
This is the AccelerometerGraph sample app from Apple.
I guess you could do it if the iPhone has a compass. Other than that I don't think it will be possible or reliable.
This would really depend on the location of the accelerometer on the device, i just tested this using the accelerometergraph sample application on a 2g itouch and you can see the initial acceleration on the x and y axis(the 2g does not have the accelerometer in the center of the device I guess). So in a sense you could detect the rotation, however I think the challenge would be differentiating that acceleration from directional acceleration. And I'm sure the values would change if apple placed the accelerometer in different locations on different models. There would definitally not be any way of doing it via shouldAutorotateToInterfaceOrientation, I recommend you load the accelerometergraph sample application in the sdk and experiment with the acceleration vectors to see if you can isolate a rotation vector reliably on multiple devices.

While playing sound acceleration occurs

I am developing one game where I want to move UIImageView based on accelerometer. When I rotate iphone device left to right or right to left the UIImageView have to rotate in particular angle. It's moving also but the problem occurs when I play background sound because of that sound, it sends some acceleration point even if my iphone is idle.
So my UIImageView is also moving. It should not happen. When I decrease the iphone sound volume it works very well. What I have to do for that.
And also if anyone knows how to get acceleration point only when iphone is moving from left to right or right to left. It should not detect when iphone is xz or yz plane.
If anybody knows the solution please reply.
Have you got any filtering on the input from the accelertometer? I would expect the noise from the speaker the accelerometer is picking up is vastly different in amplitude and frequency than the game control.
There is a simple low pass filter in the Apple accelerometer graph sample code.