I have the following code
playerPosition.x += acceleration.x * 10;
playerPosition.y += acceleration.y * 20;
acceleration currently works, I would like to detect if the player should move backwards(y decreasing) and if so change it to
playerPosition.y += acceleration.y * 10;
How does one detect if acceleration from the accelerometer is negative?
How does one retrieve the direction of the movement? left right etc?
You just compare the acceleration with 0
acceleration.y < 0.0
or
acceleration.y > 0.0
There is a free app iSimulate where you can see what values accelerometer outputs.
Related
I am trying to calculate Jerk (http://en.wikipedia.org/wiki/Jerk_(physics)) and jounce (http://en.wikipedia.org/wiki/Jounce) with the acceleration data from the accelerometer. I think I have Jerk figured out, but I am not sure what I am doing for jounce is correct. Can anyone confirm or deny what I am doing is giving me correct values (Do I need to take into consideration time?)
#define kFilteringFactor 0.4
float prevAccelerationX;
float prevAccelerationY;
float prevAccelerationZ;
float prevJerkX;
float prevJerkY;
float prevJerkZ;
- (void)viewDidLoad
{
[super viewDidLoad];
prevAccelerationX = 0;
prevAccelerationY = 0;
prevAccelerationZ = 0;
prevJerkX = 0;
prevJerkY = 0;
prevJerkZ = 0;
[self changeFilter:[LowpassFilter class]];
[[UIAccelerometer sharedAccelerometer] setUpdateInterval:1.0 / kUpdateFrequency];
[[UIAccelerometer sharedAccelerometer] setDelegate:self];
}
// UIAccelerometerDelegate method, called when the device accelerates.
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration
{
float pax = prevAccelerationX;
float pay = prevAccelerationY;
float paz = prevAccelerationZ;
float pjx = prevJerkX;
float pjy = prevJerkY;
float pjz = prevJerkZ;
prevAccelerationX = acceleration.x - ( (acceleration.x * kFilteringFactor) +
(prevAccelerationX * (1.0 - kFilteringFactor)) );
prevAccelerationY = acceleration.y - ( (acceleration.y * kFilteringFactor) +
(prevAccelerationY * (1.0 - kFilteringFactor)) );
prevAccelerationZ = acceleration.z - ( (acceleration.z * kFilteringFactor) +
(prevAccelerationZ * (1.0 - kFilteringFactor)) );
// Compute the derivative (which represents change in acceleration).
float jerkX = ABS((prevAccelerationX - pax));
float jerkY = ABS((prevAccelerationY - pay));
float jerkZ = ABS((prevAccelerationZ - paz));
prevJerkX = jerkX - ( ( jerkX * kFilteringFactor) +
(prevJerkX * (1.0 - kFilteringFactor)));
prevJerkY = jerkY- ( (jerkY * kFilteringFactor) +
(prevJerkY* (1.0 - kFilteringFactor)) );
prevJerkZ = jerkZ - ( (jerkZ * kFilteringFactor) +
(prevJerkZ * (1.0 - kFilteringFactor)) );
// Compute the derivative (which represents change in acceleration).
float jounceX = ABS((prevJerkX - pjx));
float jounceY = ABS((prevJerkY - pjy));
float jounceZ = ABS((prevJerkZ - pjz));
}
In order to calculate derivatives, yes, you need to take time into consideration. Basically you can estimate jerk with just (a2-a1)/samplingtime. Its time derivative is similar. Your way of using kFilteringFactor seems weird to me but might work for your particular sampling time. You should not take ABS(), as it is perfectly valid for the derivative to be negative.
However, one big issue is probably going to be low sampling frequency. Sampling frequencies in phones are usually around 60 Hz. That means your actual bandwidth for acceleration is 30 Hz (the Nyquist frequency). Halve that and that's your jerk bandwidth. Halve that and your bandwidth for jounce, namely 7.5 Hz. Roughly speaking. All jerks (still a funny word) over 15 Hz and jounces over 7.5 Hz do not disappear but instead are aliased on top of your results. So not only you miss some information, the information you miss actually causes even more damage to your results. Properly done, you'd need low pass filtering before each derivative.
take several time series points from the acclerometer and perform B-Spline Interpolation and find the control points.
Take those points and utilize a a 3rd degree Berstein polynomial, take its first derivative and feed the control points from the B-Spline solution into the derived polynomial where t is between 0 and 1 (assuming 1 Hz sampling rate .. 0 to 1 interpolates everything in that second of time). Those values will be the Jerk/Jounce. You'll be computing this for front and side and acceleration values.
In a iOS prototype I use a combination of CMDeviceMotion.deviceMotion.yaw and CLHeading.trueHeading to make stable compass heading that is responsive and accurate. This works well when the iPhone is held flat, where I have a graphical arrow that point to a stable compass heading.
The problem appear when the iPhone is held vertical in portait mode. The UIDeviceOrientation constantly changes from UIDeviceOrientationFaceDown to UIDeviceOrientationFaceUp and back. This makes the yaw value to skip back and forth +/-180 degrees based on small changes of the pitch. Is it possible to lock the device to one orientation that gives a stable yaw value, predict the change without glitches or compute the gyro yaw (or roll in this orientation) in other ways?
This poor guy have the same problem, with no answers. Double points possible people! :)
https://stackoverflow.com/questions/10470938/euler-angle-yaw-not-working-when-iphone-orientation-changes
I was just searching for an answer to this problem. It broke my heart a bit to see that you posted this over a year ago, but I figured maybe you or someone else could benefit from the solution.
The issue is gimbal lock. When pitch is about 90 degrees, yaw and roll match up and the gyro loses a degree of freedom. Quaternions are one way of avoiding gimbal lock, but I honestly didn't feel like wrapping my mind around that. Instead, I noticed that yaw and roll actually match up and can simply be summed to to solve the problem (assuming you only care about yaw).
SOLUTION:
float yawDegrees = currentAttitude.yaw * (180.0 / M_PI);
float pitchDegrees = currentAttitude.pitch * (180.0 / M_PI);
float rollDegrees = currentAttitude.roll * (180.0 / M_PI);
double rotationDegrees;
if(rollDegrees < 0 && yawDegrees < 0) // This is the condition where simply
// summing yawDegrees with rollDegrees
// wouldn't work.
// Suppose yaw = -177 and pitch = -165.
// rotationDegrees would then be -342,
// making your rotation angle jump all
// the way around the circle.
{
rotationDegrees = 360 - (-1 * (yawDegrees + rollDegrees));
}
else
{
rotationDegrees = yawDegrees + rollDegrees;
}
// Use rotationDegrees with range 0 - 360 to do whatever you want.
I hope this helps someone else!
If somebody is interested in the implementation in iOS Swift the code is given below:
let queue = NSOperationQueue()
motionManager.startDeviceMotionUpdatesToQueue(queue) {
[weak self] (data: CMDeviceMotion!, error: NSError!) in
var yawDegrees: Double = self!.motionManager.deviceMotion.attitude.yaw * (180.0 / M_PI)
var pitchDegrees: Double = self!.motionManager.deviceMotion.attitude.pitch * (180.0 / M_PI)
var rollDegrees: Double = self!.motionManager.deviceMotion.attitude.roll * (180.0 / M_PI)
if(rollDegrees < 0 && yawDegrees < 0){
self!.rotationDegrees = 360 - (-1 * (yawDegrees + rollDegrees))
}
else {
self!.rotationDegrees = yawDegrees + rollDegrees
}
}
However I am having some problems and I hope #blkhp19 can help me with this because at certain points the angles go into negative values which then messes up the entire calculation and I can't figure out what the problem is.
The problem is a bit confusing because there are at least two different ways to think about Yaw. One is from the phone's perspective, and one from the world perspective.
I'll use this image from Apple to explain further:
If the phone is flat on a table:
Rotations along the phone's yaw (or Z axis): change the compass heading.
Rotations along the phone's roll (or Y axis): do not change compass heading.
Rotations along the phone's pitch (or X axis): do not change compass heading.
If the phone is flat against a wall:
Rotations along the phone's yaw (or Z axis): change the compass heading.
Rotations along the phone's roll (or Y axis): change the compass heading.
Rotations along the phone's pitch (or X axis): do not change compass heading.
For the remainder of this answer, I'll assume the phone is upright and yaw, pitch, and roll refer to exactly what's in the photo above.
Yaw
You'll need to use atan2 and inspect gravity as in this example.
let yaw = -Angle(radians: .pi - atan2(motion.gravity.x, motion.gravity.y))
Pitch
Similar to the above, I primarily just swapped x and z and it seems to be returning the correct values:
let pitch = Angle(radians: .pi - atan2(motion.gravity.z, motion.gravity.y))
Roll (aka Compass Heading)
Use blkhp19's code above which sums up the attitude yaw and roll. If you import SwiftUI, you can leverage the Angle struct to make radian + degrees conversion easier:
func roll(motion: CMDeviceMotion) -> Angle {
let attitudeYaw = Angle(radians: motion.attitude.yaw)
let attitudeRoll = Angle(radians: motion.attitude.roll)
var compassHeading: Angle = attitudeYaw + attitudeRoll
if attitudeRoll.degrees < 0 && attitudeYaw.degrees < 0 {
compassHeading = Angle(degrees: 360 - (-1 * compassHeading.degrees))
}
return compassHeading
}
Also note that if you don't need the actual angle, and all you need is the relationship (e.g. isPhoneUpright), you can simply read gravity values for those.
extension CMDeviceMotion {
var yaw: Angle {
-Angle(radians: .pi - atan2(gravity.x, gravity.y))
}
var pitch: Angle {
Angle(radians: .pi - atan2(gravity.z, gravity.y))
}
var roll: Angle {
let attitudeYaw = Angle(radians: attitude.yaw)
let attitudeRoll = Angle(radians: attitude.roll)
var compassHeading: Angle = attitudeYaw + attitudeRoll
if attitudeRoll.degrees < 0 && attitudeYaw.degrees < 0 {
compassHeading = Angle(degrees: 360 - (-1 * compassHeading.degrees))
}
return compassHeading
}
}
I have set up an event system in FMOD with 3D sound triggered when the listener is close to them. I want to be able to change the listener angle of the listener so that my entire sound landscape shifts the same amount of degrees as my listener rotates. Nice huh?
Is it possible? What would be the iPhone code for that?
Would I pass something to the method set3DListenerAttributes?
I'm not sure I fully understand, do you just want to rotate the listener in 3D space? If so you adjust the listener forward vector to point in the new direction.
** Based on your response **
FMOD works with a cartesian co-ordinate system, it has a unit length vector which points in the direction an object (or listener) is facing. I would recommend you do some reading on trigonometry to fully understand how to convert an angle (in degrees or radians) to a vector.
For your forward vector the equation is (if I remember correctly):
x = cos(angle)
z = sin(angle)
y = 0;
This assumes angle is in radians, to convert from degrees to radians:
radians = degrees * (180 / Pi)
where Pi is roughly 3.14159265
This was the code I used in the end:
float degree = 90;
float radians = (degree) * (M_PI/180);
float fx = cos(radians);
float fz = sin(radians);
forward.x = fx;
forward.z = fz;
listenerpos.x = lxPos * DISTANCEFACTOR;
listenerpos.z = lyPos * DISTANCEFACTOR;
result = eventSystem->set3DListenerAttributes(0, &listenerpos,&vel,&forward,NULL);
I have an OpenGL ES View in Android thats controlled by a matrix for translation. Im trying to figure out a way to get a hint of momentum scrolling as seen in the google maps app or the iPhone. Thanks.
If your problem is in 2d, it is quite simple
You need to get the elapsed time in each frame
Your onTouch function will find the acceleration of your finger. I forgot the formula on how to get the acceleration from a distance. It should be the second derivative of position with a time variable. But you should always convert your deltaX, deltaY in acceleration. To make it easy you don't really need to put something accurate there. Edit: I don't know why I didn't see it but the function was all there...
acceleration.x = 2(newposition.x - position.x - speed.x * elapsedTime) / (elapsedTime * elapsedTime);
Once you have your acceleration you can set your new position with that code. This is simple physic dynamics in 2d. With your acceleration you can find your speed and with your speed you can find your next position.
speed.x = (float) (mass * acceleration.x * elapsed + speed.x);
speed.y = (float) (mass * acceleration.y * elapsed + speed.y);
position.x += mass * acceleration.x / 2 * elapsed * elapsed + speed.x * elapsed;
position.y += mass * acceleration.y / 2 * elapsed * elapsed + speed.y * elapsed;
speed.x *= friction;
speed.y *= friction;
Mass and friction will let you define how fast it goes and how fast it will slow down by itself. You probably will have to tweak the code because this dynamic isn't exactly nice if you have to have to scroll backward to slow down.
At the end of each frame, you will have to reset your acceleration to (0,0). And on each new frame after a touch even, the acceleration should be set to something. It should work very well :)
Measure the speed that the view is scrolling at.
Detect when the user stops scrolling.
Gradually decrease the speed that the scroll view is scrolling at.
Something like this:
public void redraw() {
myScrollView.ySpeed = myScrollView.lastY-myScrollView.y;
myScrollView.xSpeed = myScrollView.lastX-myScrollView.x;
if (!userIsScrolling && ySpeed > 0) {
ySpeed--;
}
if (!userIsScrolling && xSpeed > 0) {
xSpeed--;
}
myScrollView.lastY = myScrollView.y;
myScrollView.y += ySpeed;
myScrollView.lastX = myScrollView.x;
myScrollView.x += xSpeed;
}
public void userStoppedScrolling() {
userIsScrolling = false;
}
I'm asking them at 50Hz / 50 times per second for data. When I suddenly flip the device on the x-axis by 90 degrees while the device was flat on a table with display facing up bevore, the values move pretty slowly to the "target" value for that position.
Now the weird thing is: If I increase the measurement-rate, the value will move faster to that new value upon suddenly flipping the device by 90 degrees. But if I just ask once per second for the new value, it take's very long until the value reaches the target. What can be the reason for this?
I don't do any kind of data aggregation, and don't accumulate anything. I just do some simple filtering to get rid of the noise. My method looks like this:
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration {
// Use a basic low-pass filter to only keep the gravity in the accelerometer values for the X and Y axes
// accelerationX is an instance variable
accelerationX = acceleration.x * 0.05 + accelerationX * (1.0 - 0.05);
// round
int i = accelerationX * 100;
float clippedAccelerationValue = i;
clippedAccelerationValue /= 100;
[self moveViews:clippedAccelerationValue];
}
later on, in my -moveViews: method, I do this:
-(IBAction)moveSceneForPseudo3D:(float)accelerationValue {
if(fabs(lastAccelerationValue - accelerationValue) > 0.02) { // some little treshold to prevent flickering when it lays on a table
float viewAccelerationOffset = accelerationValue * 19 * -1;
newXPos = initialViewOrigin + viewAccelerationOffset;
myView.frame = CGRectMake(newXPos, myView.frame.origin.y, myView.frame.size.width, myView.frame.size.height);
lastAccelerationValue = accelerationValue;
}
}
As a result, of the device gets turned 90 degrees on the x-achsis, or 180 degrees, the view just moves pretty slowly to it's target position. I don't know if that's because of the physics of the accelerometers, or if it's a bug in my filtering code. I only know that there are fast paced games where the accelerometers are used for steering, so I almost can't imagine that's a hardware problem.
This line:
accelerationX = acceleration.x * 0.05 + accelerationX * (1.0 - 0.05);
is a low-pass filter, which works by computing a moving average of the x acceleration. In other words, each time that callback is called, you're only moving the accelerationX by 5% towards the new accelerometer value. That's why it takes many iterations before accelerationX reflects the new orientation.
What you should do is increase the 0.05 value, to say 0.2. I'd make a global #define and play around with different values along with different refresh rates.