I am using a Murata SCL3400 2 Axis Inclinometer to do some measurements.
I can talk to unit using a USB-SPI converter chip (as the inclinometer is digital and can only be communicated via SPI Protocol).
One issue that I do have is that I would like to know how to convert the gravity projection values - ACCx & ACCy(g) to angles in degrees.
I tried using the equation
Angle = Tanh(ACCx/ACCy)
and converted this value in Radians to Degrees by multiplying it with 180/pi.
However this has not worked.
Can anyone help out with my query?
Thanks
LabMat
Related
I need to calibrate a magnetometer using the method of Merayo with matlab.
I have found this code:
But I do not understand how I apply this technique. I fact I have the magnetic Data distorted, I apply the Magnetic Calibration.
[U,c] = MgnCalibration(X)
So I get U the Shape ellipsoid parameter and c the ellipsoid center.
And the calibrated measurement is:
w = U*(v-c)
The problem is that when I calculate the corrected Data I have another order of values.
Data=[1750 1460 -3940]
CalibratedData=[0.4042 0.3820 -0.6860]
What I did not very well understood?
How can I use my magnetic data after this calibration?
The calibrated data for each axis [mx=0.4042 my=0.3820 mz=-0.6860] should now be independent of the orientation in 3D space in which the fluxgate finds itself for that specific measurement at that point in space. The Total Field calculated with sqrt(mx.mx+my.my+mz.mz) will be the calibrated Earth's Total magnetic field at that point.
If scaled properly this total field value should be the same as that measured by a proton or cesium vapor magnetometer
Still need the math: I am trying to calculate the yxy rotation sequence given a quaternion transformation. I can easily do this using Matlab's quat2angle function. However, I need to calculate this by hand using a python script.
This part solved: Please look at this awesome presentation which helped me resolve these issues below:
https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact=8&ved=0CCoQFjAC&url=http%3A%2F%2Fwww.udel.edu%2Fbiology%2Frosewc%2Fkaap686%2Freserve%2Fshoulder%2Fshoulder%2FBluePresentation.ppt&ei=jgRAVLHfOsSrogTJiYHABQ&usg=AFQjCNGFmwh11jEZen80jc3tM4f7HUQcNw&sig2=Dlr8_7TIFPLyUfJy6-pSJA&bvm=bv.77648437,d.cGU
Also, with Matlab, I am seeing strange results with the way they calculate yxy. I have a quaternion transformation of [1.0000 -0.0002 -0.0011 -0.0006] and I get y = 112.4291 x = -0.0719 y1 = -112.5506 (in degrees).
I don't expect to see any rotations here (my sensors aren't rotating). Why is Matlab showing me rotation? And when I try to just move in the x rotation, I see y and y1 also rotate, however, I don't expect y or y1 to be rotating. Any thoughts?
UPDATE:
When I add y + y1 I seem to get the value for the first y (when doing simple rotation around the first y), and this smooths out the data. However, when I combine the three rotations of the shoulder, the data doesn't make sense. I am trying to define shoulder movement based on plane of elevation, elevation and rotation (yxy) in a way that's easy to interpret. When I rotate around x, then the second y, I get "clipping" (data goes to 180 then -180 following positive trend for y1 and opposite happens for y), even though I start my sensors at the zero position. Also, If I try to rotate only around the second y, I see rotation in the x. That doesn't make any sense either. Any additional thoughts?
Note:
I am using 2 IMU sensors, taring them in the same orientation, holding one constant and rotating the other, calculating the relative rotation between them using quaternions, and then calculating the yxy rotation sequence angles.
In case anyone is interested in quaternion calculations and transformations. I solved it using this transformations library:
http://www.lfd.uci.edu/~gohlke/code/transformations.py.html
There are several functions in here using matrices, quaternions, and Euler rotations. And you can convert quaternions to several different Euler rotation sequences. Give thanks to the person who created this script.
I am writing an iPhone/iPad app. I need to compute the acceleration and deceleration in the direction of travel of a vehicle traveling in close to a straight horizontal line with erratic acceleration and deceleration. I have the sequence of 3 readings from the X,Y,Z orthogonal accelerometers. But the orientation of the iphone/ipad is arbitrary and the accelerometer readings include vehicle motion and the effect of gravity. The result should be a sequence of single acceleration values which are positive or negative depending on whether the vehicle is decelerating or accelerating. The positive and negative direction is arbitrary so long as acceleration has the opposite sign to deceleration. Gravity should be factored out of the result. Some amount of variable smoothing of the result would be useful.
The solution should be as simple as possible and must be computationally efficient. The answer should be some kind of pseudo-code algorithm, C code or a sequence of equations which could easily be converted to C code. An iPhone specific solution in Objective C would be fine too.
Thanks
You will need some trigonometry for this, for example to get the magnitude you need
magn = sqrt(x*x+y*y+z*z);
to get the angle you will need atan, then c function atan2 is better
xyangel = atan2(y,x);
xymagn = sqrt(x*x+y*y);
vertangle = atan2(z,xymagn)
no how you get negative and positive magnitude is arbitrary, you could for example interpret π/2 < xyangle < 3π/2 as negative. That would be taking the sign of x for the sign of magn, but it would be equally valid to take the sign from y
It is really tough to separate gravity and motion. It's easier if you can analyze the data together with a gyroscope and compass signal.
The gyroscope measures the rate of angular rotation. Its integral is theoretically the angular orientation (plus an unknown constant), but the integral is subject to drift, so is useless on its own. The accelerometer measures angular orientation plus gravity plus linear acceleration. With some moderately complex math, you can isolate all 3 of those quantities from the 2 sensors' values. Adding the compass fixes the XY plane (where Z is gravity) to an absolute coordinate frame.
See this great presentation.
Use userAcceleration.
You don't have to figure out how to remove gravity from the accelerometer readings and how to take into accont the orientation: It is already implemeted in the Core Motion Framework.
Track the mean value of acceleration. That will give you a reference for "down". Then subtract the mean from individual readings.
You'll need to play around with the sensitivity of the mean calculation, since, e.g., making a long slow turn on a freeway will cause the mean to slowly drift outwards.
If you wanted to compensate for this, you could use GPS tracking to compute a coarse-grained global acceleration to calibrate the accelerometer. In fact, you might find that differentiating the GPS velocity reading gives a good enough absolute acceleration all by itself (I haven't tried, so I can't say).
I am using this device (http://www.sparkfun.com/products/10724) and have successfully implemented an quite well working orientation estimation based on a fusion of magnetometer, accelerometer and gyroscope data based on this http://www.x-io.co.uk/node/8#open_source_imu_and_ahrs_algorithms implementation. Now I want to calculate the dynamic acceleration (measures acceleration without static gravity acceleration). For doing this I came to the following idea.
Calculate a running average of the raw accelerometer data. If the raw acceleration is stable for some time (small difference between running average and current measured raw data) we assume the device does not move and we are measuring the raw gravity. Now save the gravity vector and also current orientation as quaternion. This approach assumes that our device could not be accelerated constantly without gravity.
For calculating the acceleration without gravity I am now doing following quaternion calculation:
RA = Quaternion with current x,y,z raw acceleration values
GA = Quaternion with x,y,z raw acceleration values of estimated gravity
CO = Quaternion of current orientation
GO = saved gravity orientation
DQ = GO^-1 * CO // difference of orientation between last gravity estimation and current orientation
DQ = DQ^-1 // get the inverse of the difference
SQ = DQ * GA * DQ^1 // rotate gravity vector
DA = RA - SQ // get the dynamic acceleration by subtracting the rotated gravity from the raw acceleration
Could someone check if this is correct? I am not sure because on testing it I get some high acceleration on rotating my sensor board, but I am able to get some acceleration data (but is is much smaller than the accelration during rotation) if the device is moved without rotating it.
Moreover I have the question if the accelerometer is also measuring acceleration if it is rotated on place or not!
Another way is to differentiate accel to give jerk (using finite difference, j = (a2 - a1) / dt). Run the jerk through a decay/leakage function (use a half life decay calc value rather than a simple multiplier). Then integrate the jerk (trapezoidal rule, a = dt * (j1 + j2) * 0.5) and it will remove the DC offset (gravity). Again run this signal through a decay function.
The decay functions avoid the value spiraling off but will reduce the magnitude of dynamic acceleration values that you see and will introduce some shaping to the signal. So you won't get values that are 'accurate' m/s/s readings any longer. But it is useful for short-time movements.
Of course you could just use a highpass filter instead but that generally requires a fixed sampling rate and is probably more computationally expensive if you are using convolution (finite impulse response filter).
It's easier than you think. You may wanna have a look at my post here about it:
http://www.varesano.net/blog/fabio/simple-gravity-compensation-9-dom-imus
I am trying to work with 3D rotations but I don't seem to get the idea how the framework does the calculations.
For example I get this data:
yaw -1.010544 pitch 0.508249 roll 1.128918
Then I print the corresponding rotation matrix
0.599901 -0.128495 -0.789689
0.740043 0.464230 0.486649
0.304065 -0.876344 0.373584
After reading the API and the wiki, I am pretty sure there must be a general way to create a rotationmatrix out of euler angles. I tried all of these here, no results.
Do I miss something, or how is this done?
Well, iOS or CoreMotion provides both. RotationMatrix and Euler Angle Representation.
The documentation is here.
But if you are interested how this calculation works, the take a look here.
To sum it up: Each Euler Angle can be represented by a rotation matrix which describes a rotation around one unit axis. Since there are three angles, you just have to combine them. But watch out! Euler angles are not unambiguous since there are different orders in which the rotation matrices can be multiplied.
EDIT: You linked to a quaternion calculation. Or am I completely wrong?