Is the iPhone accelerometer calibrated? Gravity measurement changes depending on orientation - iphone

I'm doing some test with iPhone 4S accelerometer. If I take the raw data in Z-axis (telephone rest over desktop) I get an acceleration 9.65-9.70 m/s2 (after g conversion by 9.8261).
But if i have the telephone resting over edge, the measurement of the accelerometer value in the X-axis is so different, aprox. 9.80-9.85 m/s2 (after the same g conversion).
My question is, if the gravity is the same, why this difference? It is not callibrated?
On the other hand, I check the module value at both situations and the difference is the same.
Thanks.

I don't know what kind of answer you expect, but you should be more precise when you're talking about calibration.
Of course, the g-sensors are calibrated and as always: every calibration comes with an error. In your case the error is under 1%.
So if you want an answer:
Yes, the iPhone accelerometer is calibrated and has an error under 1% in your case. If you collect measurements from other (hundreds of) users, you could calculate the mean error of the device (I guess it's about 1% though).

The problem is that it's not possible to determine gravity 100% exactly when all of the sensors (gyro and compass as well) show an intrinsic error. The lack of a precise external reference system leads to this error. Accelerometer and gyroscope are corrected mutually and if there is a slight drift it does affect the direction where the sensor fusion algorithm (Kalman-Filter or others) calculates gravity should be.
While gyroscope is very fast in detecting the direction it tends to drifting effects. Accelerometers are slower in reaction but provide a way to detect gravity. Magnetometers are even slower but can contribute to stabilising the overall result. Combine Gyroscope and Accelerometer Data shows some graphs of the raw and the processed sensor data.

I continued working with accelerometers. The results are not bad. About iPhone accelerometers calibrating, I can say that STMicroelectronics does calibration over his own sensor. Later, iPhone factory assemblies accelerometer onto circuit board. The soldering affects to accelerometer accuracy (thermal effects) and probably, the accelerometer requires a new calibration, but for consumer requirements, the accuracy is already good, but if you need high requirements, you need a new calibration.

Related

Impact of motors on IMU

Currently, we are using BNO055 in one of our projects. The IMU is placed next to the dc motor due to space constraints within the hardware setup. Due to motors vibrations, we are applying a low pass filter on quaternion values read from this (https://github.com/adafruit/Adafruit_BN ... awdata.ino) script. We have set 5 Hz as a cut-off frequency of the filter. We have also placed IMU on Sorbothane (damping material) to minimize the vibrations. However, we are still selling the error in the orientation.
What could be done to reduce the impact of motor vibrations on IMU both from a software and hardware point of view? Any inputs are highly appreciated.
Motor vibration may not be the only problem here.
Orientation estimation can go wrong due to multiple factors like,
Bias due to incorrect calibration. Keep the sensor level and idle. Make sure Gyroscope reads close to (0,0,0) on average. Accelerometer should read either (0,0,9.81) or (0 0 -9.81) m/s^2 depending on the convention.
Bias due to temperature changes. Even a 10 deg change in PCB temperature can change the bias in Gyro by 0.3 dps (according to the datasheet)
Motor noise. Seems like you have already tried reducing this one.
If none of them work, you could try implementing your own Kalman or complementary filter based on the raw data from Gyro, Accel and mag. This way you can be sure about calibration process, estimator gains, how the estimator works.
If implementing Kalman filter is difficult, you could try this AHRS filter block/algorithm given here,
https://in.mathworks.com/help/nav/ref/ahrsfilter-system-object.html

quadrotor accelerometer unstable

I am currently working on my project quadrotor. I am using ADXL335 accelerometer and L3G4200D gyroscope interfaced with an atemga 128. When I check reading from accelerometer without running motors, values are accurate and stable. But when I start motors, values start to fluctuate. The more I increase the speed the more they fluctuate. I tried Kalman filter, the results seem same just less fluctuating but still not enough for a stable flight. My gyroscope readings also give too much drift. Is this suppose to happen or am I doing something wrong.
It is quite difficult to help you as "fluctuations" can be caused by several things.
Just checked out the datasheet of the ADXL335 accelerometer. Have you added the bandwidth limiting capasitors on the output Cx,Cy and CZ. If not, they might help you in reducing the fluctuations.
Another thing that might cause your fluctuations interference from the motor cables to the signal cables.
If you havent used screened/shielded cables for the accelerometer change them and make sure that you try to reduce whatever interference you might find.
You might find some hints for how to do good emc design Here
From your statement, I would assume that the motors would be causing the interference. The way I see it, it could be caused one of two ways:
The PCB was custom designed, houses both power electronics and the sensitive measurement units and not enough care was taken to isolate the sensitive parts from the interference generated on the board.
The magnetic field from the motors are causing the fluctuations. This could be because the IMU is too close to the motors, improperly isolated or improperly positionned. Try avoiding installing the IMU on the same plane as the motors. We currently have our IMU placed about 20 cm above the center of gravity of our drone. I cannot confirm that it caused the accelerometer to fluctuate, but it did have enormous influences on the compass when it was placed only 10 cm above the center of gravity.

CMDeviceMotion userAcceleration drift

I'm getting the acceleration data using the -[CMDeviceMotion userAcceleration]
I've noticed one interesting thing: I always get a small bias on the Z axis. It is about 0.0155 (with variance of 0.002). While on other axes the average values are near 0.
I'm testing this with iPod Touch 4G (and it is just laying on the table during testing). The question is: where this bias is from and is it device specific?
I noticed similar values although CoreMotion tries to eliminate bias. If you rotate your device so that x (or y) is parallele to gravity you will probably see the bias in x direction. Using raw sensor data showed the same tendency but with larger values and some more super-imposing effects like temperature dependency, time based shifting, ...
18 months ago I read a specification of the iPhone 3 devices' accelerometers and according to this the accuracy was about 1.8 % of g. (what a pity the bookmark to STM product page I set now leads to 404).
Basically this should not be a problem as long as you don't try to estimate exact positions (displacements) and this seems to be impossible with an acceptable accuracy - see the several discusion here on SO.

Accelerometer to relative position

Before I reinvent the wheel I wanted to see if anyone can share code or tips for the following:
In order to get relative position of the iPhone, one needs to
Set the accelerometer read rate
Noise filter the accelerometer response
Convert it to a vector
Low pass filter the vector to find gravity
Subtract gravity from the raw reading to find the user caused acceleration
Filter the user caused acceleration to get the frequencies you are interested in ( probably bandpass depending on the application)
Integrate to find relative speed
Integrate to find position
So what I'm hoping is that people have already written some or all of the above and can provide tips, or better yet code.
A few questions I haven't found the answer to:
What is the frequency response of the iPhone accelerometer? What hardware filters exist between the accelerometer and the analog to digital converter?
What is the fastest reading rate the accelerometer delegate can be called without duplicating reading values?
Differences in the above for the various phones?
Any good tips for designing the filters, such as cutoff frequency for separating gravity and user motion?
Any code or tips for the integration steps? Any reason to integrate in the cartesion coordinate system rather than as vector, or vise versa?
Any other experiences, tips, or information that one should know prior to implementing this?
As I find information out, I'll be collecting it in this answer.
Hardware
The 3GS uses an ST LIS331DL 3-axis ±2g/±8g digital accelerometer.
The iPhone 4 and iPad use an ST LIS331DLH 3-axis ±2g/±4g/±8g digital accelerometer.
They are both capable of being read at 100Hz and 400Hz, although on the iPhone 3G (under iOS 4.1) the accelerometer delegate is not called more frequently than 100Hz even if setUpdateInterval is set for faster updates. I do not know if the API permits faster updates on the iPhone 4, and Apple's documentation merely states that the maximum is determined by the hardware of the iPhone. (TBD)
The A/D converter is on the same silicon as the MEM sensor, which is good for noise immunity.
The DL version is 8 bits (3GS) while the DLH version is 12 bits (iPhone 4). The maximum bias (offset) in the DL version is twice the bias of the DLH (0.04g vs 0.02g) version.
The data sheet for the DLH reports acceleration noise density, but that value is not reported on the DL datasheet. Noise density is reasonably low at 218 μg/√Hz for the DLH.
Both sensors give either 100Hz sampling or 400Hz sampling speeds, with no custom rate. The sensor discards values if the iPhone doesn't read the output register at the set sampling rate.
The "typical" full scale value for the DL sensor is ±2.3g, but ST only guarantees that it's at least ±2g.
Temperature effects on the sensor are present and measurable, but not very significant.
TBD:
Is the hardware filter turned on, and what are the filtering characteristics?
How noisy is the power supply to the accelerometer? (Anybody just happen to have the iPhone schematic laying around?)
The accelerometer uses an internal clock to provide timing for the sample rate and A/D conversion. The datasheet does not indicate the accuracy, precision, or temperature sensitivity of this clock. For accurate time analysis the iPhone must use an interrupt to sense when a sample is done and record the time in the interrupt. (whether this is done or not is unknown, but it's the only way to get accurate timing information)
API
Requesting lower than 100Hz sampling rates results in getting selected samples, while discarding the rest. If a sampling rate that is not a factor of 100Hz is requested in software, the time intervals between real sensor readings cannot be even. Apple does not guarantee even sampling rates even if a factor of 100 is used.
It appears that the API provides no software filtering.
The API does scale the raw accelerometer value into a double representing Gs. The scaling factor used is unknown, and whether this is different for each phone (ie, calibrated) and whether the calibration occurs on an ongoing basis to account fo sensor drift is unknown. Online reports seem to suggest that the iPhone does re-calibrate itself on occasion when it's lying flat on a surface.
Results from simple testing suggest that the API sets the sensor to ±2g for the 3GS, which is generally fine for handheld movements.
TBD:
Does Apple calibrate each unit so that the UIAccelerometer reports 1G as 1G? Apple's documentation specifically warns against using the device for sensitive measurement applications.
Does the reported NSTimeInterval represent when the values were read from the accelerometer, or when the accelerometer interrupt indicated that new values were ready?
I'm just dealing with the same problem. The only difference to your approach is that I don't want to rely on the low pass filter to find gravity. (TBH I don't see how I can reliably tell the gravity vector from the accelerometer readings)
Am trying it with the gyros right now.

Core Location and speed measurements

Does anyone know if Core Location in the iPhone OS uses anything but simple vector math to calculate speed? I've read that the GPS system can provide speed measurements that can be accurate when position is not (I believe using the Doppler shifts of the signals).
I've tried and failed to see if the iPhone does this. The question is basically, does this data contain information or is it just convenience functions, using (filtered?) location data?
I suppose my question is if anyone have tried this in reality, or knows beyond what is in the documentation.
The Core Location documentation describes the speed reading thus:
This value reflects the instantaneous speed of the device in the direction of its current heading.
While not absolutely definitive, this strongly suggests that the reading is direct, rather than an interpolation of positions, which cannot be described as "instantaneous" by any reasonable definition.
The GPS system in itself is not able to provide speed measurements. The only way this can practically be done is by comparing to discrete position measurements and the time between those. It's just a matter of applying simple math to get the speed and direction traveled. More samples can be used to get a more accurate measurement.
It is not feasible to measure the speed directly by simple GPS receivers, e.g. by use of Doppler shift. This is due to the fact that each satellite itself is traveling at very high speed around the globe. Each satellite orbits the globe twice every day, resulting at a speed of almost 14000 km/hour. Since the direction of the satellite compared to the GPS unit varies depending on where it is on the sky, the difference in the measure Doppler shift would be huge compared to the Doppler shift resulting from moving of the GPS receiver itself.
I'm however not saying that this couldn't be done by very sophisticated hardware and algorithms, but the cost/benefit would probably not be worth even considering it.