IMU/MEMS with high limit accelerometer and magnetometer - accelerometer

anyone know of an IMU sensor.
Price range of 5-17$.
That has a high accelerometer value of 50-200g and a magnetometer to measure the the Earth's orientation.
Update on mag and acc need to be 100Hz or higher.
I know i kan get one imu for each but i would like to have them combined to save space.
I'm using it to measure high rpm and high g crashing.
Im also designing the pcb myself so just a datasheet of the ic is fine.

Related

How do I calculate the maximum sptio-temporal resolution of an IMU sensor?

I am trying to calculate the trajectory of an IMU sensor by integrating the acceleration twice, but I keep getting distorted data. I would like to know what is maximal resolution I can actually produce from this sensor (MPU9250).
The sensor is operating at a sampling rate of 200Hz
The LSB is 0.6mm/s^2.
The noise covariance matrix of the sensor has the diagonal elements [19.7, 19.7, 52.2] mm/s^2
Assuming that I know the exact orientation of the sensor (using a camera/marker based system for the purpose of debugging), What is the actual highest spatiotemporal resolution I can get with this sensor?
integrated acceleration twice to get trajectory

How can I get an accurate relative yaw angle from a 6DOF IMU like the LSM6DS3 with minimal gyro drift?

I currently have a platform that has a 6DOF IMU (the LSM6DS3) with no magnetometer. I want to get as accurate of a relative yaw angle reading as possible, avoiding or minimizing any gyro drift. I tried a simple approach of:
a) calculate the gyro angular rate zero-offset by computing the average gyro angular rate reading for a few seconds while my platform is known to be stationary.
b) read the angular rate while the LSM6DS3 reports a new reading is available (so, this should essentially be the output data rate I configured, i.e. 416Hz). I subtract this from the zero-offset calculated in (a) above to get the degrees/s angular rate and then integrate (multiply the time delta with the degrees/s angular rate and add to the current yaw angle).
Is this the best I can do to avoid gyro drift issues if I want to get as accurate of a relative yaw rating possible?
I looked a little bit at Kalman filters and Madgwick filters and Mahony filters but they don't seem to suggest the ability to improve yaw angle readings as it seems the accelerometer would not be useful to calculate the yaw angle? Is that correct?

Impact of motors on IMU

Currently, we are using BNO055 in one of our projects. The IMU is placed next to the dc motor due to space constraints within the hardware setup. Due to motors vibrations, we are applying a low pass filter on quaternion values read from this (https://github.com/adafruit/Adafruit_BN ... awdata.ino) script. We have set 5 Hz as a cut-off frequency of the filter. We have also placed IMU on Sorbothane (damping material) to minimize the vibrations. However, we are still selling the error in the orientation.
What could be done to reduce the impact of motor vibrations on IMU both from a software and hardware point of view? Any inputs are highly appreciated.
Motor vibration may not be the only problem here.
Orientation estimation can go wrong due to multiple factors like,
Bias due to incorrect calibration. Keep the sensor level and idle. Make sure Gyroscope reads close to (0,0,0) on average. Accelerometer should read either (0,0,9.81) or (0 0 -9.81) m/s^2 depending on the convention.
Bias due to temperature changes. Even a 10 deg change in PCB temperature can change the bias in Gyro by 0.3 dps (according to the datasheet)
Motor noise. Seems like you have already tried reducing this one.
If none of them work, you could try implementing your own Kalman or complementary filter based on the raw data from Gyro, Accel and mag. This way you can be sure about calibration process, estimator gains, how the estimator works.
If implementing Kalman filter is difficult, you could try this AHRS filter block/algorithm given here,
https://in.mathworks.com/help/nav/ref/ahrsfilter-system-object.html

Is the iPhone accelerometer calibrated? Gravity measurement changes depending on orientation

I'm doing some test with iPhone 4S accelerometer. If I take the raw data in Z-axis (telephone rest over desktop) I get an acceleration 9.65-9.70 m/s2 (after g conversion by 9.8261).
But if i have the telephone resting over edge, the measurement of the accelerometer value in the X-axis is so different, aprox. 9.80-9.85 m/s2 (after the same g conversion).
My question is, if the gravity is the same, why this difference? It is not callibrated?
On the other hand, I check the module value at both situations and the difference is the same.
Thanks.
I don't know what kind of answer you expect, but you should be more precise when you're talking about calibration.
Of course, the g-sensors are calibrated and as always: every calibration comes with an error. In your case the error is under 1%.
So if you want an answer:
Yes, the iPhone accelerometer is calibrated and has an error under 1% in your case. If you collect measurements from other (hundreds of) users, you could calculate the mean error of the device (I guess it's about 1% though).
The problem is that it's not possible to determine gravity 100% exactly when all of the sensors (gyro and compass as well) show an intrinsic error. The lack of a precise external reference system leads to this error. Accelerometer and gyroscope are corrected mutually and if there is a slight drift it does affect the direction where the sensor fusion algorithm (Kalman-Filter or others) calculates gravity should be.
While gyroscope is very fast in detecting the direction it tends to drifting effects. Accelerometers are slower in reaction but provide a way to detect gravity. Magnetometers are even slower but can contribute to stabilising the overall result. Combine Gyroscope and Accelerometer Data shows some graphs of the raw and the processed sensor data.
I continued working with accelerometers. The results are not bad. About iPhone accelerometers calibrating, I can say that STMicroelectronics does calibration over his own sensor. Later, iPhone factory assemblies accelerometer onto circuit board. The soldering affects to accelerometer accuracy (thermal effects) and probably, the accelerometer requires a new calibration, but for consumer requirements, the accuracy is already good, but if you need high requirements, you need a new calibration.

Accelerometer to relative position

Before I reinvent the wheel I wanted to see if anyone can share code or tips for the following:
In order to get relative position of the iPhone, one needs to
Set the accelerometer read rate
Noise filter the accelerometer response
Convert it to a vector
Low pass filter the vector to find gravity
Subtract gravity from the raw reading to find the user caused acceleration
Filter the user caused acceleration to get the frequencies you are interested in ( probably bandpass depending on the application)
Integrate to find relative speed
Integrate to find position
So what I'm hoping is that people have already written some or all of the above and can provide tips, or better yet code.
A few questions I haven't found the answer to:
What is the frequency response of the iPhone accelerometer? What hardware filters exist between the accelerometer and the analog to digital converter?
What is the fastest reading rate the accelerometer delegate can be called without duplicating reading values?
Differences in the above for the various phones?
Any good tips for designing the filters, such as cutoff frequency for separating gravity and user motion?
Any code or tips for the integration steps? Any reason to integrate in the cartesion coordinate system rather than as vector, or vise versa?
Any other experiences, tips, or information that one should know prior to implementing this?
As I find information out, I'll be collecting it in this answer.
Hardware
The 3GS uses an ST LIS331DL 3-axis ±2g/±8g digital accelerometer.
The iPhone 4 and iPad use an ST LIS331DLH 3-axis ±2g/±4g/±8g digital accelerometer.
They are both capable of being read at 100Hz and 400Hz, although on the iPhone 3G (under iOS 4.1) the accelerometer delegate is not called more frequently than 100Hz even if setUpdateInterval is set for faster updates. I do not know if the API permits faster updates on the iPhone 4, and Apple's documentation merely states that the maximum is determined by the hardware of the iPhone. (TBD)
The A/D converter is on the same silicon as the MEM sensor, which is good for noise immunity.
The DL version is 8 bits (3GS) while the DLH version is 12 bits (iPhone 4). The maximum bias (offset) in the DL version is twice the bias of the DLH (0.04g vs 0.02g) version.
The data sheet for the DLH reports acceleration noise density, but that value is not reported on the DL datasheet. Noise density is reasonably low at 218 μg/√Hz for the DLH.
Both sensors give either 100Hz sampling or 400Hz sampling speeds, with no custom rate. The sensor discards values if the iPhone doesn't read the output register at the set sampling rate.
The "typical" full scale value for the DL sensor is ±2.3g, but ST only guarantees that it's at least ±2g.
Temperature effects on the sensor are present and measurable, but not very significant.
TBD:
Is the hardware filter turned on, and what are the filtering characteristics?
How noisy is the power supply to the accelerometer? (Anybody just happen to have the iPhone schematic laying around?)
The accelerometer uses an internal clock to provide timing for the sample rate and A/D conversion. The datasheet does not indicate the accuracy, precision, or temperature sensitivity of this clock. For accurate time analysis the iPhone must use an interrupt to sense when a sample is done and record the time in the interrupt. (whether this is done or not is unknown, but it's the only way to get accurate timing information)
API
Requesting lower than 100Hz sampling rates results in getting selected samples, while discarding the rest. If a sampling rate that is not a factor of 100Hz is requested in software, the time intervals between real sensor readings cannot be even. Apple does not guarantee even sampling rates even if a factor of 100 is used.
It appears that the API provides no software filtering.
The API does scale the raw accelerometer value into a double representing Gs. The scaling factor used is unknown, and whether this is different for each phone (ie, calibrated) and whether the calibration occurs on an ongoing basis to account fo sensor drift is unknown. Online reports seem to suggest that the iPhone does re-calibrate itself on occasion when it's lying flat on a surface.
Results from simple testing suggest that the API sets the sensor to ±2g for the 3GS, which is generally fine for handheld movements.
TBD:
Does Apple calibrate each unit so that the UIAccelerometer reports 1G as 1G? Apple's documentation specifically warns against using the device for sensitive measurement applications.
Does the reported NSTimeInterval represent when the values were read from the accelerometer, or when the accelerometer interrupt indicated that new values were ready?
I'm just dealing with the same problem. The only difference to your approach is that I don't want to rely on the low pass filter to find gravity. (TBH I don't see how I can reliably tell the gravity vector from the accelerometer readings)
Am trying it with the gyros right now.