MPU 6050 DMP Output Values Units Issue - accelerometer

I'm running into an odd issue with a motion control processor (MPU 6050 / 9150) that is returning raw gyroscope values as specified between the -/+ 32768 which in the way this gyro is configured corresponds to -/+ 2000 deg / sec. This makes total sense as this goes along with the specified documentation.
However, when i process data from the digital motion processor (6 axis sensor fusion) i get a maximum of 1429. I don't think that there's an error reading the data from the mpu buffer as the other measurements look good (reading from the same FIFO buffer). I assume this is likely some units issue that has to do with the 6 axis sensor fusion, and also likely involving radians
Wondering if any of you bright people with your light bulb brains have some insight here?
Thanks!

Related

How to measure change in altitude at high frequency with Apple Watch

I am trying to measure changes in altitude with an Apple Watch in a sport activity (Kite Surfing). Currently my App is just collecting data for analysis. I am recording barometric and GPS altitude for comparison at a frequency of 10 measurements per second. Basically, it works and data is recorded, but it seems these data are just worthless. In both measurements there are sudden jumps in the dataset of up to +-10m and spikes in GPS readings of up to 75m. Does anyone have an idea how to get somehow accurate readings? I basically do not care about absolute altitude; I am just interested in the change of altitude.
Use startRelativeAltitudeUpdates(to:withHandler:) and when your done remember to stopRelativeAltitudeUpdates()
Here is a link to doc.
Also you can ignore anomalies. for example: if the max possible change in altitude in 100 milliseconds is 2 meters (72 km/h). Then if you see any changes more than 2 meters in 100 millisecond just ignore the data and wait for the next reading.
remember when you ignore one reading to account for the time difference.

Why do my temperatur sensors became suddenly instable?

I have four DS18B20 temperature sensors connected to my Raspberry Pi. I use 1wire and a pull-up resistor.
I read the values directly via cat from the 1wire devices and put the velues without calculation into a gnuplot data file.
The setup has been running fine for weeks now, measuring a coolbox at different places in a range between about 0°C and 30°C. I got quite accurate readings and plots.
But suddenly the values of all sensors start to "flutter", the became unstable. They also dropped -- all four -- about quarter of a °C. The flutter is about about 0.1°C to 0.2°C. Two of the sensors are actually inside fluid (0.5l and 25l) and thus it is virtually impossible that they suddenly drop or flutter.
The time of the event coincides with me checking the cooler box. I might have shifted or touched some sensoring parts. But can that explain the temperature shift? What happend? How can I fix it?
It seems that a cause of the problem could be resolution being lowered. This is a (volatile) setting stored in the sensor itself. It can be set to 9, 10, 11 or 12 bits. The higher the resolution the more precise the measurements will be, but at a cost of longer measurement time.
According to DS18B20 datasheet, by default the resolution is set to 12 bits after power up. In addition, the drivers handling 1-wire communication with the sensor often also by default set highest possible resolution during startup. This could explain why rebooting fixed the problem in case of OP, but doesn't explain why the change in resolution happened in the first place. That likely depends on the specific setup and will likely have to be resolved on a case-by-case basis.
Furthermore - to confirm that the measurements are indeed done at lower resolution, one can get the numeric values of the samples and check the minimum value by which the measurements change. For example, for 12bit resolution the minimum delta is 0.0625 degrees, while for 9bit resolution it may only change by 0.5 degree and nothing in between.

How to detect ultrasonic range sensor is out of range or not?

I am using jsn-srf04t ranging sensor (25cm to 5m range) I want to know when it is going out of range(when under 25cm)
The problem is when it goes under 25 cm, the sensor output sometimes goes to (90cm to 95cm or 100cm to 120cm) and this cause undetectability of that it is really out of range or not!
Is there any solution?
This question is not directly related, but I thought I post a suggestion/answer anyway.
SRF04's can detect the distances as small as 3 cms. Please measure the width of the output echo pulse using an oscilloscope. It can be from 100uS to 18mS and if there is no object within its range, the echo pulse is 36ms.
If the measured pulse width from the oscilloscope is agreeing with what you say, then presumably the SRF04 is faulty, or there is a problem with its mounting etc.
If the width of the pulse is measured in uS, then dividing by 58 will give you the distance in cm, or dividing by 148 will give the distance in inches.
The SRF sensors can be triggered as fast as every 50mS, or 20 times each second. You should wait 50ms before the next trigger to ensure the ultrasonic "beep" has faded away and will not cause a false echo on the next ranging.
Otherwise, check your timer configuration. Ensure that it can measure a pulse in the order of hundreds of microseconds with at least a resolution of tens of microseconds.
If you are using this, then perhaps you are at the lowest possible level.

Why does MATLAB change the sample rate while trying to acquire data?

I am using a DataQ acquisition device in Matlab 32-bit with the Data Acquisition toolbox.
On occasion, when I have my sample rate set to 300, it tells me:
Warning: This hardware could not support the requested value of 300
for SampleRate. SampleRate has been set to 1000"
However, if I set SampleRate to 1000, it sometimes sets it back to 300 with the same error message.
Also, if I set the program so that after the error displays and the device has started recording it returns the SampleRate, this is always at whatever I set it to, not what the program claims it changed it to.
Anyone have any idea how I find out what the actual sample rate was or keep it from resetting mine? I need to know how many samples there are per second for further calculations.
The problem is not with Matlab but with the DAQ. I have a similar "problem" with a NI DAQ. The hardware is set to sample at a very high rate to avoid aliasing. You could sample at a higher rate than required and then use the Matlab command "resample" to reduce your sampling rate. Resample will avoid any aliasing of higher frequencies.

How do I measure sound pressure using the iPhone internal microphone?

I want to measure the loudness of ambient sound. Having read a number of posts on stackoverflow I feel more confused than I was originally. Im not a sound engineer just a programmer.
I think I need to calculate dBSPL with the formula 20 * log10 (voltage / Voltage_Ref)
So for this I need to sample the internal microphone voltage (or pressure in Pascals?) level. The class AVAudioRecorder allows me to meter read the peakPowerForChannel but this gives a dbFS reading between 160 and 0. Where 0 is full power. How do I access the voltage/pressure levels, with another API perhaps?
I had read that roughly 0 dbFS = 99 db SPL. But that would mean the maximum db SPL I could read using the peakPowerForChannel reading would be 99 db SPL. I'm looking to read levels higher than this.
Any information on this would be most appreciated - im somewhat stuck at this point.
Thanks
Mike
The only way to do this is to test your particular iOS device model (and perhaps production batch) against a known sound source at a given distance and relationship to the mic in an anechoic chamber. The voltage and pressure relationship is neither specified by Apple nor available from any API.