Does the Android/iOS accelerometer/gyro API use a different ADC sampling rate OR mean/filtering to perform resampling?
I am trying to find out which sampling rate is best for my application and my approach is to collect data at 100 Hz and resample at different frequencies and determine which frequency is good enough for my use case.
However, the performance of this on the real application could be closer to the one on the development environment if I knew how this sampling was implemented on the Android API/ core-motion API. Is the API simply changing the ADC sampling frequency or is it implementing a interpolation or mean filtering after sampling at a higher frequency? (Note: The answers for the Android / iOS might be different, and I do not have this level of detailed knowledge on either system.)
Any links/sources are appreciated.
Related
I can see that the Android Bacon Library features two algorithms for measuring distance: the running average filter and the ARMA filter.
How are these related to the library's implementation (apart from using the filter's formula)?
Where can I find some background information about these that explains the theory behind it?
Are there any known alternative algorithms that can be studied and tried out for measuring the distance?
There are two basic steps to giving a distance estimate on BLE beacons.
Collect RSSI samples
Convert RSSI samples to a distance estimate.
Both of these steps have different possible algorithms. The ARMA and Running Average filters are two different algorithms used for collecting the RSSI samples.
Understand that beacon send out packets at a periodic rate, typically at 1-10 times per second. Each of these packets, when received by the phone will have its own signal level measurement called RSSI. Because of radio noise and measurement error, there is a large amount of variance on each RSSI sample which can lead to big swings in distance estimates. So typically you want to take a number of RSSI samples and average them together to reduce this error.
The running average algorithm simply takes 20 seconds worth of samples (by default, the time period is configurable), throws out the top and bottom 10 percent of the RSSI readings and takes the mean of the remainder. This is similar to how iOS averages samples, so it is the default algoritm for the library for cross-platform compatibility reasons. But it has the disadvantage that the distance estimate lags behind, telling you where the phone is relative to the beacon an average of 10 seconds ago. This may be inappropriate for use cases where the phone is moving relative to the beacon.
The ARMA (Autoregressive Moving Average) algorithm statistically weights the more recent samples more heavily that the older samples, leading to less lag in the distance estimates. But its behavior can be a bit indeterminate and is subject to more varying performance in different radio conditions.
Which algorithm is right for you depends on your use case. Testing both to see which performs better for you is generally the best approach. While there are other possible algorithms for data collection, these are the only two built in to the library. Since it is open source, you are welcome to create your own and submit them as a pull request.
For step 2, there are also a number of possible algorithms. The two most common are a curve fitted formula and a path loss formula. The curve fitted formula is the library default and the path loss alternative is available only in a branch of the library under development. You are welcome to use the latter, but it requires building the library from source. Again, as an open source library you are welcome and encouraged to develop your own alternative algorithms.
My team and I are planning to build an external accessory for iOS that will sample ultrasonic sound at 256KHZ. It's a lot and I am wondering whether iOS vDSP can do the conversion from time domain to frequency domain for 256,000 samples/sec, or we need to have a hardware based solution for the FFT.
Sample projects from Apple such as aurioTouch are very helpful but I couldn't find that deals with sampling rate more than the professional audio sampling frequency. I need help figuring out the following:
Can vDSP FFTs process 256,000 samples/second? If not, any other creative ways to do the same aside from doing the conversion in the hardware?
The closest discussion I found related to this is
How many FFTs per second can I do on my smartphone? (for performing voice recognition)
A 256 kHz data rate is less than 6 times faster than normal 44100 audio. And float FFTs of real-time audio data using the vDSP/Accelerate framework use only in the neighborhood of 1% or less of 1 CPU on recent iOS devices.
The FFT computation time will be a tiny portion of the time available.
Source: I wrote the vDSP FFTs.
Why not see how the devices handle upsampled signals, starting with aurioTouch.
If you need it faster, you should measure the speeds of an integer based FFT implementation.
I think shopkick is detecting very high frequency signal which is not audible to human ear.But the real question is how they can detect signal of more than 22khz in iphone. I have checked frequency response of iphone mic,it seems to be from 20 hz to 22 khz within the human audible range.
http://blog.faberacoustical.com/2009/iphone/iphone-microphone-frequency-response-comparison/ http://www.businessinsider.com/shopkick-crate-barrel-2010-12?op=1
Can you guide me on this. If it is possible with iphone mic,then we can able do some signal processing specifically FFT in order to get frequency.
Well I am currently working on a similar system of transmitting data using these high frequencies and this is what I found out. Al-thou keep in mind that I am doing this with Android phones, mostly Galaxy S line.
First of all spectrum of 20khz to 22khz seems quite promising because it can be detected by all phones we tested and even reproduced by some of them. These frequencies are inaudible to humans of any age and even the dogs and cats seem to not notice them. If you are targeting (actually avoiding) detection by humans you could even go to as low as 18khz since most people wouldn't hear that. This gives you a bandwidth of 4000hz which you can Frequency modulate a data into. Of course don't expect to transmit 8mp images but some small data can be transmitted. You are right in the part that you could than use FFT to transit into frequency domain and analyse those frequencies, this can be done even on older phones in Java (I think doing it in objective c would be even faster).
Also if you have few iPhones on your disposal you could install any frequency analyser and play the frequencies you want on another iPhone or some speaker to test what they can detect. Just keep in mind that standard desktop speakers would probably be able to play the given frequencies but will introduce noise of lower frequency. Piezo tweeters are probably best for these type of sounds al-thou I must say I am using iPhone 4 to play these frequencies for testing quete efficiently.
I read somewhere that Shopkick now even plays there sound codes over stores PA-s and since those speakers are not really optimised for above 20khz response I too am starting to suspect they are using frequencies below that. Take a look at this website for different store codes that some people are using to cheat the system http://www.ceploitips.com/2011/03/shopkick-walk-in-files.html
Keep in mind that using these might ban your account since they improved there misuse detection algorithms.
Also I too would like to read more about the Shopkick implementation so if anyone viewing this has some link please share.
First, human hearing pretty much tops out at 20 KHz and even that requires a very young human and a very low and erratic shift along those upper frequencies. For example, I can produce a tone as low as 18 KHz at full iPad volume at a sample rate of 48 KHz that even my dog doesn't notice. Read up on PsychoAcoustics and you will see that humans filter echoes at even very low frequencies that are there but we don't notice them.
But in the case of ShopKick, I don't think they are going above even 21 KHz. I have created several digital audio modulations on the iPhone and 21 KHz seems to be the upper limit for any distance at all.
It would help if you gave more input on what you are doing. I assume from the question you want to modulate a digital signal between two devices.
My best guess is that they are using maximal length sequences. These are almost like a weak background hiss that covers a large range of the audio spectrum. The key to detection is that the pattern repeats exactly and the phone has a key that detects the sound by correlating the key and the incoming audio.
I am using NSTimer to update peak power from iphone. From monitoring, it does not update very fast. I need high frequency of updating peak power in order of 100 micro second (100us). I also try with usleep(100) to update every 100us. Still very slow. Can someone help me to point out how to achieve this? I am thinking I need to use this code to measure distance. Thank you.
you capture the audio (record, input, or file), access its samples from the pcm cbr (uncompressed, with a fixed sampling rate) stream, and read the samples of the range you are interested in. considering the high frequency, you will only have to analyze a small number of samples (2-5, depending on the sampling rate). you may need to interpolate to improve accuracy with so few samples.
Before I reinvent the wheel I wanted to see if anyone can share code or tips for the following:
In order to get relative position of the iPhone, one needs to
Set the accelerometer read rate
Noise filter the accelerometer response
Convert it to a vector
Low pass filter the vector to find gravity
Subtract gravity from the raw reading to find the user caused acceleration
Filter the user caused acceleration to get the frequencies you are interested in ( probably bandpass depending on the application)
Integrate to find relative speed
Integrate to find position
So what I'm hoping is that people have already written some or all of the above and can provide tips, or better yet code.
A few questions I haven't found the answer to:
What is the frequency response of the iPhone accelerometer? What hardware filters exist between the accelerometer and the analog to digital converter?
What is the fastest reading rate the accelerometer delegate can be called without duplicating reading values?
Differences in the above for the various phones?
Any good tips for designing the filters, such as cutoff frequency for separating gravity and user motion?
Any code or tips for the integration steps? Any reason to integrate in the cartesion coordinate system rather than as vector, or vise versa?
Any other experiences, tips, or information that one should know prior to implementing this?
As I find information out, I'll be collecting it in this answer.
Hardware
The 3GS uses an ST LIS331DL 3-axis ±2g/±8g digital accelerometer.
The iPhone 4 and iPad use an ST LIS331DLH 3-axis ±2g/±4g/±8g digital accelerometer.
They are both capable of being read at 100Hz and 400Hz, although on the iPhone 3G (under iOS 4.1) the accelerometer delegate is not called more frequently than 100Hz even if setUpdateInterval is set for faster updates. I do not know if the API permits faster updates on the iPhone 4, and Apple's documentation merely states that the maximum is determined by the hardware of the iPhone. (TBD)
The A/D converter is on the same silicon as the MEM sensor, which is good for noise immunity.
The DL version is 8 bits (3GS) while the DLH version is 12 bits (iPhone 4). The maximum bias (offset) in the DL version is twice the bias of the DLH (0.04g vs 0.02g) version.
The data sheet for the DLH reports acceleration noise density, but that value is not reported on the DL datasheet. Noise density is reasonably low at 218 μg/√Hz for the DLH.
Both sensors give either 100Hz sampling or 400Hz sampling speeds, with no custom rate. The sensor discards values if the iPhone doesn't read the output register at the set sampling rate.
The "typical" full scale value for the DL sensor is ±2.3g, but ST only guarantees that it's at least ±2g.
Temperature effects on the sensor are present and measurable, but not very significant.
TBD:
Is the hardware filter turned on, and what are the filtering characteristics?
How noisy is the power supply to the accelerometer? (Anybody just happen to have the iPhone schematic laying around?)
The accelerometer uses an internal clock to provide timing for the sample rate and A/D conversion. The datasheet does not indicate the accuracy, precision, or temperature sensitivity of this clock. For accurate time analysis the iPhone must use an interrupt to sense when a sample is done and record the time in the interrupt. (whether this is done or not is unknown, but it's the only way to get accurate timing information)
API
Requesting lower than 100Hz sampling rates results in getting selected samples, while discarding the rest. If a sampling rate that is not a factor of 100Hz is requested in software, the time intervals between real sensor readings cannot be even. Apple does not guarantee even sampling rates even if a factor of 100 is used.
It appears that the API provides no software filtering.
The API does scale the raw accelerometer value into a double representing Gs. The scaling factor used is unknown, and whether this is different for each phone (ie, calibrated) and whether the calibration occurs on an ongoing basis to account fo sensor drift is unknown. Online reports seem to suggest that the iPhone does re-calibrate itself on occasion when it's lying flat on a surface.
Results from simple testing suggest that the API sets the sensor to ±2g for the 3GS, which is generally fine for handheld movements.
TBD:
Does Apple calibrate each unit so that the UIAccelerometer reports 1G as 1G? Apple's documentation specifically warns against using the device for sensitive measurement applications.
Does the reported NSTimeInterval represent when the values were read from the accelerometer, or when the accelerometer interrupt indicated that new values were ready?
I'm just dealing with the same problem. The only difference to your approach is that I don't want to rely on the low pass filter to find gravity. (TBH I don't see how I can reliably tell the gravity vector from the accelerometer readings)
Am trying it with the gyros right now.