Counting steps using accelerometer or gyroscope - iphone

I am developing an application to count the steps of the user while putting the phone in the pocket of his pants.
And I need to know which is better to use, Should I use the accelerometer Sensor or the GyroScope sensor.
Also I have tried the accelerometer sensor, and it worked but I'm asking to check whether the GyroScope is more accurate to this function or the accelerometer?
Thanks in advance for help.

I think you may have already seen my ideas on this subject, but for completeness, I think it's good to record them here.
I think the best results will come from using all sensors available. However, I got reasonable results from just using accelerometer data, see my answer here. What I did was to get a lot of friends to walk for me, and I counted how many steps they took. As they were walking, my Android device was logging all sensor output. I then developed a program in C# (because that's my favourite language) that analysed all the log files, and hence optimised a methodology for counting steps that I then ported to Android java.
Whatever sensors you end up using, logging a whole load of data and then analyzing to work out how best to count the steps is what I would recommend.

Accelerometer sensor detects accelerations along an axis, meanwhile gyroscope can detect rotations, so they have different (and complementary) uses
Take a look a this more detailed explanation of their differences, and the values you can filter from their raw data
https://github.com/hadimichael/V-Tracker/wiki/Hardware

You can try this.. it perfectly works for me..
var motionmanager = CMMotionManager()
motionManager.deviceMotionUpdateInterval = 0.1
motionManager.startDeviceMotionUpdatesToQueue(NSOperationQueue.currentQueue(), withHandler:{
deviceManager, error in
var accelerationThreshold:Double = 1;
var userAcceleration:CMAcceleration = deviceManager.userAcceleration;
if(fabs(userAcceleration.x) > accelerationThreshold) || (fabs(userAcceleration.y) > accelerationThreshold) || (fabs(userAcceleration.z) > accelerationThreshold)
{
println("LowPassFilterSignal")
}
})

Related

How does the complementary filter work?

I'm trying to combine the data from an accelerometer and a gyroscope to accurately measure the pitch and yaw angles of an object. After researching the complementary filter and attempting to implement it, i have a few questions on how it works.
I've read that the filter "trusts" the gyroscope data if there is a lot of angular movement and that it "trusts" the accelerometer data if the object is stable.
http://www.pieter-jan.com/node/11
In this article the complementary filter is described in this way:
*angle = 0.98(angle + gyrData * dt) + 0.02*(accData)*
To me it, seems as if the gyroscope data is being favoured. In the following image, http://www.pieter-jan.com/images/resize/Complementary_Filter.png , found at the bottom of the page, the filtered data seems to "keep close" to the accelerometer data, even though the gyroscope data has drifted. I don't understand why this occurs when the calculation suggests the gyroscope data is being favoured. I have observed this in other photos as well. During my own testing i needed to "swap" the 0.98 and 0.02, suggesting the accelerometer data is being favoured, to obtain similar results. Am i missing completely misunderstanding how this filter works? Is it normal to "favour" the accelerometer data?
Furthermore when the angle of an object needs to be monitored for a long length of time, doesn't the gyroscope data become useless as the drift is so large, how does the filter compensate?
I Realise where i was going wrong.
I had essentially calculated the angle using only the gyroscope data and used that in the filter. i.e
GyroAngle += d°/s * time_between_cycles
FilteredAngle = 0.98*GyroAngle + 0.02*AccelerometerAngle
Instead i should've been doing this:
FilteredAngle = 0.98*(FilteredAngle + d°/s * time_between_cycles) + 0.02*AccelerometerAngle
Doing this has yielded much better results
I'm trying to do something similar to this. I'm implementing a complementary filter in my Arduino code using an LSM9DS0 sensor (It has a gyro/accelerometer built-in https://www.adafruit.com/product/2021
There's a filter value I have been playing around and a calibration method I'm trying to use but I can't seem to get rid of it 100%. There is always a small deviation from the angle and I can never get a 100% filtered angle with no error.

Processing accelerometer data

I would like to know if there are some libraries/algorithms/techniques that help to extract the user context (walking/standing) from accelerometer data (extracted from any smartphone)?
For example, I would collect accelerometer data every 5 seconds for a definite period of time and then identify the user context (ex. for the first 5 minutes, the user was walking, then the user was standing for a minute, and then he continued walking for another 3 minutes).
Thank you very much in advance :)
Check new activity recognization apis
http://developer.android.com/google/play-services/location.html
its still a research topic,please look at this paper which discuss the algorithm
http://www.enggjournals.com/ijcse/doc/IJCSE12-04-05-266.pdf
I don't know of any such library.
It is a very time consuming task to write such a library. Basically, you would build a database of "user context" that you wish to recognize.
Then you collect data and compare it to those in the database. As for how to compare, see Store orientation to an array - and compare, the same holds for accelerometer.
Walking/running data is analogous to heart-rate data in a lot of ways. In terms of getting the noise filtered and getting smooth peaks, look into noise filtering and peak detection algorithms. The following is used to obtain heart-rate information for heart patients, it should be a good starting point : http://www.docstoc.com/docs/22491202/Pan-Tompkins-algorithm-algorithm-to-detect-QRS-complex-in-ECG
Think about how you want to filter out the noise and detect peaks; the filters will obviously depend on the raw data you gather, but it's good to have a general idea of what kind of filtering you'd want to do on your data. Think about what needs to be done once you have filtered data. In your case, think about how you would go about designing an algorithm to find out when the data indicates activity (like walking, running,etc.), and when it shows the user being stationary. This is a fairly challenging problem to solve, once you consider the dynamics of the device itself (how it's positioned when the user is walking/running), and the fact that there are very few (if not no) benchmarked algos that do this with raw smartphone data.
Start with determining the appropriate algorithms, and then tackle the complexities (mentioned above) one by one.

How can I Compare 2 Audio Files Programmatically?

I want to compare 2 audio files programmatically.
For example: I have a sound file in my iPhone app, and then I record another one. I want to check if the existing sound matches the recorded sound or not ( - similar to voice recognition).
How can I accomplish this?
Have a server doing audio fingerprinting computation that is not suitable for mobile device anyway. And then your mobile app uploads your files to the server and gets the analysis result for display. So I don't think programming language implementing it matters much. Following are a few AF implementations.
Java: http://www.redcode.nl/blog/2010/06/creating-shazam-in-java/
VC++: http://code.google.com/p/musicip-libofa/
C#: https://web.archive.org/web/20190128062416/https://www.codeproject.com/Articles/206507/Duplicates-detector-via-audio-fingerprinting
I know the question has been asked a long time ago, but a clear answer could help someone else.
The libraries from Echoprint ( website: echoprint.me/start ) will help you solve the following problems :
De-duplicate a big collection
Identify (Track, Artist ...) a song on a hard drive or on a server
Run an Echoprint server with your data
Identify a song on an iOS device
PS: For more music-oriented features, you can check the list of APIs here.
If you want to implement Fingerprinting by yourself, you should read the docs listed as references here, and probably have a look at musicip-libofa on Google Code
Hope this will help ;)
Apply bandpass filter to reduce noise
Normalize for amplitude
Calculate the cross-correlation
It can be fairly Mhz intensive.
The DSP details are in the well known text:
Digital Signal Processing by
Alan V. Oppenheim and Ronald W. Schafer
I think as well you may try to select a few second sample from both audio track, mnormalise them in amplitude and reduce noise with a band pass filter and after try to use a correlator.
for instance you may take a 5 second sample of one of the thwo and made it slide over the second one computing a cross corelation for any time you shift. (be carefull that if you take a too small pachet you may have high correlation when not expeced and you will soffer the side effect due to the croping of the signal and the crosscorrelation).
After yo can collect an array with al the results of the cross correlation and get the index of the maximun.
You should then set experimentally up threshould o decide when yo assume the pachet to b the same. this will change depending on the quality of the audio track you are comparing.
I implemented a correator to receive and distinguish preamble in wireless communication. My script is actually done in matlab. if you are interested i can try to find the common part and send it to you.
It would be a too long code to be pasted hene in the forum. if you want just let me know and i will send it to ya asap.
cheers

iPhone Temperature Sensor

My question is very similar to this one: iPhone Proximity Sensor. There's clearly some manner of thermometer within the iPhone that's readable by the OS. Has anyone uncovered the super-secret undocumented APIs to read this sensor?
I doubt this sensor is for ambient temperature - rather I suspect it is for overheating of the circuits. If that is all you want then great, but again, I think it would be useless for ambient temperature.
just my opinion.
All i could find was CTGetTemperature in CoreTelephony of all places.
I don't know about previous models, but my iPhone4 goes from cool-ish to very warm in a matter of minutes depending on the various radio usages. So unless "good enough" = "within 20 degrees F or so", then probably not good for ambient measurement.
Unless (maybe you meant this) you could also track radio usage and subtract a temperature variable depending on radio usages. phew. complicated. Easier to just query NWS.
command to get all super-secret names which related with temperature in CoreTelephony framework
nm "/Applications/Xcode463.app/Contents/Developer/Platforms/iPhoneOS.platform/DeviceSupport/6.1 (10B141)/Symbols/System/Library/Frameworks/CoreTelephony.framework/CoreTelephony" | grep empera

iPhone CoreLocation: How to get the most accurate speed

I'm experimenting with adding the GPS functionality to my iPhone app. It's a workout app that will be used while walking or running. So what I want to use GPS for is to show the speed that the person is moving in Mph and minute/mile.
How should I configure the CLLocationManager so I get the best possible results? What should I set desiredAccuracy and distanceFilter?
I've tried with:
distanceFilter = 10 and desiredAccuracy = kCLLocationAccuracyNearestTenMeters
and reading
CLLocation.speed property
Testing while driving around in my car the accuracy seems good compared to the car speedometer although it takes a while to update. I realize that the update delay may very well be the time it takes to query the GPS location, but I'm not sure if changing the above two parameters would give better results.
Should I use kCLLocationAccuracyBest and some other value for distanceFilter?
I'm interested to hear from others using CoreLocation to get speed. What are you doing to get more accurate results?
For best results, you should use kCLLocationAccuracyBest. What you put into your distance filter depends on up with which faults you're willing to put. Basically, you're going to have to make decisions based on accuracy vs availability. That is, during periods when a best-accuracy answer is not available, what will you display?
One approach is to let the phone deliver less-accurate answers and, using a projection of what was happening the last time you had best-accuracy information, see if what you have makes sense.
That is, suppose I'm jogging at 6mph to the North. You plot me along point-A, point-B, point-C... then you get a low-accuracy answer (maybe kCLLocationAccuracyNearest100Meters.) Look at the spot where it says I am and figure out "could I have gotten to that spot from point-C if I'd continued along my current path, making reasonable adjustments for possible changes in speed?" If so, then the new point is within the realm of possibility. (If not, then toss it out.) Then project from point-C at my last-known speed and figure out where you think I probably am, ballistically. Save that as ballistic-point-D.
Of course, you're using the accelerometer to get some sort of inertial sense of which way I went, right? So, you can't know direction (you don't know what way the phone is pointing), but you can make a reasonable stab at distance.
Using all this information, plot the most likely spot where you think I probably am.
NOTE: When testing, don't just drive in good-cell coverage areas. See how your app performs out in the hills, away from cell phones. A lot of people like to bike & jog those areas!
Disclaimer: I've only played with CoreLocation a bit, I've not tested the accuracy very closely.
I'd expect that you'd get the most accurate results by using the defaults for distanceFilter and desiredAccuracy. Less-frequent updates are only going to give you less data to work with.
One issue you're likely to run into is when the location fix is lost for a while, then comes back. The naive, connect-the-dots approach to figuring out distance traveled is going to tend to under-estimate the actual speed of the runner. Rather than using CLLocation.speed, you might get better results calculating speed based on some heuristic approximation to the line the runner is actually following.