Find the walking distance using Accelerometer - iphone

Need to make an app in which user can find the walking distance between two points.
The Concept is like user will start the app and start walking and after taking a few steps he will click some button which will show him the distance traveled by him from the point where he just started the App. and to where he stopped.
I know we can find the distance between two location by using CLLocation Class but the challenge is to get the accurate measurements up to say 3 meters.
Even I am not sure that we can use accelerometer as If I'm at rest, the accelerometer detects only acceleration due to gravity. This obviously set the walking distance to 0 so dont know how will I detect the starting point.
Any hint/suggestion on the same would be a great help to me.

Check this: iphone accelerometer speed and distance
As far as I know it is almost impossible to get accurate results about distance, using the accelerometer.

Related

Get distance height with accelerometer

I have searched for hours now and still didn't find a definitive answer to my problem.
The scenario is this: the user throw an iPhone as high as he can and I want to measure the height that the iPhone has done.
I want to use the accelerometer with Core Motion and I successfully implemented a simple system that gives me the acceleration on the 3 axis. This is an acceleration though.
Based on my physics knowledge, the formula to calculate the maximum height is (V0^2)/2*g where V0 is the starting velocity.
I have the acceleration velocity though.
Any idea how can I convert the acceleration to velocity or directly get the velocity from my accelerometer?
I know it's not a completely programming related question, but I just want to have some help on this :)
First of all there is alredy an app that does exactly what you are up to: Send Me To Heaven. You won't find it in Apple's App Store because it never passed the review, guess why ;-)
As you stated you only have access to accelerations. h = v02/(2*g) is correct. To get the starting velocity v0 you need to integrate the acceleration numerically over the time. The trickiest part will be to find necessary and sufficient conditions to determine the time interval [t1, t2]. When did the acceleration phase start and when did it stop.
Another thing to consider is to avoid cheating users who just perform a rotation around there axis for a couple of seconds, then simulate the flying phase. There you might consider the landing phase too: when the user grabs the device you should register a strong deceleration.
However, don't expect this app to ever get in the store and at Google Play the competitors were faster.
Problems:
There's a number of problems physics impose on you (other than the velocity), to get the result.
The angle of the throw, relative to the direction of gravity. You cant know the relative vertical distance, unless you know this angle.
Orientation of reference throughout the throw (you cannot deduct the speed from the acceleration, from the device itself, unless you account for the changes in rotation while the phone accelerates).
However! You can decide to assume certain things, which will make these annoying problems go away!
Reasonable assumption:
The device is caught again, at the same relative height it was thrown.
This assumption reduce the problem to a much simpler one, in which we only really need to find the duration of time, where the device is in free fall, in order to determine the relative height of the throw.
All you have to do:
To determine if the device is in free fall, is relatively easy, since the total gravity would be near 0 m/s^2.
However, there's still one smallish problem to this, because the accelerometer is probably not located at the center of mass of the phone, so it will experience a constant acceleration (if the phone rotates around itself) in exactly one particular direction, throughout the free fall.
The maths of determining the height of a vertical throw, based on the airtime duration is left as an exercise to the reader :-)

Gravity as frame of reference in accelerometer data in iOS

I'm working on an iPhone app for motorcyclist that will detect a crash after it has occurred. Currently we're in the data acquisition process and plotting graphs and looking at data. What i need to log is the forward user acceleration and tilt angle of the bike relative to bike standing upright on the road. I can get the user acceleration vector, i.e. the forward direction the rider is heading by sqrt of the x,y and z accelerometer values squared. But for the tilt angle i need a reference that is constant, so i thought lets use the gravity vector. Now, i realize that deviceMotion API has gravity and user acceleration values, where do these values come from and what do they mean? If i take the sqrt of the x,y and z squared components of the gravity will that always give me my up direct? How can i use that to find the tilt angle of the bike relative to an upright bike on the road? Thanks.
Setting aside "whiy" do this...
You need a very low-pass filter. So once the phone is put wherever-it-rides on the bike, you'll have various accelerations from maneuvers and the accel from gravity ever present in the background. That gives you an on-going vector for "down", and you can then interpret the accel data in that context... Fwd accel would tip the bike opposite of braking, so I think you could sort out fwd direction in real time too.
Very interesting idea.
Assuming that it's not a "joke question" you will need a reference point to compare with i.e. the position taken when the user clicks "starting". Then you can use cos(currentGravity.z / |referenceGravity|) with |referenceGravity| == 1 because Core Motion measures accelerations in g.
But to be honest there are a couple of problems for instance:
The device has to be in a fixed position when taking the reference frame, if you put it in a pocket and it's just moving a little bit inside, your measurement is rubbish
Hmm, the driver is dead but device is alive? Chances are good that the iPhone won't survive as well
If an app goes to the background Core Motion falls asleep and stops delivering values
It has to be an inhouse app because forget about getting approval for the app store
Or did we misunderstand you and it's just a game?
Since this is not a joke.
I would like to address the point of mount issue. How to interpret the data depends largely on how the iPhone is positioned. Some issues might not be apparent to those that don't actually ride motorcycles.
Particularly when it comes to going around curves/corners. In low speed turns the motorcycle leans but the rider does not or just leans slightly. In higher speed turns both the rider and the motorcycle lean. This could present an issue if not addressed. I won't cover all scenarios but..
For example, most modern textile motorcycle jackets have a cell phone pocket just inside on the left. If the user were to put there phone in this pocket, you could expect to see only 'accelerating' & 'braking'(~z) acceleration. In this scenario you would expect to almost never see significant amounts of side to side (~x) acceleration because the rider leans proportionally into the g-force of the turn. So while going around a curve one would expect to see an increase in (y)down from it's general 1g state. So essentially the riders torso is indexed to gravity as far as (x) measurements go.
If the device were mounted to the bike you would have to adjust for what you would expect to see given that mounting point.
As far as the heuristics of the algorithm to detect a crash go, that is very hard to define. Some crashes are like you see on television, bike flips ripping into a million pieces, that crash should be extremely easy to detect, Huh 3gs measured up... Crash! But what about simple downs?(bike lays on it's side, oops, rider gets up, picks up bike rides away) They might occur without any particularly remarkable g-forces.(with the exception of about 1g left or right on the x axis)
A couple more suggestions:
Sensitivity adjustment, maybe even with some sort of learn mode (where the user puts the device in this mode and rides, the device then records/learns average riding for that user)
An "I've stopped" or similar button; maybe the rider didn't crash, maybe he/she just broke down, it does happen and since you have some sort of ad-hoc network setup it should be easy to spread the news.

iOS: Core Motion used to detect larger movements over distance?

I have a GPS app that I would like to detect if the user is standing still and not moving. Using Core Location works for this, but is sometimes not accurate because new updates move and gives the illusion of speed and motion.
So, I am wondering if in addition to that, I can also use Core Motion. Is this a good idea to detect motion such as someone walking, running, driving, etc, and know when they are no longer doing that motion? Or, is Core Motion only for small movements such as tilting the device or lifting it to your ear?
I wanted to tell others who visit this question what I've learned and what I think about this approach.
I have been doing some research of my own to know whether this is possible, and more importantly, even if it is what is the battery consumption and accuracy of the location change detected. For Android though, this question was asked quite sometime back. The answer provides links to this Google Tech Talk. At 23:20, the speaker talks about how difficult it is to achieve this and the accuracy you will achieve in the results.
Even though I have to come to realize the battery consumption from sensors on the iPhone is a little lesser than in most Android phones, I still think this is a costly affair in terms of accuracy and battery consumption.
you can use the GPS with the sensor readings to distinguish between walking, running, etc. if you combine the tilt angle frequency change and the GPS speed information (you need to do some work to get some of this info of course, but thats the way to do it).
You are talking about 4 different measurements from 4 different sensors (technically more than 4 but..) -
Latitude & Longitude - from CoreLocation. It uses a mix of GPS + cell tower triangulation.
Accelerometer - the current orientation of the device in 3D space.
Gyroscope - orientation of the device on its own axis.
Magnetometer - which tells you which direction a device is point w.r.t south,north,east,west
Of all these I think only Latitude & Longitude are of use to you. Basically what you do is to make the sensitivity (i.e. the update rate from the sensor) a bit more relaxed. With some tweaking around with this you should be able to tell with good accuracy if a person is standing or moving.

Android/ iOS how to determine small changes in distance using sensors?

I have been doing a bit of research, but I cannot seem to find a way to determine small distances (centimeters and meters) using the sensors in Android or iOS devices.
Bluetooth appears too inaccurate and require more than one device, GPS only works over larger variations in distance, and small variations in rotation seem to make using the accelerometer nearly impossible.
Is there a method that I am unaware of that would allow me to do such a thing? I am familiar with Calculus, so using Integrals to determine distance based on changes in time and velocity/ acceleration is not a problem for me, I just do not know how to determine those things.
Thank you.
There's no sensor in these devices which is able to give you the desired accuracy without exterior help.
If your use case allows for a bit of external setup, here are some ideas:
You could use the camera and computer vision to calculate device movement. You could, for example, use ARToolkit to measure the distance to a visual tag fixed to a wall. In close distances you can get pretty high accuracy (mm) using this technique.
Another idea would be to measure the distance to a solid object, like a wall, by emitting a short audio signal using the speaker and measure the time until the echo arrives at the microphone. This would be more of a research project, though.
You CAN use the accelerometer to measure distance travelled
(if ONLY absolute displacement is involved).
Have the user hold the device flat and walk from pointA to pointB.
The user presses a "Start" button in ur app as he starts from A and
presses an "End" button in ur app as he reaches B.
Calculate the double-intergral of AccelX & AccelY seperately over time
between the 2 button presses. These will be distX & distY respectively.
Total displacement will be sqrt( (distXsquared) + (distY squared) ).
GoodLUCK!!
Regards
CVS#2600Hertz
Just as a thought experiment, you should be able to do this using a combination of the accelerometer and the compass on each device.
However, whether the accuracy of these sensors is enough for what you want to do...well I think you'd just have to try it.

Detecting movement with an iphone

Can the iphone detect its movement in terms of distance?
Would one be able to use a built in function on an iphone to determine the distance the phone has moved so that the speed of movement can be calculated?
Basically my question is
can an iphone detect its position and distance moved without using the gps?
thanks
You probably could with some clever math.
Basically, integrate over the accelerometer data.
For all the details, see http://www.freescale.com/files/sensors/doc/app_note/AN3397.pdf
No, the only sensor that the device has that can calculate "distance" is via the Location API, which will make use of the GPS. Accelerometer and gyros (in iPhone 4) can give precise measurements of changes in orientation, but not distance travelled.
Not easily, there are a couple of ways you can do this but they have severe limitations and you'll have to write all the code yourself.
One way is to use the accelerometer and try and calculate the distance from the forces on the phone, this is never going to be very reliable.
Another way is to use wifi, essentially looking at the signal strength to determine distance from the router (I think this is only possible using private APIs and requires several routers to be at all accurate). Or listen from a router to find out how far away the iPhone is.