iOS: Core Motion used to detect larger movements over distance? - iphone

I have a GPS app that I would like to detect if the user is standing still and not moving. Using Core Location works for this, but is sometimes not accurate because new updates move and gives the illusion of speed and motion.
So, I am wondering if in addition to that, I can also use Core Motion. Is this a good idea to detect motion such as someone walking, running, driving, etc, and know when they are no longer doing that motion? Or, is Core Motion only for small movements such as tilting the device or lifting it to your ear?

I wanted to tell others who visit this question what I've learned and what I think about this approach.
I have been doing some research of my own to know whether this is possible, and more importantly, even if it is what is the battery consumption and accuracy of the location change detected. For Android though, this question was asked quite sometime back. The answer provides links to this Google Tech Talk. At 23:20, the speaker talks about how difficult it is to achieve this and the accuracy you will achieve in the results.
Even though I have to come to realize the battery consumption from sensors on the iPhone is a little lesser than in most Android phones, I still think this is a costly affair in terms of accuracy and battery consumption.

you can use the GPS with the sensor readings to distinguish between walking, running, etc. if you combine the tilt angle frequency change and the GPS speed information (you need to do some work to get some of this info of course, but thats the way to do it).

You are talking about 4 different measurements from 4 different sensors (technically more than 4 but..) -
Latitude & Longitude - from CoreLocation. It uses a mix of GPS + cell tower triangulation.
Accelerometer - the current orientation of the device in 3D space.
Gyroscope - orientation of the device on its own axis.
Magnetometer - which tells you which direction a device is point w.r.t south,north,east,west
Of all these I think only Latitude & Longitude are of use to you. Basically what you do is to make the sensitivity (i.e. the update rate from the sensor) a bit more relaxed. With some tweaking around with this you should be able to tell with good accuracy if a person is standing or moving.

Related

iPhone4 iOS5 is there a physics engine to convert CMDeviceMotion events into displacement?

I'm running a CMDeviceMotion processing queue on iPhone 4, which gives me user-induced acceleration, along with the rotation rates. I can filter this data myself.
What I'm trying to understand is how to convert these discrete samples of acceleration, device attitude and rotational rate into a 3 dimensional displacement. This is possible with classical mechanics for straight lines, but I"m thinking of more advanced calculations - for example curves. This can be handled with GPS, but I'm looking for a much better resolution - lets say within 10 feet. GPS under clear sky has an average accuracy of about 30 feet.
Is there some sort of a physics engine or physics processor that can take a set of device motion or acceleration/turn rate events and give me a distance of how far the phone is from the original location?
I know that there are various pedometer and bike GPS trackers for iPhone. Are they based on GPS or do they actually do the acceleration integration like I'm describing?
Unfortunately, the acceleration integration you are describing won't work in itself.
However, you may improve the accuracy by fusing with the GPS signal and/or make domain specific assumptions. For details, see the above link.

Detecting the user's spinning motion

I have been experimenting with the Core Motion framework to detect a user spinning around, say on a merry-go-round, holding an iphone in his hand.
There are ways to detect the device motion around its own axes, but what is a good way to detect the iPhone spinning in circles?
Thanks
You can use the gyroscope. Take a look here: Gyroscope example
You have to remind that it is only availble on iPhone4 and iPhone4S.
There is one degenerate case where you can run into trouble, only magnetometer (compass) can help in that particular case.
If you put the device (a) on the desk in stationary position then (b) on a perfectly horizontal turntable rotating slowly you will get the same qualitative sensor readings. Both the gyro and the accelerometer readings are constant in the two cases, although the readings quantitatively differ. The sad part is: gyro bias error can render case (a) to look like (b) and vice-versa. In this particular case you need a compass to cancel the gyro drift. Case (a) is typical for a phone.
Apart from this degenerate case, gyroscopes and accelerometers with sensor fusion are sufficient to track arbitrary rotations of the device.

iOS: Get how fast user is moving

I'm wanting to figure out if a user is not moving at all, walking, or running using the iPhone. I'm not trying to implement a pedometer. I just want to know around about if someone is moving briskly, slowly, or not at all. I don't need mph or anything like that.
I think the accelerometer may be able to do this for me, but I was wondering if someone knows of any tutorials or example code that might be able to point me in the right direction?
Thanks to all that reply
The accelerometer won't do you any good here - it will only capture changes in velocity.
Just track the current location periodically and calculate the speed.
There are no hard thresholds for walking vs. running motion, so you will have to experiment a bit. The AccelerometerGraph sample code should get you started on how to get and interpret accelerometer data.
The Accelerometer is good, but if the user has an iPhone 4 or iPad 2 you should use the gyroscope.
CMMotionManager and Event Handeling Guide - Motion Events
Apple Documentation is the best example you can get!
People have a different bounce in their step between walking and running which can be measured with the accelerometer, but this differs between individuals (what shoes they are wearing, what surface they are upon, what part of the body is attached to the iPhone etc.), and this motion can probably be imitated by shaking the iPhone just right while standing still.
Experiment by recording the two types of acceleration profiles, and then use some sort of pattern matching to pick the most likely profile candidate from the current recorded acceleration data.

Detecting movement with an iphone

Can the iphone detect its movement in terms of distance?
Would one be able to use a built in function on an iphone to determine the distance the phone has moved so that the speed of movement can be calculated?
Basically my question is
can an iphone detect its position and distance moved without using the gps?
thanks
You probably could with some clever math.
Basically, integrate over the accelerometer data.
For all the details, see http://www.freescale.com/files/sensors/doc/app_note/AN3397.pdf
No, the only sensor that the device has that can calculate "distance" is via the Location API, which will make use of the GPS. Accelerometer and gyros (in iPhone 4) can give precise measurements of changes in orientation, but not distance travelled.
Not easily, there are a couple of ways you can do this but they have severe limitations and you'll have to write all the code yourself.
One way is to use the accelerometer and try and calculate the distance from the forces on the phone, this is never going to be very reliable.
Another way is to use wifi, essentially looking at the signal strength to determine distance from the router (I think this is only possible using private APIs and requires several routers to be at all accurate). Or listen from a router to find out how far away the iPhone is.

How to detect height of iPhone (for use in augmented reality game)?

I'm working on locating an iPhone device in 3D space.
I can use lat/long to detect physical location, I can use the magnetometer to figure out the direction they're facing, and I might be able to use the accelerometer to figure out how their device is oriented, but I can't figure out a way to get height of the device off the floor.
Specifically, I need to know if the user is squatting down, or raising their hand toward the ceiling (a different of about 2 meters/6 feet).
I posted a more detailed description of what I'm trying to do on my blog: http://pushplay.net/blog_detail.php?id=36
I would love any suggestions as to how to even fake this sort of info. I really want the sort of interactivity and movement that would require ducking and bobbing, versus just letting someone sit back and angle the phone -- kind of the way people can "cheat" playing with a Wii...
The closest I could see you getting to what you're looking for is using the accelerometer/magnetometer as an inertial tracker. You'd have to calibrate the user's initial position on startup to a "base" position, then continuously sample the sensors on a background thread to build a movement model. This post talks about boosting the default sample rate of the accelerometer functions so that you can get a pretty fine-grained picture of the user's movements.
I'm not sure this will solve your concern about people simply angling the device to produce the desired action, but you will have to strike a balance between being too strict in interpreting movements and allowing for differences in movement
The CoreLocation stuff gives you elevation aswell as lat/long, so you could potentially use that although there are some significant problems with this:
Won't work well indoors (not a problem for Sat Nav, is a problem for games)
Your users would have to "calibrate" (probably by placing the phone on the floor) each location they use!
In fact, you'd need to start keeping a list of "previously calibrated locations"... which could vary hugely just in one house (eg multiple rooms and floors). Could get in the way of the game.
Can't be used on moving transport (tranes, planes, automobiles... even walking) because elevation changing so frequently.
Therefore I'd have thought that using the accelerometer as a proxy for height is a substantially more preferable route than determining absolute elevation.
I am not intimately familiar with the iphone. But it might require a hardware add-on. (which you probably don't want to do). After thinking on this the only way I know how is through light or more specific laser. You shoot out a laser on the floor and record the time it takes to get back. It's actually not a lot to put this hardware together and I am sure the iphone has connections for peripherals. Unless osmeone can trump me, I say ther eis no way to do that with an image.