Execute action on tilt with Accelorometer - iphone

I want to create an iPhone application that executes certain actions based on what direction you tilt your device. For instance if you tilt the device so your phone screen is pointing towards the floor execute action x, if you tilt your phone so the phone screen is pointing towards the sky execute action y. I found some examples using the iPhones accelerometer in order to detect the tilt of the phone, but the values the accelerometer generated were so sporadic it was hard to execute a certain action based on specific values.
I'm relatively new to using the accelerometer within applications so I might be going about this completely the wrong way, any help would be appreciated.

The raw data out of the accelerometer is pretty jittery. You'll want to apply a high pass filer or a low pass filter at a minimum to the raw data. See the Apple sample code AccelerometerFilter.m for some basics on how to use it. I found that that wasn't sufficient and I keep a time-moving average of the data to accomplish what I need. You'll certainly need to play around with this to get it to do what you want it to do.

Related

How can I detect if the user is walking/running with his device?

simple question hard answer:
I'd like to be able to read if the device (and the user) is running/walking holding his device. I know that the iPhone accelerometer calculates acceleration so if the user runs at a constant speed, there will be no signal spotted.
Any help on that ?
I actually used to work on that...what you can do is to detect with the accelerometer and gyro the frequency of the movement. If you plot a chart, you will see a periodic behavior when you walk or run. Do some "field" testing and you could see how those frequency change between walking and running. It's pretty cool.
Try dynamic time warping (DTW).
First, you build a small "database" of motions that you would like to recognize.
Then, in your application you compare the current sensor readings with DTW to the ones in the database and pick the most similar one.

Game Design: Checking for object intersection or getting values from accelerometer

I'm currently developing an iPhone game where the player needs to tilt the device to do something. The game is somewhat of a memory game with the four corners of the screen being possible targets. The object of the game is to remember the order and then move the device to the right place.
My question is more about the design of the moving mechanic. The two options that I thought of were to get the values from the accelerometer directly and when the are greater than a specific value return whether that was the correct place to go (ie the right corner for the given instruction). My second idea is that each corner would have its own CGrect and the accelerometer would move an other CGrect and when the two intersect it would return whether the move was right or wrong.
In your opinion which one would be best? I think that the accelerometer data would be quicker but it might be affected by sudden movements while the other way might be slower but more accurate. Let me know what you think.
I think you should try both, test each of them on players, and see which works better. Drive game design decisions from user testing whenever possible.
My speculation is that you are going to need to damp or accumulate the accelerometer data somehow, since it is noisy; and if you are integrating that data into a moving average, then you should show where that moving average is with eg an onscreen sprite.
You probably don't even need to use the CGRect's intersection -- if you're just trying to determine whether <x0,y0> is within r units of <x1,y1> then you can do it via a simple Pythagorean distance. But the important thing is that if there is some internal state in the algorithm calculating what the accelerometer data has integrated to, then you need to show that state onscreen to feel responsive.

iOS: Get how fast user is moving

I'm wanting to figure out if a user is not moving at all, walking, or running using the iPhone. I'm not trying to implement a pedometer. I just want to know around about if someone is moving briskly, slowly, or not at all. I don't need mph or anything like that.
I think the accelerometer may be able to do this for me, but I was wondering if someone knows of any tutorials or example code that might be able to point me in the right direction?
Thanks to all that reply
The accelerometer won't do you any good here - it will only capture changes in velocity.
Just track the current location periodically and calculate the speed.
There are no hard thresholds for walking vs. running motion, so you will have to experiment a bit. The AccelerometerGraph sample code should get you started on how to get and interpret accelerometer data.
The Accelerometer is good, but if the user has an iPhone 4 or iPad 2 you should use the gyroscope.
CMMotionManager and Event Handeling Guide - Motion Events
Apple Documentation is the best example you can get!
People have a different bounce in their step between walking and running which can be measured with the accelerometer, but this differs between individuals (what shoes they are wearing, what surface they are upon, what part of the body is attached to the iPhone etc.), and this motion can probably be imitated by shaking the iPhone just right while standing still.
Experiment by recording the two types of acceleration profiles, and then use some sort of pattern matching to pick the most likely profile candidate from the current recorded acceleration data.

Detecting movement with an iphone

Can the iphone detect its movement in terms of distance?
Would one be able to use a built in function on an iphone to determine the distance the phone has moved so that the speed of movement can be calculated?
Basically my question is
can an iphone detect its position and distance moved without using the gps?
thanks
You probably could with some clever math.
Basically, integrate over the accelerometer data.
For all the details, see http://www.freescale.com/files/sensors/doc/app_note/AN3397.pdf
No, the only sensor that the device has that can calculate "distance" is via the Location API, which will make use of the GPS. Accelerometer and gyros (in iPhone 4) can give precise measurements of changes in orientation, but not distance travelled.
Not easily, there are a couple of ways you can do this but they have severe limitations and you'll have to write all the code yourself.
One way is to use the accelerometer and try and calculate the distance from the forces on the phone, this is never going to be very reliable.
Another way is to use wifi, essentially looking at the signal strength to determine distance from the router (I think this is only possible using private APIs and requires several routers to be at all accurate). Or listen from a router to find out how far away the iPhone is.

How to detect height of iPhone (for use in augmented reality game)?

I'm working on locating an iPhone device in 3D space.
I can use lat/long to detect physical location, I can use the magnetometer to figure out the direction they're facing, and I might be able to use the accelerometer to figure out how their device is oriented, but I can't figure out a way to get height of the device off the floor.
Specifically, I need to know if the user is squatting down, or raising their hand toward the ceiling (a different of about 2 meters/6 feet).
I posted a more detailed description of what I'm trying to do on my blog: http://pushplay.net/blog_detail.php?id=36
I would love any suggestions as to how to even fake this sort of info. I really want the sort of interactivity and movement that would require ducking and bobbing, versus just letting someone sit back and angle the phone -- kind of the way people can "cheat" playing with a Wii...
The closest I could see you getting to what you're looking for is using the accelerometer/magnetometer as an inertial tracker. You'd have to calibrate the user's initial position on startup to a "base" position, then continuously sample the sensors on a background thread to build a movement model. This post talks about boosting the default sample rate of the accelerometer functions so that you can get a pretty fine-grained picture of the user's movements.
I'm not sure this will solve your concern about people simply angling the device to produce the desired action, but you will have to strike a balance between being too strict in interpreting movements and allowing for differences in movement
The CoreLocation stuff gives you elevation aswell as lat/long, so you could potentially use that although there are some significant problems with this:
Won't work well indoors (not a problem for Sat Nav, is a problem for games)
Your users would have to "calibrate" (probably by placing the phone on the floor) each location they use!
In fact, you'd need to start keeping a list of "previously calibrated locations"... which could vary hugely just in one house (eg multiple rooms and floors). Could get in the way of the game.
Can't be used on moving transport (tranes, planes, automobiles... even walking) because elevation changing so frequently.
Therefore I'd have thought that using the accelerometer as a proxy for height is a substantially more preferable route than determining absolute elevation.
I am not intimately familiar with the iphone. But it might require a hardware add-on. (which you probably don't want to do). After thinking on this the only way I know how is through light or more specific laser. You shoot out a laser on the floor and record the time it takes to get back. It's actually not a lot to put this hardware together and I am sure the iphone has connections for peripherals. Unless osmeone can trump me, I say ther eis no way to do that with an image.