Accelerometer detecting side to side movements - iphone

can someone please give me an example of how you detect if the iphone is moving left or moving right. like a long version of a shake i want my app to be able to know if the users arm is going right or left. imagine your hand like a window wiper with the phone at the end
any ideas?
Thanks, Sam :P

That would be the acceleration.x value. Imagine 3 lines going through your iPhone:
1) a line perpendicular to your iPhone screen and going through it, that would be Z.
2) a horizontal and parallel line to your screen, that would be X (what you're looking for).
3) a vertical and parallel line to your screen, that would be Y.
So, moving the iPhone left or right will generate changes in the X graph. Moving the iPhone up or down will generate changes in the Y graph. Moving your iPhone forward or backward will generate changes in the Z graph.
Hope this helps you grasp the accelerometer's concept,
~ Natanavra.

Related

Calculate the distance between the iPhone and an object, knowing their physical widths (of the iPhone as well)

if you check this thread started from me (Calculate the distance between the iPhone and a door, knowing their physical widths) I accepted an answer, that states, correctly, that if you do not know focal lens data of the camera of the iPhone, there is no easy way to calculate the distance between an iPhone and, let's say, a door, knowing its width.
I have to start another thread now asking:
I know the physical (not only in pixel) size of the screen of the iPhone (for iPhone 5 is 2.31 inches)
Also I know the width of a door.
Now, if I am in the position where the width of the door fits perfectly in the width of the iPhone itself (not of the camera), and the iPhone stands perfectly vertical, is it possible to know the distance between the iPhone and the door in that moment?
Thank you for your kind help!
I assume you mean that there is some outside image capturing device (be it a human eye or another camera) and the image capturing device, the phone, and the door are all in a line such that the phone appears to be the same width as the door.
In this case, you still need a) the distance between the image capturing device and the phone and b) the optical information (such as focal length) of the image capturing device. Just sit down with a pen and paper and draw out the geometry for a little bit and you'll see that.
This is going to involve a trigonometric calculation. I think you might have done R&D on Gyroscope, if not then surely you should refer it.
1) Find angle your phone is making with ground. Like when you point the device's camera to bottom of the object.
2) Now you are having one angle and you are making 90 degree with ground. So basically you are forming a right angled triangle. And you had just found one of your angle near your hand.
3) You can approximate distance of your phone from surface to your hand. So you got one side of triangle and one angle. Thus you can find second side i.e distance between you and object.
Hope this helps. :)

Is there any way to draw lines on screen by swiping and check whether they are crossing each other

We have a situation in which user will be drawing many lines on the screen and we need to find out the crossing lines and get intersection point.
Is there any way to achieve this in iPhone using cocoa-touch ? Please show a direction and if possible code sample would be gr8 for me as I am stuck for this since last 2 days.
Please help.
As I see it, this is purely mathematics.
When the user touches the screen (and moving around with his finger) you will get some coordinates. Based on the current touch place (x,y) and the previous one, you can consider to be a line. Store each line inside an array.
Now when a new touch takes place, check whether this new line (current touch coordinates and previous ones) cross any other line from your array.
I don't remember what is the formula to check this intersection, but after a quick search, I found this, maybe will help.

Manipulating Gyroscope / Accelerometer Values obtained from iPhone 4

I'm developing a project for my university, to manipulate gyroscope/accelerometer values obtained from the iPhone 4. But i'm stuck with a mathmatical issue and I hope you guys can help me out.
I'll give you an example of what's about:
Your iPhone is face up and you move it UP, on Y axis.
Your iPhone is face right and you move it UP, on X axis this time (since you rotated the iphone 90 degrees).
On the second time, the computer interprets that i've moved the iphone to the RIGHT, but it's wrong. I've moved it up, but my axis were rotated since the iphone was face right.
What do I need?
I need a way to VIRTUALY position back the iphone face up (where the 3 axis are correct) and give each axis his correct movement value.
If the iphone is turned 90 degrees, then I can easily switch X-Z axis and its correct. But I want to make it work for any angle of rotation.
I would be really thankfull if anyone can help me with some sort of pseudo-algorithm or mathmatical description of what to do.
NOTE: I only need a way to compensate all three axis acording with the iPhone rotation.
Update:
I don't actually need the precise values, since I'm making a graph coparition between all the records I get from the gyroscope. I'll make it clearer.
-> You draw a LETTER just by moving the iphone in the air and my application will recognize the letter you just drew. The method I use for recognition is based on TFT algorithm, and recording to a database with sample values originated from letters I've previously drawed.
My point is: Don't really matter the values I get, or what they represent. All I need is that all graphs be equal even if the iPhone is on different position. Quite hard to explain, but if you draw the letter 'P' with the iphone turned UP, the graph originated will be different if you draw the 'P' with the iPhone turned RIGHT.
So I need to compensate the axis to their original orientation, that way I'll get always similar graphs.
This post was before iOS5 was released. FYI to anyone coming here, DeviceMotion relative to world - multiplyByInverseOfAttitude shows how to transform device-relative acceleration values to earth-relative for iOS5 and above.
So, what you want is to convert from iPhone's coordinate system (or object/tangent system) to world coordinate system (or vice versa, as you see it, doesn't matter). You know the iPhone coordinate system as you have gyroscope data. So what you want is to create the object->world transformation matrix and multiple each of the velocity vectors (from accelerometer) by it.
Take a look here for a good explanation of tangent space and how to create tangent space -> world transformation matrix. If you aren't familiar with 3D/linear math it might be a bit tricky, but worth the trouble.

Is there a way to see which pixels are what coordinates with crosshairs on iPhone simulator?

Is there a tool that will do this? I want to be running the simulator and then be able to put the mouse over some point and have it tell me what the (x,y) coordinates are. Surely there's a simple tool that does this.
I just use the built-in screenshot snapper from OS X. Just Command+Shift+4 and when you drag it shows the dimensions of the snap you'd take. Press escape to drop it. Works great.
In the Developer Tools -> Applications -> Graphics Tools there is a program called Pixie. It will do what you want. In Preferences you can set it up so that an option-drag will count pixels. You can also set it to just show the pixel coordinates and do the math yourself.
I've used the Iconfactory's xScope for this before. If you create rulers that are the size of the display in the Simulator, you can get a readout of the X and Y coordinates of the mouse pointer as you move across the Simulator screen. Getting the rulers precisely aligned with the edge of the Simulator screen can be a little tricky for applications with dark backgrounds, though.

How to recognize diagonal swipes on the screen?

I'm using UISwipeGestureRecognizer successfully to capture and action when gestures on the screen are made with single or multiple touches in Up, down left or right directions using the standard tools.
However what I really need to do now is have the device recognize diagonal swipes in the same way.
Does anyone have any ideas where to start?
Do I simply subclass UIGestureRecognizer myself and try to work out how to roll my own UIDiagonalSwipeRecognizer? Or is there a way of detecting if a swipe is, say, up && left?
Your help is appreciated...
I'd simply use an oldschool "touchesBegan" and "touchesEnded" and there check the x and y delta, if both of them is greater then some minimum constant, then this swipe should be a diagonal one.
Correct me if I wrong but UISwipeGestureRecognizer available from iOS 3.2 only, and if so that is not an advantage.