Is there any way to find, iPhone movement?. I need to know if iPhone is still i.e its not moving then do some thing or else if iPhone is moving(in vertical or horizontal direction) do some logic.
Do I need to read accelerometer didAccelerate method. Or there is some other way like reading current location (lat/log) values.
[I DON'T have 3GS]
Please suggest which is the best and robust way to read iPhone movement and how to know if iPhone is still.
Thanks
Yes, you will need to use the UIAccelerometer class. Set the delegate and do some calculations on the accelerometer:didAccelerate delegate message.
The trick is that you will want to set up some threshold values. If there is no sig movement within your range in the x, y, and z axis, the device is not moving.
Of course, slight vibrations will cause value changes, which is why you will need to determine what not moving is for your purposes.
Related
I want to count number of shake horizontally and vertically, I have referred to UIAcceleration
I have also referred to Motion Events
But couldn't come up with better approach.
Any kind of help is highly appreciated , code , reference, or any type.
i just want to count the number of shake user make by shaking the iphone device, the shake can be vertically or horizontally holding iphone in normal way(home key at the bottom)
Try DiceShaker. You'll need to use the code for "Isolating Instantaneous Motion from Acceleration Data" given in Listing 4-6 of the motion events (also called high-pass filter computation) documentation to detect acceleration provided by user.
EDIT: The accelerometer constantly provides the gravity component readings because the accelerometer works with a bunch of springs that determine the force component (in the direction of each spring's length) by the increase/decrease in the spring's length. So just remove the constant gravity(the force that's ALWAYS working) component to detect the change provided by the user (hence the name high-pass). Luckily, we don't need to figure out how to because Apple has done the hard work and given the equations in their documentation!
I'm developing a project for my university, to manipulate gyroscope/accelerometer values obtained from the iPhone 4. But i'm stuck with a mathmatical issue and I hope you guys can help me out.
I'll give you an example of what's about:
Your iPhone is face up and you move it UP, on Y axis.
Your iPhone is face right and you move it UP, on X axis this time (since you rotated the iphone 90 degrees).
On the second time, the computer interprets that i've moved the iphone to the RIGHT, but it's wrong. I've moved it up, but my axis were rotated since the iphone was face right.
What do I need?
I need a way to VIRTUALY position back the iphone face up (where the 3 axis are correct) and give each axis his correct movement value.
If the iphone is turned 90 degrees, then I can easily switch X-Z axis and its correct. But I want to make it work for any angle of rotation.
I would be really thankfull if anyone can help me with some sort of pseudo-algorithm or mathmatical description of what to do.
NOTE: I only need a way to compensate all three axis acording with the iPhone rotation.
Update:
I don't actually need the precise values, since I'm making a graph coparition between all the records I get from the gyroscope. I'll make it clearer.
-> You draw a LETTER just by moving the iphone in the air and my application will recognize the letter you just drew. The method I use for recognition is based on TFT algorithm, and recording to a database with sample values originated from letters I've previously drawed.
My point is: Don't really matter the values I get, or what they represent. All I need is that all graphs be equal even if the iPhone is on different position. Quite hard to explain, but if you draw the letter 'P' with the iphone turned UP, the graph originated will be different if you draw the 'P' with the iPhone turned RIGHT.
So I need to compensate the axis to their original orientation, that way I'll get always similar graphs.
This post was before iOS5 was released. FYI to anyone coming here, DeviceMotion relative to world - multiplyByInverseOfAttitude shows how to transform device-relative acceleration values to earth-relative for iOS5 and above.
So, what you want is to convert from iPhone's coordinate system (or object/tangent system) to world coordinate system (or vice versa, as you see it, doesn't matter). You know the iPhone coordinate system as you have gyroscope data. So what you want is to create the object->world transformation matrix and multiple each of the velocity vectors (from accelerometer) by it.
Take a look here for a good explanation of tangent space and how to create tangent space -> world transformation matrix. If you aren't familiar with 3D/linear math it might be a bit tricky, but worth the trouble.
I want to create a 360 degree turntable showing lots of pictures (12, 24 or 36) by controlling that rotation with touch events (like that example but coded for an iOS app natively).
The simplest idea depending on the touch position is to load that specific uiimage.
Any ideas what's the best practice for that? Is there a chance to create that image-turntable with the help of coreanimation faster? Any other hints on that? Any other projects known where I can get some help on that?
Thanks for your time and hints in the right direction.
Here's another example for an ipad-app from the "audi a8".
From the first example it becomes obvious that the objects have actually been photographed for each angle of rotation. This is the really tricky part. You will need a tripod and a camera with remote control, and if possible also a rotational platter to keep angles consistent.
Implementation is relatively straightforward. As you guessed, you just track the touch positions and, depending on delta to the last touch position, show the appropriate image.
well, you can just use the HTML/CSS/JS used in the same example... just load that in an UIWebView in your app and load your site embedded as a resource...
Subclass UIImageView, load array of your frames, handle tap movement over the screen y-axis and change active image accordingly. Don't forget to loop your images. :)
Is there a way to capture the amount of screen that is making contact with the users? I assume there is since this finger painting app shows the ipad responding to only the pixels that the user makes contact with.
Thanks so much in advance for your help!
The size of the touch is abstracted away by the framework, and UITouches only contain calculated (“best estimated”) points instead of the raw, actual areas that were touched. I would guess that the “pressure” was calculated from the duration and the direction of the touch.
In a nutshell, there is no public API to get the contact area.
I don't think Apple provides APIs for the size of the touch, or as #nickthedude said (I think) any kind of way to measure pressure. Basically, you need to implement your own algorithm/policy for determining line thickness/opacity/other effects. I believe a common way to do this is to measure the amount of time spent for the stroke, and work from there. For instance, if the user moved more quickly, you might want a thinner line segment. Apple really should just provide a canvas view of some kind. Best of luck!
to get the exact area you may have to roll your own but you can get uievents pretty easily and then do some magic from there. Basically impliment/override touchesBegan, touchesEnded, touchesMoved on the UIView in question and put in your custom code there.
Looking at the video maybe the amount of touches in the UIEvent set might correspond to the "pressure" of the touch, then again maybe not.
What if you laid down a series of successively smaller square uiviews wherever the user touched then if the touches "spilled" into the larger uiviews behind the smaller front ones than you could conjecture that the touch pressure was harder. Something to try I guess. Good luck.
Why not just describe what you want to do and foxus on asking about that instead - it may not have anything at all to do with the example that has you so otherwise enthralled - I can use a camera to monitor your hand from across the table and paint pixels on the screen via BT, completely ignoring any contact between your fingers and the screen.
How do i calculate Dip and Strike from iphone. They are used to measure rocks.
i have to use compass api how do i get the angle from it.
anyone has idea please give me some idea.
Thank you.
The "compass API" is not so much a compass as a magnetometer. (In other words, if you put your phone near some strong local magnetic field - say, a CRT monitor - you won't only be measuring the Earth's magnetic field.)
This sample ought to help you on your way. It demonstrates the use of the CLLocationManager.
To determine the orientation of the rockface you need to use the UIAccelerometer class. Have your ViewController or whatever implement the UIAccelerometerDelegate protocol. The BubbleLevel sample application will show you how to use the API.
The magnetometer then allows you to turn that phone orientation into an orientation relative to magnetic North.
Instead of asking the same question. I offer part but not full answer.
Strike => (Geology) horizontal surface crossing bedding in question: this can be achieved by reading the heading from the CLLocationDirection.magneticHeading
Dip.angle => (Geology) horizontal surface & bedding angle: this can be achieved by either CMMotionManager.CMDeviceMotion.CMAttitude.roll or pitch depending on how the iOS device is held to the bedding being measuered.
EDIT:
3. Dip.driection => (Geology) actual NSWE orientation of the tilt: Strike + 90 because they are perpendicular clockwise (I think).
I may have not written my answer correctly but my geologist colleagues confirm my readings which is basically 1, 2 and 3 (eg: 43/11SE)
If someone can improve this answer, I appreciate it.