How do i calculate Dip and Strike from iphone. They are used to measure rocks.
i have to use compass api how do i get the angle from it.
anyone has idea please give me some idea.
Thank you.
The "compass API" is not so much a compass as a magnetometer. (In other words, if you put your phone near some strong local magnetic field - say, a CRT monitor - you won't only be measuring the Earth's magnetic field.)
This sample ought to help you on your way. It demonstrates the use of the CLLocationManager.
To determine the orientation of the rockface you need to use the UIAccelerometer class. Have your ViewController or whatever implement the UIAccelerometerDelegate protocol. The BubbleLevel sample application will show you how to use the API.
The magnetometer then allows you to turn that phone orientation into an orientation relative to magnetic North.
Instead of asking the same question. I offer part but not full answer.
Strike => (Geology) horizontal surface crossing bedding in question: this can be achieved by reading the heading from the CLLocationDirection.magneticHeading
Dip.angle => (Geology) horizontal surface & bedding angle: this can be achieved by either CMMotionManager.CMDeviceMotion.CMAttitude.roll or pitch depending on how the iOS device is held to the bedding being measuered.
EDIT:
3. Dip.driection => (Geology) actual NSWE orientation of the tilt: Strike + 90 because they are perpendicular clockwise (I think).
I may have not written my answer correctly but my geologist colleagues confirm my readings which is basically 1, 2 and 3 (eg: 43/11SE)
If someone can improve this answer, I appreciate it.
Related
I want to count number of shake horizontally and vertically, I have referred to UIAcceleration
I have also referred to Motion Events
But couldn't come up with better approach.
Any kind of help is highly appreciated , code , reference, or any type.
i just want to count the number of shake user make by shaking the iphone device, the shake can be vertically or horizontally holding iphone in normal way(home key at the bottom)
Try DiceShaker. You'll need to use the code for "Isolating Instantaneous Motion from Acceleration Data" given in Listing 4-6 of the motion events (also called high-pass filter computation) documentation to detect acceleration provided by user.
EDIT: The accelerometer constantly provides the gravity component readings because the accelerometer works with a bunch of springs that determine the force component (in the direction of each spring's length) by the increase/decrease in the spring's length. So just remove the constant gravity(the force that's ALWAYS working) component to detect the change provided by the user (hence the name high-pass). Luckily, we don't need to figure out how to because Apple has done the hard work and given the equations in their documentation!
I'm developing a project for my university, to manipulate gyroscope/accelerometer values obtained from the iPhone 4. But i'm stuck with a mathmatical issue and I hope you guys can help me out.
I'll give you an example of what's about:
Your iPhone is face up and you move it UP, on Y axis.
Your iPhone is face right and you move it UP, on X axis this time (since you rotated the iphone 90 degrees).
On the second time, the computer interprets that i've moved the iphone to the RIGHT, but it's wrong. I've moved it up, but my axis were rotated since the iphone was face right.
What do I need?
I need a way to VIRTUALY position back the iphone face up (where the 3 axis are correct) and give each axis his correct movement value.
If the iphone is turned 90 degrees, then I can easily switch X-Z axis and its correct. But I want to make it work for any angle of rotation.
I would be really thankfull if anyone can help me with some sort of pseudo-algorithm or mathmatical description of what to do.
NOTE: I only need a way to compensate all three axis acording with the iPhone rotation.
Update:
I don't actually need the precise values, since I'm making a graph coparition between all the records I get from the gyroscope. I'll make it clearer.
-> You draw a LETTER just by moving the iphone in the air and my application will recognize the letter you just drew. The method I use for recognition is based on TFT algorithm, and recording to a database with sample values originated from letters I've previously drawed.
My point is: Don't really matter the values I get, or what they represent. All I need is that all graphs be equal even if the iPhone is on different position. Quite hard to explain, but if you draw the letter 'P' with the iphone turned UP, the graph originated will be different if you draw the 'P' with the iPhone turned RIGHT.
So I need to compensate the axis to their original orientation, that way I'll get always similar graphs.
This post was before iOS5 was released. FYI to anyone coming here, DeviceMotion relative to world - multiplyByInverseOfAttitude shows how to transform device-relative acceleration values to earth-relative for iOS5 and above.
So, what you want is to convert from iPhone's coordinate system (or object/tangent system) to world coordinate system (or vice versa, as you see it, doesn't matter). You know the iPhone coordinate system as you have gyroscope data. So what you want is to create the object->world transformation matrix and multiple each of the velocity vectors (from accelerometer) by it.
Take a look here for a good explanation of tangent space and how to create tangent space -> world transformation matrix. If you aren't familiar with 3D/linear math it might be a bit tricky, but worth the trouble.
So I'm making an Augmented Reality app and I'm a little unsure how to tell, given your current location and heading information and a second location, if you're actually facing that location. I think it has to do with a specific part of the CLLocation Heading but I'm a little unsure. Any help would be awesome, thanks a lot everyone
I did this once and if I recall, all you need is the coordinates of the locations and the heading. If I understand corrctly, you have to check, whether your current position is north or south and whether it is east or west of your destination.
This can be done easily by checking the latitude / longitude values. Adding the heading, you can calculate a single 'line' which represents the direction you are watching.
Anyways, I did this in opengl, drawing some scene into the iPhone. All I needed to do then, was take the calculations from above and draw the scenery.
I cant tell you the code from memory but I can tell you, that drawing it on a piece of paper with some examples did help :) - and all you need to do it you already seem to have!
I had a quick look at CLLocation APIs, and it doesn't look like they do relative heading. Sounds like you'll want to implement something like a "Great Circle Bearing" algorithm, to get the bearing from your position to the object relative to North, and then use the compass reading to determine when your camera is aligned in that direction. This page is a goldmine of geodetic algorithms.
Is there any way to find, iPhone movement?. I need to know if iPhone is still i.e its not moving then do some thing or else if iPhone is moving(in vertical or horizontal direction) do some logic.
Do I need to read accelerometer didAccelerate method. Or there is some other way like reading current location (lat/log) values.
[I DON'T have 3GS]
Please suggest which is the best and robust way to read iPhone movement and how to know if iPhone is still.
Thanks
Yes, you will need to use the UIAccelerometer class. Set the delegate and do some calculations on the accelerometer:didAccelerate delegate message.
The trick is that you will want to set up some threshold values. If there is no sig movement within your range in the x, y, and z axis, the device is not moving.
Of course, slight vibrations will cause value changes, which is why you will need to determine what not moving is for your purposes.
I think that there are some issues with the earth Gravity, so I wonder if there are any examples where the accelerometers work by subtracting these.
(It's too bad you can't file a bug with planet Earth to fix its Gravity issue!)
There are several examples listed under "Related sample code" in the UIAcceleration Class Reference.
As that doc states, a value of 1.0 represents roughly +1g. So if you point the iPhone straight up in portrait orientation, you should see roughly <0, -1, 0>. As you rotate the phone around, the magnitude of the acceleration vector sqrt(x*x+y*y+z*z) should stay around 1.
Apple's AccelerometerGraph demo includes sample code that implements a high-pass filter-- effectively filtering out the effect of gravity. It's pretty simple to just grab it and modify to suit your needs.