I'm using the gyro (does accelerometer and gyro the same on iPhone?) and the X axis and Y axis are working great.
The Z axis is called YAW, but I just can't understand it. It should change its value to 0 when iPhone on its sides and -1 to 1 when face up and down.
In reality , every move on the X/Y axis , change also the Z ...
The definition of Z is answer also on the definition of X axis anyway (same?) because Z is not discrete .
I just can't understand the Z axis thing and if there is another way of getting data of current position from iPhone except from this.
The iPhone has an accelerometer and a gyroscope. Here is a very good video explaining both:
http://www.gametrailers.com/user-movie/wiimotion-plus-explained/244882
I may be wrong, but I believe you are using the accelerometer not the gyro. There are going to be 2 values that are always going to be changing if you rotate the iPhone. Imagine or draw your iPhone vertically as a rectangle. Now draw a horizontal line and a vertical line across it. Those are your X and y axis. If you rotate your iPhone with the screen facing you 90 degrees (from portrait to landscape) you are going to see that your x and y values changed from 1 to 0 and from 0 to 1. The values change when the lines you drew before change from horizontal to vertical and from vertical to horizontal.
Imagine The iPhone facing you again. The z-axis is going to be a line going though the iPhone. So if you put your iPhone on a table you are going to see on the app that the x and y-axis are going to be the same (I believe 0) because both lines are flat, and the z-axis is going to be I believe 1 because the line going though the iPhone is vertical.
Here is an image showing the lines:
(source: garagegames.com)
Related
I did some research on OpenPose and the output is x and y coordinates with confidence point. x and y coordinates are good for detecting up, down, left, and right movements. I was wondering is it possible to detect turning movements. Turning usually happens on the z axis. Is there a way to tell if body part has rotated 180 degrees with x and y coordinate.
I had few ideas like calculating the slope of the hand line. The slope tells us if the hand is tilted or not. If the slope is high or very low then the hand has rotated. Same concept for all other body parts. But I don't think that will work in all cases. Please check 2nd figure here https://cmu-perceptual-computing-lab.github.io/openpose/web/html/doc/md_doc_02_output.html to understand output of OpenPose.
If a camera is looking at say 6 evenly spaced dots in the real world it would look like the image below if the camera is looking at the image straight on with no rotation in the x, y or z axis
The z-axis is perpendicular to the image sensor so rotation around the z-axis is simple, it's just a tilt of the image. If I were to rotate the camera (or objects being looked at) around the x axis (if the x-axis is up down) the rows and columns will no longer be parallel and would project off to a vanishing point, like this.
What I would like to do is take a 2 dimensional image of say, dots, and be able to apply different rotations around the x,y and z axes independently. I've experimented with reading my image in Matlab and multiplying by a rotation matrix, or even a full camera matrix but I can't figure out how to take a 2D image, simulate rotating it around the x axis and then saving that back to an image. So that my original grid of dots would look like the bottom image with lines going off to a vanishing point. I've seen some examples using imwarp but I didn't see how I can set the angle of rotation. I'm working on camera calibration so I really want to be able to specify an angle of rotation around each axis.
Thanks for any help.
I am working on a project where i have to determine position and rotation of an IoT object with accelerometer, gyroscope and magnetometer. The goal is to show this object in a Unity3D virtual world.
To do this, i have an Arduino Nano 33 BLE that include a LSM9DS1 as inertial module.
So, i started to determine the rotation of my object. This link was very usefull : http://gilles.thebault.free.fr/spip.php?article32
That's the line for get the Y axis :
angle=0.98*(angle+float(gy)0.01/131) + 0.02atan2((double)ax,(double)az)*180/PI;
With this one, i can get X, Y and Z angles. When i try one by one axis orientation, all axis rotation are fine. The problem is when i use 2 axis or more at same time. For example, i use only X and Y axis. When i turn only X axis of 90°, Y axis turn of 90° too. This video will explain more accurately than words : https://www.youtube.com/watch?v=IeuIuEcjUBc&feature=youtu.be
I searched lot of things to fix it but now, i have no more idea. Can anyone guide me?
I noticed a really puzzling behavior on iPhone:
If I hold the phone in the vertical, and tilt it, the compass change.
I already figured the amount it changes is the same amount it would change for the same amount of tilting if it was in horizontal (ie: suppose that a vector coming from the screen is called Y, turning around Y does not matter the attitude of the iPhone results in a compass change).
I want to compensate that, my app was not made to you hold the phone in the horizontal (although I do plan also to allow some tilting in the X axis let's call it, from like 10 degrees to 135)
But I really could not figure how iPhone calculate the heading, thus where the heading vector actually points...
After some scientific style experiments, I found:
The iPhone has magnetometer, it has 3 axis, X, that goes from left to right from the screen. Y, that goes from bottom to up. And Z, that comes from behind the phone and comes to the front.
Earth magnetic field is as expected by the laws of physics not a sphere, in the location I am (brazil), it is slanted about 30 degrees. (meaning that I have to hold the phone in a 30 degrees angle to zero 2 axis).
One possible technique to calculate north, is use cross product of a vector tangential to the magnetic field (ie: the vector the magnetometer reports to you), and gravity. The result will be a vector that points east. If you wish you can make another cross product between east and gravity, resulting in a vector that points north.
Know that iPhone sensors are quite good, and every minor fluctuation and vibration is caught, thus it is good idea to use a lowpass filter, to remove the noise from the signal.
The iPhone itself, has a complex routine to determine the "true heading", I don't figured it completely, but it uses the accelerometer in some way to compensate for tilt. You can use the accelerometer and compensate back if that is your wish, for example if the phone is tilted 70 degrees, you can change the true heading by 70 degrees too, and the result will be the phone ignoring tilting.
Also the routine of true heading, verify if the iPhone is upside down or not. If we consider it in horizontal, in front of you as 0, then more or less at 135 degrees it decides that it is upside down, flipping the results.
Note the same coordinate system also apply to the accelerometer, allowing the use of vectors operations between accelerometer and magnetometer data without much fiddling.
How to move image when i am moving iphone?
THis question really needs to be improved. Your best bet would be to look at UIAccelerometer documentation and UIImage documentation. If you provide more details of what you want to do, I can provide a more detailed response.
First of all, as zPesk said, read docs. But, as an approximation.
Start accelerometer setting your class as the sharedAccelerometer delegate. Then implement accelerometer:didAccelerate: on your class and check the X and Y axis (if you want to move the image on 2D).
If X axis is negative, move your image to the left, if positive to the right. If Y axis is negative, to the bottom, if positive, to the top.
If you want to accelerate the movement of the image depending on the measurement of the accelerometer, multiply some pixel constant for the axis measurement and add it to the X and Y of the frame of the image. The more you tilt the device, the more accelerated is the movement.