Unity3D relative rotation of objects on an axis - unity3d

In the project I am using IMU sensors to track real player's hand and apply the transformation values onto 3D hand inside Unity.
As IMU sets the Y axis orientation relative to magnetic north of the earth, in the game, hand initializes on specific direction.
What I want is to calculate the offset of IMU's given Y values and 3D hand's original Y rotation, so that I can subtract that value to the 3D model's Y rotation (that will seem like player's initial Y Rotation is the same as 3D hand's). Code would be: transform.Rotate(Vector3.up, offset, Space.World);
IMU sends Euler angles (does it well, as I wasn't able to get Gimbal lock)
As I understand, I need to find out angle difference between 3D Hand's initial rotation and IMU's given initial rotation on XZ plane (or through Y Axis)
How do I calculate the offset?

You can use Quaternion.FromToRotation to calculate offset, something like:
var offset = Quaternion.FromToRotation(Vector3.up, imuUp);
transform.rotation *= offset;

Related

How to convert left handed coordinate system to right handed? [duplicate]

This question already has answers here:
Convert quaternion from right-handed to left-handed coordinate system
(3 answers)
Flipping a quaternion from right to left handed coordinates
(6 answers)
Changing a matrix from right-handed to left-handed coordinate system
(8 answers)
Closed 2 months ago.
I want to convert Unity left handed coordinate system to ROS right handed coordinate system. You see the frames here. How can I convert roll, pitch, yaw in Unity to ROS? Additionally, for Unity camera position how should I convert (x,y,z) positions in unity to ROS?
To convert the roll, pitch, and yaw angles from a left-handed coordinate system (such as Unity) to a right-handed coordinate system (such as ROS), you will need to perform the following steps:
Invert the pitch angle: In Unity, pitch is defined as the rotation around the x-axis. In ROS, pitch is defined as the rotation around the y-axis. To convert from Unity to ROS, you will need to invert the pitch angle.
Swap the roll and yaw angles: In Unity, roll is defined as the rotation around the z-axis and yaw is defined as the rotation around the y-axis. In ROS, roll is defined as the rotation around the x-axis and yaw is defined as the rotation around the z-axis. To convert from Unity to ROS, you will need to swap the roll and yaw angles.
Convert the angles to the proper range: Both Unity and ROS use a range of (-180, 180] degrees for the roll, pitch, and yaw angles. However, the definition of the zero point for each angle may be different. Make sure to convert the angles to the proper range so that they are consistent with the conventions used by ROS.
Here is some example code in C# that demonstrates how to perform the idea:
// Convert roll, pitch, yaw from Unity (left-handed) to ROS (right-handed)
float roll = unityRoll; // roll angle in Unity (around x-axis)
float pitch = -unityPitch; // pitch angle in Unity (around y-axis)
float yaw = unityYaw; // yaw angle in Unity (around z-axis)
// Swap roll and yaw
float temp = roll;
roll = yaw;
yaw = temp;
// Convert angles to proper range
roll = WrapAngle(roll);
pitch = WrapAngle(pitch);
yaw = WrapAngle(yaw);
// Roll, pitch, and yaw are now in the conventions used by ROS
// Function to wrap an angle in the range (-180, 180]
float WrapAngle(float angle)
{
while (angle <= -180f) angle += 360f;
while (angle > 180f) angle -= 360f;
return angle;
}

Why does CMDeviceMotion attitude quaternion not always rotate to earth frame?

I wrote an app that writes gravity, userAcceleration, and attitude quaternion to CSV while driving in a vehicle. The intent is to capture the dynamics of the vehicle (e.g. braking, accelerating, cornering) in the earth frame using an iPhone.
Then, I sum gravity and userAcceleration, and rotate the resulting raw acceleration vector by the quaternion provided by CMAttitude to get the acceleration in the earth frame. In about 60% of recording sessions, the average z values are not +9.81m/s^2 and jump to varying magnitudes besides +9.81m/s^2. For example (each tick mark in the y-axis represents 5m/s^2):
But, I expect a plot with a consistent average value for acceleration in the z axis like the following:
When I start device motion updates, I use the xMagneticNorthZVertical attitude reference frame, like so:
motionManager.startDeviceMotionUpdates(using: .xMagneticNorthZVertical, to: OperationQueue(), withHandler: didUpdateDeviceMotion)
The following computes raw acceleration in the global frame using the attitude quaternion:
let accel = CMAcceleration(x: (motion.userAcceleration.x+motion.gravity.x), y: (motion.userAcceleration.y+motion.gravity.y), z: (motion.userAcceleration.z+motion.gravity.z))
let a = SIMD3<Double>(accel.x, accel.y, accel.z)
let a_vehicle = simd_act(attitude.quaternion, a)
I also have written the equivalent in MATLAB resulting with the same problem.
xMagneticNorthZVertical should result in an attitude that computes the direction of gravity. The direction of X or Y does not matter to me.
I do not have any magnets in the vicinity to skew the computed attitude.
In contrast: Android's rotationVector consistently rotates the accelerometer readings to the earth frame. Surely the quality of iPhone is better than Android.
What might be the cause of the attitude quaternion to not always rotate the device frame to the earth frame such that Z is in the direction of gravity?

Find absolute world axes rotation angles(x,y,z) from "transform rotation"

We rotated a transform by setting Euler angles at runtime.
obj.transform.eulerAngles = new Vector3(0,270,90);
We are trying to export obj.transform.rotation quaternion rotation information to absolute rotation in degrees (i.e. X, Y, Z- World axis) as we need to send this information to a non-unity system where rotation applies in world axes only.
How can we calculate absolute rotation in world axes?

Find angle between 2 points ignoring origin.forward in Unity

Background: I am creating an AR treasure hunt app. It is simple, it has a locator that tells you where the treasure is relative to you. I have the camera being the origin and the treasure being an object in AR world.
Question: I would like to rotate my arrow according to where in space the treasure is at. but in 2d. Basically, I would ignore the relative forward plane that is camera.forward.
Example: If the camera rotation is default, the angle can be calculated as atan2(dy,dx). If the camera is looking straight down, the angle is atan2(dz,dx).
What I have tried:
Quaternion lookRot = Quaternion.LookRotation(target.transform.position - origin.transform.position);
Quaternion relativeRot = Quaternion.Inverse(origin.transform.rotation) * lookRot;
Relative rotation is correct in 3d space but I would like to convert that into 2d ignoring the camera.forward plane. So even if the treasure is in front or behind the camera, it should not change the angle.
Okay so I’m hoping this makes sense. You’re going to need some sort of if statement to determine if your character is looking along the x, y or z plane. Hopefully the diagram is clear as to what those parameters are but if not. To be looking in the “x” plane for example, the y rotation would have to be between 45° and -45° or 135° and -135° AND the z rotation would have to be between 45° and -45° or between 135° and -135°.
Essentially what you’ve got is a sphere split into six parts, two parts for each plane along which the character could look. Once you’ve determined which plane the character is looking in you can determine the direction by finding the difference in position between the character and the treasure along the two planes the character isn’t looking along. Then use trig to calculate the angle
Replying to an old thread, but I was struggling with the same problem and found a relatively simple solution:
Project the position of the target (relative to the origin) on a plane defined by the forward vector of the camera. Then just rotate towards the projected point:
Vector3 diff = target.transform.position - origin.transform.position;
Vector3 projected = Vector3.ProjectOnPlane(diff, Camera.main.transform.forward);
origin.transform.rotation = Quaternion.LookRotation(projected);
Calculate the difference in x and y coordinates simply by subtracting transform.x for one object by transform.x of another object and the same process for y coordinates and then use Mathf.atan(difference in y/difference in x) to calculate the angle. Then put the z rotation to this angle and assign the x and y rotation to what they already were.
Turns out there is a very simple way to get relative X and Y of the target.
Vector2 ExtractRelativeXY(Transform origin, Transform target) {
// Get the absolute look rotation from origin to target.
Quaternion lookRot = Quaternion.LookRotation(target.transform.position - origin.transform.position);
// Create a relative look rotation with respect to origin's forward.
Quaternion relativeRot = Quaternion.Inverse(origin.transform.rotation) * lookRot;
// Obtain Matrix 4x4 from the rotation.
Matrix4x4 m = Matrix4x4.Rotate(relativeRot);
// Get the 3rd column (which is the forward vector of the rotation).
Vector4 mForward = m.GetColumn(2);
// Simply extract the x and y.
return new Vector2(mForward.x, mForward.y);
}
Once obtained x and y, turn it into angle using angle = atan2(y,x) as suggested by both MBo and Tom.
This works because of the matrix components of the quaternion can be demonstrated in multiple vectors. Better illustration is found here https://stackoverflow.com/a/26724912.

Detecting the direction of the accelerometer movement on the y axis

I currently detect movement on the y axis. How does one calculate the direction it moved on the axis?
I get the same values when moving up or down.
Is the Gyro needed for this?
Do remember that the accelerometer will reflect the force of gravity. So movement up and down will generally be reflected as 9.81 m/s2 plus or minus the actual acceleration of the device relative to the Earth.