Mapping outside quaternions to Unity - unity3d

I'm trying to map rotations from a sensor into Unity using Quaternions, and I cannot
seem to figure out why rotations do not map correctly.
I'm using the Adafruit BNO055 to pull absolute orientation in the form of Quaternions. The
source for their Quaternion implementation can be found here. From what I understand about
Quaternions, which is almost nothing, I should be able to pull a Quaternion out of the sensor and
pump it into any GameObject inside Unity so that they share the same orientation. If I had a loop
set up that read the Quaternion data from the sensor and pumped it into Unity, the GameObject should
rotate exactly like the sensor in the physical world. Unfortunately, this is not happening.
An example of the data sent from the sensor to Unity
w: 0.903564
x: 0.012207
y: 0.009094
z: -0.428223
Is the Quaternion sent from the sensor not equal to the Quaternions used in Unity? If not, how
would I go about getting these mapped correctly?
Thanks in advance for any help!

When I have created a Quaternion from an external sensor in a comma separated list, this works for me:
parts = SensorReceivedText.Split(',');
float x = Convert.ToSingle(parts[0]);
float y = Convert.ToSingle(parts[1]);
float z = Convert.ToSingle(parts[2]);
float w = Convert.ToSingle(parts[3]);
Quaternion rotation = new Quaternion(x, y, z, w);

Just for example of diffrent conversions (including Unity and OpenGL)
https://developers.google.com/project-tango/overview/coordinate-systems
I don't know your device and coordinates notation, but you can recover it making some experiments with orientation.
The main problem of conversion, that conversion matrix may contain MIRRORING (-1 matrix component). And can't be solved just rearanging rotation axes.

Related

How to recreate pose of model from Unity in Blender?

So, I've got some models posed in Unity that I would like to recreate in Blender as part of a larger, static scene. I'm using the same model in Blender that I was in Unity and I have a file of all of the positions and rotations of each part/bone, adjusted for the move from XYZ to XZY. However, all of the rotations in Blender are wrong, and in such a way that I can't seem to find a pattern.
What I've been doing: I've made a basic little model of a box person in Blender and given it a basic armature (It's a very bad model- I'm not a modeler- but it works and it's simple). I've been attaching a script to each bone on the model so I can get the Quaternion values for their rotations in Unity, and negating the x, y, and z, and swapping the y and z values to convert from left-handed to right-handed coordinates before putting them back into Blender. You can see the results in the image below.
Model posed in Unity on left, same rotation values in Blender on right
For the record I've also tried using Euler values since I'm more familiar with them, but figured using Quaternions would give a more reliable result since the Euler values from Unity can be inconsistent from what I've read.
Everything I can find that seems related is, reasonably, in the other direction, from Blender to Unity.
Honestly, in this particular case I can just re-create the poses in Blender and move on. However, I'm interested in learning what it is that's so different between Blender and Unity that is causing these strange rotations.
Here are the .blend and .fbx of the block man for reference.
https://drive.google.com/drive/folders/1p4jhh8CVMwN7RlP0YQvwqOQJ3Ft569G6?usp=sharing
what it is that's so different between Blender and Unity that is causing these strange rotations.
You basically already know that: Blender uses a right-handed whereas Unity uses a left-handed coordinate system
Afaik something like this should do it
For positions it is quite trivial:
Flip Z and Y
Also for the scale
Flip Z and Y
For the rotation
Negate X, Y, Z
Flip Z and Y
It should work in both directions the same (left-handed -> right-handed and right-handed -> left-handed)
public class Test : MonoBehaviour
{
public Vector3 BlenderPosition;
public Quaternion BlenderRotation;
public Vector3 BlenderScale;
[ContextMenu(nameof(ApplyTransformValuesFromBlender))]
private void ApplyTransformValuesFromBlender()
{
transform.localPosition = FlipLeftRightHanded(BlenderPosition);
transform.localRotation = FlipLeftRightHanded(BlenderRotation);
transform.localScale = FlipLeftRightHanded(BlenderScale);
}
[ContextMenu(nameof(ConvertTransformValuesToBlender))]
private void ConvertTransformValuesToBlender()
{
BlenderPosition = FlipLeftRightHanded(transform.localPosition);
BlenderRotation = FlipLeftRightHanded(transform.localRotation);
BlenderScale = FlipLeftRightHanded(transform.localScale);
}
private Vector3 FlipLeftRightHanded(Vector3 vector)
{
return new Vector3(vector.x, vector.z, vector.y);
}
private Quaternion FlipLeftRightHanded(Quaternion quaternion)
{
return new Quaternion (-quaternion.x, -quaternion.z, - quaternion.y, quaternion.w);
}
}

transform.LookAt and Quaternion.LookRotation are flipping the body's rotation when looking?

I need help please, the body is flipping even though it is looking the player, I want it to be on the right rotation when looking.
Here is my code! :
Vector3 direction = Player.position - transform.position;
Quaternion rotation = Quaternion.LookRotation(direction);
transform.rotation = rotation;
and also you can watch my video so I can explain it to you clearly!
Please help!
https://streamable.com/ygnbhq
Please watch!!!
Both as described in the API will rotate your object so its forward vector is pointing towards the target ... your 3D model seems "wrong" and doesn't have its forward (Z) vector pointing forward by default.
If I look at your model in the video I can see in the rotation Gizmo that your model is rotated by default!
It looks like the forward (Z) vector is pointing up, its right (X) vector is pointing forward and its up (Y) vector is pointing left.
That means in order to be oriented correctly it needs to be additionally rotated about -90° in its local X and -90° in its local Z axis.
You could fix that by adding the rotation needed to again make your object stand upright instead like e.g.
transform.LookAt(Player.position);
transform.localRotation *= Quaternion.Euler(-90, 0, -90);

Which rotation is shown in the Inspector?

The chest bone of my player can be rotated while aiming.
Now I wanted to evaluate how much (minimum and maximum rotation) I should let the chest be rotatable.
To do that, I allowed all degrees of rotation and took a look at the Inspector.
For example, the minimum value that the chest should be rotatable to the left should be Y=-15.
At Y=-15 (seen in the Inspector), it still looked natural.
Now I wanted to code this.
To my surprise, chest.localRotation.Y was a completely different value than what the Inspector is showing.
I have then taken a look at the chest variable and extended the view.
I just can't see the rotation value that the Inspector is showing.
How should I go on in this case, please?
I'm using this to rotate the bone:
Chest.LookAt(ChestLookTarget.position);
Chest.rotation = Chest.rotation * Quaternion.Euler(Offset);
Thank you!
The reason why it doesn't work:
Quaternion is not a human readable value.
One Quaternion is allways unique but can have multiple (infinite?) different representations in Euler space! The other way round one Euler represents allways exactly one Quaternion value.
If you look at the docs it explicitly says
Don't modify this directly unless you know quaternions inside out.
Than as said what you see in the inspector is the localRotation in relation to the parent Transform.
Better said it is one of the many possible Euler inputs that result in the Quaternion. What you see in the debug at localEulerAngles is another possible Euler representation. Unity usually in localEulerAngles also gives you only values > 0.
It seems that the chest anyway will only rotate around the Y axis, right?
If this is the case you can simply get the Angle between the chest's original forward vector and the target. It is way easier to handle Vector3 values than Quaternions ;)
It seems to be the same use case as in this post
// get the target direction
Vector3 targetDir = ChestLookTarget.position - Chest.position;
// Reset any difference in the Y axis
// since it would change the angle as well if there was a difference I the height
// between the two objects
targetDir.y = 0;
// however you currently rotate
// instead rotate only the Vector3 variable without applying it to the transform yet
Vector3 newDir = Vector3.RotateTowards(Chest.forward, targetDir, RotationSpeed * Time.deltaTime, 0.0f);
// Compare the target direction to the parents forward vector
float newAngle = Vector3.Angle(Chest.parent.transform.forward, newDir);
if (newAngle > MaxRotationAngle)
{
// What should happen if angle gets bigger?
return;
}
// If angle still okey set the new direction
Chest.rotation = Quaternion.LookRotation(newDir);

Get simplified rotation of object

In my game, Im always rotating a gameobject (Cube) with Quaternion.Lerp. And I need which face of this cube is looking up. So I think I can use current rotation of object to get this (if there is a better way please say it).
Firstly this gameobject's rotation (0,0,0). when I rotate it forward it becomes (90,0,0). There is no problem. But after that, when I rotate it second time it doesnt become (180,0,0). it is (0,180,180). How can I prevent unity to do this.
Try using euler angles instead of quaternions, they are much simpler, here is an example:
transform.eulerAngles = new Vector3(180, 0, 0);
There are multiple ways of representing a rotation with Euler angles. When you convert the rotation of your GameObject (stored internally as a quaternion) to Euler angles (using either transform.eulerAngles or transform.rotation.eulerAngles) you may not get the result you're expecting.
You could get around this by storing a Vector3 (let's call it currentRotation) containing the rotation of your object in Euler angles, and construct Quaternions from it using Quaternion.Euler(currentRotation). You can do your interpolation by calculating your target rotation (targetRotation) as a Vector3, then constructing a Quaternion for it before doing Quaternion.Lerp:
var targetRotationQuaternion = Quaternion.Euler(targetRotation);
var currentRotationQuaternion = Quaternion.Euler(currentRotation);
transform.rotation = Quaternion.Lerp(currentRotationQuaternion, targetRotationQuaternion, *your t*);
Incidentally, this answer is very similar to a comment on http://answers.unity3d.com/questions/667037/transformrotationeulerangles-not-accuarate.html but it's worth reproducing here in case that ever gets taken down.

Measuring vertical movement of non-fixed Android device

My goal is to make an Android mobile app (SDK16+) that measures road quality while riding a bike.
I have found a Sensor fusion demo for Android that I assume will do all the measurements for me.
How can I get only the vertical movement when the phone is not fixed in a certain orientation?
The problem
The problem here is that you have two systems of coordinates, the dx, dy, dz, of your device and the wX, wY, wZ of the world around you. The relationship between the two changes as you move your device around.
Another way to formulate your question would be to say:
Given a sensor reading [dx, dy, dz], how do I find the component of the reading which is parallel to wZ?
A solution
Luckily, the orientation sensors (such as Android's own ROTATION_VECTOR sensor) provide the tools for transformation between these two coordinate systems.
For example, the output of the ROTATION_VECTORsensor comes in the form of an axis-angle representation of the rotation your device has from some certain "base" rotation fixed in the world frame (see also: Quaternions).
Android the provides the method SensorManager#getRotationMatrixFromVector(float[] R, float[] rotationVector), which takes axis-angle representation from your sensor and translates it into a rotation matrix. A rotation matrix is used to transform a vector given in one frame of reference to another (in this case [ World -> Device ]).
Now, you want to transform a measurement in the device frame into the world frame? No problem. One nifty characteristic of rotation matrices is that the inverse of the rotation matrix is the rotation matrix of the opposite transformation ([Device -> World] in our case). Another, even niftier thing is that the inverse of a rotation matrix simply is it's transpose.
So, your code could follow the lines of:
public void findVerticalComponentOfSensorValue() {
float[] rotationVectorOutput = ... // Get latest value from ROTATION_VECTOR sensor
float[] accelerometerValue = ... // Get latest value from ACCELEROMETER sensor
float[] rotationMatrix = new float[9]; // Both 9 and 16 works, depending on what you're doing with it
SensorManager.getRotationMatrixFromVector(rotationMatrix, rotationVectorOutput);
float[] accelerationInWorldFrame = matrixMult(
matrixTranspose(rotationMatrix),
accelerometerValue);
// Make your own methods for the matrix operations or find an existing library
accelerationInWorldFrame[2] // This is your vertical acceleration
}
Now, I'm not saying this is the best possible solution, but it should do what you're after.
Disclaimer
Strapping a device that uses sensor fusion including a magnetometer to a metal frame may produce inconsistent results. Since your compass heading doesn't matter here, I'd suggest using a sensor fusion method that doesn't involve the magnetometer.