So, I've got some models posed in Unity that I would like to recreate in Blender as part of a larger, static scene. I'm using the same model in Blender that I was in Unity and I have a file of all of the positions and rotations of each part/bone, adjusted for the move from XYZ to XZY. However, all of the rotations in Blender are wrong, and in such a way that I can't seem to find a pattern.
What I've been doing: I've made a basic little model of a box person in Blender and given it a basic armature (It's a very bad model- I'm not a modeler- but it works and it's simple). I've been attaching a script to each bone on the model so I can get the Quaternion values for their rotations in Unity, and negating the x, y, and z, and swapping the y and z values to convert from left-handed to right-handed coordinates before putting them back into Blender. You can see the results in the image below.
Model posed in Unity on left, same rotation values in Blender on right
For the record I've also tried using Euler values since I'm more familiar with them, but figured using Quaternions would give a more reliable result since the Euler values from Unity can be inconsistent from what I've read.
Everything I can find that seems related is, reasonably, in the other direction, from Blender to Unity.
Honestly, in this particular case I can just re-create the poses in Blender and move on. However, I'm interested in learning what it is that's so different between Blender and Unity that is causing these strange rotations.
Here are the .blend and .fbx of the block man for reference.
https://drive.google.com/drive/folders/1p4jhh8CVMwN7RlP0YQvwqOQJ3Ft569G6?usp=sharing
what it is that's so different between Blender and Unity that is causing these strange rotations.
You basically already know that: Blender uses a right-handed whereas Unity uses a left-handed coordinate system
Afaik something like this should do it
For positions it is quite trivial:
Flip Z and Y
Also for the scale
Flip Z and Y
For the rotation
Negate X, Y, Z
Flip Z and Y
It should work in both directions the same (left-handed -> right-handed and right-handed -> left-handed)
public class Test : MonoBehaviour
{
public Vector3 BlenderPosition;
public Quaternion BlenderRotation;
public Vector3 BlenderScale;
[ContextMenu(nameof(ApplyTransformValuesFromBlender))]
private void ApplyTransformValuesFromBlender()
{
transform.localPosition = FlipLeftRightHanded(BlenderPosition);
transform.localRotation = FlipLeftRightHanded(BlenderRotation);
transform.localScale = FlipLeftRightHanded(BlenderScale);
}
[ContextMenu(nameof(ConvertTransformValuesToBlender))]
private void ConvertTransformValuesToBlender()
{
BlenderPosition = FlipLeftRightHanded(transform.localPosition);
BlenderRotation = FlipLeftRightHanded(transform.localRotation);
BlenderScale = FlipLeftRightHanded(transform.localScale);
}
private Vector3 FlipLeftRightHanded(Vector3 vector)
{
return new Vector3(vector.x, vector.z, vector.y);
}
private Quaternion FlipLeftRightHanded(Quaternion quaternion)
{
return new Quaternion (-quaternion.x, -quaternion.z, - quaternion.y, quaternion.w);
}
}
Related
Goal:
Calculate a rotation, that can be used, to correct the Transform of the bones, so they get rotated as expected when manually rotated.
In Detial:
Character's bone Transform is not imported to Unity correctily, ie. the left hand's Z axis does not looks towards the top of the hand, but backwards, while the right hand's Z axis looks forward.
The bones default rotation is not the same as you would expect if you have seen the rig in another application before.
I had the idea to create a class, that puts the character in T-Pose, then maps the bones rotation.
(I would assume that Unity does something similiar under the hood, that's how I got the idea.)
So the class would be used with an API like this:
/// This should return the rotation, where the bone pointing forward rotated by rotation.
/// Ie. when rotation = Quaternion.Identity the left hand should look forward.
Quaternion CorrectRotation(HumanBodyBones bone, Quaternion rotation)
{
return Quaternion.Inverse(tPoseRotations[bone]) * rotation;
}
The problem is, that I can't seem to find a good way to map theese rotations.
My last attempt was this:
Vector3 boneDirection = (boneTransform.position - parentTransform.position).normalized;
Quaternion mappedRotation = Quaternion.LookRotation(boneDirection, parentTransform.up);
As you can see in this image, with this method the hands look in the same direction, forward, but still not rotated correctly, they have an offset from the desired result. (The correct rotation is shown by the purple hands.)
When given other rotations, the hands follow with the same offset, so other then this unwanted offset they work correctly.
So basically, I would appriciate any help to fix my problem, either this way, or with a different solution, that works with different rigs.
im new in unity and I'm making a simple game, a kind of earth that is orbiting the sun, this earth is rotating using RotateAround(), I want to change the diameter of the earth to the sun based on user input, so the earth can be farther or closer to the sun,but it still orbiting to the sun. I can't program it .
Here's the code :
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class Player : MonoBehaviour
{
public float speed = 100;
public Rigidbody2D rb;
float movement;
void Update()
{
movement = Input.GetAxisRaw("Horizontal");
}
private void FixedUpdate()
{
transform.RotateAround(new Vector3(movement,0,0), Vector3.forward, Time.fixedDeltaTime * -speed);
}
all kinds of help mean a lot to me.
Sorry for my bad english.
RotateAround won't work very well in your situation.
https://docs.unity3d.com/ScriptReference/Transform.RotateAround.html
You can take a simple approach, which is to put the Earth inside an empty GameObject, then rotate the empty and move the Earth over its local position to make it orbit farther or closer. The pivot center of the empty should be the same position as the sun.
The hard way is to use math. Basically define a distance and then calculate its x and z (y in 2D) position based on the Sin and Cos of the angle, which you should increase over the time. But since you are new, I advise you the first one.
I'm learning Unity3D and having some trouble on my 3D radar. My radar is a child of my Player game object which is rotating around as I fly. The radar itself is rotated 45 degrees so it faces the user. There is a Cube that is supposed to be the radar blip of the enemy plane. The Cube is a child of Radar so should inherit its rotation. There is a script on the Cube to update itself every update(). Here is the hierarchy:
Enemy Plane
Player
-- Camera
-- Radar
------ Cube (radar representation of Enemy Plane)
The problem is that while the Cube itself is rotated, with the Radar, its motion is not. As I get closer to the enemy plane, the Cube just gets closer to the camera (which is good) but I would expect its motion to follow the 45 degree rotation of the parent Radar object?
public class RadarGlyph : MonoBehaviour
{
GameObject radarSource;
GameObject trackedObject;
Vector3 radarScaler;
void Start()
{
this.radarSource = GameObject.Find("Radar");
this.trackedObject = GameObject.Find("Enemy Fighter");
this.radarScaler = new Vector3(0.001f, 0.001f, 0.001f);
}
void Update()
{
Vector3 vDelta = this.trackedObject.transform.position - this.radarSource.transform.position;
vDelta.Scale(this.radarScaler);
this.transform.localPosition = this.transform.InverseTransformDirection(vDelta);
}
}
For a complete solution, you have to get the position of the target wrt the ship first and then recreate it within the context of the blip and the radar.
As a quick fix, you can try changing your last line like this:
this.transform.localPosition = this.parent.localRotation * this.transform.InverseTransformDirection(vDelta);
or (apparently not good as you mentioned)
this.transform.localPosition = Quaternion.Inverse(this.parent.localRotation) * this.transform.InverseTransformDirection(vDelta);
one of these is bound to work. (The first one did)
Edit: here's a third alternative
this.transform.localPosition = this.transform.parent.parent.InverseTransformDirection(vDelta);
This one gets the position in Player's space and applies it in radar's space.
The first and third are trying to do the same thing. Since you were transforming the direction into the blip's coordinate frame, any rotations that its parents have are canceled out. Instead, the correct thing to do is to get the position relative to the Player first. Then apply it to the blip in the radar. The third line of code I have here is attempting to do that.
I'm trying to map rotations from a sensor into Unity using Quaternions, and I cannot
seem to figure out why rotations do not map correctly.
I'm using the Adafruit BNO055 to pull absolute orientation in the form of Quaternions. The
source for their Quaternion implementation can be found here. From what I understand about
Quaternions, which is almost nothing, I should be able to pull a Quaternion out of the sensor and
pump it into any GameObject inside Unity so that they share the same orientation. If I had a loop
set up that read the Quaternion data from the sensor and pumped it into Unity, the GameObject should
rotate exactly like the sensor in the physical world. Unfortunately, this is not happening.
An example of the data sent from the sensor to Unity
w: 0.903564
x: 0.012207
y: 0.009094
z: -0.428223
Is the Quaternion sent from the sensor not equal to the Quaternions used in Unity? If not, how
would I go about getting these mapped correctly?
Thanks in advance for any help!
When I have created a Quaternion from an external sensor in a comma separated list, this works for me:
parts = SensorReceivedText.Split(',');
float x = Convert.ToSingle(parts[0]);
float y = Convert.ToSingle(parts[1]);
float z = Convert.ToSingle(parts[2]);
float w = Convert.ToSingle(parts[3]);
Quaternion rotation = new Quaternion(x, y, z, w);
Just for example of diffrent conversions (including Unity and OpenGL)
https://developers.google.com/project-tango/overview/coordinate-systems
I don't know your device and coordinates notation, but you can recover it making some experiments with orientation.
The main problem of conversion, that conversion matrix may contain MIRRORING (-1 matrix component). And can't be solved just rearanging rotation axes.
In my game, Im always rotating a gameobject (Cube) with Quaternion.Lerp. And I need which face of this cube is looking up. So I think I can use current rotation of object to get this (if there is a better way please say it).
Firstly this gameobject's rotation (0,0,0). when I rotate it forward it becomes (90,0,0). There is no problem. But after that, when I rotate it second time it doesnt become (180,0,0). it is (0,180,180). How can I prevent unity to do this.
Try using euler angles instead of quaternions, they are much simpler, here is an example:
transform.eulerAngles = new Vector3(180, 0, 0);
There are multiple ways of representing a rotation with Euler angles. When you convert the rotation of your GameObject (stored internally as a quaternion) to Euler angles (using either transform.eulerAngles or transform.rotation.eulerAngles) you may not get the result you're expecting.
You could get around this by storing a Vector3 (let's call it currentRotation) containing the rotation of your object in Euler angles, and construct Quaternions from it using Quaternion.Euler(currentRotation). You can do your interpolation by calculating your target rotation (targetRotation) as a Vector3, then constructing a Quaternion for it before doing Quaternion.Lerp:
var targetRotationQuaternion = Quaternion.Euler(targetRotation);
var currentRotationQuaternion = Quaternion.Euler(currentRotation);
transform.rotation = Quaternion.Lerp(currentRotationQuaternion, targetRotationQuaternion, *your t*);
Incidentally, this answer is very similar to a comment on http://answers.unity3d.com/questions/667037/transformrotationeulerangles-not-accuarate.html but it's worth reproducing here in case that ever gets taken down.