Unity: Create AnimationClip With World Scale AnimationCurves - unity3d

I've been looking for a solution to this for quite a while now (meaning several days) and I haven't found anything yet. Maybe I'm thinking about it wrong and there isn't a way, but let's try!
I'm recording hand-data on a Hololens (the Unity Hololens Input Simulation for now). This essentially gives me one float AnimationCurve for each hand joint for each transform.position.x to z and rotation.x to w. Now my goal is to put these curves into an AnimationClip and add it to an AnimatorController (via an AnimatorOverrideController) that animates a hand rig and replay the recordings. Everything so far works!
However, the recorded hand-data from the Hololens is in world scale, not in local scale. (which makes sense, since you usually want absolute coordinates when you want to know where the hand is.) But to animate the hand, it seems I'm only able to set local coordinates, which I don't have.
Example:
clip.SetCurve("", typeof(Transform), "localPosition.x", curve.PositionX);
Here, the clip takes the the x-coordinates from some hand joint and puts it to the localPosition.x of the corresponding hand rig joint. The problem: curve.PositionX is world-scale (absolute coordinates), but localPosition.x takes local-scale (coordinates relative to its parent).
I can't simply change "localPosition.x" to "position.x", like so:
clip.SetCurve("", typeof(Transform), "position.x", curve.PositionX);
even though the Transform class has both properties and position is the object's world scale position. I'm not sure why this doesn't work, but it gives me the following error:
Cannot bind generic curve on Transform component, only position, rotation and scale curve are supported.
I'm aware that it doesn't make much sense to use absolute coordinates for an animation, but I simply don't have anything else.
Does anyone have an approach how I can deal with this in a sensible, not-too-cumbersome way? It seems I have all the important parts, I just can't figure out how to put them together. Thanks so much already! :)

From my basic understanding, it seems like you are using the Input animation recording service provided by MRTK. Unfortunately, MRTK does not provide the localPosition version of Curves data. However, you can modify the data from the recordingBuffer after the InputRecordingService stops recording.
So, this is a method worth trying for you: in the handJointCurves dictionary property of recordingBuffer field, a set of pose curves is stored for each joint. And then, base on this table:Joint pose curves, subtract the position value of the key None from the position value of each other joint in every key frame so that the localPosition based on the key None is obtained.

Related

Unity: skinned mesh vertexbuffer output slightly different than correct results

It’s like this hip is stuck in place, with in the pictures(being at different times in the animation since I can’t upload a gif) the red wireframe is the drawing of the raw output triangles and everything else being default unity, aka the correct output
What am I doing wrong? What am I missing? This has been driving me nuts for 2-3 days now, any help is appreciated
As you do not post any code, I will try to do some guessing over what is going on. Disclamair: I actually happen to run into similar problem myself.
First, you must know that Update's of MonoBehaviour are called in random order (not neccessery random-random, but yous see the point). If you bake mesh in one component, it can still be one-frame late to the Animator component.
There are actually two solution of this case: first is to specify the order of Script Execution Order in Project Settings, while the second is to use LateUpdate instead of Update when updating mesh after skinning.
The second problem you might have run into is scale / position of skinned mesh. Even if it does not contribute at all to the animation movement it can still wreck the mesh baking collision later on.
To fix that make sure that all your SkinnedMeshRenderers objects have Local Position, Local Rotation and Local Scale set in Transform component to identity (0,0,0 for position and rotation, 1,1,1 for scale). If not - reset those values in editor. It will not change animations, but it will change mesh generator.
If both solutions does not work, please describe the problem more thoroughly.

(UNITY) Plane not rotating to normal vector of three points?

I am trying to get a stretched out cube (which we can call a plane for the sake of discussion) to orient itself to the normal vector of a plane described by three points. I wrote a script to find the normal of three points, and then used transform.LookAt to have the planes align. However, I am finding that this script is not working at all how it is intended to and despite my best efforts I can not figure out why.
drastic movements of the individual points hardly effect the planes rotation.
the rotation of the object when using the existing points in the script should be 0,0,0 in the inspector. However, it is always off by a few degrees and as i said does not align itself when I move the points around.
This is the script. I can also post photos showing the behavior or share a small unity package
First of all Transform.LookAt takes a position as parameter, not a direction!
And then it
Rotates the transform so the forward vector points at worldPosition.
Doesn't sound like what you are trying to achieve.
If you want your object to look with its forward vector in the given normal direction (assuming you are calculating the normal correctly) then you could rather use Quaternion.LookRotation
transform.rotation = Quaternion.LookRotation(doNormal(cpit, cmit, ctht);
alternatively to this you can also simply assign the according vector directly like e.g.
transform.forward = doNormal(cpit, cmit, ctht);
or
transform.up = doNormal(cpit, cmit, ctht);
depending on your needs

Explanation of how to calculate transforms in Unity

I am getting started with Unity and am just trying to get my head around the units. What are these units? It seems they are their own 'quantity' and to treat 2 units as 2 times the value of 1 unit.
Anyway - I am trying to workout how to optimally calculate transforms to objects sit exactly where I want them to.
In my scene I have a terrain and a cylinder as so:
As you can see my cylinder is floating. I want the cylinder to sit perfectly on top of the terrain.
My terrain is at the following transform: 0,0,0 and scale 0,0,0 (not sure how to tell it's dimensions yet).
My cylinder is part of a new object, as so:
My FirstPersonPlayer is at transform: 85.9,2.165,51.8 and scale 1,1,1. My Cylinder is at 'localposition' 0,0,0 and local scale 1.2,1.8,1.2
Now - the transform of FirstPersonPlayer on the y axis appears to be what I need to correct.
Currently it is set to 2.165 and is floating a bit above the terrain.
Through manually shifting it, around 1.85 looks about right - but I want to know how to calculate that, rather than doing a finger in the air 'that looks about right'.
Can anyone help me? (Before you suggest using gravity etc , I actually am, but don't want the player falling as soon as they start, however slight that may look or feel.
Many thanks,
As per #Nikola Dimitroff the answer is:
You don't have to compute anything, hold Shift + Control and drag the object. Every game engine ever made calls this "Snap to Ground"
I appreciate and agree with the other comments.

Get path points on applied impulse

I added impulse to GameObject like
rigidbody.Addforce( SomeVector3, ForceMode.Impulse);
and my gravity multiplier is 4. How to calculate points of my path before I apply that impuls on body ?
I have all data like mass, gravity multipiler and so on and I would like to draw that path before apply impulse so I need number of points.
Using common physics you are of course able to predict the path the object will go. The problem is, your formulas, ticking etc. will most likely not exactly be the same as the ones Unity uses internally, so your results won't be exact.
Only thing I can think off is having a second, invisible, version of your GameObject and apply the impulse to it and track its position, then display the resulting path for your visible GameObject. You might consider accelerating Unity's time using Time.timeScale during prediction, so you don't have to wait too long for the simulation's results.

iTween consider up vector of transforms

I'm using iTween.MoveTo and for the "path" argument, I give an array of transforms. Though the transforms are placed on a sphere. The object moves very nicely over the path, but it stays oriented upwards instead of properly oriented on the sphere.
I tried using the "lookat" argument and giving that the center of my sphere and that works (after tweaking my character a bit that if it looks at the center it's actually standing on it) but then he won't look ahead on the path.
So, is there any way I can make iTween to take the up vector of the transforms into account?
Thanks!
One way to do this is to disable the "lookat" and turn a flag on when the iTween is happening ( I suggest a Coroutine that uses WaitForSeconds for the duration of the tween ).
Then, on your Update/FixedUpdate routine, you can use the following code:
outSideObjectTransform.forward = outSideObjectTransform.position - lookTarget.position;
This will make the outsideObject have it's forward axis direction going away from the center object. The same can be said for any of the axis.
Also, if you want the exact opposite ( that the object looks towards it's "look target") , just replace the subtraction of the two Vectors with an addition.
I hope this helps.