2d Look at Function - unity3d

I am creating a 2d sidescroller game. I have a point in space (where the mouse is) and I need the weapon to look and "follow" that point.
Does anyone know where to begin?

wikihow: How to Find the Angle Between Two Vectors
After you have the angle, you can appropriately rotate the thing to be rotated.
I am not sure if javascript also has an atan2(x,y) function, which could be used to get the angle.

Related

(UNITY) Plane not rotating to normal vector of three points?

I am trying to get a stretched out cube (which we can call a plane for the sake of discussion) to orient itself to the normal vector of a plane described by three points. I wrote a script to find the normal of three points, and then used transform.LookAt to have the planes align. However, I am finding that this script is not working at all how it is intended to and despite my best efforts I can not figure out why.
drastic movements of the individual points hardly effect the planes rotation.
the rotation of the object when using the existing points in the script should be 0,0,0 in the inspector. However, it is always off by a few degrees and as i said does not align itself when I move the points around.
This is the script. I can also post photos showing the behavior or share a small unity package
First of all Transform.LookAt takes a position as parameter, not a direction!
And then it
Rotates the transform so the forward vector points at worldPosition.
Doesn't sound like what you are trying to achieve.
If you want your object to look with its forward vector in the given normal direction (assuming you are calculating the normal correctly) then you could rather use Quaternion.LookRotation
transform.rotation = Quaternion.LookRotation(doNormal(cpit, cmit, ctht);
alternatively to this you can also simply assign the according vector directly like e.g.
transform.forward = doNormal(cpit, cmit, ctht);
or
transform.up = doNormal(cpit, cmit, ctht);
depending on your needs

Unity: Create AnimationClip With World Scale AnimationCurves

I've been looking for a solution to this for quite a while now (meaning several days) and I haven't found anything yet. Maybe I'm thinking about it wrong and there isn't a way, but let's try!
I'm recording hand-data on a Hololens (the Unity Hololens Input Simulation for now). This essentially gives me one float AnimationCurve for each hand joint for each transform.position.x to z and rotation.x to w. Now my goal is to put these curves into an AnimationClip and add it to an AnimatorController (via an AnimatorOverrideController) that animates a hand rig and replay the recordings. Everything so far works!
However, the recorded hand-data from the Hololens is in world scale, not in local scale. (which makes sense, since you usually want absolute coordinates when you want to know where the hand is.) But to animate the hand, it seems I'm only able to set local coordinates, which I don't have.
Example:
clip.SetCurve("", typeof(Transform), "localPosition.x", curve.PositionX);
Here, the clip takes the the x-coordinates from some hand joint and puts it to the localPosition.x of the corresponding hand rig joint. The problem: curve.PositionX is world-scale (absolute coordinates), but localPosition.x takes local-scale (coordinates relative to its parent).
I can't simply change "localPosition.x" to "position.x", like so:
clip.SetCurve("", typeof(Transform), "position.x", curve.PositionX);
even though the Transform class has both properties and position is the object's world scale position. I'm not sure why this doesn't work, but it gives me the following error:
Cannot bind generic curve on Transform component, only position, rotation and scale curve are supported.
I'm aware that it doesn't make much sense to use absolute coordinates for an animation, but I simply don't have anything else.
Does anyone have an approach how I can deal with this in a sensible, not-too-cumbersome way? It seems I have all the important parts, I just can't figure out how to put them together. Thanks so much already! :)
From my basic understanding, it seems like you are using the Input animation recording service provided by MRTK. Unfortunately, MRTK does not provide the localPosition version of Curves data. However, you can modify the data from the recordingBuffer after the InputRecordingService stops recording.
So, this is a method worth trying for you: in the handJointCurves dictionary property of recordingBuffer field, a set of pose curves is stored for each joint. And then, base on this table:Joint pose curves, subtract the position value of the key None from the position value of each other joint in every key frame so that the localPosition based on the key None is obtained.

Explanation of how to calculate transforms in Unity

I am getting started with Unity and am just trying to get my head around the units. What are these units? It seems they are their own 'quantity' and to treat 2 units as 2 times the value of 1 unit.
Anyway - I am trying to workout how to optimally calculate transforms to objects sit exactly where I want them to.
In my scene I have a terrain and a cylinder as so:
As you can see my cylinder is floating. I want the cylinder to sit perfectly on top of the terrain.
My terrain is at the following transform: 0,0,0 and scale 0,0,0 (not sure how to tell it's dimensions yet).
My cylinder is part of a new object, as so:
My FirstPersonPlayer is at transform: 85.9,2.165,51.8 and scale 1,1,1. My Cylinder is at 'localposition' 0,0,0 and local scale 1.2,1.8,1.2
Now - the transform of FirstPersonPlayer on the y axis appears to be what I need to correct.
Currently it is set to 2.165 and is floating a bit above the terrain.
Through manually shifting it, around 1.85 looks about right - but I want to know how to calculate that, rather than doing a finger in the air 'that looks about right'.
Can anyone help me? (Before you suggest using gravity etc , I actually am, but don't want the player falling as soon as they start, however slight that may look or feel.
Many thanks,
As per #Nikola Dimitroff the answer is:
You don't have to compute anything, hold Shift + Control and drag the object. Every game engine ever made calls this "Snap to Ground"
I appreciate and agree with the other comments.

iTween consider up vector of transforms

I'm using iTween.MoveTo and for the "path" argument, I give an array of transforms. Though the transforms are placed on a sphere. The object moves very nicely over the path, but it stays oriented upwards instead of properly oriented on the sphere.
I tried using the "lookat" argument and giving that the center of my sphere and that works (after tweaking my character a bit that if it looks at the center it's actually standing on it) but then he won't look ahead on the path.
So, is there any way I can make iTween to take the up vector of the transforms into account?
Thanks!
One way to do this is to disable the "lookat" and turn a flag on when the iTween is happening ( I suggest a Coroutine that uses WaitForSeconds for the duration of the tween ).
Then, on your Update/FixedUpdate routine, you can use the following code:
outSideObjectTransform.forward = outSideObjectTransform.position - lookTarget.position;
This will make the outsideObject have it's forward axis direction going away from the center object. The same can be said for any of the axis.
Also, if you want the exact opposite ( that the object looks towards it's "look target") , just replace the subtraction of the two Vectors with an addition.
I hope this helps.

XCode translate matrix ala ActionScript?

I am wondering if there is a way to translate the underlying matrix of a layer much like you can in ActionScript3.
In AS3 I can get the transform of a layer and shift it to, let's say, make the center of the layer the anchor point, rather than the upper-left corner.
The reason I ask is because I am trying to rotate a layer (containing a square) along a diagonal axis. I thought it might be easy if I could rotate the matrix by 45 degrees, then I could just rotate around the X-axis and be done.
But I cannot figure out how to do that.
Any help, greatly appreciate, as always.
Cheers,
Chris
Use a CGAffineTransform.
Edit:
I am afraid I don't know what you mean by "rotating an object along a diagonal axis". What you most likely need to do is to concatenate two or more transforms.
See figure 5-8 in Quartz 2D programming Guide The concatenation of two transforms creates the appearance of the image rotating around its lower left corner.