I'm using iTween.MoveTo and for the "path" argument, I give an array of transforms. Though the transforms are placed on a sphere. The object moves very nicely over the path, but it stays oriented upwards instead of properly oriented on the sphere.
I tried using the "lookat" argument and giving that the center of my sphere and that works (after tweaking my character a bit that if it looks at the center it's actually standing on it) but then he won't look ahead on the path.
So, is there any way I can make iTween to take the up vector of the transforms into account?
Thanks!
One way to do this is to disable the "lookat" and turn a flag on when the iTween is happening ( I suggest a Coroutine that uses WaitForSeconds for the duration of the tween ).
Then, on your Update/FixedUpdate routine, you can use the following code:
outSideObjectTransform.forward = outSideObjectTransform.position - lookTarget.position;
This will make the outsideObject have it's forward axis direction going away from the center object. The same can be said for any of the axis.
Also, if you want the exact opposite ( that the object looks towards it's "look target") , just replace the subtraction of the two Vectors with an addition.
I hope this helps.
Related
I am trying to get a stretched out cube (which we can call a plane for the sake of discussion) to orient itself to the normal vector of a plane described by three points. I wrote a script to find the normal of three points, and then used transform.LookAt to have the planes align. However, I am finding that this script is not working at all how it is intended to and despite my best efforts I can not figure out why.
drastic movements of the individual points hardly effect the planes rotation.
the rotation of the object when using the existing points in the script should be 0,0,0 in the inspector. However, it is always off by a few degrees and as i said does not align itself when I move the points around.
This is the script. I can also post photos showing the behavior or share a small unity package
First of all Transform.LookAt takes a position as parameter, not a direction!
And then it
Rotates the transform so the forward vector points at worldPosition.
Doesn't sound like what you are trying to achieve.
If you want your object to look with its forward vector in the given normal direction (assuming you are calculating the normal correctly) then you could rather use Quaternion.LookRotation
transform.rotation = Quaternion.LookRotation(doNormal(cpit, cmit, ctht);
alternatively to this you can also simply assign the according vector directly like e.g.
transform.forward = doNormal(cpit, cmit, ctht);
or
transform.up = doNormal(cpit, cmit, ctht);
depending on your needs
I've been looking for a solution to this for quite a while now (meaning several days) and I haven't found anything yet. Maybe I'm thinking about it wrong and there isn't a way, but let's try!
I'm recording hand-data on a Hololens (the Unity Hololens Input Simulation for now). This essentially gives me one float AnimationCurve for each hand joint for each transform.position.x to z and rotation.x to w. Now my goal is to put these curves into an AnimationClip and add it to an AnimatorController (via an AnimatorOverrideController) that animates a hand rig and replay the recordings. Everything so far works!
However, the recorded hand-data from the Hololens is in world scale, not in local scale. (which makes sense, since you usually want absolute coordinates when you want to know where the hand is.) But to animate the hand, it seems I'm only able to set local coordinates, which I don't have.
Example:
clip.SetCurve("", typeof(Transform), "localPosition.x", curve.PositionX);
Here, the clip takes the the x-coordinates from some hand joint and puts it to the localPosition.x of the corresponding hand rig joint. The problem: curve.PositionX is world-scale (absolute coordinates), but localPosition.x takes local-scale (coordinates relative to its parent).
I can't simply change "localPosition.x" to "position.x", like so:
clip.SetCurve("", typeof(Transform), "position.x", curve.PositionX);
even though the Transform class has both properties and position is the object's world scale position. I'm not sure why this doesn't work, but it gives me the following error:
Cannot bind generic curve on Transform component, only position, rotation and scale curve are supported.
I'm aware that it doesn't make much sense to use absolute coordinates for an animation, but I simply don't have anything else.
Does anyone have an approach how I can deal with this in a sensible, not-too-cumbersome way? It seems I have all the important parts, I just can't figure out how to put them together. Thanks so much already! :)
From my basic understanding, it seems like you are using the Input animation recording service provided by MRTK. Unfortunately, MRTK does not provide the localPosition version of Curves data. However, you can modify the data from the recordingBuffer after the InputRecordingService stops recording.
So, this is a method worth trying for you: in the handJointCurves dictionary property of recordingBuffer field, a set of pose curves is stored for each joint. And then, base on this table:Joint pose curves, subtract the position value of the key None from the position value of each other joint in every key frame so that the localPosition based on the key None is obtained.
I am creating a 2d sidescroller game. I have a point in space (where the mouse is) and I need the weapon to look and "follow" that point.
Does anyone know where to begin?
wikihow: How to Find the Angle Between Two Vectors
After you have the angle, you can appropriately rotate the thing to be rotated.
I am not sure if javascript also has an atan2(x,y) function, which could be used to get the angle.
I am tearing my hair out trying to figure out what seems to be a very easy problem. I know a lot of this stuff has been talked about tangentially, so apologies if this treads on well-covered ground, but I can't find anything specific to my solution (believe me, I've looked).
Basically I want to drag an object/sprite along a pre-defined, curved path (not just move it, but DRAG IT). Think of the iPhone's "Slide to unlock" thing, but instead of just dragging the slider left-to-right, make the path an arc or a wavy line.
My basic thinking was:
define a bezier path, set the object at the start point.
if the object is touched, check for hit detection on the bezier path in touchesMoved (or some similar function). if touches stay on the path, advance the sprite along the path until the path ends (in which case, task is finished) or the user's finger goes off the path (in which case, the object should go back to the beginning).
None of this is trivial (at least, that's how it seems). For example:
Doing hit detection on a Bezier path is a royal pain since you actually need to do it on the stroked portion, not the fill portion. And even then, I can't seem to find a way to do it on a path of any width -- only on the 1-point-wide path of the Bezier.
Moving an object partially along a path doesn't even seem possible: all of the animation methods move the sprite along the ENTIRE path. Also, doing this requires you to find the point on the path closest to the user's touch, which, if you've ever looked this up involves astoundingly complicated math.
I've thought of using rigid bodies to occupy all of the space EXCEPT the path, so the object can only move in the path. However, this requires the definition of curved rigid bodies some of which must be concave. Dead end.
Am I making this too hard? It doesn't seem that complicated. I don't need a whole solution, just a new way to think about this and kick in the right direction. Any help would be really appreciated.
How about this?
Consider the X Axis of your bezier
path.
Each time the user taps or interacts with the screen just look at the x portion of the touch
Map that X Coordinate with your path and move the object to the right position.
Yes, you are making this too hard.
Take the simplification suggested above (or along a circle, line, etc) if it works for, or if you really want to do it against a bézier curve, consider the following:
Look at the definition of the bézier curve
What you're looking for is to define a new object position P' from a current position P and a change in touch position D.
If you rephrase the original P(x,y) in terms of t (bézier curves are parametric), then the problem becomes finding how much t offset to add based on D.
Something involving the differential of the bezier fn at P might be a good way to do that. Ie, how much t would have been added had the curve just been a straight line coming from point P along the curve.
EDIT:
Transition between segments:
If each segment has t in [0,1), then you can detect t >= 1 and move on to the next segment, setting P to the end of the previous segment, and evaluating the movement again in relation to that point. There might have to be some heuristics involved if you have a lot of small points, etc.
I'm playing with OpenGL ES on iPhone and I'm trying to rotate a model by panning with the finger. I discovered the open source app Molecules that let's you do that and I'm looking at that code, but when it comes to rotate a model of mine I'm able to rotate it only around a point distant in the space (like it was in orbit as a satellite and I am the fixed planet).
Any suggestion on what can be wrong?
I can post the code later , maybe on demand (many lines)
For the most part refer to Molecules you can find it here MOLECULES
If my memory serves me correctly, I think you need to translate the model to the origin, rotate, and then translate back to starting position to get the effect you are after.
I think there is a glTranslate() function, Say the object is at 1,0,0. You should then translate by -1,0,0 to go to origin. That is translate by a vector going from the center of the object to the origin.
The draw code probably looks roughly like this:
glLoadIdentity();
glTranslate(0, 0, -10);
glRotate(...);
drawMolecule();
Now it's important to realize that these transformations are applied in reverse order. If, in drawMolecule, we specify a vertex, then this vertex will first be rotated about the axis given to glRotate (which by definition passes through the local origin of the molecule), and then be translated 10 units in the −z direction.
This makes sense, because glTranslate essentially means: "translate everything that comes after this". This includes the glRotate call itself, so the result of the rotation also gets translated. Had the calls been reversed, then the result of the translation would have been rotated, which results in a rotation about an axis that does not pass through the origin anymore.
Bottom line: to rotate an object about its local origin, put the glRotate call last.