I have two gameObjects A and B. They are rotated at 90 degrees, which makes its local y axis face forward.
1st Case
In this case, the local y position of B is ahead of local y position of A
2nd Case
Even though their global position is same as the 1st case, we can observe here that local y position of A is ahead of local y position of B.
I tried using A.transform.localPosition.y and B.transform.localPosition.y to find which is greater but it doesnt work. What can I do to find which is front in these two different cases?
Vector projections are your friend here. Project both positions onto a line and compare their magnitude (or square magnitude, it's faster).
Case 1:
Vector3 a = Vector3.Project(A.position, Vector3.up);
Vector3 b = Vector3.Project(B.position, Vector3.up);
if (a.sqrMagnitude > b.sqrMagnitude)
{
// a is ahead
}
else
{
// b is ahead
}
Case 2: Project both positions onto Vector3.left.
Maybe you can even always simply project the two positions onto one of the two objects' forward vector (A.forward or B.forward assuming they're rotated equally).
Hope this helps.
You could compare Vector3.Dot(A.position, A.forward) and Vector3.Dot(B.position, B.forward) to find the one in front in relation to their forward.
The object with the bigger Dot product is in front, and this works in all rotations, including 3D ones.
You can use the following snippet to test for yourself:
// Assign these values on the Inspector
public Transform a, b;
public float RotationZ;
void Update() {
a.eulerAngles = new Vector3(0, 0, RotationZ);
b.eulerAngles = new Vector3(0, 0, RotationZ);
Debug.DrawRay(a.position, a.right, Color.green);
Debug.DrawRay(b.position, b.right, Color.red);
var DotA = Vector2.Dot(a.position, a.right);
var DotB = Vector2.Dot(b.position, b.right);
if (DotA > DotB) { Debug.Log("A is in front"); }
else { Debug.Log("B is in front"); }
}
Related
this is my first time posting on here. I'm working on a game using the new Unity multiplayer networking solution.
In summary, the issue is that the player is not moving as intended.
I am trying to take player input as follows:
Vector3 worldSpaceDir = new Vector3(Input.GetAxisRaw("Vertical"), 0, Input.GetAxisRaw("Horizontal"));
then convert it to the object space coordinates of the player character:
_inputDirection = transform.InverseTransformDirection(worldSpaceDir);
The issue I'm having is with a rotation of 0 or 180 the player moves as expected with the WASD inputs, however, at 90 or 270 all the inputs are flipped(A = right, D = left, W = backward, S = forward).
I found a question that is exactly my question but no one responded with an answer. The question is quite old now so I wanted to ask it again for more visibility.
Here's a link to the original question.
Firstly, you are taking the worldSpaceDir wrong, it should be as follow
Vector3 worldSpaceDir = new Vector3(Input.GetAxisRaw("Horizontal"), 0, Input.GetAxisRaw("Vertical"));
here we take horizontal input as X and vertical input as Z, because in Unity Forward is pointed as Z and not Y.
Secondly, we do not need to use InverseTransformDirection() we just need TransformDirection() something like following
Vector3 inputDirection = transform.TransformDirection(worldSpaceDir);
here we are telling unity to convert the worldSpaceDir that is relative to transform (local direction) into a world space direction, so we might actually give a proper name to worldSpaceDir.
The following would work for you.
private void Update() {
Move();
}
private void Move() {
Vector3 directionToMove = new Vector3(Input.GetAxisRaw("Horizontal"), 0, Input.GetAxisRaw("Vertical"));
Vector3 inputDirection = transform.TransformDirection(directionToMove);
transform.position += inputDirection * Time.deltaTime;
}
I think you want to go the other way round actually!
Transform.InverseTransformDirection converts a vector from world space into local space.
What you get as input however is a local vector on the XZ plane. You want to apply this direction according to your player objects orientation, if e.g. pressing right (your input is 1,0,0) the object shall move towards its transform.right vector.
So you rather want to convert in the opposite direction into world space to move the object in the Unity world space.
You should rather be using Transform.TransformDirection!
var worldMove = transform.TransformDirection(input);
Or alternatively you can also just multiply by the rotation like
var worldMove = transform.rotation * input;
Note that if you are also using a Rigidbody like the question you linked there is also Rigidbody.AddRelativeForce which basically works the same way and expects a local space vector which is then internally converted into a world space force.
I am working on a simple AR engine and I am having a problem with matching 3d object with a camera image.
For better understanding, I illustrated it with the picture. Points A and B are in 3d space. Points C and D are given on a texture plane. The distance to the plane from the camera is known.
I know how to obtain the coordinates of Anear, Bnear, Afar, Bfar, Cnear, Dnear, Cfar and Dfar.
The problem is how to find points A' and B' in 3d space such as vector d==d' and points Anear == Cnear and Bnear == Dnear (the projection of 3d points to the screen should result with the same coordinates)
Could anyone please help me with the math here, or at least point me to where to look for the answer?
PS. Seems like my problem description is not clear enough so to put it in other words: I have a pair of points in 3d space and a pair of points on texture plane (image from webcam). I need to put the points in 3d space at the correct distance from camera - so after perspective transformation they overlay the points on texture plane. The spatial relation of the 3d points need to be preserved. In the drawing the visual solution are points A' and B'. The dashed line illustrates the perspective transformation (where they are casted on near plane at the same location as points C and D).
So if I understand correct
given points in world-space are
A
B
C
D
also known is the distance d and implicitely the Camera.position origin and Camera.transform.forward direction.
Searched are
A'
B'
As I understand you could find the first point A' by finding the intersection point of the line (origin = A, direction = camera forward) and the line (origin = camera.position, direction = Camera.position -> C).
Equally also the second point B' by finding the intersection point of the line (origin = B, direction = camera.forward) and the line (origin = camera.position, direction = Camera.position -> D).
Unity offers some special Math3d that come to help here e.g.:
//Calculate the intersection point of two lines. Returns true if lines intersect, otherwise false.
//Note that in 3d, two lines do not intersect most of the time. So if the two lines are not in the
//same plane, use ClosestPointsOnTwoLines() instead.
public static bool LineLineIntersection(out Vector3 intersection, Vector3 linePoint1, Vector3 lineVec1, Vector3 linePoint2, Vector3 lineVec2)
{
Vector3 lineVec3 = linePoint2 - linePoint1;
Vector3 crossVec1and2 = Vector3.Cross(lineVec1, lineVec2);
Vector3 crossVec3and2 = Vector3.Cross(lineVec3, lineVec2);
float planarFactor = Vector3.Dot(lineVec3, crossVec1and2);
//is coplanar, and not parrallel
if(Mathf.Abs(planarFactor) < 0.0001f && crossVec1and2.sqrMagnitude > 0.0001f)
{
float s = Vector3.Dot(crossVec3and2, crossVec1and2) / crossVec1and2.sqrMagnitude;
intersection = linePoint1 + (lineVec1 * s);
return true;
}
else
{
intersection = Vector3.zero;
return false;
}
}
So you could probably do something like
public static bool TryFindPoints(Vector3 cameraOrigin, Vector3 cameraForward, Vector3 A, Vector3 B, Vector3 C, Vector3 D, out Vector3 AMapped, out Vector3 BMapped)
{
AMapped = default;
BMapped = default;
if(LineLineIntersection(out AMapped, A, cameraForward, cameraOrigin, C - cameraOrigin))
{
if(LineLineIntersection(out BMapped, B, cameraForward, cameraOrigin, D - cameraOrigin))
{
return true;
}
}
return false;
}
and then use it like
if(TryFindPoints(Camera.transform.position, Camera.transform.forward, A, B, C, D, out var aMapped, out var bMapped))
{
// do something with aMapped and bMapped
}
else
{
Debug.Log("It was mathematically impossible to find valid points");
}
Note: Typed on smartphone but I hope the idea gets clear
Given K the camera position and X=A' and Y=B'
var angleK = Vector3.Angle(C-K,D-K);
var angleB = Vector3.Angle(D-K, A-B);
var XK = Mathf.Sin(angleB)*(Vector3.Distance(A,B))/Mathf.Sin(angleK);
var X= K+(C-K).normalized*XK;
var Y= B + X - A;
As per my game requirements, I was giving manual force when two cars collide with each other and move back.
So I want the correct code that can justify this. Here is the example, collision response that I want to get:
As per my understanding, I have written this code:
Vector3 reboundDirection = Vector3.Normalize(transform.position - other.transform.position);
reboundDirection.y = 0f;
int i = 0;
while (i < 3)
{
myRigidbody.AddForce(reboundDirection * 100f, ForceMode.Force);
appliedSpeed = speed * 0.5f;
yield return new WaitForFixedUpdate();
i++;
}
I am moving, my cars using this code:
//Move the player forward
appliedSpeed += Time.deltaTime * 7f;
appliedSpeed = Mathf.Min(appliedSpeed, speed);
myRigidbody.velocity = transform.forward * appliedSpeed;
Still, as per my observation, I was not getting, collision response in the proper direction. What is the correct way for getting above image reference collision response?
Until you clarify why you have use manual forces or how you handle forces generated by Unity Engine i would like to stress one problem in your approach. You calculate direction based on positions but positions are the center of your cars. Therefore, you are not getting a correct direction as you can see from the image below:
You calculate the direction between two pivot or center points therefore, your force is a bit tilted in left image. Instead of this you can use ContactPoint and then calculate the direction.
As more detailed information so that OP can understand what i said! In the above image you can see the region with blue rectangle. You will get all the contact points for the corresponding region using Collision.contacts
then calculate the center point or centroid like this
Vector3 centroid = new Vector3(0, 0, 0);
foreach (ContactPoint contact in col.contacts)
{
centroid += contact.point;
}
centroid = centroid / col.contacts.Length;
This is the center of the rectangle to find the direction you need to find its projection on your car like this:
Vector3 projection = gameObject.transform.position;
projection.x = centroid.x;
gameObject.GetComponent<Rigidbody>().AddForce((projection - centroid )*100, ForceMode.Impulse);
Since i do not know your set up i just got y and z values from car's position but x value from centroid therefore you get a straight blue line not an arrow tilted to left like in first image even in the case two of second image. I hope i am being clear.
I am using the MVN-Unity plug-in to access Xsens motion capture data and animate a Unity character in real-time. The character in white and green and yellow is a Unity skeleton that is animated based on Xsens motion data.
I am now trying to animate a different (non-Unity) character using Xsens (the other character that looks like a human) so similar to what the plug-in does, the motion data (positions & orientations) are being mapped to his joints/bones.
But as you can see below, something is wrong with orientations...
I think the reason might be that the rotations from MVN are not properly offset. As you can see in the next two pictures, the MVN hips have the x-axis (red) pointing to the puppet's backside, whereas for the guy's hips, the x-axis points to the right of him.
It might also be that the plug-in is using global rotations somewhere where it should use local rotations. That this must be the case can be demonstrated when I rotate the guy around before I start the Unity app; i.e. select the guy's root game object and try setting the y-rotation to 0/90/180/270 before pressing play, and compare the results: every time the distortions are different.
I don't know how to properly fix this. The code snippet that updates the Unity model (mapped to the MVN puppet or the guy) is as follows. I took this from the plug-in scripts:
private void updateModel(Transform[] pose, Transform[] model)
{
// Re-set the target, then set it up based on the segments.
Vector3 pelvisPos = new Vector3();
Vector3 lastPos = target.position;
target.position = Vector3.zero;
// Map only 23 joints.
for (int i = 0; i < 23; i++)
{
switch (i)
{
// Position only on y axis, and leave x and z to the body. Apply the 'global' position & orientation to the pelvis.
case (int)XsAnimationSegment.Pelvis:
pelvisPos = pose[i].position * scale;
model[i].position = new Vector3( model[i].position.x, pelvisPos.y, model[i].position.z );
model[i].rotation = pose[i].rotation * modelRotTP[i];
break;
// Update only the 'orientation' for the rest of the segments.
default:
if ( model[i] != null )
{
model[i].rotation = pose[i].rotation * modelRotTP[i];
}
break;
}
}
// Apply root motion if the flag is enabled; i.e. true.
if (applyRootMotion)
{
// Only update x and z, since pelvis is already modified by y previously.
target.position = new Vector3(pelvisPos.x + pelvisPosTP.x, lastPos.y, pelvisPos.z + pelvisPosTP.z);
}
// Set the final rotation of the full body, but only position it to face similar as the pelvis.
Quaternion q = Quaternion.Inverse(modelRotTP[(int)XsAnimationSegment.Pelvis]) * model[(int)XsAnimationSegment.Pelvis].rotation;
target.rotation = new Quaternion(target.rotation.x, q.y, target.rotation.z, target.rotation.w);
}
I sort of understand what the code does, but I don't know how to fix this problem. Most probably to do with the axes being different? I would appreciate any help...
You can modify XsLiveAnimator.cs script in the line: 496
with that
model[segmentOrder[i]].transform.rotation = orientations[segmentOrder[i]];
model[segmentOrder[i]].transform.Rotate(rotOffset, Space.World);
rotOffset is a Vector3 of your rotation
I have 2 objects. It will be in various direction and distance.
How can i instantiate objects between them with a specific distance.
var centerLocation : Vector3 = Vector3.Lerp(object2.transform.position - object1.transform.position, 0.5);
Vector3.Lerp will determine the Vector3 location between 2 Vector3s at a specified percentage. 0.5 = 50%.
My suggestion would be to calculate the vector between the two objects, like this
Vector3 objectLine = (object2.transform.position - object1.transform.position);
Store the magnitude of that vector
float distance = objectLine.magnitude;
Then, normalise the vector;
objectLine = objectLine.normalized;
Iterate through the line, instanciating the object you want to create a specific distances
Vector3 creationPoint = object1.transform.position;
float creationPointDistance = (object1.transform.position -
object1.transform.position);
while(creationPointDistance < distance)
{
creationPoint += objectLine * NEW_OBJECT_DISTANCE;
creationPointDistance = (object1.transform.position -
object1.transform.position);
if(creationPointDistance < distance)
{
objects.Add((GameObject)Instanciate(newObject, creationPoint,
new Vector3(0.0f,0.0f,0.0f)));
}
}
What that will do is set the initial point to be object1's position. It will then move a set distance along the vector between object 1 and object 2, check it's within the two objects, and if it is, instanciate the object, storing it in a list of gameobjects.
That hopefully should do it. I don't have Unity (or any IDE) in front of me to check the syntax.