NavMeshAgent.SetDestination Y axis problem - unity3d

Im trying to create simple movement AI for GameObjects.
Each GameObject has NavMeshAgent
Vector3 destination = new Vector3(Random.Range(-walkArea, walkArea), 0, Random.Range(-walkArea, walkArea));
navAgent.SetDestination(destination);
That's what im trying to do. But my baked ground not flat, there could be Y axis up to 30-40.
So if there mountains around GameObject he just gets stuck and can't climb over.
What can i do about it? If i just navAgent.Move(destination), everything works fine. GameObject teleports on X-Z position without worrying about Y axis.
How i can do same thing but with SetDestination?

I found solution.
In main GameObject i created empty gameobject with NavMeshAgent.
Vector3 destination = new Vector3(Random.Range(-walkArea, walkArea), 0, Random.Range(-walkArea, walkArea));
navDestinationObject.Move(destination);
navAgent.destination = navDestinationObject.transform.position;
navDestinationObject gets right Y axis on .Move, then we just move main GameObject to navDestinationObject position.
But i think there must be better solution...

Get the ground y by casting a Ray where groundLayerMask is the LayerMask of your ground to prevent misbehave.
public void WalkTo(Vector3 position)
{
Physics.Raycast(position + Vector3.up * 64, Vector3.down, out RaycastHit hit, 128, groundLayerMask);
position.y = hit.y;
navAgent.SetDestination(position);
}
So we cast a Ray from 64 units above the position to find the ground y and then set it manually. (The RaycastHit is struct so if there is no hit, y will be 0.)

Related

Unity 2D, rotate Vector2 to NEW Vector2 in animation

My game is 2D from a top down-ish perspective, my character movement vector2 being fed into the animator blend to determine which direction my sprite faces: (example sprite)
Vector2 movement = new Vector2(Input.GetAxis("Horizontal"),
Input.GetAxisRaw("Vertical"));
anim.SetFloat("hor", movement.x);
anim.SetFloat("ver", movement.y);
However, I would like my sprite to rotate to it's new target vector2 rather than instantly switch to it. So if I was facing right, and I pushed left, the movement vector2 would travel over time to a new movementTarget vector2, the sprite changing from facing right, to up, to left.
I cannot figure or find a way to do this and have been on it many hours. I've tried things like Vector3.RotateTowards and angles, but each approach I can't get what I'm looking for as this aspect of math just confuses me.
Could someone point me in the right direction?
Vector2 targetMovement = new Vector2(Input.GetAxis("Horizontal"),
Input.GetAxisRaw("Vertical"));
if (targetMovement != movement) coroutine?????
I don't want to rotate the sprite image, or the object transform, just the movement Vector2 variable over time. So if I am facing right (1,0) and press left, I want the Vector to travel through (0,1 - up) then finally to (-1,0 - left) but gradually.
Managed to do it like this. Unsure if best way?
float currAngle, targAngle;
Vector2 movement = new Vector2(Input.GetAxis("Horizontal"),
Input.GetAxisRaw("Vertical"));
if (movement != Vector2.zero)
targAngle = Mathf.Atan2(movement.y, movement.x) * Mathf.Rad2Deg;
if (Mathf.Abs(currAngle - targAngle) > 1f) {
currAngle = Mathf.LerpAngle(currAngle, targAngle, 10f * Time.deltaTime);
Vector2 newVec = new Vector2(Mathf.Cos(currAngle * Mathf.Deg2Rad),
Mathf.Sin(currAngle * Mathf.Deg2Rad));
anim.SetFloat("hor", newVec.x);
anim.SetFloat("ver", newVec.y);
}

Unity rotation calculation

//RAYCAST
RaycastHit hit = new RaycastHit();
if(Physics.Raycast(impact.position, bloom, out hit, 1000f, canBeShot))
{
GameObject newBulletHole = Instantiate(bulletHolePrefab, hit.point + hit.normal * 0.001f, Quaternion.identity) as GameObject;
newBulletHole.transform.LookAt(hit.point + hit.normal);
Destroy(newBulletHole, 5f);
}
//Bullet
bulletSpawnPoint = GameObject.Find("BulletSpawn").transform;
var bullet = Instantiate(bulletPrefab, bulletSpawnPoint.position, **bulletSpawnPoint.rotation**);
bullet.GetComponent<Rigidbody>().velocity = bulletSpawnPoint.forward * loadout[currentIndex].bulletSpeed;
I need to get the perfect "bulletSpawnPoint.rotation" depending of my bullet hole created by the raycast hit. Thanks
enter image description here
So you want to spawn the bullet and then use Rigitbody to move it to the newBulletHole gameobject.
Easy way would be to store position of the bullet hole and then pass it into LookAt method of the bullet Transform before adding velocity.
A bit more advanced way (and this is what you are asking in your question, afaik): calculate the direction vector (bulletSpawnPoint.transform.position - newBulletHole.transform.position) and then get Quaternion (i.e. rotation) with LookRotation

Input.mousePosition.x or y equivalent on Raycast with World Space canvas

I'm trying to convert a mouse event into a VR Raycast event.
I've Input.mousePosition.x & Input.mousePosition.y as the coordinate of a mouse click event. I want to apply the same event on a VR Raycast hit.
I've the following code
Ray ray = new Ray(CameraRay.transform.position, CameraRay.transform.forward);
RaycastHit hit;
if (GetComponent<Collider>().Raycast(ray, out hit, 100)) {
Debug.Log ("True");
Vector3 point = camera.WorldToScreenPoint(hit.point);
} else {
Debug.Log ("False");
}
From the Raycast hit point, how do I get the equivalent x-y coordinates of where the Ray hits the collider?
Update:
The following script is attached to a color picker object in my Unity setup for Google Cardboard. On hover on the color picker, I want to get the coordinates of where the Raycast hits the collider (so that I can get the color in that coordinate).
Question 1:
On the FixedUpdate, I've an if statement if (GetComponent<Collider>().Raycast(ray, out hit, 100)) and it's returning false. What am I missing here?
Question 2:
Am I correct to assume that if the hit.point is set, I could get the x,y,z coordinates of the point where the ray hits the collider to be point[0], point1 and point[2]?
I'm not a huge expert on VR in Unity, but you would usually use Camera.WorldToScreenPoint for these purposes. To use it on the main camera, use these lines:
Ray ray = new Ray(CameraRay.transform.position, CameraRay.transform.forward);
RaycastHit hit;
if (GetComponent<Collider>().Raycast(ray, out hit, 100)) {
Vector3 point = camera.WorldToScreenPoint(hit.point);
}
The z value gives you the distance from the camera, in case you were interested in that. The Unity documentation: https://docs.unity3d.com/ScriptReference/Camera.WorldToScreenPoint.html
EDIT:
Question 1:
RaycastHit hit;
if (Physics.Raycast(CameraRay.transform.position, Vector3.forward, out hit)) {
Debug.Log ("True");
Vector3 point = camera.WorldToScreenPoint(hit.point);
} else {
Debug.Log ("False");
}
Attempt using this code instead, from what I've read, Physics.Raycast() is almost always better to use than Collider.Raycast().
Question 2:
Vector3[] point = new Vector3[3] {hit.point.x, hit.point.y, hit.point.z};
This creates an array named point that has the three variables assigned, although the z coordinate can be discarded for use on a 2D color wheel.

transform.LookAt() causes Y angle to increase by 180

I am trying to create a third person spaceship movement.
The spaceship rotates about all axes at its position, and has a throttle control to move in forward direction. There is a camera which is always behind it. I am not making the camera a child because I want the camera to NOT follow the rotation about z axes.
The camera has a script, which keeps its position a fixed distance behind the spaceship, and then calls transform.LookAt(spaceShipTarget).
The problem is that as I rotate the ship around global x axes 90 degrees, the y axis of camera suddenly does a 180 degree rotation. The camera control script is below:
using UnityEngine;
namespace UnityStandardAssets.Utility
{
public class FollowBehind : MonoBehaviour
{
public Transform target;
public float distance;
public float delay;
private Vector3 velocity = Vector3.zero;
private void LateUpdate()
{
Vector3 offset = target.transform.TransformVector(0, 0, distance);
Vector3 currentPosition = transform.position;
Vector3 finalPosition = target.position + offset;
transform.position = Vector3.SmoothDamp(currentPosition,
finalPosition, ref velocity, delay);
transform.LookAt(target);
}
}
}
Why would that happen and how can I fix it?
The problem you have with the rotation of the camera is probably caused by the script you use to make the camera follow the spaceship, probably because when you rotate the spaceship the rotation (and probably the position) of the camera are affected.
What you could do instead is make both the spaceship and camera child of another object, and then add a script to this parent object. Now you can put some code in the script of the parent to move the parent itself (this way both camera and spaceship will move together, and you don't need to keep them together manually) and also in the script of the parent you can put some code to rotate the spaceship and camera individually or together based on specific inputs.

Instantiate prefab around a object

I have a scene with a body maked with makehuman, and I need to add a simple prefab (a torus) around the arm of the body when the user touch the arm.
I tried:
Instantiate the prefab in the point where the user touch, but the prefab apear in the border of the arm.
Instantiate the prefab in the center of the arm, with this code:
float radio = hit.transform.collider.radius; // the arm has a capsuleCollider
Ray r = Camera.main.ScreenPointToRay(Input.GetTouch(0));
Vector3 origin = r.origin;
float distance = (origin - hit.point).magnitude;
RaycastHit ou;
Vector3 position = hit.point;
Ray r2 = new Ray(r.GetPoint(distance + 10f), -r.direction);
if (cc.Raycast(r2, out ou, distance + 10f))
position = (hit.point + ou.point) / 2;
Instantiate(Prefab, position, Quaternion.identity);
This try to Select the center of the arm and initialite a torus.
The second option works in some cases, but the general impression is that is the wrong way to do it.
How can I add a prefab around a collider? or, how can I modify the mesh to add a visual indicator?
This should work a lot better as well as look a lot cleaner:
Vector3 center = hit.transform.collider.bounds.center;
Instantiate(Prefab, center, Quaternion.identity);
hit.transform.collider is a vital part of this process and you got that part. collider.bounds is the bounding box that surrounds the collider (http://docs.unity3d.com/ScriptReference/Collider-bounds.html), and bounds.center is the center of the bounding box (http://docs.unity3d.com/ScriptReference/Bounds-center.html). The Vector3 that bounds.center returns is where you want to spawn your prefab.
From there, you should be able to rotate the prefab to the desired angle and perform any number of operations you want.