Unity - Debug.DrawRay keeps drawing lines on every Update call - unity3d

I have a skeleton game object which I make move by increasing its velocity on the Update function, like this:
body.velocity = new Vector2((isFacingLeft ? -1 : 1) * speed, body.velocity.y);
Which is also controlled by a direction isFacingLeft, which determines movement for the skeleton from left to right.
int direction = isFacingLeft ? -1 : 1;
But, since my skeleton is in a platform, I want to use a ray cast to ensure I detect whenever no platform is in front (so the cast points diagonally down in front of the skeleton), so I prepare a few variables to ensure I have a proper position for the ray cast. Then, if no floor has been found, the isFacingLeft variable is flipped, so the skeleton moves in the opposite direction as it approaches an edge:
Vector2 beginFloorCast = new Vector2(transform.position.x + ((enemyWidth / 2) * direction), transform.position.y - enemyHeight / 2);
Vector2 floorCastDirection = transform.TransformDirection(new Vector2(1 * (direction), -1));
float floorCastLength = 1f;
RaycastHit2D hit = Physics2D.Raycast(beginFloorCast, floorCastDirection, floorCastLength);
if (!hit) {
isFacingLeft = !isFacingLeft;
}
While this all works, I am now trying to draw a debug ray so I can visualize in debug mode how long the floor ray is (and so I can use this in the future to detect if the player is in range of the skeleton with another ray), so I use the following:
Debug.DrawRay(beginFloorCast, floorCastDirection.normalized * floorCastLength, Color.green, 5f);
This also works, but in the Scene mode, the ray is drawn once for every Update call:
Is there a way for me to clear the debug screen after the Update function is done? Or what could I change in my code to make sure I only call it once and that the ray follows the skeleton?

Related

Unintended player movement from transform.InverseTransformDirection

this is my first time posting on here. I'm working on a game using the new Unity multiplayer networking solution.
In summary, the issue is that the player is not moving as intended.
I am trying to take player input as follows:
Vector3 worldSpaceDir = new Vector3(Input.GetAxisRaw("Vertical"), 0, Input.GetAxisRaw("Horizontal"));
then convert it to the object space coordinates of the player character:
_inputDirection = transform.InverseTransformDirection(worldSpaceDir);
The issue I'm having is with a rotation of 0 or 180 the player moves as expected with the WASD inputs, however, at 90 or 270 all the inputs are flipped(A = right, D = left, W = backward, S = forward).
I found a question that is exactly my question but no one responded with an answer. The question is quite old now so I wanted to ask it again for more visibility.
Here's a link to the original question.
Firstly, you are taking the worldSpaceDir wrong, it should be as follow
Vector3 worldSpaceDir = new Vector3(Input.GetAxisRaw("Horizontal"), 0, Input.GetAxisRaw("Vertical"));
here we take horizontal input as X and vertical input as Z, because in Unity Forward is pointed as Z and not Y.
Secondly, we do not need to use InverseTransformDirection() we just need TransformDirection() something like following
Vector3 inputDirection = transform.TransformDirection(worldSpaceDir);
here we are telling unity to convert the worldSpaceDir that is relative to transform (local direction) into a world space direction, so we might actually give a proper name to worldSpaceDir.
The following would work for you.
private void Update() {
Move();
}
private void Move() {
Vector3 directionToMove = new Vector3(Input.GetAxisRaw("Horizontal"), 0, Input.GetAxisRaw("Vertical"));
Vector3 inputDirection = transform.TransformDirection(directionToMove);
transform.position += inputDirection * Time.deltaTime;
}
I think you want to go the other way round actually!
Transform.InverseTransformDirection converts a vector from world space into local space.
What you get as input however is a local vector on the XZ plane. You want to apply this direction according to your player objects orientation, if e.g. pressing right (your input is 1,0,0) the object shall move towards its transform.right vector.
So you rather want to convert in the opposite direction into world space to move the object in the Unity world space.
You should rather be using Transform.TransformDirection!
var worldMove = transform.TransformDirection(input);
Or alternatively you can also just multiply by the rotation like
var worldMove = transform.rotation * input;
Note that if you are also using a Rigidbody like the question you linked there is also Rigidbody.AddRelativeForce which basically works the same way and expects a local space vector which is then internally converted into a world space force.

Rotate Object around point and move it along sine function

First off: I am very new to Unity, as in VERY new.
I want to do the following: I want to rotate a cube around a stationary point (in my case a camera) with a radius that is adjustable in the inspector. The cube should always have its Z-axis oriented towards the camera's position. While the cube is orbiting around the camera, it should additionally follow a sine function to move up and down with a magnitude of 2.
I have some working code, the only problem is an increase in distance over time. The longer the runtime, the higher the distance between the cube and the camera.
Here is what I currently have:
void Awake()
{
cameraPosition = GameObject.FindGameObjectWithTag("MainCamera").transform;
transform.position = new Vector3(x: transform.position.x,
y: transform.position.y,
z: cameraPosition.position.z + radius);
movement = transform.position;
}
I instantiate some variables in the Awake()-method and set the cube's position to where it should be (do you instantiate in Awake()?). I'll use the Vector3 movement later in my code for the "swinging" of the cube.
void Update()
{
transform.LookAt(cameraPosition);
transform.RotateAround(cameraPosition.position, cameraPosition.transform.up, 30 * Time.deltaTime * rotationSpeed);
MoveAndRotate();
}
Here I set the orientation of the cube's z-axis and rotate it around the camera. 30 is just a constant i am using for tests.
void MoveAndRotate()
{
movement += transform.right * Time.deltaTime * movementSpeed;
transform.position = movement + Vector3.up * Mathf.Sin(Time.time * frequency) * magnitude;
}
To be quite frank, I do not understand this bit of code completely. I do however understand that this includes a rotation as it moves the cube along it's x-axis as well as along the world's y-axis. I have yet to get into Vector and matrices, so if you could share your knowledge on that topic as well I'd be grateful for that.
It seems like I have found the solution for my problem, and it is an easy one at that.
First of all we need the initial position of our cube because we need to have access to its original y-coordinate to account for offsets.
So in Awake(), instead of
movement = transform.position;
We simply change it to
initialPosition = transform.position;
To have more readable code.
Next, we change our MoveAndRotate()-method to only be a single line long.
void MoveAndRotate()
{
transform.position = new Vector3(transform.position.x,
Mathf.Sin(Time.time * frequency) * magnitude + initialPosition.y,
transform.position.z);
}
What exactly does that line then? It sets the position of our cube to a new Vector3. This Vector consists of
its current x-value
our newly calculated y-value (our height, if you want to say so) + the offset from our original position
its current z value
With this, the cube will only bop up and down with distancing itself from the camera.
I have also found the reason for the increase in distance: My method of movement does not describe a sphere (which would keep the distance the same no matter how you rotate the cube) but rather a plane. Of course, moving the cube along a plane will automatically increase the distance for some points of the movement.
For instantiating variables in Awake it should be fine, but you could also do it in the Start(){} Method that Unity provides if you wanted to.
For the main problem itself I'm guessing that calling this function every frame is the Problem, because you add on to the position.
movement += transform.right * Time.deltaTime * movementSpeed;
Would be nice if you could try to replace it with this code and see if it helped.
movement = transform.right * Time.deltaTime * movementSpeed;

How to make a model appear in front of AR Camera after the session starts using ARFoundation?

I was looking to update the ARcamera position.I am doing ImageTracking project.It detects an image and a corresponding prefab is shown in front of the camera.It starts playing an animation.After the animation I want the prefab to come really close towards the camera.When I give the code prefab.position=ARcamera.position; after animation code,I think the prefab goes to the initial position where the ARCamera was when the app had started that is (0,0,0).
How to make the prefab come really close towards the front camera.
speed = 10f;
float step = speed * Time.deltaTime;
Showprefabs.transform.GetChild(0).position = Vector3.MoveTowards(Showprefabs.transform.GetChild(0).position,
new Vector3(Arcam.transform.position.x, Arcam.transform.position.y + 0.2f,
Arcam.transform.position.z + 6.3f), step);
//The values 0.2f and 6.3f I added using the Editor to make the prefab come near the camera(But in world position it is different.)
First of all I hope by "prefab" you mean already Instantiated GameObject. It makes no sense to move a prefab ;)
You tried to calculate the target position but did it with World-Space coordinates.
You probably want to do something like
var targetObject = Showprefabs.transform.GetChild(0);
var currentPosition = targetObject.position;
var targetPosition = Arcam.transform.position
// Place it 60cm in front of the camera
+ Arcam.transform.forward * 0.6f
// additionally move it "up" 20cm perpendicular to the view direction
+ Arcam.transform.up * 0.2f;
targetObject.position = Vector3.MoveTowards(currentPosition, targetPosition, step * Time.deltaTime);
If you want that movement a bit smoother so it moves slower if already close and faster if further away you might be interested in rather using Vector3.Lerp here
public float smoothFactor = 0.5f;
targetObject.position = Vector3.Lerp(currentPosition, targetPosition, smoothFactor);
Where a smoothFactor of 0.5 reads: Every frame set the object to a position in the center of the currentPosition and targetPosition. A value closer to 0 results in slower movement, closer to 1 in faster reaching the targetPosition.
Note that actually this approach will never really fully arrive at the targetPosition but only come very very close but usually this doesn't matter in AR where the Camera constantly moves a bit anyway.

How to lock position of physic body in z axis in Unity 3D

I am developing a 2.5D game. In that game I want my character (which has Rigidbody component attached to) to just move on x and y axises. So I use this code snippet:
private void LockZAxis () {
Vector3 currentPosition = _rigidbody.position;
currentPosition.z = 0;
_rigidbody.position = currentPosition;
}
I call this LockZAxis method in the end of both Update, FixedUpdate and LateUpdate. But it doesn't work. When my character run forward for a while, its z position is still changed.
For additional information, in my code, there are two times I manipulate the position of RegidBody. The first is when my character jump, that time I use this:
jumpVelocityVector = Vector3.up * jumpForceUp + transform.forward * jumpForceForward;
_rigidbody.velocity = jumpVelocityVector;
And each frame when I want my character to move a bit faster, so in the update method, I have this:
void Update () {
Vector3 newPosition = transform.position + transform.forward * speed * Time.deltaTime;
newPosition.z = 0;
_rigidbody.MovePosition (newPosition);
LockZAxis ();
}
A rigidbody is used to simulate physics, by setting the rigidbody's position every frame you're essentially teleporting the character every frame. You can restrict movement in z-axis, this will prevent it to move in z-axis when physics is applied, which is what a rigidbody typically is used for.
Here is how to restrict rigidbody positional change:
If you run your LockZAxis() after you've changed the position it should teleport the object to the z-position of 0 every frame. Please make sure that the z-axis is the correct axis. You can debug this by pausing a running game and manipulating the Transform values to see how each axis moves your Object.
Here is how you can do it with C# Script:
Freeze All Positions
rigidbody.constraints = RigidbodyConstraints.FreezePosition;
Freeze Specific Positions:
rigidbody.constraints = RigidbodyConstraints.FreezePositionY | RigidbodyConstraints.FreezePositionZ;
Unity Documentation
Is physics gravity set to only affect the Y position ?
Physics.gravity = new Vector3(0, -1.0F, 0);
And set these also
rigidbody.angularVelocity = Vector3.zero;
rigidbody.velocity.z=0;
make sure your rigidbody is set to kinematic since you are using Rigidbody.moveposition() and using moveposition() will directly effect velocity internally on a kinematic rigidbody
Try using moveposition() for you jump instead of velocity

check if centerPlane is not near edge in unity

I'm trying to make whack a mole game using project tango.
When user start the game, the program will create holes at random point for the moles to come out. Right now, I already can make the hole and spawn the mole at random, though I have a problem with the created hole.
The hole sometimes spawn at an edge contour of table and stuff and make the hole partially floating in the air. Is there any way to check if the centerPlane is near an edge or not?
Here's the screenshot of the problem I meant. I want the "hole" not to spawn on area that doesn't fit with it's height and width. Currently, I'm using farandole release.
EDIT 1:
I'm trying to do as Hristo suggest. But it doesn't work, the FindClosestPoint always return -1, even when I use the center of the screen. Here's the script I used. And for some additional info, I'm using the unitySDK and unity 5.5.2f1
bool CheckTheCorner(Camera cam,Vector3 planeCenter){
Vector2 firstPointInScreen = WorldToScreenConverter(cam,new Vector3 (planeCenter.x,planeCenter.y,planeCenter.z-holeheight));
Vector2 secondPointInScreen = WorldToScreenConverter(cam,new Vector3 (planeCenter.x,planeCenter.y,planeCenter.z+holeheight));
Vector2 thirdPointInScreen = WorldToScreenConverter(cam,new Vector3 (planeCenter.x-holewidth,planeCenter.y,planeCenter.z));
Vector2 fourthPointInScreen = WorldToScreenConverter(cam,new Vector3 (planeCenter.x+holewidth,planeCenter.y,planeCenter.z));
DebugText.text = m_pointCloud.FindClosestPoint (cam, new Vector2(Screen.width / 2, Screen.height / 2), 1).ToString ();
Vector3 firstPoint = m_pointCloud.m_points[m_pointCloud.FindClosestPoint(cam, firstPointInScreen, 1)];
Vector3 secondPoint = m_pointCloud.m_points[m_pointCloud.FindClosestPoint(cam, secondPointInScreen, 1)];
Vector3 thirdPoint = m_pointCloud.m_points[m_pointCloud.FindClosestPoint(cam, thirdPointInScreen, 1)];
Vector3 fourthPoint = m_pointCloud.m_points[m_pointCloud.FindClosestPoint(cam, fourthPointInScreen, 1)];
return false;
}
Vector2 WorldToScreenConverter(Camera cam,Vector3 worldPos){
Vector3 screenPos = cam.WorldToScreenPoint (worldPos);
return new Vector2 (screenPos.x,screenPos.z);
}
Ah yes, don't mind the return false one for the moment, I just put it there to avoid error since I'm still figuring out the FindClosestPoint.
What you can do is take the 4 corners on your plane and decide if they are all laying on a surface in the real world, if not you can make the plane elsewhere.
That can happen with the use of FindClosestPoint() method in the TangoPointCloud.cs. What that method does is makes a Raycast from your camera trough a certain point on the screen landing in the real world environment. The method then returns the index of that point. The list to search with the indexes is called m_points
So lets split it in steps:
Make 4 Vectors using the `FindClosestPoint().
Check if all 4 vectors are on the same plane (simple math).
If step 2 is true -> Instantiate your GameObject on that plane.
To get one of the vectors your code should be something like this:
Vector3 firstPoint = m_pointCloud.m_points[m_pointCloud.FindClosestPoint(Camera.main, new Vector2(Screen.width / 2, Screen.height / 2), 1)];
In this example I'm using the center of the screen as my Vector2 parameter. However you don't want the center of the screen . Instead you want the position of one corner from your plane translated as a screen point.
Hope that solves your problem. Cheers.