Unity3D Input.GetAxis for Mouse not consistent - unity3d

i am working on camera rotations based on mouseInput and i want it to have consistent sensitivity across all framerates. According to the unity documentation, the Input.GetAxis should be inherently framerate independent for the mouse movement.
What I am finding is that when i use the code below and change the framerate from 200+ to 30, the Input.GetAxis is returning very different outputs. For 200+ fps the GetAxis is low around like 2-5 where for 30 fps it is returning around 10-15 for the same mouse movements. Its causes a very drastic sensitivity difference in-game. Am i missing something? Thanks
public Rigidbody _playerRigidBody;
private Vector2 _lookInput;
private void Start()
{
//Application.targetFrameRate = 30;
}
private void Update()
{
_lookInput.x = Input.GetAxis("Mouse X");
}
void FixedUpdate()
{
var sensitivity = 300f;
var newPlayerRotation = _playerRigidBody.rotation * Quaternion.Euler(_lookInput.x * sensitivity * Vector3.up * Time.deltaTime);
_playerRigidBody.MoveRotation(newPlayerRotation);
}
}
I have tested everything in this little function and the only thing that causes the issue is the Input.GetAxis. Its inconsistent. Any ideas or solutions or workarounds?

"framerate independent" means the value is depend on the movement of the mouse, it doesn't mean the value is consistent.
For framerate = 200, value = 2-5 means the mouse move 2-5 points in 1/200 seconds.
So to get the move distance during 1 second use:
_lookInput.x = Input.GetAxis("Mouse X") / Time.deltaTime;

Related

When calculating FPS in Unity, why does it count like this? [duplicate]

When I start up my game it stays around 95-101 rapidly changing, in between all of those numbers.. but when I open up the stats bar I'm getting upper 200's low 300's
so wondering why that is still new to c# so be easy on me lol. heres the code
thanks in advance as always ^_^.
float deltaTime = 0.0f;
void Update()
{
deltaTime += (Time.deltaTime - deltaTime) * 0.1f;
}
void OnGUI()
{
int w = Screen.width, h = Screen.height;
GUIStyle style = new GUIStyle ();
Rect rect = new Rect (0, 0, w, h * 2 / 100);
style.alignment = TextAnchor.UpperRight;
style.fontSize = h * 2 / 100;
style.normal.textColor = new Color (255.0f, 255.0f, 255.0f, 1.0f);
float msec = deltaTime * 1000.0f;
float fps = 1f / deltaTime;
string text = string.Format ("({1:0.} fps)", msec, fps);
GUI.Label (rect, text, style);
}
}
In order to display a meaningful FPS rate you need to measure how many frames were rendered over a constant period of time, for example one second. Then only after that period do you display the calculated value on screen. This will provide for an average frames per second as opposed to an instantaneous frames per second, the latter of which is not particularly useful in most cases as it leads to widely fluctuating values.
Code
First define some fields:
DateTime _lastTime; // marks the beginning the measurement began
int _framesRendered; // an increasing count
int _fps; // the FPS calculated from the last measurement
Then in your render method you increment the _framesRendered. You also check to see if one second has elapsed since the start of the period:
void Update()
{
_framesRendered++;
if ((DateTime.Now - _lastTime).TotalSeconds >= 1)
{
// one second has elapsed
_fps = _framesRendered;
_framesRendered = 0;
_lastTime = DateTime.Now;
}
// draw FPS on screen here using current value of _fps
}
Cross-technology
It should be pointed out that the above code makes no particular use of Unity whilst still being reasonably accurate and is compatible with many frameworks and APIs such as DirectX; OpenGL; XNA; WPF or even WinForms.
When I start up my game it stays around 95-101 rapidly changing, in between all of those numbers.. but when I open up the stats bar I'm getting upper 200's low 300's
The ASUS VG248QE is 1ms and the max it can do is 144Hz so it is unlikely you are getting "upper 200's low 300's". FPS is meaningless when VSYNC is turned off on a non-GSYNC monitor. Is your VSYNC turned on?
In Unity, FPS is equivalent to number of Updates that occur in 1 second. This is because Update() is called every Time.deltaTime seconds.
InvokeRepeating method
You can also use InvokeRepeating to implement your own FPS counter while using only integers, like this:
private int FrameCounter = 0;
private int Fps = 0;
void Start()
{
InvokeRepeating("CountFps", 0f, 1f);
}
void Update()
{
FrameCounter++;
}
private void CountFps()
{
Fps = FrameCounter;
FrameCounter = 0;
}
Then just display the Fps variable in the OnGUI() method. Using this method, your Fps value will get updated every second; if you want more frequent updates, change the last argument of InvokeRepeating call and then adjust the Fps calculation accordingly.
Note, however, that InvokeRepeating takes Time.timeScale into account, so e.g. if you pause the game with Time.timeScale = 0f; the counter will stop updating until you unpause the game.
FixedUpdate method
Another approach is to count the FPS in FixedUpdate() method instead of OnGUI() or Update(). This gets called every Time.fixedDeltaTime seconds which is always the same, no matter what. The value of Time.fixedDeltaTime can be set globally for the project via menu Edit->Project Settings->Time, item Fixed Timestep.
In this case, you would count frames the same way (in Update), but update your FPS counter in FixedUpdate - which is basically the same as calling you own method with InvokeRepeating("CountFps", 0f, 0.02f) (0.02f being a typical Time.fixedDeltaTime value, but this depends on your project settings as per above).
Conclusion
Most of the time, you won't need to update the displayed FPS that often, so I personally like to use the InvokeRepeating method and 1 second intervals.
OnGUI function is called at las twice per frame (sometimes more). You are calculating your "FPS" inside OnGUI so it will almost never be accurate.
Defining :
deltaTime += (Time.deltaTime - deltaTime) * 0.05f;
will bring your FPS values closest to real but it will not be accurate if you calc it on OnGUI method.
I guess (not sure) that you should use FixedUpdate() instead of OnGUI() to calc your FPS. (also you don't need to change your deltaTime to multiply by 0.05f if you use FixedUpdate)

Forward and Sideways Player movement

So my friends and I are developing a game where you play as a snowball rolling down a hill avoiding obstacles. We're having trouble with our movement, however. We want our snowball to gains speed the larger it gets and the longer it goes without hitting something. Our forward movement is controlled by
void FixedUpdate(){
rb.velocity += Physics.gravity * 3f * rb.mass * Time.deltaTime;
//to accelerate
rb.AddForce(0, 0, 32 * rb.mass);
}
We're applying a sideways force on key input
if (Input.GetKey(ControlManager.CM.left))
{
if (rb.velocity.x >= -15 - (rb.mass * 8))
{
rb.AddForce(Vector3.left * sidewaysForce * (rb.mass * .5f), ForceMode.VelocityChange);
}
}
if (Input.GetKey(ControlManager.CM.right))
{
if (rb.velocity.x <= 15 + (rb.mass * 8))
{
rb.AddForce(Vector3.right * sidewaysForce * (rb.mass * .5f), ForceMode.VelocityChange);
}
}
The mass increases as the scale increases and vice versa.
The issue comes when the snowball gets larger than a certain scale. Once it hits that point it massively accelerates left and right while you push the keys then snaps back to its forward speed when you let go. I assume it's something to do with the mass and the way the applied forces are compounding.
Is there a better way to accelerate our snowball downhill or move it left and right? I've tried transform.Translate and transform.MovePosition, but they lead to choppy left and right movement.
For one, most movement code should multiply the velocity by Time.deltaTime. In a hypothetical game, if you increased the velocity by a certain amount each frame, then somebody with a beefy 60 fps computer will go twice as fast as a poor 30 fps laptop gamer because they will accelerate less frequently. In order to fix this, we multiply acceleration by Time.deltaTime, which is the time since the last frame.
Instead of this code, where framerate would determine speed;
Vector3 example = new Vector3(1,1,1);
void Update()
{
rb.AddForce(example);
}
We would use this code. If the framerate is half as much, Time.deltaTime will be twice as much, so the acceleration will be constant for everyone using it.
Vector3 example = new Vector3(1,1,1);
void Update()
{
rb.AddForce(example * Time.deltaTime);
}
There may be another source of this problem, although getting rid of frame-rate dependencies is a good place to start. It looks like you already have used Time.deltaTime for downward velocity, but not for sideways movement. Even if it doesn't fix the problem, using Time.deltaTime is essential for any consistent game.

Eliminating camera jitters caused by repositioning camera within defined bounds

This is the image of what I'm trying to achieve, in a 3D space. I want the object to follow my finger within the green zone, but stay on the edge of the green zone if I move my finger outside of it. I have achieved this with the code below, but when moving my finger around the red zone a lot of jitters and clipping occurs as the object keeps snapping back within it's bounds. The jitters I'm seeing are caused when holding my finger in the red zone out of the players circle bounds. Instead of being "stuck" in the bounds the player is trying to continue and then being positioned back within the bounds, causing jitters. I'm looking for a way to limit the movement of the player within the bounds without having to reset it's position. My main camera is attached to the moving object so it's important that I eliminate the jitters. How can I smooth this out?
public class Player : MonoBehaviour
{
public Camera movementCam;
readonly float radius = 0.45f;
readonly float speed = 3f;
Ray firstTouchPos;
Vector2 playerPos;
[SerializeField] Vector3 targetPosition;
readonly float followDelay = 20f;
void Update()
{
if (Input.GetMouseButtonDown(0))
{
firstTouchPos = movementCam.ScreenPointToRay(Input.mousePosition);
playerPos = transform.position;
}
if (Input.GetMouseButton(0))
{
Ray currentTouchPos = movementCam.ScreenPointToRay(Input.mousePosition);
Vector2 direction = currentTouchPos.origin - firstTouchPos.origin;
float distance = Vector3.Distance(transform.position, Vector3.zero);
targetPosition = distance >= radius ? (Vector3) . (direction.normalized * radius) : (Vector3)(playerPos + direction * speed);
}
transform.position = Vector3.Lerp(transform.position, targetPosition, followDelay);
}
}
Your issue is incorrectly clamping, a simple fix would be:
if (Input.GetMouseButton(0))
{
Ray currentTouchPos = movementCam.ScreenPointToRay(Input.mousePosition);
Vector2 direction = currentTouchPos.origin - firstTouchPos.origin;
float distance = Vector3.Distance(transform.position, Vector3.zero);
targetPosition = (Vector3)(playerPos + direction * speed);
if (targetPosition.sqrMagnitude > radius * radius) //if our calculated position is greater than the radius...
targetPosition = targetPosition.normalized * radius; //set our calculated position to be exactly on the radius.
}
The jitter was caused by your object leaving the radius one frame, and on the next frame would be clamped back to the radius, only for it to attempt to move outside the radius again the next frame.
This way removes the ternary operator, which means it will behave consistently across frames, rather than switching between clamp and movement each frame.
Here are some additional pieces of advice for this issue, once you fix the above problem:
You should multiply speed and followDelay by time.deltaTime in order to smooth them across frames correctly.
You should probably apply your camera motion during LateUpdate() instead of Update(), LateUpdate() happens after all your updates happen, what can happen is during Update() objects can move around before and after your camera code is called, causing it to behave slightly inconsistently from frame to frame, applying the motion in LateUpdate() to the camera means your camera moves only when all your objects have 'settled' into place after their update, making it behave more consistently.
Additionally you're technically using Lerp() wrong here, it shouldn't cause jitter but it's not exactly how lerp should be used. Are you sure you don't want Vector3.MoveTowards() instead?

How to move an object by a certain angle over a period of time in a circle [duplicate]

I a new here and i try to start working with Unity Engine.
Could somebody explain me, how works Quaternion.Slerp? Because I want to rotate some object in different angles 90, 180 and 270. My code you can see below. Unfortunately when I add 180 degrees, object make crazy things and than put rotation to (0, 180, 180) for this game object. I would like to get (180,0,0)
public float speed = 0.1F;
private float rotation_x;
void Update()
{
if (Input.GetButtonDown("Fire1"))
{
rotation_x = transform.rotation.eulerAngles.x;
rotation_x += 180;
}
transform.rotation = Quaternion.Slerp(transform.rotation, Quaternion.Euler(rotation_x, transform.eulerAngles.y, transform.eulerAngles.z), Time.time * speed);
}
Most examples out there including Unity examples from their official website are using Lerp in the wrong way. They didn't even bother to describe how it works in the API documentation. They just starch it in the Update() function and call it a day.
Mathf.Lerp, Vector3.Lerp, and Quaternion.Slerp work by changing from one position/rotation to another with the t value(last parameter) being passed in.That t value is also know as time.
The min of the t value is 0f and the max is 1f.
I will explain this with Mathf.Lerp to make it easier to understand. The Lerp functions are all the-same for both Mathf.Lerp, Vector and Quaternion.
Remember that Lerp takes two values and returns values between them. If we have a value of 1 and 10 and we do Lerp on them:
float x = Mathf.Lerp(1f, 10f, 0f); will return 1.
float x = Mathf.Lerp(1f, 10f, 0.5f); will return 5.5
float x = Mathf.Lerp(1f, 10f, 1f); will return 10
As you can see, the t(0) returns the min of the number passed in, t(1) returns the max value passed in and t(0.5) will return mid point between the min and the max value. You are doing it wrong when you pass any t value that is < 0 or > 1. That code in you Update() function is doing just that. Time.time will increase every second and will be > 1 in a second, so you have problems with that.
It recommended to use Lerp in another function/Coroutine instead of the Updated function.
Note:
Using Lerp has a bad side of it when it comes to rotation. Lerp does not know how to rotate Object with the shortest path. So bear that in mind. For example, you have an Object with 0,0,90 position. Lets say you want to move the rotation from that to 0,0,120 Lerp can sometimes rotate left instead of right to reach that new position which means it take longer to reach that distance.
Let's say we want to make the rotation (0,0,90) from whatever the current rotation is. The code below will change the rotation to 0,0,90 in 3 seconds.
ROTATION OVER TIME:
void Start()
{
Quaternion rotation2 = Quaternion.Euler(new Vector3(0, 0, 90));
StartCoroutine(rotateObject(objectToRotate, rotation2, 3f));
}
bool rotating = false;
public GameObject objectToRotate;
IEnumerator rotateObject(GameObject gameObjectToMove, Quaternion newRot, float duration)
{
if (rotating)
{
yield break;
}
rotating = true;
Quaternion currentRot = gameObjectToMove.transform.rotation;
float counter = 0;
while (counter < duration)
{
counter += Time.deltaTime;
gameObjectToMove.transform.rotation = Quaternion.Lerp(currentRot, newRot, counter / duration);
yield return null;
}
rotating = false;
}
INCREMENTAL ANGULAR ROTATION OVER TIME:
And to just rotate the Object to 90 in z axis, the code below is a great example of that. Please understand there is a difference between moving Object to new rotational point and just rotating it.
void Start()
{
StartCoroutine(rotateObject(objectToRotate, new Vector3(0, 0, 90), 3f));
}
bool rotating = false;
public GameObject objectToRotate;
IEnumerator rotateObject(GameObject gameObjectToMove, Vector3 eulerAngles, float duration)
{
if (rotating)
{
yield break;
}
rotating = true;
Vector3 newRot = gameObjectToMove.transform.eulerAngles + eulerAngles;
Vector3 currentRot = gameObjectToMove.transform.eulerAngles;
float counter = 0;
while (counter < duration)
{
counter += Time.deltaTime;
gameObjectToMove.transform.eulerAngles = Vector3.Lerp(currentRot, newRot, counter / duration);
yield return null;
}
rotating = false;
}
All my examples are based on frame-rate of the device. You can use real-time by replacing Time.deltaTime with Time.delta but more calculation is required.
Before anything, you can't add 180 on euler angles like that, and that's mainly what is causing your problem. You'd better use quaternion directly instead, or work on the transform itself.
You can think of a quaternion as an orientation in space. In contrary to what have been said, I do recommend learning how to use them if you can. However, I don't recommend using euler angles at all... as they're suject to different writing conventions, and will fail sometimes. You can look at 'gimbal lock' if you want details about that.
Simply a slerp or lerp (standing for spherical linear interpolation, or linear interpolation respectively) is a way to interpolate (go from one orientation to another, by increasing t from 0 to 1, in a coroutine or anywhere else) between orientation A and B. The difference between the two is that the slerp is giving you the shortest path from A to B.
In the end, when t = 1, lerp(A,B,t) and slerp(A,B,t) will give you B.
In your case, if you want to instantly rotate an object in space to a specific orientation, I suggest you use Quaternion.AngleAxis which is the most forward way to describe mathematically a quaternion.
If you want to add a rotation, say 90° to you actual orientation (without animation between the two), you can do something like this :
transform.rotation *= Quaternion.AngleAxis(axis_of_rotation, angle)
or use transform.rotate (depending on the parameters, it can be a right multiply, or left : local, or world transform).
Programmers' answer is detailling how to animate your transform. But I do suggest you to investigate quaternion themselves, as it will give you global understanding of space transforms.

Unity coroutine movement over time is not consistent/accurate?

I have a coroutine that moves my Camera upwards each time the player reaches a certain point in the game. I used a coroutine so that the camera will move smoothly over time.
Here's a snippet of my code:
private IEnumerator MoveCameraUpCoroutine(Vector3 startPos, Vector3 endPos, float duration)
{
float elapsedTime = 0;
while(elapsedTime < duration)
{
transform.position = Vector3.Lerp(startPos, endPos, (elapsedTime/duration));
elapsedTime += Time.deltaTime;
yield return null;
}
}
public void MoveCameraUp(Vector3 startPos, Vector3 endPos, float duration)
{
StartCoroutine(MoveCameraUpCoroutine(startPos, endPos, duration));
}
In my controller script, I just call my coroutine like this:
cam.GetComponent<CameraMovement>().MoveCameraUp(cam.transform.position,
new Vector3(cam.transform.position.x, cam.transform.position.y + setupLevel.heightOfBlock, cam.transform.position.z),
0.1f);
The problem with this is that the camera's movement is not always consistent in terms of where it's supposed to stop. I did some debugging. On the first run, the camera moved to the 0.7864508 yPos. On the second run, the camera moved to the 0.7789915 yPos. etc. It's not consistent.
But when I simply use Translate instead of my coroutine:
cam.transform.Translate(0, setupLevel.heightOfBlock, 0);
I get consistent end values for the camera's yPos at 0.7876318, which is what I need. But this code does not move the camera smoothly over time which is not what I want.
Does anyone know how to fix this coroutine issue? I don't know but I think there's something wrong with my coroutine code. Any help is greatly appreciated.
The result is actually quite consistent with what's expected. There's absolutely nothing wrong with your code, rather it's because of of differences in frame rates.
The reason you're seeing the minor differences is because there's no guarantee that two frames will take the exact amount of time to render.
Take a look at this line
elapsedTime += Time.deltaTime;
What you're doing here is adding an inconsistent value to your elapsed time. (i.e. Time.deltaTime is different every frame).
One partial fix could be to use Time.smoothDeltaTime instead, which is a smoothed out value for deltaTime over several frames. This, however is not going to be perfect.
A second approach, (not entirely an answer per se, but I'm leaving this here for others as well) is to use a tweening engine such as DoTweenor iTween.
These engines have methods that essentially do what you're trying to.
float elapsedTime = 0;
float ratio = elapsedTime / duration;
while(ratio < 1f)
{
elapsedTime += Time.deltaTime;
ratio = elapsedTime / duration;
transform.position = Vector3.Lerp(startPos, endPos, ratio);
yield return null;
}
With this setup, your loop will run until ratio is 1. When 1, endPos is returned and the loop exits next round.
I think the issue was that you compare elapsedTime to duration. So on the last run, you move then you increase and you compare. As a result, the last increase is not considered and you end up somewhere near the end but not at the end.