How can I predict rigid-body movement in a multi-player network game using cannon.js? - prediction

How can I simulate rigid-body movement? I'm building a multi-player network racing game and I need to predict the next millisecond physical behavior and send position/rotation over the network to reduce the synchronization movement gap.
My current implementation just sends my current position/rotation to the other player. I can see the jerky movement on the other PC. My guess is that the cannon simulation and the latest position/rotation have a conflict that causes the slight delay.

Related

How to prevent jitter with Network Transforms in Netcode?

We currently are working on a multiplayer game with Unity Netcode. The game is a racing game with relatively high speed. The player objects get synchronized with the Network Transform component.
We have the problem that the player is lagging a little bit (or junpig some steps forward instead of moving smooth).
The interpolate setting on the Network Transform makes it a little bit better but not completely.
Something we could observe is that when the own camera isn't moving, the other player isn't lagging at all, but since we have a racer game the camera moves all the time.
We could finally fix the problem. More information: https://forum.unity.com/threads/how-to-prevent-jitter-with-network-transforms-in-netcode.1387044/#post-8733624

Client side prediction with a Shared object(Player - Ball relationship)

Our team is developing a football game. But we are stuck in a step.
We can't move forward because of this.
When we make client side prediction, the local player plays on the current time.
But the other objects like balls and other players play in the past(due to latency).
Normally for other objects we use interpolation but this doesn't work for the ball because local player interact with it while moving and it causes the other players to intertwine with the ball.
We tried to take the ball to the current state simulating physics and run all work-flow on the local. But this time when other players interact with the ball, they seems like moving the ball from far behind. Because of the other players in the past.
What is the best method to use for this situation?

Moving Leap Motion Hands Coordinate System Following Spawned iPhone Player Camera

I'm fixing a legacy project two years ago. It uses a Windows Unity host to utilize Leap Motion device capturing hand movements, and an iPhone Player (with Cardboard headset) to control how the view ports move relatively in the "game world".
Now I find that only when my Leap Motion device keeps still (e.x. be pinned on my chest) and only the iPhone player moves with my head can I find everything okay. Otherwise, when I wear both the Leap Motion device and the iPhone on my head, the hand model sways with my head's moving.
I've concluded that the captured position of hands by Leap Motion device has been interpreted as position relative to the "world coordination system", but in fact it should be a local one relative to my headset (i.e. the iPhone player camera which is spawned as a game object in my windows host).
I've made a simplified scene to illustrate my situation. The hierarchy when network is not connected is like below:
The hierarchy when the Windows program is connected to itself as the host:
When iPhone End is also connected:
I'm trying to give command to "Hands" so that it rotates with "Camera(Clone)/Head", but it doesn't work. (In the following picture, "RotateWith" and "CameraFacing" are different trials to let it move with "Camera(Clone)/Head".)
It sounds like the problem is caused by the camera and the Leap Motion having different latencies and operating at different frame-rates, which can be solved with temporal warping. This has already been implemented by Leap Motion and is done automatically if you use the Leap XR Service Provider.
Attach the LeapXRServiceProvider component to your Main Camera and ensure the Temporal Warping Mode is set to "Auto". This will tell the Leap Motion code to compensate for the differences between the hand tracking frame and the Unity frame.

How to synchronize projectiles with Photon in Unity3D

I have a multiplayer game that synchronizes the movement of the character via photon transform view, applying a Lerp Interpolation for a smooth movement, the shots are synchronized in each client using Photon RPC calls.
I have the following problem, when you move and shoot with a player the other clients observe that the projectiles start in a position in which the player is not yet (this uses Lerp in the synchronization of the movement).
I need to see the projectiles, so I can't make them invisible and only show a shooting animation.
what is the best way to do this?
What you should do is take into account the player position when you start animating the projectile, so that indeed it starts shooting from where the player is and not from where the rpc, which means you also need to adjust the trajectory so that it correct itself to match the projectile real position and direction.
have you tried to minimize the lerp so that it doesn't lag? are you already at the limit of it

Unity:Multiplayer Transformation lagging

I am building a multiplayer game in unity.Currently facing an issue ,the players transformation lags in the other device it is connected too but the transformation is fine in its own device.Plus it also does not show the player transformation animation like walking etc.
NetworkTransform of Player
You can increase network send rate to avoid lag in movement. Currently it is set to 9 (as your image depicted) while for animation on network please see Network animator.
The NetworkAnimator is used to synchronize animations across the
network{tut here}.
In this case i don't use NetworkTransition i used synchonize position and update each Client.
https://docs.unity3d.com/ScriptReference/Networking.SyncVarAttribute.html
Second i reference use NetworkAnimator to solve animation synchonize issue:
https://docs.unity3d.com/Manual/class-NetworkAnimator.html