OculusGo remote random position issue - oculusgo

I am developing an application using Unity and OculusGo headset.Application is interacting with objects using OculusGo remote controller. But sometimes what happens, OculusGo remote is located at random positions. Sometimes it goes too behind from the original position or sometimes it goes too ahead from the original position. If this problem occurs and I restart the application it gives me accurate position.
Note: I am not changing position of camera or remote at run time and it is having fixed position through out the application.
Can you please help me in this problem?

Related

HoloLens/Unity shared experience: How to track a user's "world" position instead of Unity's position?

I have here an AR game I'm developing for the HoloLens that involves rendering holograms according the the users's relative position. It's a multiplayer shared experience where everyone in the same physical room connects to the same instance (shared Unity scene) hosted via cloud or LAN, and the players who have joined can see holograms rendering at other player's positions.
For example: Player A, and B join an instance, they're in the same room together. Player A can see a hologram above Player B tracking Player B's position (A Sims cursor if you will). Then once Player A gets closer to Player B, a couple more holographic panels can open up displaying the stats of Player B. These panels are also tracking Player B's position and are always rendered with a slight offset relative to Player B's headset position. Player B also sees the same on Player A and vice versa.
That's fundamentally what my AR game does for the time being.
Problem:
The problem I'm trying to solve is tracking the user's position absolutely to the room itself instead of using the coordinate positions Unity says Player A's game object is at and Player B's.
My app works beautifully if I mark a physical position on the floor and a facing direction that all the players must assume when starting the Unity app. This then forces the coordinate system in all the player's Unity app to have a matching origin point and initial heading in the real world. Only then am I able to render holograms relative to a User's position and have it correlate 1:1 between the Unity space and real physical space around the headset.
But what if I want Player A to start the app on one side of the room and have Player B start the app on the other side of the room? When I do this, the origin point of Player A's Unity world is at different physical spot than Player B. Then this would result in Holograms rendering A's position or B's position at a tremendous offset.
I have some screenshots showing what I mean.
In this one, I have 3 HoloLenses. The two on the floor, plus the one I'm wearing to take screenshots.
There's a blue X on the floor (It's the sheet of paper. I realized you can't see it in the image.) where I started my Unity app on all three HoloLenses. So the origin of the Unity world for all three is that specific physical location. As you can see, the blue cursor showing connected players works to track the headset's location beautifully. You can even see the headsets's location relative to the screenshooter on the minimap.
The gimmick here to make the hologram tracking be accurate is that all three started in the same spot.
Now in this one, I introduced a red X. I restarted the Unity app on one of the headsets and used the red X as it's starting spot. As you can see in this screenshot, the tracking is still precise, but it comes at a tremendous offset. Because my relative origin point in Unity (the blue X) is different than the others headset's relative origin point (the red X).
Problem:
So this here is the problem I'm trying to solve. I don't want all my users to have to initialize the app in the same physical spot one after the other to make the holograms appear in the user's correct position. The HoloLens does a scan of the whole room right?
Is there not a way to synchronize these maps together with all the connected HoloLenses then they can share what their absolute coordinates are? Then I can use those as a transform point in the Unity scene instead of having to track multiplayer game objects.
Here's a map on my headset I used the get the screenshots from the same angel
This is tricky with inside-out tracking as everything is relative to the observer (as you've discovered). What you need is to be able to identify a common, unique real-location that your system will then treat as 'common origin'. Either a QR code or unique object that the system can detect and localise should suffice, then keep track of your user's (and other tracked objects) offset from that known origin within the virtual world.
My answer was deleted because reasons, so round #2. Something about link-only answers.
So, here's the link again.
https://learn.microsoft.com/en-us/windows/mixed-reality/develop/unity/tutorials/mr-learning-sharing-05
And to avoid the last situation, I'm going to add that whomever wants a synchronized multiplayer experience with HoloLens should read through the whole tutorial series. I am not providing a summary on how to do this wihtout having to copy and paste the docs. Just know that you need a spatial anchor that others load into their scene.

how to make an object move automatically Within a specified path

I am a beginner with Unity 5.6 and game development in general.
I have a train and its rails in my scene. I need the train to start moving automatically on the rails (within a specified path) when the game starts. When it reaches the end of the path it should return to the start and keeps moving. I also want to control its speed. I've tried many things and searching for answers to make this and haven't succeeded yet.
How can I do this please? What kind of code do I have to write? Do I need to attach it to the train object?

Keep object in place when AR track is found

I'm pretty new with Vuforia and Unity, what I'm looking for is to have a full-screen/in place image/object (without movements) when the track is found (in keep it on screen till track is lost)
I was thinking to make the image target child of AR camera (to keeps is relative position to the camera) and put my image on top of it (a sample cube for now).
This will give me the correct positional result (image is not moving) but there are some rendering issues. (when I run it, it's blinking as I guess could be some rendering issues)
any comments and suggestions would be really helpfull
The way vuforia works is that the camera is manipulated so that it has the same relative position to the target as your device has to the physical target in the real world.* Making one the child of the other won't allow this to happen. The whole point is to replace the "marker" with the 3D object so that it looks like it actually exists in the real world, moving and keeping itself fixed in place as the user moves their device around.
If you want something to be displayed on screen relative to the screen you should have it check to see if the target is detected and enabled, and depending on that state, enable or disable itself. This object can then be a child of the camera (so it doesn't move).
*Alternatively it moves the object around relative to the camera.

Kudan in Unity: how to stop or reset markerless tracking?

I am creating an application with Kudan where a photograph (a 2D sprite) appears via markerless tracking. Based on the sample project I've successfully made adjustments so that the 2D plane is always perpendicular to the camera and placed on the screen in the position I want. Really wonderful!
But I am unable to figure out how to restart/reset the tracking via a script. I can always force the tracking to restart by blocking the camera or shaking the phone, but I want to do it via a button-- it is exactly the same behavior I've found described in the "ArbiTrack Basics" guide for Android and iOS, but am unable to reproduce it in Unity. To what script should I send a stop tracking command in order to get the tracking instance to restart (exactly the same effect as blocking the camera when running one of the sample Unity projects in Markerless Mode).
The situation is described here for Android coding: https://wiki.kudan.eu/ArbiTrack_Basics#Stopping_ArbiTrack
where it says to call these three things:
// Stop ArbiTrack
arbiTrack.stop();
// Display target node
arbiTrack.getTargetNode().setVisible(true);
//Change enum and label to reflect ArbiTrack state
arbitrack_state = ARBITRACK_STATE.ARBI_PLACEMENT;
I have found one way to do this-- though I'm not sure it's ideal.
Looking in the TrackingMethodMarkerless.cs script, it seems that the StopTracking() does not work-- it disables the updating of the tracking but doesn't actually disable the instance of detection. But taking a note from it, I added an if statement to the ProcessFrame() function:
//
if (disableMarkerless == false)
trackable.isDetected = _kudanTracker.ArbiTrackIsTracking ();
else
trackable.isDetected = false;
//
Now, changing the disableMarkerless bool operator disables the tracking.

In Unity, how to prevent animations from messing with the movement?

The problem:
I have a character model with a Nav Mesh Agent component. It moves perfectly well to any destination I tell it to move (using the NavMeshAgent.destination property).
But this suddenly fails as soon as I use an animation controller I downloaded from the store. The character won't run to it's destination; instead, it will endlessly run around it in circles.
I'm not sure why this happens, but I suppose the running animation somehow cripples the character's ability to turn. The Inspector, in the import setting of the relevant .fbx file shows: Average Angular Y Speed: 0.0 deg/s.
What I really, really fail to understand is why this keeps happening even though I have explicitely set NavMeshAgent.updatePosition and NavMeshAgent.updateRotation properties to true. The way I understand the documentation, this should make the character move as the Nav Mesh Agent wants it to move, and not as anything else (animations included) wants it to move?
How should I fix this problem? How should I force the animation not to meddle in the movement?
Do all your animation in place and use code to do the movement and you can uncheck root motion and use state machine values to get a better movement or use root motion and let mecanim`s retarget engine do the blending so go see for yourself what gets you better result , so I guess your problem is that your animation are not in place.
First: one of the biggest plus of Unity is its mecanim. Disabling root motion is negating a big advantage.
Second: the reason your character is running around probably is because the animator and the navmesh agent are issuing conflicting orders. Use updatePosition to false and updateRotation to true. Hence, the animator controls how fast you move and the navmesh agent controls the angular speed. Other posible cause is that your destination is unreachable. Check the Y component of the vectors and insure they are coplanar.