Hololens2 + Vuforia ImageTarget. Tracked Image has a position offset - unity3d

I'm having an issue with a project built in Unity for Hololens 2, using
MRTK 2.7.0
OpenXR 1.0.0
Vuforia 9.8.8
Unity 2020.3.10f
The MRTK is installed using the MRTK feature tool. I installed foundations, extensions, toolkit tools and standard assets.
The Vuforia ImageTargets get a random offset from their supposed position.
I have the scene, hand tracking and the general setup for the Hololens running as intended using the MRTK. I also want to utilize the image tracking capabilities of Vuforia, but thats where my issue comes in. When starting the app on the Hololens 2 device, the ImageTargets are getting detected, but their position has a random offset, after each restart of the app. I suspect that the World coordinate system is slightly off. In the Vuforia behaviour, the World Center mode is already predefined to DEVICE and cannot be changed.
When testing in the Editor with a Webcam, everything is working as expected.
The scene is setup similar to the tutorial found at https://arvrjourney.com/hololens-2-marker-tracking-with-vuforia-engine-and-mrtk-fb582c8f8ac0
My scene hirarchy looks as follows, which is similar to the scene setup found in the Vuforia Hololens 2 samples
One of the Targets has no child, but it is actually toggling another scene gameobject, after detection and then switching over to extended tracking to use the more stable tracking of the Hololens. The SceneGameobject is following the Imagemarkers position. To verify the behaviour, there is also a second marker, with a default setup and a cube as a child. Both show the same behaviour and same offset.
I also tried to change the Image Targets' place in the hirarchy, like putting it as a child of the Playspace, or the Maincamera, to no avail.
TLDR: Vufora Image Targets do not work when using OpenXR on the Hololens 2
EDIT: I changed from OpenXR back to Windows Mixed Reality, which seems to have eliminated the issue, but I'm really puzzled about it nonetheless, especially, when OpenXR is supposed to replace Windows Mixed Reality in the future o.O

This was a problem that was caused by Vuforia not fully supporting OpenXR. As of version 10.4.4, they have fixed the problem.
If you were like me and came to this question because you were implementing your own image tracking service and suddenly there was an offset caused by updating to Unity 2020 and/or OpenXR, it's likely you are encountering the same problem Vuforia had to fix. I fixed it by changing how I am obtaining the world origin:
The proper way to obtain the world origin in OpenXR is using:
using Microsoft.MixedReality.OpenXR;
var worldOrigin = PerceptionInterop.GetSceneCoordinateSystem(Pose.identity) as SpatialCoordinateSystem;

Related

Unity Oculus Rift S - Unable to change camera's field of view

In my current project, I am using the Oculus Integration package to interface my app with my Oculus Rift S headset and Unity 2021.3.6f1 URP.
While working on the project I played with the camera settings to get better visuals, and noticed that the field of view attribute always goes back to 90 (even when I set it at run time).
I went over the scripts that were imported from the Oculus package, mainly the following ones (since those are the ones used in the project): OVRCameraRig, OVRManager, OVRHeadsetEmulator.
But wasn't able to find anywhere in the code what is the cause. I even tried searching through all the scripts (using my IDE) for any piece of code that changes the fieldOfView property, and found some scripts but none of them is used in the project, and commenting those lines made no difference…
So, my question is why can't I change my camera's field of view? What caused it to constantly be set to 90?
Its a bad idea to change the default FOV on a VR camera - its meant to match the actual FOV of the headset and most users will experience heavy nausea if you change it more than a few degrees from the correct value. If you want to experience just how bad this feels, place a quad in front of the main camera, with unlit textured material with a texture written to from another camera (with a different fov). This simple solution will not give you stereoscopy, but should be enough to experience just how bad of and idea this is

Hololens + Unity: GameObjects are invisible

After I build my Unity project and send it to the Hololens, I have the following problem:
The splash screen appear followed by a debugging window on the bottom. In the background is a white net. However, you can't see any game objects. I've tested a lot but haven't found a solution for that. Visual Studio does not display any error messages. What I've looked at roughly:
These are my modules. Im using the 2019.4.22f1 version of Unity and the MRTK Foundation Toolkit 2.7.2.
My build settings
My project settings
I tried to place the objects in the middle of the camera and changed the colors.
MRTK settings I haven't changed anything most of the time
Main camera settings
My scene
When i start the scene i get this error in the console. I dont know if this has anything to do with my problem
i have two possible solutions (no guarantee)##
you could spawn the objects on input directly in front of the
camera, add a debug.log("object in front of you"); so you can find
the issue.
If this doesnt work i would try to test differnet types of materials
like you do with HDRP.
if this does not work either i probably cant help you out now.
It seems like your GameObject is too far to be hidden behind by the mesh. Please make the spatial mesh invisible by setting the Display Option property of Spatial Mesh Observer Setting to None, this item can be found under the Spatial Awareness profile of the MRTK profile.

Setting up Hololens MRTK 2.0 with Vuforia in Unity 2019.1

I'm developing a Unity app for the Hololens 1 that uses Vuforia. Unfortunately, I cannot get the camera to work with Vuforia, it remains frozen in place and does not follow head movement. When I disable Vuforia, the camera tracks fine.
My setup is as follows:
* Windows 10
* Unity 2019.1.4f1
* MRTK v2.0.0 RC2
* Vuforia 8.1.11
I tried following the steps outlined here:
https://github.com/Microsoft/MixedRealityToolkit-Unity/issues/1461#issuecomment-373714387
To no avail. I also tried having both cameras active, same result. The Vuforia Hololens sample that can be found in the Unity asset store is severely outdated (using the old Holotoolkit, not MRTK), so it is not very useful to me. I noticed that older versions of Vuforia allow the script on the camera to be set to "world center": "camera", but this option is now forced to "device" when Vuforia is configured for the hololens.
Can anyone tell me how to properly configure my scene for MRTK 2 and Vuforia? I'd be eternally grateful for a link to an up to date example project.
EDIT:
This seems to be an issue only when using Unity's holographic remote. I would still very much like to resolve that though, since deploying is very time-consuming and makes debugging almost impossible.
This worked for me:
Import MRTK package and add it to the scene. This will create a MainCamera under MixedRealityPlayspace Game Object.
Then GameObject > VuforiaEngine > ARCamera. This will create an ARCamera with two components: Vuforia Behavior and Default Initialization Error Handler. Copy these two components and add them to the MainCamera created when you added MRTK.
Finally delete ARCamera.
I use Windows 10, Unity 2018.4, MRTKv2.0 and Vuforia 8.
Good luck.

Unity gear VR reticle pointer shows double when focusing on close objects

I'm developing a VR app in Unity for the Samsung Gear VR and I'm trying to implement a pointer so the user can interact with the objects in the scene. When you look at distant objects it looks fine, but when you focus on close objects (which is highly needed for the app mechanics) the pointer appears to be duplicated, so you need to center the desired object in the middle of the points :P
What I've tried
-Using the GvrReticlePointer that comes with the GoogleVR package for cardboard
-Creating my own pointer by adding a canvas to the main camera with an image in the center
-Changing some of the Camera settings like field of view, stereo separation, etc.
-Configure my phone via a QR code http://imgur.com/fVrNrQk
Steps to reproduce (With canvas added to camera)
1.- Create a simple scene with a few objects to look at in Unity
2.- Set build settings for android
3.- Configure player settings to enable "Virtiual Reality Supported"
4.- Add Oculus as Virtual Reality SDK
5.- Set package name and minimum API level
6.- Add a canvas to the camera
7.- Add an image to the canvas, a cross will do the job
Observations
I'm using Unity 5.6.0b10 since google cardboard's site recommends using this version for the GoogleVR package. And I'm using the Samsung Gear VR with a Samsung Galaxy S6 edge + phone.
Solved
Apparently this is a well documented issue called voluntary Diplopia, and it's a human bug not a software one (read here, Unity's documentation, section The Reticle Interaction in VR).
The problem is trying to put the reticle at a fixed point in the user interface, like traditional 3D games. When looking at closer objects in VR this is going to cause this seeing double problem.
The solution is to position the reticle at the point in the 3D space the user is looking at. If he's looking closer, the reticle is drawn closer. Of course now you also have to scale the reticle accordingly, so the users can see it the same size no matter where they're looking at.
Unity also provides some example scripts about this, you can find them in the assets store, is called VR Samples.
Now I have performance issues (I'm working on mobile platforms): sometimes, when you turn your head fast you can see the reticle where it was drawn before. But looks way better than the double reticle version.

Physics messed up with cardboard scene in Unity

I am in the process of putting together an app using the Google Cardboard SDK. The user will be able to use the app with or without cardboard. So, there is a switch button inside the app, that activates and deactivates stereo rendering.
The app also uses the Vuforia SDK to track image targets. If a specific target is recognized, some 3D objects above the target and a particle system starts to emit particles.
Everything works fine in non-stereo mode. Particles are emitted and falling correctly as intended. They should simulate snow. Also if the user turns the image target to an angle, the 3D objects above fall down.
When switching to stereo mode, the physics are messed up completely. The snow particles are not falling anymore, they seem to "teleport" around the screen. Also the 3D objects do fall upwards, with a really heavy negative gravity. Timescale seems multiplied several times, but is not - I double checked that. Gravity also does not change when switching between non-stereo and stereo rendering.
Everything works fine in Unity Editor in moth modes. It only appears on the device, which is an iPhone 5.
Cardboard SDK is version 0.52, which is the newest.
Unity is version 5.3.1.
Vuforia is 5.0.6, which is not the newest, but release notes do not indicate a fix concerning physics. Will update it anyway as a next step.
Vuforia is 5.0.10, which is the latest version.
I double checked gravity and timescale, which are not changing when switching between modes. I have a hard time figuring out what might cause the physics to mess up.
EDIT:
I did some further investigations. I made me a little gizmo sitting always in front of the camera but getting the rotation of the Unity world space axes, so I know the 3D-world is oriented in relation to the camera. And it turns out, that when in VR mode with the Google cardboard camera system, the world does spin around the camera heavily. I managed to hold the test device in a way, so it is slowing down and almost freezing, but I have no explanation for the effect yet.
I managed to get my setup right again. Unfortunately I did not find the source of the weird behavior. But By deleting the Vuforia Prefab and the Cardboard Prefab and adding them again to the scene, the problem was solved.