Hololens 2 Gaze cursor not executing buttons - unity3d

We're working on Hololens 2 and have created our own Button design, where we followed a MRTK Tutorial.
Now sadly we cannot execute the buttons using the Gaze-cursor in Hololens 2.
We are using our own Configurationprofile, but the same is valid when using the defaultHololens2configurationprofile.
Also there is a weird behaviour (Valid for both profiles mentioned before): When starting the app the Gaze-cursor is visible, the moment my hands are recognized the gaze-cursor disappears (all good until now), but when I move my hands behind my back the gaze-cursor doesn't appear anymore.
Does anybody have a similar problem, knows how to solve it or has observed something similar?
We are using:
Unity 2020.3.6f1
MRTK 2.7.0
All XR Packages up to date, except XR Plugin Management 4.0.1
Here some screenshots which components our buttons have attached:
Cheers and thanks for the help

The reason is that MRTK is currently designed in a way that at a distance hand rays act as the prioritized focus pointers, so the eye gaze is suppressed as a cursor input if hand rays are used.
If you want to use both eye focus and hand rays at the same time, please follow this documentation:Use hand rays and eye-gaze input together. However, in this way, voice command will be the only method to interact with the hologram which focusing on.
Besides, if you want to support a 'look and pinch' interaction, you need to disable the hand ray according to this document:How to support look + hand motions (eye gaze & hand gestures)

I filed the following GitHub issue and it's being investigated - "Select" voice command does not fire the appropriate events when using OpenXR on HoloLens 2

Related

Hololens + Unity: GameObjects are invisible

After I build my Unity project and send it to the Hololens, I have the following problem:
The splash screen appear followed by a debugging window on the bottom. In the background is a white net. However, you can't see any game objects. I've tested a lot but haven't found a solution for that. Visual Studio does not display any error messages. What I've looked at roughly:
These are my modules. Im using the 2019.4.22f1 version of Unity and the MRTK Foundation Toolkit 2.7.2.
My build settings
My project settings
I tried to place the objects in the middle of the camera and changed the colors.
MRTK settings I haven't changed anything most of the time
Main camera settings
My scene
When i start the scene i get this error in the console. I dont know if this has anything to do with my problem
i have two possible solutions (no guarantee)##
you could spawn the objects on input directly in front of the
camera, add a debug.log("object in front of you"); so you can find
the issue.
If this doesnt work i would try to test differnet types of materials
like you do with HDRP.
if this does not work either i probably cant help you out now.
It seems like your GameObject is too far to be hidden behind by the mesh. Please make the spatial mesh invisible by setting the Display Option property of Spatial Mesh Observer Setting to None, this item can be found under the Spatial Awareness profile of the MRTK profile.

Hand animation supported by OVRHandPrefab in Unity (Oculus Quest 2)

I am new to Unity and trying to get basic hands working in terms of being able to see the hands and having them move in accordance with my own hands (preferably using controller, which I know has limited control over what it can detect hands are doing).
I configured OVRHandPrefab as shown in this article, but I do not see the hands. I have tried using with my (physical) hands only as well, but I don't see the hands. I tried disabling hand-tracking support, but that didn't help either.
I've tried all the options in "Hand Tracking Support" in OVRCameraRig
and am using the default values for the two OVRHandPrefab objects except for changing one of them to match the right hand (since left hand seems to be default).
I also tried using the OVRCustomHandPrefab_L and ..._R, but while I do see the hands they don't animate at all in accordance with me pressing buttons or triggers. I'm not sure if these prefabs are supposed to animate out of the box though.
If anyone can suggest any troubleshooting suggestions or any steps where I can get basic animated hand models working, I'd appreciate it.
I'm using Unity 2020.3.18f1.
Use the OVRCustomHandPrefab_L and ..._R and click "automap bones" button under OVR Custom Skeleton for each one.

no hand joints over holographic remoting

I use Holographic Remoting Player to project the unity uwp program to HoloLens. I can get the unity picture on HoloLens, move around to move the field of view, have the hand laser and air tap working well, but the hand joint visualization doesn't show up.
The player setting is ok. And I have followed the troubleshooting steps in this link, everything is ok, but the hand joint still doesn't work.
https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/Tools/HolographicRemoting.html#msbuildforunity-package-import-via-writing-into-the-packagemanifest
I have tried in Unity 2019.2.4 and 2019.4.1, both the same result. Is there anything misconfigured I need to check?
According to this document, the hand tracking profile has been updated to allow for setting the hand joint visualization to: Nothing, Everything, Editor or Player. It meaning it is possible to turn on/off hand joint visualizations while in the editor or in the device or both. So it might be a good idea to double check this field in your MRTK profile and the following way is worth trying:
Click the MixedRealityToolkit object in the Hierarchy window, and then navigate to Input->Hand Tracking in Inspector window, find HandJointVisualizationModes field and set to Everything.

Listen for GazeInput down event without selecting anything - Google VR Unity

I'm working with the Google VR Unity SDK and I'm trying to create a VR application where users can switch between multiple ambients(places). I want them to switch to a different ambient just by pushing down the magnetic sensor of the cardboard, pointing anywhere. My problem is that every link (like this one) I've found, works with objects selection. I've tried adding an Event Trigger to my Main Camera and adding a Mesh collider to my building but none of them worked.
So, ¿is it possible to listen for the magnetic sensor pushdown in the full scene without having to select an object?
Turns out it's simpler than I thought.
if(Input.GetButtonDown("Fire1")){
//some code
}
Thing is googleVR removed magnetic button support since version 0.9.0 and I was using 1.0.3. So if you want to implement a trigger for cardboard's magnetic button you need to use v0.8.5.
You could put up a Canvas that's attached to the camera in World Space, so that it always stays in the line of sight. Add a Button to the canvas at the location where the gaze input cursor is, and you should always hit that when triggering.

Unity3d 5 Direct Input

I have an issue in Unity3d v5 where my joystick does not work as intended, When i plug in the joystick, moving it right from the center, gives me -1 and up to 1.
Keeping it completely center, gives me 1 and moving it left gives me 1 (so no change in value moving stick left.
From what ive read it has to do with using RawInput and not using DirectInput.
I have read a post where someone suggests using a registry change to force unity to use directinput. But it does nothing for me using unity3d v5.
Can anyone please help me, because i am completely stuck on this and getting this joystick to work is essential for my game :)
It's very difficult to find any information about this issue from an official Unity source, but take a look at the answer here: Joystick not working
And follow through the threads referred to there. The upshot of this all is that RawInput is all you can rely on for joystick input, on windows. The best solution to this problem I can think of is runtime calibration in your game/app. If you don't care to implement that yourself, there are 3rd party options such as this one, that integrate with Unity: cInput 2
I am running into this same problem with my Unity projects, and my plan is simply to present joystick users with the option to calibrate. You could save the calibration settings so the user doesn't have to calibrate every time the game is run. Not ideal, but again the most realistic solution to the problem I can come up with.