Hand animation supported by OVRHandPrefab in Unity (Oculus Quest 2) - unity3d

I am new to Unity and trying to get basic hands working in terms of being able to see the hands and having them move in accordance with my own hands (preferably using controller, which I know has limited control over what it can detect hands are doing).
I configured OVRHandPrefab as shown in this article, but I do not see the hands. I have tried using with my (physical) hands only as well, but I don't see the hands. I tried disabling hand-tracking support, but that didn't help either.
I've tried all the options in "Hand Tracking Support" in OVRCameraRig
and am using the default values for the two OVRHandPrefab objects except for changing one of them to match the right hand (since left hand seems to be default).
I also tried using the OVRCustomHandPrefab_L and ..._R, but while I do see the hands they don't animate at all in accordance with me pressing buttons or triggers. I'm not sure if these prefabs are supposed to animate out of the box though.
If anyone can suggest any troubleshooting suggestions or any steps where I can get basic animated hand models working, I'd appreciate it.
I'm using Unity 2020.3.18f1.

Use the OVRCustomHandPrefab_L and ..._R and click "automap bones" button under OVR Custom Skeleton for each one.

Related

How to modify behaviour of a VR controller's pointer

My university colleagues and I are trying to develop a Virtual Reality project for university where we use an Oculus headset and create a scene with a mouse where you can select and click and drag different objects in the scene. You are supposed to be stationary and move one of the controllers as if it was a mouse. However, we want to modify the behaviour of the controller to better fit the 3D environment. When an object is not selected, we want to interpolate the depth of the cursor according to the interpolation of the nearest objects. There is a paper that we were shown in class that we are supposed to drag inspiration from, and it achieved this kind of behaviour of the cursor with a normal mouse but I can't seem to find any information on how they did it. Our final goal would be to compare both ways of managing the scene and assess which one is better. We are using Unity with VRTK as suggested by our professor, but we and can't really seem to be able to access the mouse's file on how it moves or its behaviour, and we are kind of lost on where to go. Could someone help with this?
Here is the paper where they talk about it:
https://dl.acm.org/doi/pdf/10.1145/3491102.3501884
We so far have tried creating a simple scene and adding objects with different behaviours as well as a controller instance, but we seem to only be able to modify the events of the mouse and not its specific behaviour.
Kind regards and thanks

Unity Tilemap: Level-specific data on tiles in palette

I've just started my first Unity experience by moving a match-3 based game to Unity, and I want to be able to attach data to each available tile type specifying how likely that tile is to be dropped onto the board (and be able to override that value for each level). There are a few match-3 tutorials floating around, but besides a tendency to not explain how to make things happen in Unity, I haven't found any leveraging the Tilemap, which seems a shame since it looks like it provides a fair bit of necessary functionality. The only problem is, I don't see anywhere to attach data or a script to a base Tile, so I've been looking into whether going the Scriptable Tile / custom editor route would get me what I need; but then I run into the lack of instructions on how to get basic functionality on the custom editor (the ability to specify the underlying sprite seems rather fundamental, but I'm not finding anything).
Anyway, is the Scriptable Tile interface my best bet to get level-specific data attached to individual tiles, or should I be looking for a different way to get this data on there? Or should I just ditch the Tilemap and the functionality it does include out of the box altogether like all the tutorials seem to be doing?

Unable to add interactions to oculus handtracking in Unity

I'm using the oculus integration package to implement hand tracking and have added the OVRCameraRig. Under the LeftHandAnchor I have added OVRCustomHandPrefab_L and similarly done the right hand. I have also enabled physics so the hands work and I can push items off the table in game.
However, I want to implement grabbing mechanism so I added the HandsManager prefab to the scene and put the above mentioned hand prefabs for the left and right hand fields. FInally, I added in the InteractableToolsSDKDriver but none of the interactable tools are working, the little blue dots aren't visible at all.
I'm not getting any error either. Please help, how should I fix this to get the Interactable tools to work?
Well probably because you haven't add component "xr grab interactable" to the item that you wanna grab.

Hololens 2 Gaze cursor not executing buttons

We're working on Hololens 2 and have created our own Button design, where we followed a MRTK Tutorial.
Now sadly we cannot execute the buttons using the Gaze-cursor in Hololens 2.
We are using our own Configurationprofile, but the same is valid when using the defaultHololens2configurationprofile.
Also there is a weird behaviour (Valid for both profiles mentioned before): When starting the app the Gaze-cursor is visible, the moment my hands are recognized the gaze-cursor disappears (all good until now), but when I move my hands behind my back the gaze-cursor doesn't appear anymore.
Does anybody have a similar problem, knows how to solve it or has observed something similar?
We are using:
Unity 2020.3.6f1
MRTK 2.7.0
All XR Packages up to date, except XR Plugin Management 4.0.1
Here some screenshots which components our buttons have attached:
Cheers and thanks for the help
The reason is that MRTK is currently designed in a way that at a distance hand rays act as the prioritized focus pointers, so the eye gaze is suppressed as a cursor input if hand rays are used.
If you want to use both eye focus and hand rays at the same time, please follow this documentation:Use hand rays and eye-gaze input together. However, in this way, voice command will be the only method to interact with the hologram which focusing on.
Besides, if you want to support a 'look and pinch' interaction, you need to disable the hand ray according to this document:How to support look + hand motions (eye gaze & hand gestures)
I filed the following GitHub issue and it's being investigated - "Select" voice command does not fire the appropriate events when using OpenXR on HoloLens 2

Unity3d 5 Direct Input

I have an issue in Unity3d v5 where my joystick does not work as intended, When i plug in the joystick, moving it right from the center, gives me -1 and up to 1.
Keeping it completely center, gives me 1 and moving it left gives me 1 (so no change in value moving stick left.
From what ive read it has to do with using RawInput and not using DirectInput.
I have read a post where someone suggests using a registry change to force unity to use directinput. But it does nothing for me using unity3d v5.
Can anyone please help me, because i am completely stuck on this and getting this joystick to work is essential for my game :)
It's very difficult to find any information about this issue from an official Unity source, but take a look at the answer here: Joystick not working
And follow through the threads referred to there. The upshot of this all is that RawInput is all you can rely on for joystick input, on windows. The best solution to this problem I can think of is runtime calibration in your game/app. If you don't care to implement that yourself, there are 3rd party options such as this one, that integrate with Unity: cInput 2
I am running into this same problem with my Unity projects, and my plan is simply to present joystick users with the option to calibrate. You could save the calibration settings so the user doesn't have to calibrate every time the game is run. Not ideal, but again the most realistic solution to the problem I can come up with.