Unable to add interactions to oculus handtracking in Unity - unity3d

I'm using the oculus integration package to implement hand tracking and have added the OVRCameraRig. Under the LeftHandAnchor I have added OVRCustomHandPrefab_L and similarly done the right hand. I have also enabled physics so the hands work and I can push items off the table in game.
However, I want to implement grabbing mechanism so I added the HandsManager prefab to the scene and put the above mentioned hand prefabs for the left and right hand fields. FInally, I added in the InteractableToolsSDKDriver but none of the interactable tools are working, the little blue dots aren't visible at all.
I'm not getting any error either. Please help, how should I fix this to get the Interactable tools to work?

Well probably because you haven't add component "xr grab interactable" to the item that you wanna grab.

Related

Hololens + Unity: GameObjects are invisible

After I build my Unity project and send it to the Hololens, I have the following problem:
The splash screen appear followed by a debugging window on the bottom. In the background is a white net. However, you can't see any game objects. I've tested a lot but haven't found a solution for that. Visual Studio does not display any error messages. What I've looked at roughly:
These are my modules. Im using the 2019.4.22f1 version of Unity and the MRTK Foundation Toolkit 2.7.2.
My build settings
My project settings
I tried to place the objects in the middle of the camera and changed the colors.
MRTK settings I haven't changed anything most of the time
Main camera settings
My scene
When i start the scene i get this error in the console. I dont know if this has anything to do with my problem
i have two possible solutions (no guarantee)##
you could spawn the objects on input directly in front of the
camera, add a debug.log("object in front of you"); so you can find
the issue.
If this doesnt work i would try to test differnet types of materials
like you do with HDRP.
if this does not work either i probably cant help you out now.
It seems like your GameObject is too far to be hidden behind by the mesh. Please make the spatial mesh invisible by setting the Display Option property of Spatial Mesh Observer Setting to None, this item can be found under the Spatial Awareness profile of the MRTK profile.

Hand animation supported by OVRHandPrefab in Unity (Oculus Quest 2)

I am new to Unity and trying to get basic hands working in terms of being able to see the hands and having them move in accordance with my own hands (preferably using controller, which I know has limited control over what it can detect hands are doing).
I configured OVRHandPrefab as shown in this article, but I do not see the hands. I have tried using with my (physical) hands only as well, but I don't see the hands. I tried disabling hand-tracking support, but that didn't help either.
I've tried all the options in "Hand Tracking Support" in OVRCameraRig
and am using the default values for the two OVRHandPrefab objects except for changing one of them to match the right hand (since left hand seems to be default).
I also tried using the OVRCustomHandPrefab_L and ..._R, but while I do see the hands they don't animate at all in accordance with me pressing buttons or triggers. I'm not sure if these prefabs are supposed to animate out of the box though.
If anyone can suggest any troubleshooting suggestions or any steps where I can get basic animated hand models working, I'd appreciate it.
I'm using Unity 2020.3.18f1.
Use the OVRCustomHandPrefab_L and ..._R and click "automap bones" button under OVR Custom Skeleton for each one.

no hand joints over holographic remoting

I use Holographic Remoting Player to project the unity uwp program to HoloLens. I can get the unity picture on HoloLens, move around to move the field of view, have the hand laser and air tap working well, but the hand joint visualization doesn't show up.
The player setting is ok. And I have followed the troubleshooting steps in this link, everything is ok, but the hand joint still doesn't work.
https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/Tools/HolographicRemoting.html#msbuildforunity-package-import-via-writing-into-the-packagemanifest
I have tried in Unity 2019.2.4 and 2019.4.1, both the same result. Is there anything misconfigured I need to check?
According to this document, the hand tracking profile has been updated to allow for setting the hand joint visualization to: Nothing, Everything, Editor or Player. It meaning it is possible to turn on/off hand joint visualizations while in the editor or in the device or both. So it might be a good idea to double check this field in your MRTK profile and the following way is worth trying:
Click the MixedRealityToolkit object in the Hierarchy window, and then navigate to Input->Hand Tracking in Inspector window, find HandJointVisualizationModes field and set to Everything.

Cannot Animate Interactable Gameobject using Mixed Reality Toolkit

I am a bit of a novice with the Unity Engine and Mixed Reality App development so please bear with me.
I have been working with the Microsoft Mixed Reality Toolkit for Unity to try and animate a game object and move it to the side. A simple action, very similar to an example scene provided by Microsoft with the toolkit called "InteractableObject" (Information links provided below)
Interactable Object - Mixed Reality (Microsoft Docs)
Mixed Reality Toolkit-Unity Interactable Objects and Receivers (Github)
This example scene in Unity has multiple objects to be used as "buttons". With the Mixed Reality Toolkit, even objects that you want the user to interact with to perform some sort of action when selected is even considered a button. At least according to the documentation I have actually been able to find on the subject. This is a series of screenshots depicting the inspector panels for my GameObject and the container for my object:
GameObject Inspector Panel
GameObject Container Inspector Panel (Part 1
GameObject Container Inspector Panel (Part 2
I am trying to make a single game object move to the side when I place the standard cursor on it. This same action is done with a balloon object in the example scene I mentioned. I have created the animator and the state machine the same as they did in there example as well as setup my game object in an almost identical format. Only real difference is that created a balloon object themselves and I am using a different set of custom models from my company.
When I attempt to play back the app in the Unity Editor, the state does not change when I place the cursor on the object. I can force the state to change using the editor and the required animation engages, but it will not change the state on its own. I configured my state machine the same as the Microsoft example and setup my state variable the same as well. It should move from an "Observation" state to a "Targeted" or "ObservationTargeted" state when the cursor moves onto the object. A screenshot of the GameObject state machine and the inspector panel of the specific transition in question are provided below:
GameObject Animator State Machine Setup
Observation to ObservationTargeted Transition Inspector Panel
I went through and verified that all components added by the Mixed Reality Toolkit are the same and they are. This includes the DefaultCursor, InputManager, MixedRealityCameraParent and Directional Light. I also checked that all the scripts were coded the same as well and they are. I am running out of places to look. I attached the Visual Studio debugger to the project in Unity and have verified that it just isn't changing the state on its own. But I cannot figure out why. I believe the problem has something to do with the setup of the transition, but I haven't been able to find the issue. All of the other mentioned components are provided by Microsoft and are not changed by myself nor are they changed in the sample scene.
If anyone else has had a similar problem or may know where I can look to find the problem please let me know. I haven't even built the project into an UWP application yet.
I know it's been a few months, but do you still looking for the solution?
With the newest version of Mixed Reality Toolkit you could make any GameObject to act as a button. Simply read this documentation. I have some cubes as buttons in my Unity project and the only extra Component I added to it to make it work was Interactable, which comes from Mixed Reality Toolkit.
If you want to trigger some animation when you place the cursor on the object (or look at it if you're going to use it with Hololens) then you can add them in Interactable object by adding a new Event (for example: OnFocus() event)
Hope this helps is any way

Unable to use Unity UI elements on my android

I just created a very simple 3D game, and I used Unity UI for the first time.
Inside the game editor I can click the buttons and move the sliders.
I build the game for my android,and I can see all UI elements on my device but however, for some reason, I just can't use them.
Can you please help me and explain what am I doing wrong? Is it a bug?
I found the problem.
The Canvas element comes with "EventSystem" object. All I had to do is to check the CheckBox called "Allow Activation on Mobile device"...