Oculus integration Custom hand movements in my VR Unity project not working on my system - unity3d

I am working in unity on creating a application for the the Quest VR system. (Quest2) and added the Oculus Integration Package to my project. (While following an online course)
When I try and use any of the custom hands provided in the package, or run any of the example scenes that use them, the finger movements follow my controller grip presses (hand trigger) perfectly making a semi fist with the bottom 3 fingers, but the index trigger presses do not seem to cause any animation. Neither does a thumb press.
I have added a crude Log to show the triggers are being picked up by the system, but the animation is not showing.
What makes this very vexing as I shared my project with the course provider I was following and it worked perfectly for him.
So the problem is on my system somewhere..
Implementing the same from scratch using the XRInteraction toolkit system works perfectly btw. But the Oculus provided system does not.
If anyone has any ideas on where to even begin looking for the problem I would appreciate it greatly.
Happy to share system details but not even sure which details would be relevant :)

Related

Touch is not working with Unity UI Toolkit buttons

I'm having a weird issue, may be a simple fix.
I've got a UI only "game" using the new UI Toolkit. It's a little kind of a drawing program. I've got a draw area in the middle with "tool buttons" on the sides. Everything works fine with Mouse, Pen, and touch when drawing (using scripts I can access all types of pointers), but for some reason touch doesn't work with the UI buttons only.
What's even weirder is that touch on UI buttons works when testing directly in Unity Play mode (I've got a touch screen laptop), but doesn't work when I make a Build.
In my Project Settings -> Input System Package, I've got Pen, Mouse, and Touchscreen active under "Supported Devices"
The new UI Toolkit is so new there's no help or similar issues I can find online.
if its still relevant:
I had the same issue and used "Standalone Input Module" instead of "Input System UI Input Module" in EventSystem. It says it's the old option, but it works for me :D
I add the touch screen here and it works now.
Just a follow up since I ended up finding my answer somewhere else.
In the "Input System UI Input Module" component in the EventSystem, I changed the "pointer Behavior" to "Single Unified Pointer" and that fixed it. Not sure if that's just a work-around, but it works great now.

Having an issue where the on-screen stick component on unity doesnt get dragged anymore

I was working on a game of mine, and the stick was working initially, but on the latest compile, the stick doesn't move anymore, I cant say it it isn't getting clicked or there is an issue with the component, I'm fairly new to unity, please help!
Found the solution myself, I had accidentally Changed the action asset in the event system to a non default one that didn't have the button interactions.

Oculus Quest shows the wrong controller

when exporting a demo project (sphere + ovr controller and avatar) to oculus quest, instead of oculus quest controllers, the scene insists to show me the oculus go controller.
I use unity 2018.3.14 and 2019.1.9 oculus integration v1.35 and 1.38.
windows 10.
in oculus rift, the whole scene works perfectly
among the issues this causes
1. controller movement is very limited
2. only 1 hand is shown at a time
2. trigger does not execute the scripts attached to the event.
I followed the proper configuration of oculus scene shown here
https://www.youtube.com/watch?v=qiJpjnzW-mw&t=1s
in OVRCameraRig -> target devices i tried all options (quest, gear+go and both) but generally made sure its on quest
https://www.dropbox.com/s/chbhpvz5u5fv9b2/oculus%20state.PNG?dl=0
(is there another place where the controller should be set?)
I made sure the right controller is chosen in the models prefab
https://www.dropbox.com/s/ejof63acjlb491z/oculus%20prefabs.PNG?dl=0
I tried to update the integration to v1.39 (only got worse, both controllers became invisible but from the oculus forum that's another problem).
I tried different unity versions.
I tried to factory reset the device.
I tested beat sable to be certain that on other apps the controllers work just fine.
have anyone encountered a similar issue?
I had same issue but resolved by following steps.
In Unity menu, Click: Oculus > Tools > Remove AndroidManifest.xlm
In Unity menu, Click: Oculus > Tools > Create store-compatible AndroidManifest.xlm
In Unity Window, Open: Assets > Plugins > Android > AndroidManifest.xlm.
Make sure AndroidManifest.xlm has
category android:name="android.intent.category.LAUNCHER"
but not
category android:name="android.intent.category.INFO".
Try Build and Run.
Good Luck.

Intercept Vive controller input?

I'm building an openvr app for steamvr to assist with seated play (my room is small so my tracking area isn't ideal). My app pretty much just adjusts the play-area height when I hold the grip button and "scroll" on the touchpad so that I can reach objects that are too low/high at variable heights. (I tried "OpenVR Advanced Settings" but the options for keybinding with it is limited to simple button presses so I decided to make my own version).
I'd like to prevent touchpad input from being sent to the game while the grip button is being held, so that the moving on the touchpad doesn't cause movement in game, is this possible at all?
I'm assuming it's not possible, but wondering whether anyone has had any experience with this.
After your clarification in the comments the answer is no, you can not "eat up" device inputs in an application, I usually work on OpenVR drivers and there after you submit a device input and/or any other event its available to anything that expects pose update events, and event subscribers can not stop others from receiving the said events
However there might be a work around (if its still an issue) I know of at least 1 application that can do what you want and that application is OVR Toolkit (when the overlay is active and you try to click something in the overlay, the game running in parallel will not receive the input, however that will only happen if OVR Toolkit overlay surface receives input, it may be a built in OpenVR overlay feature and you don't have to do anything or it can be defined by the developer, I don't really have a want to test this right now)
Sadly though OVR Toolkit is not open source, but there is an open source toolkit for unity for making overlays, which is open source and might be the solution you're looking for, it can be found here

Unity-iOS: Native iOS code development in Unity project?

I am new to Unity. I have done a character walk animation in 3ds max and imported in Unity. I created Xcode project for iOS through Unity, and animation works as expected.
I want to develop some UI button controls on screen and animate this animation in my iOS app, only when this button is clicked. How do i code it now for this UI controls and events? Do i need do adding these controls and events on the Xcode project (which created by Unity) (or) I can do everything like this kind of native code in Unity itself?
Please advise!
Thank you!
Getsy.
You can do all UI in the Unity itself. You have these options:
Old Unity GUI system. It is pretty easy to program, but it is awful in terms of performance and usability for designers — it's created completely from code, with no editors. It's almost never used in commercial products except for debugging and developer tools. However, it's still a good option for prototypes.
Use other GUI package. There are a lot of 2d and UI packages for Unity of all kinds. Currently the most popular one is NGUI, which is pad, but also has an evalution version.
Create your own UI framework (still in Unity). Just wanted to mention that this is a viable option, but it's obviously the worst one for your case.
Wait for the new 2D/GUI Unity framework. It's supposed to come out in 4.3 version, and it is just around the corner; more than that, the original NGUI author is working on it.
In you place, I'd create basic prototype controls with built-in Unity3d GUI, and by the time I'd need to create something more presentable, new Unity GUI would hopefully already be there.