Empty Array of IMotionControllers for Unreal using Oculus Quest - unreal-engine4

The typical way of setting up a MotionController in Unreal Engine is through Blueprints, but I'm trying to identify different types of motion controllers for various VR setups and from there apply the orientation and position data. I have no problems getting Oculus Rift and Vive controllers and trackers using C++, but on Oculus Quest the array that's populated with IMotionControllers is empty when debugging over ADB. Why might this be?
The following code will print a length of 0 on Oculus Quest only.
const FName feature = FName(TEXT("MotionController"));
TArray<IMotionController*> controllers = IModularFeatures::Get().GetModularFeatureImplementations<IMotionController>(feature);
UE_LOG(LogGloveController, Warning, TEXT("Number of Controllers: %d"), controllers.Num());

I checked the array in the TickComponent update and the array was populated. There's just a delay sometimes for a controller to become available (even with Rift/Vive), but it always happens on Quest.

Related

Struggling with Mobile build on phone, looking different on phone and Pc

Firstly I'm still fairly new to coding, saw people making games and fell in love with the idea of making your own game. So the past few months I've been learning unity tutorials and practicing basic games. I've come so far to be basically done with my first game. Everything is ready and I want to load and post it on playstore, but there's one last issue. When I run it on Unity on my pc it looks perfect but I loaded it onto my phone and the UI and some objects is either not showing or looking different than on the pc.Example1Example2Example3 These are examples of my problem. The IMG above is the way it should look like and the one underneath is how it shows on my phone when loaded.
It doesn´t look the same, due to the different resolutions.
Try to use a Canvas Scaler component. Set a reference resolution and the mode how you want to scale it.
If you want your UI elements to be anchored to the center/top/left etc. you should also set the anchor points. Here is a good Tutorial
A good way to instantly check the result is the "device simulator" It is a unity package
That's because of your Canvas settings & your UI GameObjects Anchor. But I think the easiest way to solve this for you - because you don't have that much experience about it - is to separate canvases for mobile & pc. This is the code:
private void Start()
{
if (Application.platform == RuntimePlatform.WindowsPlayer) // If your game's running on android
{
pcCanvas.SetActive(true); // Use PC designed canvas
mobileCanvas.SetActive(false); // Disable Mobile designed canvas
}
else // Your game is running on mobile
{
pcCanvas.SetActive(false);
mobileCanvas.SetActive(true);
}
}
Add this to a GameObject, Design 2 Canvases & assign them to script. (This will be a litte complicated, but will work) This link for more info
But if you want to use 1 canvas, you have to set its settings & its GameObjects anchors.

Oculus integration Custom hand movements in my VR Unity project not working on my system

I am working in unity on creating a application for the the Quest VR system. (Quest2) and added the Oculus Integration Package to my project. (While following an online course)
When I try and use any of the custom hands provided in the package, or run any of the example scenes that use them, the finger movements follow my controller grip presses (hand trigger) perfectly making a semi fist with the bottom 3 fingers, but the index trigger presses do not seem to cause any animation. Neither does a thumb press.
I have added a crude Log to show the triggers are being picked up by the system, but the animation is not showing.
What makes this very vexing as I shared my project with the course provider I was following and it worked perfectly for him.
So the problem is on my system somewhere..
Implementing the same from scratch using the XRInteraction toolkit system works perfectly btw. But the Oculus provided system does not.
If anyone has any ideas on where to even begin looking for the problem I would appreciate it greatly.
Happy to share system details but not even sure which details would be relevant :)

I want to use MRTK(Mixed Reality Tool Kit) gesture with script in Unity

I want to control MRTK input actions(select, scroll, hold, etc.) with script.
I'm trying to make custom controller, using EMG sensor.
When I received data from sensor, I need to control MRTK input actions depending on the data.
I tried to use some profiles (DefaultMixedRealityInputActionsProfile, MixedRealityInputSystemProfile ...) but they only provided data, not writable.
I also tried to use ViGEm, virtual game controller, but it was only available in unity test, not available at hololens because I don't know how to connect ViGEm and Hololens with wireless.
It is recommended to create a custom input system data provider based on your EMG sensor. For a step-by-step guide to getting started please refer to this link: Create data provider. And then, you can raise HoloLens gesture events corresponding to controller state changes by invoking the RaiseGestureStarted method in your custom data provider class. For example, have a look at WindowsMixedRealityDeviceManager which implements raising gesture input events and wraps the Unity XR.WSA.Input.GestureRecognizer to consume Unity's gesture events from HoloLens devices.

Oculus Quest shows the wrong controller

when exporting a demo project (sphere + ovr controller and avatar) to oculus quest, instead of oculus quest controllers, the scene insists to show me the oculus go controller.
I use unity 2018.3.14 and 2019.1.9 oculus integration v1.35 and 1.38.
windows 10.
in oculus rift, the whole scene works perfectly
among the issues this causes
1. controller movement is very limited
2. only 1 hand is shown at a time
2. trigger does not execute the scripts attached to the event.
I followed the proper configuration of oculus scene shown here
https://www.youtube.com/watch?v=qiJpjnzW-mw&t=1s
in OVRCameraRig -> target devices i tried all options (quest, gear+go and both) but generally made sure its on quest
https://www.dropbox.com/s/chbhpvz5u5fv9b2/oculus%20state.PNG?dl=0
(is there another place where the controller should be set?)
I made sure the right controller is chosen in the models prefab
https://www.dropbox.com/s/ejof63acjlb491z/oculus%20prefabs.PNG?dl=0
I tried to update the integration to v1.39 (only got worse, both controllers became invisible but from the oculus forum that's another problem).
I tried different unity versions.
I tried to factory reset the device.
I tested beat sable to be certain that on other apps the controllers work just fine.
have anyone encountered a similar issue?
I had same issue but resolved by following steps.
In Unity menu, Click: Oculus > Tools > Remove AndroidManifest.xlm
In Unity menu, Click: Oculus > Tools > Create store-compatible AndroidManifest.xlm
In Unity Window, Open: Assets > Plugins > Android > AndroidManifest.xlm.
Make sure AndroidManifest.xlm has
category android:name="android.intent.category.LAUNCHER"
but not
category android:name="android.intent.category.INFO".
Try Build and Run.
Good Luck.

Can I use multiple Gamepads with Unity3d?

I have a couple questions about Unity3d and Game Controllers on PC.
In Unity3d is it possible to use a game pad as an input source?
If so, which game pads are supported (e.g. Xbox)? Do I need a plugin or someone else's code? Can I use the vibration?
Can I recieve input from multiple game pads at the same time (co-op for up to 4 players on same machine)?
I have looked several places, and it seems that using XInput will allow for Xbox controller support in Unity on Windows. I have seen nothing on multiple controller support (e.g. 2-4 controllers on the same PC). Thank you so much for your time!
You can get input of a specific joystick using this
Joystick Buttons (from a specific joystick): “joystick 1 button 0”, “joystick 1 button 1”, “joystick 2 button 0”, …
http://docs.unity3d.com/Manual/ConventionalGameInput.html
Near the end of the page