New Unity Input System not recognizing keys - unity3d

So we are developing our custom controller and Windows is realizing it as it should, it detects all the buttons and Throttle (Left pad).
All other software is also detecting it, even online testing tools like https://devicetests.com/controller-tester are recognizing and moving Left Pad.
The problem is that Unity's new Input System v1.4.3 is not recognizing Left pad movement. When I click "Listen" under binding tab Unity is recognizing all other buttons clicked except Left Pad movement. Tried it under action types value, button, also all of the control types and Unity is not realizing movement on Left pad.
Do you have any idea how to proceed with debugging this? We are lost.
Edit:
This is how Windows sees everything:
This is example of controller testing with devicetests:
But Unity recognized every input except left pad (or throttle as Windows sees it)

Related

Is there a way to create a floating JoyStick using the On-Screen Stick component?

I'm learning the new Input System with On-Screen Stick. It all hooks in with my InputActions.
However, I'm not sure how to build a floating JoyStick, where the JoyStick only shows up when I touch the screen and hide when I release touch.
I find many tutorials on either a fully custom solution without using the built-in On-Screen Stick component or a static one, which is what I have so far.
Below is what I have so far.

Hololens 2 Gaze cursor not executing buttons

We're working on Hololens 2 and have created our own Button design, where we followed a MRTK Tutorial.
Now sadly we cannot execute the buttons using the Gaze-cursor in Hololens 2.
We are using our own Configurationprofile, but the same is valid when using the defaultHololens2configurationprofile.
Also there is a weird behaviour (Valid for both profiles mentioned before): When starting the app the Gaze-cursor is visible, the moment my hands are recognized the gaze-cursor disappears (all good until now), but when I move my hands behind my back the gaze-cursor doesn't appear anymore.
Does anybody have a similar problem, knows how to solve it or has observed something similar?
We are using:
Unity 2020.3.6f1
MRTK 2.7.0
All XR Packages up to date, except XR Plugin Management 4.0.1
Here some screenshots which components our buttons have attached:
Cheers and thanks for the help
The reason is that MRTK is currently designed in a way that at a distance hand rays act as the prioritized focus pointers, so the eye gaze is suppressed as a cursor input if hand rays are used.
If you want to use both eye focus and hand rays at the same time, please follow this documentation:Use hand rays and eye-gaze input together. However, in this way, voice command will be the only method to interact with the hologram which focusing on.
Besides, if you want to support a 'look and pinch' interaction, you need to disable the hand ray according to this document:How to support look + hand motions (eye gaze & hand gestures)
I filed the following GitHub issue and it's being investigated - "Select" voice command does not fire the appropriate events when using OpenXR on HoloLens 2

Moving mouse cursor with gamepad question

So I can create a gameobject that I can move around like a cursor with my xbox controller, however when it comes to pressing buttons I have 1 issue.
It seems there is no way to set the actual mouse cursor position without importing user32.dll and even that solution will only be a windows solution. https://answers.unity.com/questions/330661/setting-the-mouse-position-to-specific-coordinates.html
The reason I want to move the actual mouse cursor as well as my gameobject representing a cursor, is that I still want to be able to click buttons, and when I call Input.GetMouseButtonDown(0); it will click the button at the point where the actual mouse cursor is, and if the actual mouse cursor isnt moving with us its not gonna work.
Also I dont need a custom mouse cursor, if the xbox controller could move the mouse cursor directly thats fine too.
Any advice appreciated.
There is no other way to set the mouse position maybe if you needed to have a solution for mac as well you can find it here at the bottom.
also making a custom cursor is a better solution cause it will be cross platform and you could make a custom mouse down event for it so i don't see why you wouldn't
and if you are sure you need the actual mouse to move you only have the first solution
and moving the mouse with xbox controller by using a software is only gonna work on your pc and those softwares are probably using the first solution

How to update runtime scene via the editor

Is it possible to update the' position/rotation of an object in a running game by setting its properties in the editor. In other words, when I change something in Unreal editor I'd like to see it change immediately in a running instance of the game.
Yes, it is possible. Steps to do so (tested in 3rd person template, UE 1.17)
Play in editor
Hit Shift + F1 to show mouse cursor and unlock from viewport
Click on Eject button (it is on upper toolbar where Compile, Play and Launch buttons are.
Click on mesh on scene and edit it's properties
Hit Posses button (now instead of Eject)
Play in modified world
Be aware that editor will show some warnings if you move Mesh with Static Mobility. Also, shadows could be off (especially if you are using pre-built lighting).
EDIT: there is another procedure: instead of PIE, you can start Simulate (Alt + S). Then you don't need to do Unposses / Posses actions, just edit properties. There is a catch - your game needs to be playable in Simulate mode. From my experience, due to various reasons I couldn't run one of my game in Simulate, so I had to repeat that 6 steps I mentioned earlier.

UE4 Enabling Mouse During Play?

I'm extremely new to UE, and doing a few easy tutorials to get started, so I don't exactly know the correct terminology to use to help me find what I am looking for... Anyway, whenever I hit play and the game starts, my mouse disappears and I am only able to use the input that I set up; so my question is, even though I do not need mouse input for movement, as I am using WASD, how do I keep my mouse unlocked and available to move around without being locked to the camera?
When you hit play and the game starts, your mouse gets captured by the game to control the camera. If your play button is set to play in the viewport, you can release the mouse from the viewport by pressing Shift-F1. The game will still be running, but input (including from the keyboard) will be suspended and you can interact with the editor.
You can also change the default behavior of how the mouse is captured and it if is constrained to the viewport boundaries. To see these options, go into the project settings (settings button above main viewport/project settings...). On the left side of the project settings window, select 'Input' under the Engine heading. On the right side will be some mouse preferences that can change its behavior.
For instance, to allow the mouse to travel outside the viewport, change 'Default Viewport Mouse Lock Mode' to 'Do Not Lock'. I don't recommend this, but you might experiment with these to get a feel for what they do. Also, you might look up these settings in the Unreal documentation for more detail.
There are 3 blueprint nodes that change the input method:
Set Input Mode Game
Set Input Mode Game and UI
Set Input Mode UI
You want a combination of the second one and a "Show Mouse Cursor" node. However, if you are making an FPS and you use the mouse to look around, you may lose that ability with the second node above. It comes down to what your game is and how you want to use the mouse.
I had the similar issue. So I did something like this in Level Blueprints to achieve what I want.