I'm having a weird issue, may be a simple fix.
I've got a UI only "game" using the new UI Toolkit. It's a little kind of a drawing program. I've got a draw area in the middle with "tool buttons" on the sides. Everything works fine with Mouse, Pen, and touch when drawing (using scripts I can access all types of pointers), but for some reason touch doesn't work with the UI buttons only.
What's even weirder is that touch on UI buttons works when testing directly in Unity Play mode (I've got a touch screen laptop), but doesn't work when I make a Build.
In my Project Settings -> Input System Package, I've got Pen, Mouse, and Touchscreen active under "Supported Devices"
The new UI Toolkit is so new there's no help or similar issues I can find online.
if its still relevant:
I had the same issue and used "Standalone Input Module" instead of "Input System UI Input Module" in EventSystem. It says it's the old option, but it works for me :D
I add the touch screen here and it works now.
Just a follow up since I ended up finding my answer somewhere else.
In the "Input System UI Input Module" component in the EventSystem, I changed the "pointer Behavior" to "Single Unified Pointer" and that fixed it. Not sure if that's just a work-around, but it works great now.
Related
I am working in unity on creating a application for the the Quest VR system. (Quest2) and added the Oculus Integration Package to my project. (While following an online course)
When I try and use any of the custom hands provided in the package, or run any of the example scenes that use them, the finger movements follow my controller grip presses (hand trigger) perfectly making a semi fist with the bottom 3 fingers, but the index trigger presses do not seem to cause any animation. Neither does a thumb press.
I have added a crude Log to show the triggers are being picked up by the system, but the animation is not showing.
What makes this very vexing as I shared my project with the course provider I was following and it worked perfectly for him.
So the problem is on my system somewhere..
Implementing the same from scratch using the XRInteraction toolkit system works perfectly btw. But the Oculus provided system does not.
If anyone has any ideas on where to even begin looking for the problem I would appreciate it greatly.
Happy to share system details but not even sure which details would be relevant :)
I created a Third Person project in Unreal engine. Everything was working fine for a while. Meaning, I was able to control the player when I previewed the game (play). Due to something I probably did, at one point I could no longer control the player in the preview mode. Instead, it seems that I am controlling the default player -- sort of a camera that hangs up in the sky. I checked all the settings Including the "Default Pawn Class" in the project settings, as well as in the world settings.
Not sure what I did wrong or what settings do I need to change. I would love to get some help.
Thanks.
By mistake, I was running in the simulate mode. I change back to the "Selected Viewport" mode and everything is back to normal.
I'm building an openvr app for steamvr to assist with seated play (my room is small so my tracking area isn't ideal). My app pretty much just adjusts the play-area height when I hold the grip button and "scroll" on the touchpad so that I can reach objects that are too low/high at variable heights. (I tried "OpenVR Advanced Settings" but the options for keybinding with it is limited to simple button presses so I decided to make my own version).
I'd like to prevent touchpad input from being sent to the game while the grip button is being held, so that the moving on the touchpad doesn't cause movement in game, is this possible at all?
I'm assuming it's not possible, but wondering whether anyone has had any experience with this.
After your clarification in the comments the answer is no, you can not "eat up" device inputs in an application, I usually work on OpenVR drivers and there after you submit a device input and/or any other event its available to anything that expects pose update events, and event subscribers can not stop others from receiving the said events
However there might be a work around (if its still an issue) I know of at least 1 application that can do what you want and that application is OVR Toolkit (when the overlay is active and you try to click something in the overlay, the game running in parallel will not receive the input, however that will only happen if OVR Toolkit overlay surface receives input, it may be a built in OpenVR overlay feature and you don't have to do anything or it can be defined by the developer, I don't really have a want to test this right now)
Sadly though OVR Toolkit is not open source, but there is an open source toolkit for unity for making overlays, which is open source and might be the solution you're looking for, it can be found here
I'm new to augmented reality, and I'm using vuforia 4.2.3 and unity 5, I followed all the steps trying to make a test run and whenever the camera detects the target the whole screen turn white, I've tried many thing but none of them worked, can someone help me?
I had a problem that sounds similar.
There is a known bug that causes white screen, could be related to that? See here for more info.
What worked for me was to change the VideoBackground.shader code, as described on that thread.
Go to Project>>Qualcomm Augmented reality>>Shaders
Double-click VideoBackground. This opens up Mono.
In the code, change where it says:
"queue"="geometry-11"
to this:
"queue"="Geometry"
Save, rebuild etc.
Worked for me.
I am new to Unity. I have done a character walk animation in 3ds max and imported in Unity. I created Xcode project for iOS through Unity, and animation works as expected.
I want to develop some UI button controls on screen and animate this animation in my iOS app, only when this button is clicked. How do i code it now for this UI controls and events? Do i need do adding these controls and events on the Xcode project (which created by Unity) (or) I can do everything like this kind of native code in Unity itself?
Please advise!
Thank you!
Getsy.
You can do all UI in the Unity itself. You have these options:
Old Unity GUI system. It is pretty easy to program, but it is awful in terms of performance and usability for designers — it's created completely from code, with no editors. It's almost never used in commercial products except for debugging and developer tools. However, it's still a good option for prototypes.
Use other GUI package. There are a lot of 2d and UI packages for Unity of all kinds. Currently the most popular one is NGUI, which is pad, but also has an evalution version.
Create your own UI framework (still in Unity). Just wanted to mention that this is a viable option, but it's obviously the worst one for your case.
Wait for the new 2D/GUI Unity framework. It's supposed to come out in 4.3 version, and it is just around the corner; more than that, the original NGUI author is working on it.
In you place, I'd create basic prototype controls with built-in Unity3d GUI, and by the time I'd need to create something more presentable, new Unity GUI would hopefully already be there.