Google VR Unity SDK - Unable to Detect Touch in VR Mode - unity3d

I am using GVR Unity SDK Version: 1.1 with Unity 5.5.0f3. I need to display a button similar to the Gear Icon that is rendered by the Google VR SDK. This button should accept touches and should be visible at all times. The issue with using a Button created in the Unity Canvas is that it does not process touches because the GVR Input Module takes precedence over the StandAlone input Module.
I have looked at the google vr sdk code and have found that the gear icon is rendered using OpenGL calls in the PostRender.cs file but still am not able to find where the touch is processed. Beyond this I am stuck as I dont have much knowledge of OpenGL.
My question : How do I render a button on top of the existing UI (so that it is visible at all times just like the gear icon) and get it to accept touches (by calling a function in my code).

I know that you have to do some recasting from the controller to the UI. There is a demo scene called "Scrolling UI". Try checking there to see if there is something that can help out.
I'm not a hundred percent sure, but just some advice to the right direction.
Hope you find your answer!

The simplest way I found for enabling regular touch inputs on top of the VR scene is to use a Screen Space - Overlay canvas with a regular Graphic-Raycaster and add a Standalone Input Modul to the EventSystem game object.
This way the buttons in that canvas are clickable.
Hope it helps.

Related

Unity custom canvas button not registering clicks

I have been trying for several hours to fix a button for a group project with no luck. I have the basics such as the object being a child of the canvas, graphics raycaster, button script, event system, etc.
There is no code going into this at the moment. Everything is done strictly with Unity's canvas and buttons. Upon clicking, the image should disappear, but it doesn't. The event handler does not register anything when the mouse hovers over the image.
I have looked through several posts on the Unity forum to try and find an answer that would work and none have helped.
The problem was I needed to attach a camera that would be looking at the canvas to the "event camera" of the canvas component. This can be seen in the below image.
In my case, I had rotated the image (on Canvas) to 180 degree and was trying to perform click which did not work. But as soon as I set the rotation to 0 degree the mouse click detection started working.
I had a similar issue. In my case, I had a canvas all setup, but I did not have an EventSystem in the scene graph.
Create -> UI -> EventSystem
This was the solution for me.

Unity UI canvas not working with VR

I have been trying to get a very simple demo of a native Unity UI canvas working with VR.
I have read the oculus blog post here: https://developer3.oculus.com/blog/unitys-ui-system-in-vr/ but i need to use the native unity UI as i want to redistribute the code without license worries.I followed this tutorial https://unity3d.com/learn/tutorials/topics/virtual-reality/interaction-vr?playlist=22946 and downloaded the unity vr samples project from the asset store. In this they provide some scripts to place on the camera (VRInput and VREyeRaycaster) and some scripts to place on the target object (VRInteractiveItem and ExampleInteractiveItem).
When i apply the target scripts to a regular GameObject in the scene (e.g. a cube) the raycast works fine and the appropriate calls are made when fire1 is activated. When i try and do this for a canvas object (e.g. a button) - no hit is detected. I have tried placing the two target scripts (VRInteractiveItem and ExampleInteractiveItem) on the canvas, the image containing the button and the button itself and none work. What am i doing wrong? Why would it work on a regular gameobject and not on a UI canvas? I have made sure all my canvas elements have their raycast target boolean property ticked
EDIT:
It seems to work when i attach a box collider to the UI element, is this required? i thought it should just work with a GraphicsRaycaster attached. but the configuration below doesn't work (when box collider is disabled and graphics raycaster is enabled)
This is what is on my players camera:
I dont have a problem using box colliders if i have to but i wanted to take advantage of the UI buttons changes in highlighted and pressed color properties
In Unity raycasting works only with game objects having colliders. Raycast returns true when it hits a collider. Without colliders there is nothing the ray can hit.
Unity Physics.Raycast documentation
I believe, for anyone just seeing this for the first time, a potential reason it is not working is because the canvas from the above picture is using a "Graphics Raycaster" element and not an "OVR Raycaster" element. The OVR Raycaster is meant to replace the graphics raycaster to connect Oculus to Unity UI.
If you want to use the unity's UI in VR you might want to take a look at this asset: VRTK
There are some examples of VR UI using controllers or camera targeting.
Go to your canvas, you should have an option that is "Plane Distance" it's set to 100 , I change it to 0.5 and it works quite well.

Listen for GazeInput down event without selecting anything - Google VR Unity

I'm working with the Google VR Unity SDK and I'm trying to create a VR application where users can switch between multiple ambients(places). I want them to switch to a different ambient just by pushing down the magnetic sensor of the cardboard, pointing anywhere. My problem is that every link (like this one) I've found, works with objects selection. I've tried adding an Event Trigger to my Main Camera and adding a Mesh collider to my building but none of them worked.
So, ¿is it possible to listen for the magnetic sensor pushdown in the full scene without having to select an object?
Turns out it's simpler than I thought.
if(Input.GetButtonDown("Fire1")){
//some code
}
Thing is googleVR removed magnetic button support since version 0.9.0 and I was using 1.0.3. So if you want to implement a trigger for cardboard's magnetic button you need to use v0.8.5.
You could put up a Canvas that's attached to the camera in World Space, so that it always stays in the line of sight. Add a Button to the canvas at the location where the gaze input cursor is, and you should always hit that when triggering.

simple UI menu and canvas for DK2

I am using Unity 5.1.2p3 with DK2 SDK 0.6.0.1 and I understand from this post that Screen Space - Overlay is not supported in Unity VR. It is recommended to use Screen Space - Camera (which in my case does not work) or World Space (which I am using now) but I need someone to help me understand how I get simple menu with buttons and toggles to show as a still image and how I can make selections and button presses with my mouse cursor.
I have created a menu for my app, with 4 toggles and 1 button. When I check the Virtual Reality Supported option with the Oculus being in Direct Mode and Canvas being in World Space, I can see it in VR, but I cannot see/find my mouse cursor to tick one of the toggles.
When I take off the headset, on my monitor's Game View tab, I can see and even use the mouse and select a toggle. Obviously, I have to keep the headset steady, so in my Game View, things do not shake!
Another thing I notice is that the VR camera is the same as the Main Camera in the Unity Hierarchy, but when I take off the headset and move it around, the position of the camera does not change, only looking up and down and around is reflected.
How do I simply do a static menu like a 2D surface that does not move in VR and a user can use button presses and muse clicks with the headset on? What settings are required for this way of doing UI and canvas stuff? There are 2 attachments, showing my current settings...
Are you specifically wanting to use the mouse? If you look through a blog entry I wrote below, it will show you how to use Gaze looking to trigger menu buttons:
http://talesfromtherift.com/vr-gaze-input/
You can achieve this by some code I list there that raycasts from the center of the screen and if it hits any UI, it will trigger the correct events and make it clickable by button (or by time).

Unable to use Unity UI elements on my android

I just created a very simple 3D game, and I used Unity UI for the first time.
Inside the game editor I can click the buttons and move the sliders.
I build the game for my android,and I can see all UI elements on my device but however, for some reason, I just can't use them.
Can you please help me and explain what am I doing wrong? Is it a bug?
I found the problem.
The Canvas element comes with "EventSystem" object. All I had to do is to check the CheckBox called "Allow Activation on Mobile device"...