Canvas UI does not render in GVR (Google Cardboard) in Unity 5.3.5 - unity3d

Background
I am using Unity3D 5.3.5 to develop a Google VR (Cardboard) project
Introduction
I added a canvas button to my scene. It shows up in scene mode and sometimes in Game mode but never when I run the project
What I have tried
Turning off Direct Render for Main Camera
Setting up Render Mode of canvas to World Space
Adding Main Camera to Event Camera
Observations
Button shows up in Scene and Game Mode but not during Play mode
Though the UI does not show up, Physics Raycaster in the reticle hits the button.
Screenshots Below

This is a Unity bug. In the forums they mentioned it will be fixed in 5.3.5p5.
Its also noted in known issues for gvr:
Starting with 5.3.4p2, a bug in Unity prevents rendering World Space
uGUI Canvases into a RenderTexture
https://developers.google.com/vr/unity/release-notes#v080_initial_release
It works in older version(5.3.4f1), if you need to test it right now.

in your canvas change Render Mode to-> Screen Space Camera
change Render Camera to -> Your camera
change Plane Distance to a very low number but not negative

Related

Unity - World canvas not seen in build executable

I am working on a moba game in unity, with photon and i use world canvas to display the player life. In the edit mode, i can play without any problem, but in build exe, i can't see the world space canvas. i tried using other cameras, the only way i can see those canvases is to set a camera to render target and display the render result in a raw image. Here are the settings:
Is this an unity bug? I am using the lightweight render pipeline from unity 2018.1.3. Or am I doing something wrong?
I fixed it by using 2D sprites

How can I turn off camera video background in Unity ARKit

I'm trying to build a "lights-off" feature in my ARKit app, where I basically turn off the camera video background and only show my 3D content, sort of like VR.
Could someone please help me figure out how to turn off the the video background. I can see that a material called YuVMaterial is being used to render the camera texture but setting that to single color actually covers the entire screen and doesn't show my 3D content either.
You can uncheck the UnityEngine.XR.ARFoundation.ARCameraBackground component under the main AR camera, it will just render a black background.
In Unity, you can switch between cameras while using ARKit. The main camera has a run time spherical video applied to it, so it's not actually your device camera, but a rendering of what the device camera sees. By switching cameras, you can effectively "turn off" the background video image, but still take advantage of the ARKit properties. Have fun.

Unity UI canvas not working with VR

I have been trying to get a very simple demo of a native Unity UI canvas working with VR.
I have read the oculus blog post here: https://developer3.oculus.com/blog/unitys-ui-system-in-vr/ but i need to use the native unity UI as i want to redistribute the code without license worries.I followed this tutorial https://unity3d.com/learn/tutorials/topics/virtual-reality/interaction-vr?playlist=22946 and downloaded the unity vr samples project from the asset store. In this they provide some scripts to place on the camera (VRInput and VREyeRaycaster) and some scripts to place on the target object (VRInteractiveItem and ExampleInteractiveItem).
When i apply the target scripts to a regular GameObject in the scene (e.g. a cube) the raycast works fine and the appropriate calls are made when fire1 is activated. When i try and do this for a canvas object (e.g. a button) - no hit is detected. I have tried placing the two target scripts (VRInteractiveItem and ExampleInteractiveItem) on the canvas, the image containing the button and the button itself and none work. What am i doing wrong? Why would it work on a regular gameobject and not on a UI canvas? I have made sure all my canvas elements have their raycast target boolean property ticked
EDIT:
It seems to work when i attach a box collider to the UI element, is this required? i thought it should just work with a GraphicsRaycaster attached. but the configuration below doesn't work (when box collider is disabled and graphics raycaster is enabled)
This is what is on my players camera:
I dont have a problem using box colliders if i have to but i wanted to take advantage of the UI buttons changes in highlighted and pressed color properties
In Unity raycasting works only with game objects having colliders. Raycast returns true when it hits a collider. Without colliders there is nothing the ray can hit.
Unity Physics.Raycast documentation
I believe, for anyone just seeing this for the first time, a potential reason it is not working is because the canvas from the above picture is using a "Graphics Raycaster" element and not an "OVR Raycaster" element. The OVR Raycaster is meant to replace the graphics raycaster to connect Oculus to Unity UI.
If you want to use the unity's UI in VR you might want to take a look at this asset: VRTK
There are some examples of VR UI using controllers or camera targeting.
Go to your canvas, you should have an option that is "Plane Distance" it's set to 100 , I change it to 0.5 and it works quite well.

Google Cardboard VR HUD UI Issue

I'm currently creating a google cardboard vr app and I'm running into an issue for the UI. The VR app is set up to be on rails by creating an animation for an empty object that contains the google cardboard camera prefab. I have a world space canvas that has 3 UI objects (backwards, pause, and forward similar to the iphone). In the first case the UI works for pausing (time scale set to 0) as seen in the image below.
However the issue I'm having is having those UI icons show up while in motion. The icons seem to go off in a weird direction have a mind of their own - even though I can track the canvas and know it's in the same position relative to the camera. If I pause in game or in the editor it returns to the correct state in example 1.
I have the canvas set to world space and the event camera is the google cardboard camera prefab. I'm not sure what's going on.

VR Unity splash freezes with large scene

When I start an app that the first scene contains too much data, after 3 seconds the Unity splash is like "frozen"... By this way, if I move my head to see any part of the VR world, all is black.
I can only see a rectangle area with the unity logo from the splash, but the remaining space is black.
I suppose that it occurs because all data of the scene are being loading, but, is there anyway to avoid this problem?
I'm using a Gear VR and a Samsung Galaxy S7 Edge.
Check this image to see that the user see if move their head after the splash is "frozen":
Thanks in advance.