Unity 5.5.2f1
I'm trying to get started with making an Oculus app, following the instructions here, modifying a very simple scene I made. However, when I disable the MainCamera and drag in the OVRCameraRig, and run it in the Editor, I still only see a single frame, not these multiple eye views like in the picture. Player settings have "VR Supported" selected. Do I have something not set up right with the app? Or is the tutorial out of date or something?
Related
I'm building a simple 360 video project and mostly got it to work. I've successfully imported the video and configured project so I can pan and 'look around' the environment when the Unity IDE is playing the video in Scene mode.
When I try to build the project or run it in Game mode, it looks as if I'm staring at a wall.
None of the articles on 360 video mention things about camera placement.
I've noticed a button labeled '2D' in the Scene view button bar that looks eerily similar to how the software renders the project in Game mode.
I feel like I'm missing a switch or a configuration somewhere.
Thanks for your assistance.
I tried setting a number of different build settings for Unity Player that didn't work.
I'm attaching screenshots of what I'm seeing.
enter image description here
Here's the game view mode.
enter image description here
There's an additional step that's not posted in most 360 vid tutorials.
Even though the scene view allows you to look around by placing a camera. The MainCamera needs to be coded in order to work.
The camera code here: http://mountainpath.ch/cmsimplexh/index.php?Unity-3D/Create-a-3D-model-viewer-with-Unity-3D
Got this thing to work as expected.
After I build my Unity project and send it to the Hololens, I have the following problem:
The splash screen appear followed by a debugging window on the bottom. In the background is a white net. However, you can't see any game objects. I've tested a lot but haven't found a solution for that. Visual Studio does not display any error messages. What I've looked at roughly:
These are my modules. Im using the 2019.4.22f1 version of Unity and the MRTK Foundation Toolkit 2.7.2.
My build settings
My project settings
I tried to place the objects in the middle of the camera and changed the colors.
MRTK settings I haven't changed anything most of the time
Main camera settings
My scene
When i start the scene i get this error in the console. I dont know if this has anything to do with my problem
i have two possible solutions (no guarantee)##
you could spawn the objects on input directly in front of the
camera, add a debug.log("object in front of you"); so you can find
the issue.
If this doesnt work i would try to test differnet types of materials
like you do with HDRP.
if this does not work either i probably cant help you out now.
It seems like your GameObject is too far to be hidden behind by the mesh. Please make the spatial mesh invisible by setting the Display Option property of Spatial Mesh Observer Setting to None, this item can be found under the Spatial Awareness profile of the MRTK profile.
I have been trying to get a very simple demo of a native Unity UI canvas working with VR.
I have read the oculus blog post here: https://developer3.oculus.com/blog/unitys-ui-system-in-vr/ but i need to use the native unity UI as i want to redistribute the code without license worries.I followed this tutorial https://unity3d.com/learn/tutorials/topics/virtual-reality/interaction-vr?playlist=22946 and downloaded the unity vr samples project from the asset store. In this they provide some scripts to place on the camera (VRInput and VREyeRaycaster) and some scripts to place on the target object (VRInteractiveItem and ExampleInteractiveItem).
When i apply the target scripts to a regular GameObject in the scene (e.g. a cube) the raycast works fine and the appropriate calls are made when fire1 is activated. When i try and do this for a canvas object (e.g. a button) - no hit is detected. I have tried placing the two target scripts (VRInteractiveItem and ExampleInteractiveItem) on the canvas, the image containing the button and the button itself and none work. What am i doing wrong? Why would it work on a regular gameobject and not on a UI canvas? I have made sure all my canvas elements have their raycast target boolean property ticked
EDIT:
It seems to work when i attach a box collider to the UI element, is this required? i thought it should just work with a GraphicsRaycaster attached. but the configuration below doesn't work (when box collider is disabled and graphics raycaster is enabled)
This is what is on my players camera:
I dont have a problem using box colliders if i have to but i wanted to take advantage of the UI buttons changes in highlighted and pressed color properties
In Unity raycasting works only with game objects having colliders. Raycast returns true when it hits a collider. Without colliders there is nothing the ray can hit.
Unity Physics.Raycast documentation
I believe, for anyone just seeing this for the first time, a potential reason it is not working is because the canvas from the above picture is using a "Graphics Raycaster" element and not an "OVR Raycaster" element. The OVR Raycaster is meant to replace the graphics raycaster to connect Oculus to Unity UI.
If you want to use the unity's UI in VR you might want to take a look at this asset: VRTK
There are some examples of VR UI using controllers or camera targeting.
Go to your canvas, you should have an option that is "Plane Distance" it's set to 100 , I change it to 0.5 and it works quite well.
The scene content is shown in game player but not shown in windows build.
Could anyone suggest what can cause to this kind of problem, Or suggest ways to troubleshoot it?
Please note that :
The scene is properly defined in the build settings
I am using windows 10
I am building for windows
The scene contains some sprites with physics , an imported character with animations (I imported it from the asset store) , a UI canvas with a button, and some scripts that attached to the objects.
The scripts do almost nothing.
The game content is properly shown inside the game player (Unity's Editor player) the problem is only with the build.
When running the build output , Unity's splash screen is shown and then the scene is loaded but its content isn't shown.
I changed the camera background color , and the color is indeed changed in the build , it's just the scene content that isn't shown.
I added a UI canvas with a button , the button is shown.
I am using Unity 5.5.1f1
it is a 2d project
Make sure you are adding the scene to Scenes in Build screen.
Go to File->Build Settings and in the dialog make sure your scene has a check mark on the left.
When running in Editor Unity runs whatever scene you are working on, but when building a player you need to include the scene in the list.
Problem solved.
I deleted the camera and created a new one. It solved the problem.
I am developing an app with vuforia Cloud Recos. I want to add the feature of allowing the user to pause the page so she does not have to keep pointing the device on the target to view the trackable. This is pretty useful when I want to show texts. Is there anyway to achieve that on Unity3D ? A good example is Microsoft's Here City Lens app which includes a button to pause the page as the screenshot shows;
You could take a screenshot of the screen and apply it to an Image UI object. That is if you do not need the camera feed anymore.
If you need interaction with the elements, I would only take a screenshot of the camera feed without items. Get AR camera transform, apply it to a new camera, disable AR camera.Then apply the screenshot to a background plane covering the whole screen. Keep items on as well and they do not listen to Vuforia anymore. You are pretty much recreating a basic Unity scene. The items should not be moving with Vuforia, the camera is. So they are still in the middle and you need to know where was the camera when you took the shot. Your scene is complete