How to setup a fast development workflow (plug and play) for Oculus Quest 2 with Unity - unity3d

How would I go about setting up Unity and Oculus so I can hit Play in Unity and see the results immediately instead of building and running the game on Oculus?

First off, these are the essentials one needs in order to develop for Oculus Quest 2 with Unity. After completing these steps, you can, if you want, do the other ones to have the fastest possible workflow.
Unity side:
Install Unity (anything above 2018 would work, preferably the latest). Check the Android Build Support module, as well as the Android SDK & NDK and OpenJDK checkboxes.
Open a new 3d empty project. (preferably open it for the Android platform)
In Build Settings, switch to Android platform. (if it's not already)
In the Package Manager, install the XR Plugin Management and the Oculus XR Plugin
In Project Settings -> XR Plugin Management, go the Android tab and check the Oculus option. Wait for the import to finish.
For the Oculus side:
Make an Oculus Developer account
Put on your Quest 2 headset and log in to your developer account.
In your Quest 2 headset, go to Settings -> System -> Developer and turn on the USB Connection Dialog option. Alternatively, you can do the same with the Oculus Android app. Do one or the other.
Connect the headset to the pc/laptop with a usb-c to usb cable (like the Oculus Link cable, but it will work with third-party cables as well) and accept all the dialogues that show up, namely the "Allow USB Debugging", "Always allow from this computer", and the "Allow access to data".
Finaly, In Unity in Build Settings -> Run Device choose the connected Oculus device from the options and click Build & Run (make sure you have at least one scene added to Build Settings). This will build the application and push it to your Oculus (if connected via usb). When you put your headset on you will see the application load.
That's it. This is the bare minimum you need in order for you to develop VR games for Oculus. But of course if we leave it at that it's going to be really slow and tedious developing VR apps because we will have to code basic VR principles by ourselves and waste time on building and pushing the app to the device every time we want to change and test something. And so, we are going to install a few more things in order to speed up development.
Install the Oculus Developer HUB. This will allow you to check if your device is connected to the PC properly, and also has other functionalities (you can screen capture and record video directly from the devices). Once installed, connect your device with a usb-c to usb cable and make sure it shows properly in the Oculus Developer HUB. (this step is NOT a must but I recommend it)
Install the Oculus App for Quest 2. You can find it on Oculus website. We need this in order for the Oculus Link feature to work, which will allow us to test in real time instead of building and running the app on the Oculus.
Run the Oculus App, and when you do it will show a setup guide. Follow the guide. You can choose connection via cable (cable link) or via wifi (AirLink).
In your connected Oculus headset, a dialogue will pop up asking you if you wish to enable Oculus Link. Accept it.
In Unity, go to Package Manager and install the XR Interaction Toolkit. This plugin greatly lessens the troubles of setting up a VR rig.
In a new scene right click in the Hierarchy, go to XR -> Device-based -> XR Rig (could also say XR Origin)
Click Play in Unity and put on your headset. You should see the tracking of the headset and controllers mirror in the Unity editor.
That's pretty much it for the development side. If you wish to build for Oculus, you just have to go to Build Settings and in the Run Device option choose the connected Oculus Quest 2 device and click Build and Run. And of course save the scene with the XR Rig and add it to Build Settings.
Tested to work with Unity 2020.3.25f1 on ASUS TUF laptop.
Feel free to correct me on some of the steps if you hit an obstacle. Would love to have as comprehensive guide as possible.

Related

Unity Build targets Mixed Reality but does not appear in apps

I have created a project using the Windows XR Plugin and XR Plugin management. I am NOT using the XR interaction toolkit, I have created my own tools, including my own XRRig using the Tracked Pose Driver.
My project is NOT built with UWP since I need file access that UWP does not provide or is too tricky to provide (after many attempts and work with a senior developer here, I just gave up). So I'm using standard Unity and my build settings are "PC, MAC, Linux stand-alone". My Player settings in the XR Plug-in management are "Windows Mixed Reality". I am using a few UWP functions for file access within the app.
Now, once the build is built, I obviously don't see it within Steam, but it also does not appear in the Mixed Reality list of applications. I have to start it manually by clicking on the icon on the desktop. It works great, but wtf...?
I know for applications to appear in Mixed Reality they must be built with UWP, but then if this build is neither UWP nor Steam, what is it then? How do I add it (or sideload it) to the Mixed Reality applications that the Windows menu brings up within the cliff house for e.g.?
To answer first question on what type of application is being built if UWP and Steam are removed as target in Unity, this is being built as a PC desktop application.
To answer the second question on how to access this application inside the Mixed Reality Cliff House shell when the application is not a UWP application, this can be launched via the “Classic Apps” pin inside the mixed reality cliff house shell.
Here is more information on the "Classic Apps" pin:
https://learn.microsoft.com/en-us/windows/mixed-reality/whats-new/release-notes-may-2019#how-to-launch
That should answer the question asked on how to launch inside of VR experience itself.

Hololens Unity app always deploys in 2D instead of 3D

I'm developing an app for the Hololens 1 in Unity, and it runs perfectly fine on the device when using Holographic Remote. However whenever I build and deploy the application through Visual Studio, it then only launches in 2D mode on the Hololens (as a flat "window" you can position in space). What settings control this behaviour?
Unity version is 2019.1.4f1,
Visual Studio is 2017 Community Edition,
I'm on Windows 10.
Developer mode is turned on on both the HL and my desktop. Virtual Reality Support is ticked in Unity, the Mixed Reality SDK is added to the list and the Build settings are on x86 / D3D Project.
I tried replacing my scene with one of the examples from the MRTK, but to no avail. Strangely enough, if I make a clean new project with nothing except the MRTK example in it it does deploy properly, so there must be something in my project interfering. I just can't figure out what.
Expected behaviour is that the application launches in "room scale" mode, i.e. all other applications disappear and the objects in my scene can be viewed in 3D.
EDIT: This has been marked as a possible duplicate. The answers given there do not solve my problem, however. I already made sure that "Virtual Reality Supported" is ticked in the XR settings and the SDK is added to the list. I don't think I have a Windows Insider preview, but since I was able to deploy perfectly fine with a fresh project I don't think that's really the problem...
It appears Vuforia was causing the issues. I got it to deploy in 3D with 'Vuforia Augmented Reality Supported' ticked and the following settings in VuforiaConfiguration:
Camera Device Mode: MODE_OPTIMIZED_SPEED
Device Type: Digital Eyewear
Device Config: Hololens
Video Background DISABLED
Device Tracker DISABLED
Furthermore, 'Vuforia' must not be added to the list of Virtual Reality SDKs in XR Settings.
Note that I have not tried all subsets of these settings individually, some of them might not have an impact whatsoever (except for the last one, I am quite certain adding that SDK will force the app into 2D mode).
Also note that I haven't verified that Vuforia actually works correctly on the Hololens, just that I can deploy the app in 3D mode with it enabled, given the above settings. If someone could confirm that Vuforia is even supported by MRTK v2?
EDIT: apparently the problem is also caused by ticking "WSA Holographic Remoting Supported" in the XR Settings, so be sure to disable that.

How to develop Gear VR app with emulators?

I currently can't afford to buy a Gear VR headset with samsung's first class phones like: S6, S7 or S8. But I wanna start developing.
I tried installing Gear VR app using unity in Genymotion emulator. App got installed in genymotion but app didn't show in dual/stereoscopic mode.
While, when I installed the same app on S6 edge hardware device then it seems to work fine because it showed a message on screen to enter phone into Gear VR device.
So, Can I test Gear VR app developed using unity in any emulator. Is there any way?
If you own a S6 Edge that has the Gear VR software already installed on it you can put your Gear VR software in to developer mode which will allow it to run VR apps without being inserted in to a headset.
Go to Settings > Application Manager
Select Gear VR Service
Select Manage Storage
Click on VR Service Version several times until the Developer Mode toggle shows up
Toggle Developer Mode
Note: If you do not see the Developer Mode toggle switch after tapping VR Service Version several times, close
Gear VR Service and relaunch, and you should see it.
To get the software installed you just need to find some other person's Gear VR or a display model and plug your phone in, this will cause the software to install. You can then take your phone home and develop using the developer mode on your own.

How does OpenVR, SteamVR and Unity3D work together?

I am trying to understand the VR platform stack of Vive, and how it's games are developed.
I am struggling to understand where exactly does openVR, steamVR and Unity fit into picture.
My understanding so far has been that:
openVR - Hardware independent layer providing APIs for peripheral access.
That is it can provide access to either Oculus or Vive hardware via
a defined interface.
SteamVR - Provides access to hardware to games developed either in unity or
unreal.
Unity3D - A game engine to develop games.
If anyone can correct me, I will be much grateful.
Or if my understanding is correct, then why can't games being developed in unity 3D access hardware directly via openVR.
OpenVR is an API and runtime that allows access to VR hardware from multiple vendors without requiring that applications have specific knowledge of the hardware they are targeting (ref1), SteamVR is the customer facing name that we use for what users actually use and install (for details check this video: Using Unity at Valve)
Also Check to see that can you use the Vive with OpenVR without Steam ??.
Lets finally look all these terms, thanks to Reddit post:
How a Game Appears on your Head Mounted Display(HMD):
A game renders an image, sends it to it's corresponding runtime. The runtime then renders it to the HMD:
Rendered Image using the :
[OVR/OpenVR] SDK -> [Oculus/SteamVR] Runtime -> [Rift/Vive]
SDKs:
SDKs are used to build the games. A game can either implement OVR or OpenVR or both. This means that the game has access to native functionality in it's corresponding runtime. SDKs do not handle async timewarp or reprojection, those are handled by the runtime!
OVR: Made by Oculus for the Oculus Rift. Current version (14th May 2016) is 1.3.1 and can access all features of the Oculus runtime.
OpenVR made by Valve and supports Vive and Rift via the SteamVR runtime
Sidenote to SDK's and Unity games: Unity 5.3 currently has optimizations for VR in their native mode. The native mode supports Rift, Gear and PSVR, but not SteamVR. A game compiled with Unity 5.3 can use those optimzations with the Oculus SDK but not the OpenVR SDK. The OpenVR SDK has it's own optimizations, which may or may not result in similar performance. However, the upcoming Unity 5.4 will support SteamVR natively and performance should be more or less identical. Please note: this is Unity specific and other engines might have similar or different optimizations for some or all headsets.
Runtimes
Oculus Runtime responsible for async timewarp and handles device detection, display, etc. It (the runtime service) needs to be running for Oculus Home to launch
SteamVR Runtime responsible for reprojection and supports Rift and Vive
Software Distribution Platforms
Oculus Home needs to be running for the Rift to work. By default only supports apps from the store (checkbox in the settings of the 2d desktop client to enable other sources). It downloads games and runs them. It also handles the Universal Menu on the Xbox button
Steam/SteamVR technically does not need to run when launching OpenVR games, but highly recommended (room setup and config is pulled from there). Also handles overlay menu on the Xbox button, or when running on the Rift, it launches by pressing the select/start button in the Oculus Universal Menu
Finally worth reading.

Testing on a gearVR

I am building my first VR game for the Gear VR(Since I got it for free). The question I have is there a way I can plug in the GearVR and run it on unity that way I can see what is going on in the game, while the game is being played? Currently the way I am testing my game is by building it, transferring it to my phone, installing it, then testing it. Which is very tedious. Here is my setup in case it helps:
unity 5.5
Android Studio
Oculus Mobile SDK
Unity should be able to load the code and profile it without you manually installing. If you follow the instructions from oculus all you need to do is check the box for "Development Build" and "Autoconnect profiler"
You also may want to enable Developer Mode on the phone so the app can be run without it being in the vr headset.