Testing on a gearVR - unity3d

I am building my first VR game for the Gear VR(Since I got it for free). The question I have is there a way I can plug in the GearVR and run it on unity that way I can see what is going on in the game, while the game is being played? Currently the way I am testing my game is by building it, transferring it to my phone, installing it, then testing it. Which is very tedious. Here is my setup in case it helps:
unity 5.5
Android Studio
Oculus Mobile SDK

Unity should be able to load the code and profile it without you manually installing. If you follow the instructions from oculus all you need to do is check the box for "Development Build" and "Autoconnect profiler"
You also may want to enable Developer Mode on the phone so the app can be run without it being in the vr headset.

Related

Unity 3D GameObjects not visible in HoloLens2, MRTK

I followed this tutorial https://learn.microsoft.com/en-us/training/modules/learn-mrtk-tutorials/1-1-introduction and did every step. But when I run it on the HoloLens, I don't see anything.
Build configurations are as in the tutorial, HoloLens2 is in Developer mode and has been paired before. Tools are as in https://learn.microsoft.com/de-de/windows/mixed-reality/develop/install-the-tools?tabs=unity
I am building the app over WIFI.
When the app starts on HoloLens, mesh appears for a moment and I get a question asking if it is allowed to use on my microphone.
"Build with Unity" or similar does not appear, and no objects either. I see "nothing".
What could be the problem?
I use:
Unity 2021.1.20f
MRTK v1.0.2206. Preview (from Microsoft Download Center)
It was actually the wrong Unity version.
I am now using Unity 2020.3.4 and I can see and move the objects.

How to setup a fast development workflow (plug and play) for Oculus Quest 2 with Unity

How would I go about setting up Unity and Oculus so I can hit Play in Unity and see the results immediately instead of building and running the game on Oculus?
First off, these are the essentials one needs in order to develop for Oculus Quest 2 with Unity. After completing these steps, you can, if you want, do the other ones to have the fastest possible workflow.
Unity side:
Install Unity (anything above 2018 would work, preferably the latest). Check the Android Build Support module, as well as the Android SDK & NDK and OpenJDK checkboxes.
Open a new 3d empty project. (preferably open it for the Android platform)
In Build Settings, switch to Android platform. (if it's not already)
In the Package Manager, install the XR Plugin Management and the Oculus XR Plugin
In Project Settings -> XR Plugin Management, go the Android tab and check the Oculus option. Wait for the import to finish.
For the Oculus side:
Make an Oculus Developer account
Put on your Quest 2 headset and log in to your developer account.
In your Quest 2 headset, go to Settings -> System -> Developer and turn on the USB Connection Dialog option. Alternatively, you can do the same with the Oculus Android app. Do one or the other.
Connect the headset to the pc/laptop with a usb-c to usb cable (like the Oculus Link cable, but it will work with third-party cables as well) and accept all the dialogues that show up, namely the "Allow USB Debugging", "Always allow from this computer", and the "Allow access to data".
Finaly, In Unity in Build Settings -> Run Device choose the connected Oculus device from the options and click Build & Run (make sure you have at least one scene added to Build Settings). This will build the application and push it to your Oculus (if connected via usb). When you put your headset on you will see the application load.
That's it. This is the bare minimum you need in order for you to develop VR games for Oculus. But of course if we leave it at that it's going to be really slow and tedious developing VR apps because we will have to code basic VR principles by ourselves and waste time on building and pushing the app to the device every time we want to change and test something. And so, we are going to install a few more things in order to speed up development.
Install the Oculus Developer HUB. This will allow you to check if your device is connected to the PC properly, and also has other functionalities (you can screen capture and record video directly from the devices). Once installed, connect your device with a usb-c to usb cable and make sure it shows properly in the Oculus Developer HUB. (this step is NOT a must but I recommend it)
Install the Oculus App for Quest 2. You can find it on Oculus website. We need this in order for the Oculus Link feature to work, which will allow us to test in real time instead of building and running the app on the Oculus.
Run the Oculus App, and when you do it will show a setup guide. Follow the guide. You can choose connection via cable (cable link) or via wifi (AirLink).
In your connected Oculus headset, a dialogue will pop up asking you if you wish to enable Oculus Link. Accept it.
In Unity, go to Package Manager and install the XR Interaction Toolkit. This plugin greatly lessens the troubles of setting up a VR rig.
In a new scene right click in the Hierarchy, go to XR -> Device-based -> XR Rig (could also say XR Origin)
Click Play in Unity and put on your headset. You should see the tracking of the headset and controllers mirror in the Unity editor.
That's pretty much it for the development side. If you wish to build for Oculus, you just have to go to Build Settings and in the Run Device option choose the connected Oculus Quest 2 device and click Build and Run. And of course save the scene with the XR Rig and add it to Build Settings.
Tested to work with Unity 2020.3.25f1 on ASUS TUF laptop.
Feel free to correct me on some of the steps if you hit an obstacle. Would love to have as comprehensive guide as possible.

Is there a way I can test an Augmented Reality app on a phone that doesn't support ARCORE

I have a Xiaomi Redmi Note 8 Pro and it doesn't seem to support ARCore. I found some ways to dodge this but it appears to be quite complicated nor very safe.
My question is:
What other tools would you recommend if I want to create an app in Unity that also needs to use GPS modules, maybe altimeter and of course camera (AR stuff)?
I heard about Vuforia that might do the trick, also read something about AR Foundation from Unity. But to me, it looked like depending on chosen deployment it use AR Core or AR Kit(even Vuforia).
Any clearance about this maybe?
I suggest you don't try messing with your device. It doesn't support ARCore for good reason. And maybe you just try to use Android Studio Emulator, but Alas for some unknown reason, i try it but APK which is generated from unity can't be installed on the Emulator. Some stuff with the architecture
If you want to use Unity anyway. I suggest you use Vuforia. It works on most modern devices and doesn't even need device to test, just hit unity play mode and you can test around from your PC (need webcam).
Vuforia Engine provides a simulator mode in the Game view that you can activate by pressing the Play button. You can use this feature to evaluate and rapidly prototype your scene(s) without having to deploy to a device. (Source: https://library.vuforia.com/articles/Training/getting-started-with-vuforia-in-unity.html)
For Unity with ARFoundation, you can't use your PC like Unity with Vuforia, you need ARCore/ARKit supported devices.
Last if you want AR with GPS Modules (although this is not with unity) checkout AR.js https://github.com/AR-js-org/AR.js

Setting up Oculus Rift in Unity with OpenVR + SteamVR

Hi I'm trying to set up my Oculus Rift in Unity to develop with OpenVR and Steam. I'm using Unity version 2017.4 and have added the SteamVR package from the asset store to my project. I'm guessing I need to download the github OpenVR folder and add to my project, but is that all? and will the Rift be recognised then? I'm not sure if I need the Oculus integration tool from the asset store as well or will it interfere? Any step by step assistance would be amazing, thank you in advance (also, I know developing with Oculus and not Steam might be easier but need to use Steam for this project)
I'm currently working with Oculus Rift so...
First you need to download Oculus Rift Software. It will help you setup your Rift. You can download it HERE
Second, you probably need Oculus Rift SDK for Unity. It's working perfectly well for me. I just click Play and app is starting in Oculus Rift. You can download it HERE. And use Import Asset to add in to your Unity Project. I highly recommend also Oculus Rift Avatar SDK (for hands and nice support for Touch Controllers) and Oculus Rift Sample Framework for prefabs (like grabbers for grabbing things obviously).
Also if you will have any problems, remember to update your USB Drivers (especially for USB 3.0).

Unity3d 5.5 Enable Vr for Cardboard

I'm almost new to Unity3D, I've watched this presentation of Unity evengelist made in February 2016:
https://www.youtube.com/watch?v=pK0ZD53gOoE
Evengelist said and showed that to bring project to VR you need to select one checkbox (Virtual Reality Supported). Now in 5.5 version just downloaded, when I select Virtual Reality Supported it says also "you must add at least one VR SDK", also after that checkbox selected when I click play scene, I do not see two eyes screen, just ordinary view. When last year I tried to make VR for Cardboard and loaded Cardboard SDK, scene view showed two screens to each eye in play mode.
So the question - how now make VR that works on Cardboard in Unity3d? Do I still need Cardboard SDK (I don't need magnet input support or similar, just stereo and head movement support)?
Added: I don't have Cardboard SDK option in Unity somehow:
According to Unity Blog, Cardboard support is exclusive to Android only. iOS Cardboard support will be added soon.
Do I still need Cardboard SDK
I don't think you need the SDK. This is now Native support for Cardboard. Although,you can still download the Cardboard SDK and Unity will automatically use it.
And here is how to enable Cardboard SDK in Unity 5.5.
Thanks Programmer for your answer, it helped me find what my issue was. However, I've spent literally all of my time over the past few days trying to get Google Cardboard and Unity setup, so I figured I can at least post my answer too.
One problem I initially ran into was that I was trying to use GVR Unity SDK v1.1. There are several bugs in this version, so I reverted back to the GVR Unity SDK v1.0.3. This can be downloaded from the github repo: https://codeload.github.com/googlevr/gvr-unity-sdk/zip/f391c2436426857899af1c37f0720b3985631eb3
Then, I ran into multiple problems just getting things to run on android, so I found that I have to use build tools version 24.0.1. This can be downloaded using the Android SDK Manager executable.
Lastly, the "cardboard" option wasn't appearing for me as part of the drop downs, just like in the picture posted by the asker. The problem was that I was using a regular version of Unity, when I needed to actually be using the technical preview. This can be downloaded from https://unity3d.com/partners/google/daydream, and I'm running Unity v5.4.2f2-GVR13.
Then the drop down "cardboard" appeared, and I was able to follow google's official instructions (https://developers.google.com/vr/unity/get-started) and get things set up.
I ended up installing the Unity 5.6 Beta and got my Cardboard app built quite simply using the EasyMovieTexture store asset, turning on VR in Player Settings and adding Google Cardboard. It is working great in IOS and Android. I am working on turning off the Google Overlay as my app toggles between Cardboard and a simple 360 view and that is a proving a challenge. It seems from what I have read Google does not want you to turn off their overlay. I might try to do this natively on IOS.