How can I set the display to stereoscopic programmatically in Unity for an app deployed to an Android device?
I want a UI menu where the user can toggle between "VR mode" and normal mode. I do not want VR mode by default as it should be an option at run-time. I know there is a setting for "Virtual Reality Supported" in the build settings, but again, I do not want this enabled by default.
Include using UnityEngine.XR; at the top.
Call XRSettings.LoadDeviceByName("") with empty string followed by XRSettings.enabled = false; to disable VR in the start function to disable VR.
When you want to enable it later on, call XRSettings.LoadDeviceByName("daydream") with the VR name followed by XRSettings.enabled = true;.
You should wait for a frame between each function call. That requires this to be done a corutine function.
Also, On some VR devices, you must go to Edit->Project Settings->Player and make sure that Virtual Reality Supported check-box is checked(true) before this will work. Then you can disable it in the Start function and enable it whenever you want.
EDIT:
This is known to work on some VR devices and not all VR devices. Although, it should work on Daydream VR. Complete code sample:
IEnumerator LoadDevice(string newDevice, bool enable)
{
XRSettings.LoadDeviceByName(newDevice);
yield return null;
XRSettings.enabled = enable;
}
void EnableVR()
{
StartCoroutine(LoadDevice("daydream", true));
}
void DisableVR()
{
StartCoroutine(LoadDevice("", false));
}
Call EnableVR() to enable vr and DisableVR() to disable it. If you are using anything other than daydream, pass the name of that VR device to the LoadDevice function in the EnableVR() function.
For newer builds of Unity (e.g. 2019.4.0f1) you can use the XR Plugin Management package.
To enable call:
XRGeneralSettings.Instance.Manager.InitializeLoader();
To disable call:
XRGeneralSettings.Instance.Manager.DeinitializeLoader();
I'm using Unity 2021 but this probably works in earlier versions, I'm also using XR Plug-in Management.
Start:
XRGeneralSettings.Instance.Manager.StartSubsystems();
Stop:
XRGeneralSettings.Instance.Manager.StopSubsystems();
Full documentation at:
https://docs.unity3d.com/Packages/com.unity.xr.management#4.0/manual/EndUser.html
2020.3.14f1
Doesn't work for me, I get this error when running my Android app.
Call to DeinitializeLoader without an initialized manager.Please make
sure wait for initialization to complete before calling this API.
[RuntimeInitializeOnLoadMethod(RuntimeInitializeLoadType.AfterAssembliesLoaded)]
static void TryToDeinitializeOculusLoader()
{
XRGeneralSettings.Instance.Manager.DeinitializeLoader();
}
More context.
I try to unload the Oculus loader, before he manages to load the plugin.
I have an Android app, and the Oculus loader calls Application.Quit because the device is not an Oculus headset.
Waiting for XRGeneralSettings.Instance.Manager.isInitializationComplete takes too long.
Tried all RuntimeInitializeLoadType annotations.
OculusLoader.cs
#elif (UNITY_ANDROID && !UNITY_EDITOR)
[RuntimeInitializeOnLoadMethod(RuntimeInitializeLoadType.AfterAssembliesLoaded)]
static void RuntimeLoadOVRPlugin()
{
var supported = IsDeviceSupported();
if (supported == DeviceSupportedResult.ExitApplication)
{
Debug.LogError("\n\nExiting application:\n\nThis .apk was built with the Oculus XR Plugin loader enabled, but is attempting to run on a non-Oculus device.\nTo build for general Android devices, please disable the Oculus XR Plugin before building the Android player.\n\n\n");
Application.Quit();
}
if (supported != DeviceSupportedResult.Supported)
return;
try
{
if (!NativeMethods.LoadOVRPlugin(""))
Debug.LogError("Failed to load libOVRPlugin.so");
}
catch
{
// handle Android standalone build with Oculus XR Plugin installed but disabled in loader list.
}
}
#endif
SOLUTION
Made my build class extend IPreprocessBuildWithReport
public void OnPreprocessBuild(BuildReport report)
{
DisableXRLoaders(report);
}
///https://docs.unity3d.com/Packages/com.unity.xr.management#3.2/manual/EndUser.html
/// Do this as a setup step before you start a build, because the first thing that XR Plug-in Manager does at build time
/// is to serialize the loader list to the build target.
void DisableXRLoaders(BuildReport report)
{
XRGeneralSettingsPerBuildTarget buildTargetSettings;
EditorBuildSettings.TryGetConfigObject(XRGeneralSettings.k_SettingsKey, out buildTargetSettings);
if (buildTargetSettings == null)
{
return;
}
XRGeneralSettings settings = buildTargetSettings.SettingsForBuildTarget(report.summary.platformGroup);
if (settings == null)
{
return;
}
XRManagerSettings loaderManager = settings.AssignedSettings;
if (loaderManager == null)
{
return;
}
var loaders = loaderManager.activeLoaders;
// If there are no loaders present in the current manager instance, then the settings will not be included in the current build.
if (loaders.Count == 0)
{
return;
}
var loadersForRemoval = new List<XRLoader>();
loadersForRemoval.AddRange(loaders);
foreach (var loader in loadersForRemoval)
{
loaderManager.TryRemoveLoader(loader);
}
}
public void Awake() {
StartCoroutine(SwitchToVR(()=>{
Debug.Log("Switched to VR Mode");
}));
//For disable VR Mode
XRSettings.enabled = false;
}
IEnumerator SwitchToVR(Action callback) {
// Device names are lowercase, as returned by `XRSettings.supportedDevices`.
// Google original, makes you specify
// string desiredDevice = "daydream"; // Or "cardboard".
// XRSettings.LoadDeviceByName(desiredDevice);
// this is slightly better;
string[] Devices = new string[] { "daydream", "cardboard" };
XRSettings.LoadDeviceByName(Devices);
// Must wait one frame after calling `XRSettings.LoadDeviceByName()`.
yield return null;
// Now it's ok to enable VR mode.
XRSettings.enabled = true;
callback.Invoke();
}
Related
I'm trying to use a cloud TTS within my Unity game.
With the newer versions (I am using 2019.1), they have deprecated WWW in favour of UnityWebRequest(s) within the API.
I have tried the Unity Documentation, but that didn't work for me.
I have also tried other threads and they use WWW which is deprecated for my Unity version.
void Start()
{
StartCoroutine(PlayTTS());
}
IEnumerator PlayTTS()
{
using (UnityWebRequestMultimedia wr = new UnityWebRequestMultimedia.GetAudioClip(
"https://example.com/tts?text=Sample%20Text&voice=Male",
AudioType.OGGVORBIS)
)
{
yield return wr.Send();
if (wr.isNetworkError)
{
Debug.LogWarning(wr.error);
}
else
{
//AudioClip ttsClip = DownloadHandlerAudioClip.GetContent(wr);
}
}
}
The URL in a browser (I used Firefox) successfully loaded the audio clip allowing me to play it.
What I want it to do is play the TTS when something happens in the game, it has been done within the "void Start" for testing purposes.
Where am I going wrong?
Thanks in advance
Josh
UnityWebRequestMultimedia.GetAudioClip automatically adds a default DownloadHandlerAudioClip which has a property streamAudio.
Set this to true and add a check for UnityWebRequest.downloadedBytes in order to delay the playback before starting.
Something like
public AudioSource source;
IEnumerator PlayTTS()
{
using (var wr = new UnityWebRequestMultimedia.GetAudioClip(
"https://example.com/tts?text=Sample%20Text&voice=Male",
AudioType.OGGVORBIS)
)
{
((DownloadHandlerAudioClip)wr.downloadHandler).streamAudio = true;
wr.Send();
while(!wr.isNetworkError && wr.downloadedBytes <= someThreshold)
{
yield return null;
}
if (wr.isNetworkError)
{
Debug.LogWarning(wr.error);
}
else
{
// Here I'm not sure if you would use
source.PlayOneShot(DownloadHandlerAudioClip.GetContent(wr));
// or rather
source.PlayOneShot((DownloadHandlerAudioClip)wr.downloadHandler).audioClip);
}
}
}
Typed on smartphone so no warranty but I hope you get the idea
Working on a Unity hybrid VR (cardboard) /2D app. The cardboard side of it works fine. I am having trouble with the 2D/VR switching.
When I am in 2D mode, reticle does not move, although screen taps register. So the app seems unaware of the gyro.
I feel like I am missing something fundamental here. I have a GvrEventSystem prefab that has both an EventSystem and GvrPointerInputModule components.
What obvious thing am I over-looking?
ETA:
I have been asked to add relevant code. Here is the code for 2D-VR switching on-the-fly. This code executes w/out error, and the app switches between VR and 2D mode every 3 seconds:
readonly string NONE_STRING = "";
readonly string CARDBOARD_STRING = "cardboard";
void Start()
{
Invoke("GoPhone", 3.0f);
}
void GoPhone()
{
SetVREnabled(false);
Invoke("GoVR", 3.0f);
}
void GoVR()
{
SetVREnabled(true);
Invoke("GoPhone", 3.0f);
}
void SetVREnabled(bool isEnabled)
{
if (isEnabled)
{
StartCoroutine(LoadDevice(CARDBOARD_STRING));
}
else
{
StartCoroutine(LoadDevice(NONE_STRING));
}
}
IEnumerator LoadDevice(string newDevice)
{
if (String.Compare(XRSettings.loadedDeviceName, newDevice, true) != 0)
{
XRSettings.LoadDeviceByName(newDevice);
yield return null;
if (!XRSettings.loadedDeviceName.Equals(NONE_STRING))
XRSettings.enabled = true;
}
}
Although I feel like my problem is a configuration problem, and not a code problem. In the editor, which does not support VR mode, the app behaves in 2D mode as expected.
Also ETA:
JIC
User error! I did not follow the "Magic Window" instructions as detailed at https://developers.google.com/vr/develop/unity/guides/magic-window... let my folly be a warning to future generations!
I have 2 issues when using PhotoCapture on the Hololens 2, which are probably connected. They did not occur on the Hololens 1.
The only obtainable resolution is 3904x2196
Getting the CameraToWorldMatrix always fails
As can be seen in the docs, this resolution has no framerate associated with it.
My presumption is that the CameraToWorldMatrix is only obtainable for camera profiles with a lower resolution.
How can i change the resolution and get the matrix within Unity?
Minimal Reproducible Example
I am using Unity 2019.2.19f1 and Visual Studio 2019 Community (16.4.5)
Create a new Unity Project following the steps here, except using IL2CPP as the scripting backend instead of .NET
In Player Settings: Capabilities enable InternetClient, InternetClientServer, PrivateNetworkClientServer, Webcam, Microphone and SpatialPerception, in Player Settings: Supported Device Families select Holographic
Create a new empty gameobject and add the following script:
using System.Collections;
using System.Collections.Generic;
using System.Linq;
using UnityEngine;
using UnityEngine.Windows.WebCam;
public class PhotoCaptureController : MonoBehaviour
{
PhotoCapture photoCaptureObject = null;
bool isRunning = false;
void Start()
{
StartCoroutine(StartCameraCapture());
}
private IEnumerator StartCameraCapture()
{
if (!Application.HasUserAuthorization(UserAuthorization.WebCam))
{
yield return Application.RequestUserAuthorization(UserAuthorization.WebCam);
}
if (Application.HasUserAuthorization(UserAuthorization.WebCam))
{
Debug.Log("Creating PhotoCapture");
PhotoCapture.CreateAsync(false, OnPhotoCaptureCreated);
}
else
{
Debug.Log("Webcam Permission not granted");
}
}
private void Update()
{
if (isRunning)
{
photoCaptureObject.TakePhotoAsync(OnCapturedPhotoToMemory);
}
}
void OnPhotoCaptureCreated(PhotoCapture captureObject)
{
photoCaptureObject = captureObject;
IEnumerable<Resolution> availableResolutions = PhotoCapture.SupportedResolutions;
foreach (var res in availableResolutions)
{
Debug.Log("PhotoCapture Resolution: " + res.width + "x" + res.height);
}
Resolution cameraResolution = availableResolutions.OrderByDescending((res) => res.width * res.height).First();
CameraParameters c = new CameraParameters();
c.hologramOpacity = 0.0f;
c.cameraResolutionWidth = cameraResolution.width;
c.cameraResolutionHeight = cameraResolution.height;
c.pixelFormat = CapturePixelFormat.BGRA32;
captureObject.StartPhotoModeAsync(c, OnPhotoModeStarted);
}
private void OnPhotoModeStarted(PhotoCapture.PhotoCaptureResult result)
{
if (result.success)
{
isRunning = true;
photoCaptureObject.TakePhotoAsync(OnCapturedPhotoToMemory);
}
}
void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame frame)
{
if (result.success)
{
if (frame.TryGetCameraToWorldMatrix(out Matrix4x4 cameraToWorldMatrix))
{
Debug.Log("Successfully obtained CameraToWorldMatrix: " + cameraToWorldMatrix.ToString());
}
else
{
Debug.Log("Failed to obtain CameraToWorldMatrix");
}
}
frame.Dispose();
}
}
Change Build Settings:
Build, open VS solution, set build target to ARM64, Debug and deploy to Device
This CameraCapture plugin may work for you. It's a native plugin for Unity. The readme.txt contains information on building it. Basically, you need to build the native plugin variants before trying to open up the Unity sample. If you haven’t built C++/WinRT components, you may need to go through the setup steps here. The SpatialCameraTracker.cs file shows how to receive the camera matrices.
There is minimal support on this but all the code is in the repo
Question asked just to give solution :)
Main idea is to recognize QR-code on Unity without ANY additional action like tapping on the screen or sth like this.
(For me it's not necessary that "vuforia free" have watermark, so here is my solution)
(Also Vuforia works with camera much faster and no need to realize manually autofocus)
Continious QR code recognition using Vuforia as webcam source and ZXing Library as QR recognizer
using UnityEngine;
using Vuforia;
using ZXing;
public class QRCodeReader : MonoBehaviour {
private bool _isFrameFormatSet;
IBarcodeReader _barcodeReader = new BarcodeReader();
void Start () {
InvokeRepeating("Autofocus", 2f, 2f);
}
void Autofocus () {
CameraDevice.Instance.SetFocusMode(CameraDevice.FocusMode.FOCUS_MODE_TRIGGERAUTO);
RegognizeQR();
}
private Vuforia.Image GetCurrFrame()
{
return CameraDevice.Instance.GetCameraImage(Vuforia.Image.PIXEL_FORMAT.GRAYSCALE);
}
void RegognizeQR()
{
if (!_isFrameFormatSet == _isFrameFormatSet)
{
_isFrameFormatSet = CameraDevice.Instance.SetFrameFormat(Vuforia.Image.PIXEL_FORMAT.GRAYSCALE, true);
}
var currFrame = GetCurrFrame();
if (currFrame == null)
{
Debug.Log("Camera image capture failure;");
}
else
{
var imgSource = new RGBLuminanceSource(currFrame.Pixels, currFrame.BufferWidth, currFrame.BufferHeight, true);
var result = _barcodeReader.Decode(imgSource);
if (result != null)
{
Debug.Log("RECOGNIZED: " + result.Text);
}
}
}
}
Its possible to realize also without Vuforia, ofc. Unity provides the possibility to get a camera and show it’s input on a webcamtexture. It's possible to find more documentation here.
ZXing lib you can find here, or build it by your own hands using sourse code located on github.
Both libs is cross-platform, so must be no issues on different devices.
How can I set the display to stereoscopic programmatically in Unity for an app deployed to an Android device?
I want a UI menu where the user can toggle between "VR mode" and normal mode. I do not want VR mode by default as it should be an option at run-time. I know there is a setting for "Virtual Reality Supported" in the build settings, but again, I do not want this enabled by default.
Include using UnityEngine.XR; at the top.
Call XRSettings.LoadDeviceByName("") with empty string followed by XRSettings.enabled = false; to disable VR in the start function to disable VR.
When you want to enable it later on, call XRSettings.LoadDeviceByName("daydream") with the VR name followed by XRSettings.enabled = true;.
You should wait for a frame between each function call. That requires this to be done a corutine function.
Also, On some VR devices, you must go to Edit->Project Settings->Player and make sure that Virtual Reality Supported check-box is checked(true) before this will work. Then you can disable it in the Start function and enable it whenever you want.
EDIT:
This is known to work on some VR devices and not all VR devices. Although, it should work on Daydream VR. Complete code sample:
IEnumerator LoadDevice(string newDevice, bool enable)
{
XRSettings.LoadDeviceByName(newDevice);
yield return null;
XRSettings.enabled = enable;
}
void EnableVR()
{
StartCoroutine(LoadDevice("daydream", true));
}
void DisableVR()
{
StartCoroutine(LoadDevice("", false));
}
Call EnableVR() to enable vr and DisableVR() to disable it. If you are using anything other than daydream, pass the name of that VR device to the LoadDevice function in the EnableVR() function.
For newer builds of Unity (e.g. 2019.4.0f1) you can use the XR Plugin Management package.
To enable call:
XRGeneralSettings.Instance.Manager.InitializeLoader();
To disable call:
XRGeneralSettings.Instance.Manager.DeinitializeLoader();
I'm using Unity 2021 but this probably works in earlier versions, I'm also using XR Plug-in Management.
Start:
XRGeneralSettings.Instance.Manager.StartSubsystems();
Stop:
XRGeneralSettings.Instance.Manager.StopSubsystems();
Full documentation at:
https://docs.unity3d.com/Packages/com.unity.xr.management#4.0/manual/EndUser.html
2020.3.14f1
Doesn't work for me, I get this error when running my Android app.
Call to DeinitializeLoader without an initialized manager.Please make
sure wait for initialization to complete before calling this API.
[RuntimeInitializeOnLoadMethod(RuntimeInitializeLoadType.AfterAssembliesLoaded)]
static void TryToDeinitializeOculusLoader()
{
XRGeneralSettings.Instance.Manager.DeinitializeLoader();
}
More context.
I try to unload the Oculus loader, before he manages to load the plugin.
I have an Android app, and the Oculus loader calls Application.Quit because the device is not an Oculus headset.
Waiting for XRGeneralSettings.Instance.Manager.isInitializationComplete takes too long.
Tried all RuntimeInitializeLoadType annotations.
OculusLoader.cs
#elif (UNITY_ANDROID && !UNITY_EDITOR)
[RuntimeInitializeOnLoadMethod(RuntimeInitializeLoadType.AfterAssembliesLoaded)]
static void RuntimeLoadOVRPlugin()
{
var supported = IsDeviceSupported();
if (supported == DeviceSupportedResult.ExitApplication)
{
Debug.LogError("\n\nExiting application:\n\nThis .apk was built with the Oculus XR Plugin loader enabled, but is attempting to run on a non-Oculus device.\nTo build for general Android devices, please disable the Oculus XR Plugin before building the Android player.\n\n\n");
Application.Quit();
}
if (supported != DeviceSupportedResult.Supported)
return;
try
{
if (!NativeMethods.LoadOVRPlugin(""))
Debug.LogError("Failed to load libOVRPlugin.so");
}
catch
{
// handle Android standalone build with Oculus XR Plugin installed but disabled in loader list.
}
}
#endif
SOLUTION
Made my build class extend IPreprocessBuildWithReport
public void OnPreprocessBuild(BuildReport report)
{
DisableXRLoaders(report);
}
///https://docs.unity3d.com/Packages/com.unity.xr.management#3.2/manual/EndUser.html
/// Do this as a setup step before you start a build, because the first thing that XR Plug-in Manager does at build time
/// is to serialize the loader list to the build target.
void DisableXRLoaders(BuildReport report)
{
XRGeneralSettingsPerBuildTarget buildTargetSettings;
EditorBuildSettings.TryGetConfigObject(XRGeneralSettings.k_SettingsKey, out buildTargetSettings);
if (buildTargetSettings == null)
{
return;
}
XRGeneralSettings settings = buildTargetSettings.SettingsForBuildTarget(report.summary.platformGroup);
if (settings == null)
{
return;
}
XRManagerSettings loaderManager = settings.AssignedSettings;
if (loaderManager == null)
{
return;
}
var loaders = loaderManager.activeLoaders;
// If there are no loaders present in the current manager instance, then the settings will not be included in the current build.
if (loaders.Count == 0)
{
return;
}
var loadersForRemoval = new List<XRLoader>();
loadersForRemoval.AddRange(loaders);
foreach (var loader in loadersForRemoval)
{
loaderManager.TryRemoveLoader(loader);
}
}
public void Awake() {
StartCoroutine(SwitchToVR(()=>{
Debug.Log("Switched to VR Mode");
}));
//For disable VR Mode
XRSettings.enabled = false;
}
IEnumerator SwitchToVR(Action callback) {
// Device names are lowercase, as returned by `XRSettings.supportedDevices`.
// Google original, makes you specify
// string desiredDevice = "daydream"; // Or "cardboard".
// XRSettings.LoadDeviceByName(desiredDevice);
// this is slightly better;
string[] Devices = new string[] { "daydream", "cardboard" };
XRSettings.LoadDeviceByName(Devices);
// Must wait one frame after calling `XRSettings.LoadDeviceByName()`.
yield return null;
// Now it's ok to enable VR mode.
XRSettings.enabled = true;
callback.Invoke();
}