https://youtu.be/dtKArIh8UTQ
The video above shows MRTK2 working with Quest Pro's color passthrough. Has anyone been able to get it working with MRTK3?
I can't get the OpenXR and Oculus XR plugins to run simultaneously. MRTK3 won't work without OpenXR and passthrough won't work without Oculus.
Related
I am building an oculus app using MRTK (Mixed reality toolkit) and need Agora's video feed streaming capabilities.
While building my app I am able to hear sound but not able to see unity scene, it's all blank and appears as a black screen.
Has anyone encountered this before?
I did try Agora SDK with oculus integration without MRTK and it works fine.
yes change the project settings xr-Plug in management to oculus for android.
I am using Unity 2019.3.15f1 with the MRTK 2.4.0 for developing an augmented reality app on the HoloLens 2. Unfortunately, the stereoscopic rendering is not working correctly. There is a mismatch between the images rendered for the right and the left eye. You see two images of the scene, running the app on the HoloLens. It doesn`t matter if I am using the Remote Player or ruining a builded app on the HoloLens. In both cases there are two not matching rendered images for both eyes. The HoloLens is calibrated. Other Apps from the microsoft store, or the "Homemenu" at the HoloLens have no mismatch of the images. But if I am running MRTK-example scenes on my HoloLens 2 there is also a dismatch between the rendered images for both eyes.
Does anyone had troubles with the same issue and can provide some ideas about a solution how the images can be matched correctly for both eyes?
It seems like something went wrong in your Unity project setting, it is recommended to double-check the XR Setting to troubleshoot. For XR SDK, please follow this link: Getting started with MRTK and XR SDK. For Legacy XR, have a look at Legacy XR
Besides, we always recommend the latest Unity LTS (Long Term Support) and MRTK as the best version so that we can avoid some known issue, and the current recommendation is to use Unity 2019.4.20f1 and Mixed Reality Toolkit v2.5.4.
i have running my Unity whit hand tracking application in my oculus Quest, but i need debug this un my Unity Editor.
I have Unity 2019.3.0f6 and the featured for this version have a connection vía oculus link, but i cant show the app in the editor. Help!!!
Hi I'm trying to set up my Oculus Rift in Unity to develop with OpenVR and Steam. I'm using Unity version 2017.4 and have added the SteamVR package from the asset store to my project. I'm guessing I need to download the github OpenVR folder and add to my project, but is that all? and will the Rift be recognised then? I'm not sure if I need the Oculus integration tool from the asset store as well or will it interfere? Any step by step assistance would be amazing, thank you in advance (also, I know developing with Oculus and not Steam might be easier but need to use Steam for this project)
I'm currently working with Oculus Rift so...
First you need to download Oculus Rift Software. It will help you setup your Rift. You can download it HERE
Second, you probably need Oculus Rift SDK for Unity. It's working perfectly well for me. I just click Play and app is starting in Oculus Rift. You can download it HERE. And use Import Asset to add in to your Unity Project. I highly recommend also Oculus Rift Avatar SDK (for hands and nice support for Touch Controllers) and Oculus Rift Sample Framework for prefabs (like grabbers for grabbing things obviously).
Also if you will have any problems, remember to update your USB Drivers (especially for USB 3.0).
I'm trying to develop a game in Unity with Oculus Rift. I have DK1 as my environment. I have a game called RiftCoaster as demo and it works fine but the default demo in Oculus rift doesn't work. The demo unity projects I download from Oculus doesn't work as well. What might be there problem? Although it works with some samples I couldn't make it work for Unity projects of default demos.