Autodesk Revit in particular doesn't work in WMR desktop - virtual-reality

I'm trying to use Revit in VR while inside a realtime rendering of the model. WMR desktop doesn't let me click on the toolbars but SteamVR desktop does.
See this link to a video of the issue: https://www.dropbox.com/s/5inunjyj9jx8w0i/WMR%20Issue.mp4?dl=0
Thanks

Related

Hololens Video recording not showing UI elements

We're working with the Hololens 2 and sadly when recording a video, taking an image or going into the live-view, Unity UI elements are not shown in the image/video.
Does anybody know how we can make Unity UI elements appear in video & photo capture in Hololens?
Here is an example where below the title there is text present, but not captured in the image.
The title part uses a TextMeshPro component, while the text part uses a TextMeshProUGUI component (due to the scrolling window of the text.)
We're using Unity 2020.3.6f1, MRTK 2.7.2 with OpenXR backend.
Thanks for any help and recommendations.
For how to create mixed-reality photos and videos, you can use the Start gesture to go to Start, then select the Camera icon, for more information please refer to this link: Create mixed reality photos and videos.
If you want to seamlessly integrate mixed reality capture and insertion into your apps, you need to enable the Windows Mixed Reality Camera Settings provider in your MRTK profile and check Render from PV Camera.
The issue was that our Unity version was not updated, as hinted out in this github issue. Simply updating to the newest Unity version solved the problem.
https://github.com/microsoft/MixedRealityToolkit-Unity/issues/10155

Unity : project works in editor but black screen when build

I'm on a project using Unity 2019 LTS and some unity SDK / package:
Mapbox SDK
DreamWorld SDK (the SDK of my AR headset)
some other default AR packages (Foundation, Subsystem)
I would like to reused the Mapbox World-scale AR example in order to implement the possibility to move the scene according my AR headset position.
To do so, I removed the default main camera of the example (in AR Root) and added instead a the camera for my headset, as explained in the headset's docs (DW Developer Kit SDK).
Here are some pictures of what I've done:
No here's my problem: when I run the project in the editor with the player mode, all works perfectly fine and I see the camera rotation following the position of my AR headset.
Therefore, if I try to build the project, I cannot the see the "view" of the camera. I know that the project run because I still can see the overlay menu provides by the Mapbox World-scale example but not my camera.
Editor :
Build :
I searched online to find some solution to my issue but I only found some answer about building to Android and iPhone while I trying to build on my laptop.
The fact I see a black screen (and the overlay) seems to me that Unity cannot find a camera to show me the scene.
I just started using Unity, so it is possible that I missed an obvious thing but I don't know what.
If someone as any idea of what my problem is...
In case the suggestion from the comments with the In-Game logs does not work, you can check the external log file.
According to https://docs.unity3d.com/Manual/LogFiles.html
it is found under "C:\Users\YOUR_USERNAME\AppData\LocalLow\CompanyName\ProductName\Player.log"
CompanyName and ProductName are two names you can project somewhere in the unity project settings but there are default values.

How to draw lines from browser on remote mobile AR app?

I am looking for a solution to share the screen from a mobile AR app (ARKit or Unity AR Foundation).
The screen needs to be shared to a browser on the desktop and it should be possible to draw lines on the screen from the browser using the mouse in the AR environment that can be seen on the mobile app which is sharing the screen.
After some investigation there does not seem to be a viable solution to truly share the same AR instance with browser/mobile as you can do with 2 mobile devices.
There should however be some sort of work around possible as it can be done with Vuforia Chalk AR.
Here is a GIF showing how it works:
AR Drawing demo
Sharing a video seems to be possible
Specifically trying to figure out how the line is drawn from the browser and then displayed on the mobile AR app
How can you achieve the same functionality with open source alternatives or Unity and custom code (No Vuforia is possible)?
Looking for a tutorial or some directions to how this can be implemented.

Unity : How to test Oculus OVRinput through the editor? Workflow question

Here is my problem,
I'm using unity 2017.4.30 to develop an Oculus Go application. If anyone else has done this before you know that you can get Unity to build an APK and upload to the headset to test your software.
At the moment we are writing code to do with OVR Utilities, specifically OVRInput which basically is to do with the Oculus Go controller interacting with canvas or objects etc.
What I want to be able to do is test code we are writing in the editor and not have to upload the APK each time to test it. It seems some things like OVRCamera rig work in the editor fine but the controllers only show up when you run from the Oculus Go itself.
Any ideas how we could 'simulate' or 'virtualize' OVR controller in the editor so I can just click things with my mouse?
Thanks for reading,
If you do not want to write your own controller simulator, I'd suggest using the method oculus quest dev's have been using to quickly debug their game, which involves emulating a rift by using programs like ALVR or VirtualDesktop.
If set up correctly you'll be able to just press the play button in unity and instantly see your game on the oculus go.
The answer was the Oculus Link. When the Quest was connected to the PC via Link the Quest controllers where emulated inside unity. Problem solved.

In Unity 5.6, how to get inputs in Daydream mode without installing GVR SDK?

I am working on Unity 5.6.5f1 with VR enabled and Daydream selected for an Android app. So my application is working with daydream but I would like to get inputs from the Daydream remote (only a simple click). Is that possible without including the GVR sdk to my project?
Alternatively, can I catch a click on the on/off button or the sound level button of the phone in my app?
You need to import the GVR unity package so that you can access the GVRController class. Only then can you get the on click event.
I finally managed to catch clicks on the volume up and down buttons of the daydream remote control thanks to this post. Quite an ugly solution but it does the trick and provides two inputs.