Unity Build targets Mixed Reality but does not appear in apps - unity3d

I have created a project using the Windows XR Plugin and XR Plugin management. I am NOT using the XR interaction toolkit, I have created my own tools, including my own XRRig using the Tracked Pose Driver.
My project is NOT built with UWP since I need file access that UWP does not provide or is too tricky to provide (after many attempts and work with a senior developer here, I just gave up). So I'm using standard Unity and my build settings are "PC, MAC, Linux stand-alone". My Player settings in the XR Plug-in management are "Windows Mixed Reality". I am using a few UWP functions for file access within the app.
Now, once the build is built, I obviously don't see it within Steam, but it also does not appear in the Mixed Reality list of applications. I have to start it manually by clicking on the icon on the desktop. It works great, but wtf...?
I know for applications to appear in Mixed Reality they must be built with UWP, but then if this build is neither UWP nor Steam, what is it then? How do I add it (or sideload it) to the Mixed Reality applications that the Windows menu brings up within the cliff house for e.g.?

To answer first question on what type of application is being built if UWP and Steam are removed as target in Unity, this is being built as a PC desktop application.
To answer the second question on how to access this application inside the Mixed Reality Cliff House shell when the application is not a UWP application, this can be launched via the “Classic Apps” pin inside the mixed reality cliff house shell.
Here is more information on the "Classic Apps" pin:
https://learn.microsoft.com/en-us/windows/mixed-reality/whats-new/release-notes-may-2019#how-to-launch
That should answer the question asked on how to launch inside of VR experience itself.

Related

How to setup a fast development workflow (plug and play) for Oculus Quest 2 with Unity

How would I go about setting up Unity and Oculus so I can hit Play in Unity and see the results immediately instead of building and running the game on Oculus?
First off, these are the essentials one needs in order to develop for Oculus Quest 2 with Unity. After completing these steps, you can, if you want, do the other ones to have the fastest possible workflow.
Unity side:
Install Unity (anything above 2018 would work, preferably the latest). Check the Android Build Support module, as well as the Android SDK & NDK and OpenJDK checkboxes.
Open a new 3d empty project. (preferably open it for the Android platform)
In Build Settings, switch to Android platform. (if it's not already)
In the Package Manager, install the XR Plugin Management and the Oculus XR Plugin
In Project Settings -> XR Plugin Management, go the Android tab and check the Oculus option. Wait for the import to finish.
For the Oculus side:
Make an Oculus Developer account
Put on your Quest 2 headset and log in to your developer account.
In your Quest 2 headset, go to Settings -> System -> Developer and turn on the USB Connection Dialog option. Alternatively, you can do the same with the Oculus Android app. Do one or the other.
Connect the headset to the pc/laptop with a usb-c to usb cable (like the Oculus Link cable, but it will work with third-party cables as well) and accept all the dialogues that show up, namely the "Allow USB Debugging", "Always allow from this computer", and the "Allow access to data".
Finaly, In Unity in Build Settings -> Run Device choose the connected Oculus device from the options and click Build & Run (make sure you have at least one scene added to Build Settings). This will build the application and push it to your Oculus (if connected via usb). When you put your headset on you will see the application load.
That's it. This is the bare minimum you need in order for you to develop VR games for Oculus. But of course if we leave it at that it's going to be really slow and tedious developing VR apps because we will have to code basic VR principles by ourselves and waste time on building and pushing the app to the device every time we want to change and test something. And so, we are going to install a few more things in order to speed up development.
Install the Oculus Developer HUB. This will allow you to check if your device is connected to the PC properly, and also has other functionalities (you can screen capture and record video directly from the devices). Once installed, connect your device with a usb-c to usb cable and make sure it shows properly in the Oculus Developer HUB. (this step is NOT a must but I recommend it)
Install the Oculus App for Quest 2. You can find it on Oculus website. We need this in order for the Oculus Link feature to work, which will allow us to test in real time instead of building and running the app on the Oculus.
Run the Oculus App, and when you do it will show a setup guide. Follow the guide. You can choose connection via cable (cable link) or via wifi (AirLink).
In your connected Oculus headset, a dialogue will pop up asking you if you wish to enable Oculus Link. Accept it.
In Unity, go to Package Manager and install the XR Interaction Toolkit. This plugin greatly lessens the troubles of setting up a VR rig.
In a new scene right click in the Hierarchy, go to XR -> Device-based -> XR Rig (could also say XR Origin)
Click Play in Unity and put on your headset. You should see the tracking of the headset and controllers mirror in the Unity editor.
That's pretty much it for the development side. If you wish to build for Oculus, you just have to go to Build Settings and in the Run Device option choose the connected Oculus Quest 2 device and click Build and Run. And of course save the scene with the XR Rig and add it to Build Settings.
Tested to work with Unity 2020.3.25f1 on ASUS TUF laptop.
Feel free to correct me on some of the steps if you hit an obstacle. Would love to have as comprehensive guide as possible.

Open executable jar from Unity UWP

I'm building a windows app using Unity3d and I want to include an executable .jar file which has to open on button click from the app. Is there any way to develop this? Please help.
Unfortunately I don't think there is a direct way to do this. Pure UWP apps (which are generated by Unity) are sandboxed and compiled using .NET Native. As such they don't allow executing external code/process as that would pose security risk.
If you used an external Desktop Bridge app (which has full permissions) and communicated with it via app service (see this documentation article), you could theoretically achieve this, but it sounds a bit too complex. Another alternative would be to publish your app as a classic Win32 game. It would still be possible to publish it on Microsoft Store, however it would be limited to desktop devices.

Hololens Unity app always deploys in 2D instead of 3D

I'm developing an app for the Hololens 1 in Unity, and it runs perfectly fine on the device when using Holographic Remote. However whenever I build and deploy the application through Visual Studio, it then only launches in 2D mode on the Hololens (as a flat "window" you can position in space). What settings control this behaviour?
Unity version is 2019.1.4f1,
Visual Studio is 2017 Community Edition,
I'm on Windows 10.
Developer mode is turned on on both the HL and my desktop. Virtual Reality Support is ticked in Unity, the Mixed Reality SDK is added to the list and the Build settings are on x86 / D3D Project.
I tried replacing my scene with one of the examples from the MRTK, but to no avail. Strangely enough, if I make a clean new project with nothing except the MRTK example in it it does deploy properly, so there must be something in my project interfering. I just can't figure out what.
Expected behaviour is that the application launches in "room scale" mode, i.e. all other applications disappear and the objects in my scene can be viewed in 3D.
EDIT: This has been marked as a possible duplicate. The answers given there do not solve my problem, however. I already made sure that "Virtual Reality Supported" is ticked in the XR settings and the SDK is added to the list. I don't think I have a Windows Insider preview, but since I was able to deploy perfectly fine with a fresh project I don't think that's really the problem...
It appears Vuforia was causing the issues. I got it to deploy in 3D with 'Vuforia Augmented Reality Supported' ticked and the following settings in VuforiaConfiguration:
Camera Device Mode: MODE_OPTIMIZED_SPEED
Device Type: Digital Eyewear
Device Config: Hololens
Video Background DISABLED
Device Tracker DISABLED
Furthermore, 'Vuforia' must not be added to the list of Virtual Reality SDKs in XR Settings.
Note that I have not tried all subsets of these settings individually, some of them might not have an impact whatsoever (except for the last one, I am quite certain adding that SDK will force the app into 2D mode).
Also note that I haven't verified that Vuforia actually works correctly on the Hololens, just that I can deploy the app in 3D mode with it enabled, given the above settings. If someone could confirm that Vuforia is even supported by MRTK v2?
EDIT: apparently the problem is also caused by ticking "WSA Holographic Remoting Supported" in the XR Settings, so be sure to disable that.

Merging a game into WinRT app

I'm wondering what options do I have to merge a 2D game into WinRT app that's already developed. I've developed couple of games in Unity3D but not sure if they can be played as a part of WinRT app (launch on button click).
I've heard Microsoft provide XNA framework for game development. Would that be of any help in my case? The requirement is to launch game on button click and get back to app on back button click.
Please let me know if there's better approach/tool available.
Firstly you say you want to merge a game with a WinRT app - Do you mean you want to put a game inside another application, so perhaps have an app as a launch pad for you game? If you want to target WinRT with Unity I just looked at the platforms that Unity targets and Universal Windows Platform is listed there. You could write the application part in Unity couldn't you ? So that the app is just the first scene that you see. That might seem strange, but if you had to use unity and the app was just a small application with not too many requirements you could do it that way.
Xna has been discontinued. You should use other frameworks.
If you want to write it from scratch I would suggest that you write it as WinRT application (Universal Windows Platform) so that you can define the application UI in Xaml and write the game with a dedicated graphics/game api such as SharpDX http://sharpdx.org/ (which is a wrapper around DirectX) or perhaps even better, take a look at Win2D http://microsoft.github.io/Win2D/html/Introduction.htm which is a very impressive 2d graphics api.
In practice, both SharpDX and Win2D would target one of the Xaml controls that give you access to a swap-chain (eg SwapChainPanel, SwapChainBackgroundPanel, CanvasControl etc). These Xaml controls are integrated into the Xaml UI and can simply pop up when you need them and viola, your game is running.
Since it was a 2D game that I wanted to integrate with my WinRT app, I decided to go with Scirra Contstruct 2, designed specifically for 2D games. It exported the project as website which I hosted on my server and used a WebView within my app to load the game. Not only the performance is good, but, as a bi-product, it made the game cross platform.

How to integrate USB integration in Unity Engine for Desktop application?

Has anyone done any Unity project where USB communication is done? I have to do USB communication with a board to get sensor values. How can I go about doing this USB communication?
You can link .NET DLLs in Unity by adding them to the project (drag and drop worked, if I recall correctly). So, code your board access library in a Visual Studio project using .NET, exposing the API you need to use and add the DLL.
You will be able to access the contents of the DLL from your Unity code (although I have only done this with C#).
Of course, this only works on PC. For other platforms, I don't know if this is even possible.
EDIT: Minor correction.