Can I use Steam VR interaction system scripts with XR toolkit? - unity3d

I am converting a SteamVR standalone build into an XR toolkit android build for the Oculus Quest 2. I need it to work without using quest link. Several scripts I use (like circular driver) are in the SteamVR interaction system core library. I was wonder if there was a way to continue using some of the scripts or is it more worth my time to rewrite them for myself?
I've read the documentation for XR toolkit and it seems incredibly lacking compared to SteamVR

Related

Laggy Hololens App since update to XR SDK

Since I updated my Unity3D Hololens 2 project, using MRTK 2.5.1, to also use the XR SDK for holographic remoting, the app has become laggy also when deploying even though the profiler doesn't show much activity.
Is this a known issue that can be updated in some settings?
Could you provide more information about your project? Without any specific context, the above information make it is difficult for us to profile your application. It is recommended you follow this guide to optimize the performance of mixed reality apps in Unity: Performance recommendations for Unity
In addition, the GitHub issue page of MRTK is an important way for us to collect user feedback, and it currently has no reports on the latest MRTK performance issues. As for XR SDK, because it is a Unity API, we recommend that you submit a new feedback to the Unity product team for a professional suggestion. We always recommend the latest Unity LTS (Long Term Support) stream as the best version to develop MR app, and the current recommendation is to use Unity 2019.4.15f1: https://unity3d.com/unity/qa/lts-releases

Will a mixed-reality application and experience developed for Hololens 2 using Unity work on Oculus Quest as is?

To allow sales people to meet clients in the COVID19 era without travelling, busienss wants to create a virtual meeting room .
The clients will get Oculus Quest as Hololens is hard to procure right now where as business wants to use Hololens on their end.
Will an application /experience created for Hololens using Unity work as is on Oculus Quest or does it make sense to have the same device on both end?
I am new to this area, so not sure if this question makes sense, but is it something like developing 2 versions of code, one for iOS and one or Android and using something like Xamarin to make the process easy ?
Does Unity have features to make applications compatible between Hololens and Oculus Quest ?
MRTK makes it easy to make multiplatform XR applications.
https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/GettingStartedWithTheMRTK.html
Unity will allow you to run most applications on both Windows on ARM (HoloLens 2) and Android (Quest).
MRTK even has hand tracking support on both platforms.
https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/CrossPlatform/OculusQuestMRTK.html

How does OpenVR, SteamVR and Unity3D work together?

I am trying to understand the VR platform stack of Vive, and how it's games are developed.
I am struggling to understand where exactly does openVR, steamVR and Unity fit into picture.
My understanding so far has been that:
openVR - Hardware independent layer providing APIs for peripheral access.
That is it can provide access to either Oculus or Vive hardware via
a defined interface.
SteamVR - Provides access to hardware to games developed either in unity or
unreal.
Unity3D - A game engine to develop games.
If anyone can correct me, I will be much grateful.
Or if my understanding is correct, then why can't games being developed in unity 3D access hardware directly via openVR.
OpenVR is an API and runtime that allows access to VR hardware from multiple vendors without requiring that applications have specific knowledge of the hardware they are targeting (ref1), SteamVR is the customer facing name that we use for what users actually use and install (for details check this video: Using Unity at Valve)
Also Check to see that can you use the Vive with OpenVR without Steam ??.
Lets finally look all these terms, thanks to Reddit post:
How a Game Appears on your Head Mounted Display(HMD):
A game renders an image, sends it to it's corresponding runtime. The runtime then renders it to the HMD:
Rendered Image using the :
[OVR/OpenVR] SDK -> [Oculus/SteamVR] Runtime -> [Rift/Vive]
SDKs:
SDKs are used to build the games. A game can either implement OVR or OpenVR or both. This means that the game has access to native functionality in it's corresponding runtime. SDKs do not handle async timewarp or reprojection, those are handled by the runtime!
OVR: Made by Oculus for the Oculus Rift. Current version (14th May 2016) is 1.3.1 and can access all features of the Oculus runtime.
OpenVR made by Valve and supports Vive and Rift via the SteamVR runtime
Sidenote to SDK's and Unity games: Unity 5.3 currently has optimizations for VR in their native mode. The native mode supports Rift, Gear and PSVR, but not SteamVR. A game compiled with Unity 5.3 can use those optimzations with the Oculus SDK but not the OpenVR SDK. The OpenVR SDK has it's own optimizations, which may or may not result in similar performance. However, the upcoming Unity 5.4 will support SteamVR natively and performance should be more or less identical. Please note: this is Unity specific and other engines might have similar or different optimizations for some or all headsets.
Runtimes
Oculus Runtime responsible for async timewarp and handles device detection, display, etc. It (the runtime service) needs to be running for Oculus Home to launch
SteamVR Runtime responsible for reprojection and supports Rift and Vive
Software Distribution Platforms
Oculus Home needs to be running for the Rift to work. By default only supports apps from the store (checkbox in the settings of the 2d desktop client to enable other sources). It downloads games and runs them. It also handles the Universal Menu on the Xbox button
Steam/SteamVR technically does not need to run when launching OpenVR games, but highly recommended (room setup and config is pulled from there). Also handles overlay menu on the Xbox button, or when running on the Rift, it launches by pressing the select/start button in the Oculus Universal Menu
Finally worth reading.

Javascript in Unity for Oculus Rift

I just want to know, can I use Javascript to program items and a virtual reality world in Unity so I can implement it into the Oculus Rift? I need to do it for a research project. I would try C++, but I am better with Javascript.
If you feel confident in JavaScript then you should have very little trouble moving to C# for Unity3D scripting. There is a plugin for Unity to use the Oculus Rift. #jahroy is correct in stating that C# documentation is expansive and most of it applies very well to Unity scripting.
Here is a link to a tutorial. One caveat, it requires Unity Pro (I don't have a DEV kit, so I haven't tried it personally). http://paddytherabbit.com/unity3d-oculus-rift-plugin-setup/
Good luck!
A-Frame is a JavaScript Virtual Reality Framework built on three.js, and has been gaining in popularity: https://aframe.io/docs/0.2.0/core/
I've not tried it yet, but it sounds like it integrates well with Oculus.

unity3D + kinect interection

guys i am working on a project which uses unity engien and kinect as input source ..now according to my knowledge there is not much support between unity and kinect sdk ..i have heard about zigfu framework but it is not giving me all functionalities i need..so what are options for me? im thinking to take some functionalities from zigfu and some from a background application build in .net 4.0 and using kinect official sdk ? can i connect to kinect via two interfaces at the same time? i.e zigfu and kinect sdk ....my background app will connect to unity via pipes ..is that agood option?
I've already done something similar. I'd like to use Unity 3D engine and do some interactions to animate the model using Kinect (Kinect SDK). Some functionality in Kinect SDK are not available in Zigfu, such as Hand Gripping detection.
Because Kinect SDK is suitable for WPF application, here is my solution :
Build your Unity into Unity Standalone (PC, Mac, Linux).
Create WPF application with Kinect stuff inside.
Add WindowsFormsHost inside your XAML of WPF application.
Embed your Unity Standalone into WPF using WindowsFormsHost.
To do a communication between WPF and Unity, you can use Raknet. It will work as socket does.
in my experience, its usually not a good idea to use "two of" something, when they both do the same thing. I've never heard of zigfu before, but it seems relatively easy to learn. Since its available as a unity plug in, it may be best to use that over kinect. The reason being that Unity isn't to "friendly" with third party applications.
If your aiming for XNA, its possible to convert easily if the plug-in doesn't already do it for you.
I Highly recommend looking over the unity forums, and the ZDK documentation.
http://forum.unity3d.com/threads/127924-Zigfu-dev-kit-for-Unity3D-Product-Launch