Oculus Rift Multiplayer - unity3d

We are working on a VR collaboration app where user imports CAD files at run time. Then different users perform operations on it together. We are using Unity.
The challenges:
1. CAD files are very large.
2. Our app has desktop as well as Oculus Rift mode.
Which multiplayer approach should we follow? I heard that OculusPlatform sdk has support for generic networking also. Should we go for it, or rather choose from UNet and Photon?
Any suggestion will be greatly appreciated.

Related

Can Anylogic be integrated with a VR device?

Do you know if there is any way to integrate Anylogic with a VR device to see the simulation run in VR? What type of software/methodology has worked for you?
Only possible as far as you can run any Windows application in VR (i.e. in "cinema mode"). But there is no automated way to really "visit" your models in VR.
However, there may be some Java packages that could help, but it would be custom coding

Will a mixed-reality application and experience developed for Hololens 2 using Unity work on Oculus Quest as is?

To allow sales people to meet clients in the COVID19 era without travelling, busienss wants to create a virtual meeting room .
The clients will get Oculus Quest as Hololens is hard to procure right now where as business wants to use Hololens on their end.
Will an application /experience created for Hololens using Unity work as is on Oculus Quest or does it make sense to have the same device on both end?
I am new to this area, so not sure if this question makes sense, but is it something like developing 2 versions of code, one for iOS and one or Android and using something like Xamarin to make the process easy ?
Does Unity have features to make applications compatible between Hololens and Oculus Quest ?
MRTK makes it easy to make multiplatform XR applications.
https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/GettingStartedWithTheMRTK.html
Unity will allow you to run most applications on both Windows on ARM (HoloLens 2) and Android (Quest).
MRTK even has hand tracking support on both platforms.
https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/CrossPlatform/OculusQuestMRTK.html

How does OpenVR, SteamVR and Unity3D work together?

I am trying to understand the VR platform stack of Vive, and how it's games are developed.
I am struggling to understand where exactly does openVR, steamVR and Unity fit into picture.
My understanding so far has been that:
openVR - Hardware independent layer providing APIs for peripheral access.
That is it can provide access to either Oculus or Vive hardware via
a defined interface.
SteamVR - Provides access to hardware to games developed either in unity or
unreal.
Unity3D - A game engine to develop games.
If anyone can correct me, I will be much grateful.
Or if my understanding is correct, then why can't games being developed in unity 3D access hardware directly via openVR.
OpenVR is an API and runtime that allows access to VR hardware from multiple vendors without requiring that applications have specific knowledge of the hardware they are targeting (ref1), SteamVR is the customer facing name that we use for what users actually use and install (for details check this video: Using Unity at Valve)
Also Check to see that can you use the Vive with OpenVR without Steam ??.
Lets finally look all these terms, thanks to Reddit post:
How a Game Appears on your Head Mounted Display(HMD):
A game renders an image, sends it to it's corresponding runtime. The runtime then renders it to the HMD:
Rendered Image using the :
[OVR/OpenVR] SDK -> [Oculus/SteamVR] Runtime -> [Rift/Vive]
SDKs:
SDKs are used to build the games. A game can either implement OVR or OpenVR or both. This means that the game has access to native functionality in it's corresponding runtime. SDKs do not handle async timewarp or reprojection, those are handled by the runtime!
OVR: Made by Oculus for the Oculus Rift. Current version (14th May 2016) is 1.3.1 and can access all features of the Oculus runtime.
OpenVR made by Valve and supports Vive and Rift via the SteamVR runtime
Sidenote to SDK's and Unity games: Unity 5.3 currently has optimizations for VR in their native mode. The native mode supports Rift, Gear and PSVR, but not SteamVR. A game compiled with Unity 5.3 can use those optimzations with the Oculus SDK but not the OpenVR SDK. The OpenVR SDK has it's own optimizations, which may or may not result in similar performance. However, the upcoming Unity 5.4 will support SteamVR natively and performance should be more or less identical. Please note: this is Unity specific and other engines might have similar or different optimizations for some or all headsets.
Runtimes
Oculus Runtime responsible for async timewarp and handles device detection, display, etc. It (the runtime service) needs to be running for Oculus Home to launch
SteamVR Runtime responsible for reprojection and supports Rift and Vive
Software Distribution Platforms
Oculus Home needs to be running for the Rift to work. By default only supports apps from the store (checkbox in the settings of the 2d desktop client to enable other sources). It downloads games and runs them. It also handles the Universal Menu on the Xbox button
Steam/SteamVR technically does not need to run when launching OpenVR games, but highly recommended (room setup and config is pulled from there). Also handles overlay menu on the Xbox button, or when running on the Rift, it launches by pressing the select/start button in the Oculus Universal Menu
Finally worth reading.

Multiplayer game using unity free version for Android

I am going to start my first game on unity which is a real time multiplayer game for android. I want to ask few things.
Do I need to buy pro version or any license for the whole process (from development to submission to play store, I know that splash screen can not be change). My game is almost like 8 pool ball with all multiplayer features (create user profile, play with friends, play with Facebook friends, challenge friends, leader board filters etc etc) like in 8 pool ball. I am planning of using Photon for multiplayer.
I also need the Web and Facebook version as well in future.
I also see few pages about this but I am still confuse.
Thanks in advance.
The Unity3D personal version does NOT support C# sockets on mobile, so no 3rd party realtime plugin, neither Photon nor any other, will work on Android (or on iOS) with the personal version of Unity3D, but they all require Unity3D pro.
There is only one exception to this: PUN+ works even with Unity3D Personal on Android and iOS, because it applies a fallback to use C++ sockets through a native plugin on platforms, on which C# sockets are not available in Unity3D Personal. However, this is only true for PUN+ (Photon Unity Networking Plus), not for PUN (Photon Unity Networking).
Unity in Version 5 has no engine-specific pro-Features anymore. So basically free = pro (besides some services and splash screens). So you can go with Unity 5 free and e.g. Photon.
Regardings costs: It could be that you need some sort of webservice for your game (managing profiles etc.) so the webserver could produce some costs. But for getting started there are many free services out there like https://www.heroku.com/
Furthermore you will need to pay the 25$ fee for Google Play to publish the game.

Is there a way to add water to a Unity3d project that is combatible with mobile platforms

I am working on a Unity3D project, I have Unity Pro which comes with water4. I realize I will not be able to use this if I'm trying to make a game that works on mobile devices. Does anyone have an idea or way of making water (and allowing it to move) to look more realistic then just simply painting it on the terrain and won't crash the game on the mobile device. Also I am creating scripts in unityscript.
This water package looks pretty good, is for Mobile, non-pro and is on the Asset store.
http://u3d.as/content/grespon/easy-water/2Q8