Unity multiplayer(android and ios) using socket.io - unity3d

I am trying to achieve multiplayer game for android and IOS, I have multiple options,
Photon
Mirror
MLAPI
Socket.io
option 1,2,3 are not working for me as I found out that I cannot program a dedicated server for my specific needs and access db from them.
I am left with socket.io but I am not sure if it supports android and IOS games, plugins I found are mostly for webGL and standalone. Does anyone know if socket.io works for android and ios games made with unity smoothly ?

Yes. it's working well on android and iOS games.
But it has some problems to make games.
If your game is not big game like 2D, Card Game it's ok for you to develop with socket.io but if it's big game like 3D style(FPS, RPG, RTS...), you should use good multiplayer services like Photon Server.
For database it's not problem for you to do use database.
Of course you have to make backend to use database.
But think about to split multiplayer server and backend server using database.
Then it's same with Photon, Mirror, MLAPI, Socket.IO.
Only you can use Socket.IO on backend server so it's only comfortable for you.
So you can select multiplayer server with your game style.
I hope it will help your project.

Related

Will a mixed-reality application and experience developed for Hololens 2 using Unity work on Oculus Quest as is?

To allow sales people to meet clients in the COVID19 era without travelling, busienss wants to create a virtual meeting room .
The clients will get Oculus Quest as Hololens is hard to procure right now where as business wants to use Hololens on their end.
Will an application /experience created for Hololens using Unity work as is on Oculus Quest or does it make sense to have the same device on both end?
I am new to this area, so not sure if this question makes sense, but is it something like developing 2 versions of code, one for iOS and one or Android and using something like Xamarin to make the process easy ?
Does Unity have features to make applications compatible between Hololens and Oculus Quest ?
MRTK makes it easy to make multiplatform XR applications.
https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/GettingStartedWithTheMRTK.html
Unity will allow you to run most applications on both Windows on ARM (HoloLens 2) and Android (Quest).
MRTK even has hand tracking support on both platforms.
https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/CrossPlatform/OculusQuestMRTK.html

Hololens applications using WebGL / ThreeJS

I've got a WebGL application built with JavaScript and ThreeJS. I was able to enable WebVR somewhat easily to create a immersive environment. I think my app is a better use case for mixed-reality/AR. Hololens seems to be the big player in that hardware space.
As I look at the development tools around Hololens its pretty much Unity and C#. Both great tools but as I start developing in this closed environment I kinda feel like I'm building a Silverlight application.
I've been trying to figure out if there is a trick I can accomplish to create a immersive experience with my WebGL app. I know that I can use Edge browser, however, thats a flat experience which is not any value to this use case.
I've found a few links:
is-it-possible-to-use-webgl-with-hololens-repost
can-i-make-a-universal-app-using-html-that-runs-on-hololens
augmented reality with awe.js
All these seem to either be 2d experiences or 'fake' AR using cameras and WebVR. Furthermore, I also looked into porting my WebGL app to Unity using Unity's JavaScript language features to find out that it is really a subset fork of actual JavaScript ( known as UnityScript ) making it way more effort than its worth.
Given all this, I'm wondering if its even possible to accomplish the feat and if anyone knows if this is something on the roadmap for microsoft?
There's this new tool from Microsoft called HoloJS. It's a framework for creating holographic apps using JavaScript and WebGL.
holographicjs is a C++ Windows Runtime Component for hosting Windows Holographic apps built with Javascript and WebGL.
Its interesting and a huge hack but might be a good first start for the community!
Note: Answer based on:
I do not know what Microsoft roadmap plans are or will be
The actual easy-way to develop for hololens is using VS and Unity3D (so, maybe there is a way of developing using WebGl but as you can see, is not the easy-direct and supported way).
My answer: Taking into account that is a new product with no direct competence, they will not move forward offering other platforms unless they are forced to. Meanwhile they are happy that you use C#, Visual Studio, .Net, Edge and Windows and Unity3d under Windows (hard to believe to me you can do this using Unity3d at MacOS or Linux). It's also normal that they offer a limited ecosystem at the moment, with the same excuse: it is new, so limited support due to stability and optimal concerns is available just in their more familiar context: Microsoft products.
But as soon as new device come in and start offering new things (support for programing languages, OS or web) you should be completely sure that they will evolve or die.

Understanding VR ecology

I have background in android and have developed few apps of my own. Now I want to explore VR app development for android. Going through forums etc., first thing I understand is that I need to have basic infrastructure like unity 3d sdk, cardboard sdk, cardboard device etc. I am not able to understand roles these individual components play in overall bigger picture.
Like, why do I need unity 3d sdk if I have android sdk and cardboard sdk, and android studio as dev environment?
Then, if I plan to develop for something like Oculus then what all sdks and devices are needed, and which programming language I can work with?
Your options depend on which device you'll target:
Game engines like Unity: You need Unity and some plug ins and of course the device you will target too:
Google Cardboard / Daydream
Samsung Gear VR
From scratch application: Your language is java and you need to download the sdk for your target device:
Google Cardboard / Daydream SDK
Samsung Gear VR, Oculus Mobile SDK
Regards
I think there is a lot of promise in web-based VR. Of course, the app will not be as high fidelity as a native application built in Unity or Java but you get the benefit of being able to target many platforms out of the box. ReactVR is a cool project coming out of facebook that is making it easier and more performant to build VR apps with web technologies.
Here is a cool starter-kit that can help you get started if you are interested: https://github.com/scaphold-io/react-vr-graphql
P.S. GraphQL is a great tool for enriching your VR apps with data no matter if you're building it with React, Unity, or Java.
You can check out A-Frame (https://aframe.io), a web framework for building VR experiences. It's been out over a year and has a strong community and ecosystem. With web-based VR, you get cross-platform support across Rift, Vive, Cardboard, Daydream, GearVR out of the box. With A-Frame, you get all of the boilerplate with a single line of HTML. You just have to grab a VR-enabled browser.
A-Frame's architecture is similar to Unity's, entity-component, but A-Frame makes it declarative and similar to web development. With effort, the fidelity can rival native (https://blog.mozvr.com/a-painter/).

Multiplayer game using unity free version for Android

I am going to start my first game on unity which is a real time multiplayer game for android. I want to ask few things.
Do I need to buy pro version or any license for the whole process (from development to submission to play store, I know that splash screen can not be change). My game is almost like 8 pool ball with all multiplayer features (create user profile, play with friends, play with Facebook friends, challenge friends, leader board filters etc etc) like in 8 pool ball. I am planning of using Photon for multiplayer.
I also need the Web and Facebook version as well in future.
I also see few pages about this but I am still confuse.
Thanks in advance.
The Unity3D personal version does NOT support C# sockets on mobile, so no 3rd party realtime plugin, neither Photon nor any other, will work on Android (or on iOS) with the personal version of Unity3D, but they all require Unity3D pro.
There is only one exception to this: PUN+ works even with Unity3D Personal on Android and iOS, because it applies a fallback to use C++ sockets through a native plugin on platforms, on which C# sockets are not available in Unity3D Personal. However, this is only true for PUN+ (Photon Unity Networking Plus), not for PUN (Photon Unity Networking).
Unity in Version 5 has no engine-specific pro-Features anymore. So basically free = pro (besides some services and splash screens). So you can go with Unity 5 free and e.g. Photon.
Regardings costs: It could be that you need some sort of webservice for your game (managing profiles etc.) so the webserver could produce some costs. But for getting started there are many free services out there like https://www.heroku.com/
Furthermore you will need to pay the 25$ fee for Google Play to publish the game.

unity3D + kinect interection

guys i am working on a project which uses unity engien and kinect as input source ..now according to my knowledge there is not much support between unity and kinect sdk ..i have heard about zigfu framework but it is not giving me all functionalities i need..so what are options for me? im thinking to take some functionalities from zigfu and some from a background application build in .net 4.0 and using kinect official sdk ? can i connect to kinect via two interfaces at the same time? i.e zigfu and kinect sdk ....my background app will connect to unity via pipes ..is that agood option?
I've already done something similar. I'd like to use Unity 3D engine and do some interactions to animate the model using Kinect (Kinect SDK). Some functionality in Kinect SDK are not available in Zigfu, such as Hand Gripping detection.
Because Kinect SDK is suitable for WPF application, here is my solution :
Build your Unity into Unity Standalone (PC, Mac, Linux).
Create WPF application with Kinect stuff inside.
Add WindowsFormsHost inside your XAML of WPF application.
Embed your Unity Standalone into WPF using WindowsFormsHost.
To do a communication between WPF and Unity, you can use Raknet. It will work as socket does.
in my experience, its usually not a good idea to use "two of" something, when they both do the same thing. I've never heard of zigfu before, but it seems relatively easy to learn. Since its available as a unity plug in, it may be best to use that over kinect. The reason being that Unity isn't to "friendly" with third party applications.
If your aiming for XNA, its possible to convert easily if the plug-in doesn't already do it for you.
I Highly recommend looking over the unity forums, and the ZDK documentation.
http://forum.unity3d.com/threads/127924-Zigfu-dev-kit-for-Unity3D-Product-Launch