How to integrate USB integration in Unity Engine for Desktop application? - unity3d

Has anyone done any Unity project where USB communication is done? I have to do USB communication with a board to get sensor values. How can I go about doing this USB communication?

You can link .NET DLLs in Unity by adding them to the project (drag and drop worked, if I recall correctly). So, code your board access library in a Visual Studio project using .NET, exposing the API you need to use and add the DLL.
You will be able to access the contents of the DLL from your Unity code (although I have only done this with C#).
Of course, this only works on PC. For other platforms, I don't know if this is even possible.
EDIT: Minor correction.

Related

Is it possible for Unity to build directly to Arduino?

I am working on a project for a client and the request came in to bypass the arduino ide altogether and just write all the code in c# and have Unity build directly to the board. Is this something that would even be possible?
I know there are plenty of libraries I can access to communicate with the serial devices directly, could I just do this?
Has anyone tried to do this before?

Unity Build targets Mixed Reality but does not appear in apps

I have created a project using the Windows XR Plugin and XR Plugin management. I am NOT using the XR interaction toolkit, I have created my own tools, including my own XRRig using the Tracked Pose Driver.
My project is NOT built with UWP since I need file access that UWP does not provide or is too tricky to provide (after many attempts and work with a senior developer here, I just gave up). So I'm using standard Unity and my build settings are "PC, MAC, Linux stand-alone". My Player settings in the XR Plug-in management are "Windows Mixed Reality". I am using a few UWP functions for file access within the app.
Now, once the build is built, I obviously don't see it within Steam, but it also does not appear in the Mixed Reality list of applications. I have to start it manually by clicking on the icon on the desktop. It works great, but wtf...?
I know for applications to appear in Mixed Reality they must be built with UWP, but then if this build is neither UWP nor Steam, what is it then? How do I add it (or sideload it) to the Mixed Reality applications that the Windows menu brings up within the cliff house for e.g.?
To answer first question on what type of application is being built if UWP and Steam are removed as target in Unity, this is being built as a PC desktop application.
To answer the second question on how to access this application inside the Mixed Reality Cliff House shell when the application is not a UWP application, this can be launched via the “Classic Apps” pin inside the mixed reality cliff house shell.
Here is more information on the "Classic Apps" pin:
https://learn.microsoft.com/en-us/windows/mixed-reality/whats-new/release-notes-may-2019#how-to-launch
That should answer the question asked on how to launch inside of VR experience itself.

Open executable jar from Unity UWP

I'm building a windows app using Unity3d and I want to include an executable .jar file which has to open on button click from the app. Is there any way to develop this? Please help.
Unfortunately I don't think there is a direct way to do this. Pure UWP apps (which are generated by Unity) are sandboxed and compiled using .NET Native. As such they don't allow executing external code/process as that would pose security risk.
If you used an external Desktop Bridge app (which has full permissions) and communicated with it via app service (see this documentation article), you could theoretically achieve this, but it sounds a bit too complex. Another alternative would be to publish your app as a classic Win32 game. It would still be possible to publish it on Microsoft Store, however it would be limited to desktop devices.

Hololens applications using WebGL / ThreeJS

I've got a WebGL application built with JavaScript and ThreeJS. I was able to enable WebVR somewhat easily to create a immersive environment. I think my app is a better use case for mixed-reality/AR. Hololens seems to be the big player in that hardware space.
As I look at the development tools around Hololens its pretty much Unity and C#. Both great tools but as I start developing in this closed environment I kinda feel like I'm building a Silverlight application.
I've been trying to figure out if there is a trick I can accomplish to create a immersive experience with my WebGL app. I know that I can use Edge browser, however, thats a flat experience which is not any value to this use case.
I've found a few links:
is-it-possible-to-use-webgl-with-hololens-repost
can-i-make-a-universal-app-using-html-that-runs-on-hololens
augmented reality with awe.js
All these seem to either be 2d experiences or 'fake' AR using cameras and WebVR. Furthermore, I also looked into porting my WebGL app to Unity using Unity's JavaScript language features to find out that it is really a subset fork of actual JavaScript ( known as UnityScript ) making it way more effort than its worth.
Given all this, I'm wondering if its even possible to accomplish the feat and if anyone knows if this is something on the roadmap for microsoft?
There's this new tool from Microsoft called HoloJS. It's a framework for creating holographic apps using JavaScript and WebGL.
holographicjs is a C++ Windows Runtime Component for hosting Windows Holographic apps built with Javascript and WebGL.
Its interesting and a huge hack but might be a good first start for the community!
Note: Answer based on:
I do not know what Microsoft roadmap plans are or will be
The actual easy-way to develop for hololens is using VS and Unity3D (so, maybe there is a way of developing using WebGl but as you can see, is not the easy-direct and supported way).
My answer: Taking into account that is a new product with no direct competence, they will not move forward offering other platforms unless they are forced to. Meanwhile they are happy that you use C#, Visual Studio, .Net, Edge and Windows and Unity3d under Windows (hard to believe to me you can do this using Unity3d at MacOS or Linux). It's also normal that they offer a limited ecosystem at the moment, with the same excuse: it is new, so limited support due to stability and optimal concerns is available just in their more familiar context: Microsoft products.
But as soon as new device come in and start offering new things (support for programing languages, OS or web) you should be completely sure that they will evolve or die.

unity3D + kinect interection

guys i am working on a project which uses unity engien and kinect as input source ..now according to my knowledge there is not much support between unity and kinect sdk ..i have heard about zigfu framework but it is not giving me all functionalities i need..so what are options for me? im thinking to take some functionalities from zigfu and some from a background application build in .net 4.0 and using kinect official sdk ? can i connect to kinect via two interfaces at the same time? i.e zigfu and kinect sdk ....my background app will connect to unity via pipes ..is that agood option?
I've already done something similar. I'd like to use Unity 3D engine and do some interactions to animate the model using Kinect (Kinect SDK). Some functionality in Kinect SDK are not available in Zigfu, such as Hand Gripping detection.
Because Kinect SDK is suitable for WPF application, here is my solution :
Build your Unity into Unity Standalone (PC, Mac, Linux).
Create WPF application with Kinect stuff inside.
Add WindowsFormsHost inside your XAML of WPF application.
Embed your Unity Standalone into WPF using WindowsFormsHost.
To do a communication between WPF and Unity, you can use Raknet. It will work as socket does.
in my experience, its usually not a good idea to use "two of" something, when they both do the same thing. I've never heard of zigfu before, but it seems relatively easy to learn. Since its available as a unity plug in, it may be best to use that over kinect. The reason being that Unity isn't to "friendly" with third party applications.
If your aiming for XNA, its possible to convert easily if the plug-in doesn't already do it for you.
I Highly recommend looking over the unity forums, and the ZDK documentation.
http://forum.unity3d.com/threads/127924-Zigfu-dev-kit-for-Unity3D-Product-Launch