Is there a Vivox standalone application for non-gamers? - unreal-engine4

I am wondering if someone can communicate with players with an external vivox standalone application.

You most likely can as the Vivox SDK can be included in any type of Project not just Unreal Engine.
This would be better asked through one of their official channels.
https://developer.vivox.com/
Vivox is a Unity Technology, so their support is handled on the Unity Forums.
https://unity.com/products/vivox

Related

Can I use Unity to build games for the KaiOS platform?

KaiOS is "a web-based mobile operating system that is based on Linux kernel forked from B2G (Boot to Gecko), an open source community-driven successor of Firefox OS which was discontinued by Mozilla in 2016. (As written on its Wikipedia Article)
I wanted to use Unity Game Engine to make a game for it but I have not seen KaiOS as one of the available platforms to build for.
Is it possible that one of the platforms that Unity builds for (such as HTML5) can be made somehow compatible for KaiOS? Are there any other available tools I can use? Or will I have to work on the raw game from scratch?
The answer is no:
https://docs.unity3d.com/Manual/UnityCloudBuildSupportedPlatforms.html
Sorry to be a let down. It does support iOS though.
EDIT
If you put a little more work into it using HTML5, then yes.

How to create Augmented Reality Web app using Unity & Vuforia?

I am developing an Augmented Reality app to be integrated into a website using Unity.I need to take output in WebGL. I am using Vuforia to create AR experience. Since Vuforia is not supported with WebGL, i am not able to build. Please suggest an alternate method or how to do Augmented reality in Unity for Web. Is there any alternative to Vuforia?
The good news is yes, you definitely can build an AR experience on the web!
The bad news is that none of the current libraries built for doing so offer a Unity plugin.. Meaning you'll either have to create a wrapper, do some complicated RPC call to talk to the JS library via Unity, or completely scrap Unity altogether and use only the library. To my knowledge, the best browser-based AR library is AR.js. I know this isn't the answer you were hoping for, but I hope you're able to achieve your goals. Good luck!
This is probably a bit late in the thread. But I'd like to add an option which might help. You can definitely build your AR app in web using WebGL as output. There is easy way to integrate it with a webiste too. SLAM based AR like Google ARCore is a great example to do it.
There are two options:
You can build such an app from scratch which will obviously take more time. Because apart from development, setting up hosting infrastructure is a challenge.
Otherwise, if you want to scale such AR web app development with low or no code and cloud ready hosting, you can use a SaaS platform called Marvin XR.
You can login and try it out for FREE: https://www.marvinxr.com:8443
Hope this helps the other folks who stumble upon this thread.

Hololens applications using WebGL / ThreeJS

I've got a WebGL application built with JavaScript and ThreeJS. I was able to enable WebVR somewhat easily to create a immersive environment. I think my app is a better use case for mixed-reality/AR. Hololens seems to be the big player in that hardware space.
As I look at the development tools around Hololens its pretty much Unity and C#. Both great tools but as I start developing in this closed environment I kinda feel like I'm building a Silverlight application.
I've been trying to figure out if there is a trick I can accomplish to create a immersive experience with my WebGL app. I know that I can use Edge browser, however, thats a flat experience which is not any value to this use case.
I've found a few links:
is-it-possible-to-use-webgl-with-hololens-repost
can-i-make-a-universal-app-using-html-that-runs-on-hololens
augmented reality with awe.js
All these seem to either be 2d experiences or 'fake' AR using cameras and WebVR. Furthermore, I also looked into porting my WebGL app to Unity using Unity's JavaScript language features to find out that it is really a subset fork of actual JavaScript ( known as UnityScript ) making it way more effort than its worth.
Given all this, I'm wondering if its even possible to accomplish the feat and if anyone knows if this is something on the roadmap for microsoft?
There's this new tool from Microsoft called HoloJS. It's a framework for creating holographic apps using JavaScript and WebGL.
holographicjs is a C++ Windows Runtime Component for hosting Windows Holographic apps built with Javascript and WebGL.
Its interesting and a huge hack but might be a good first start for the community!
Note: Answer based on:
I do not know what Microsoft roadmap plans are or will be
The actual easy-way to develop for hololens is using VS and Unity3D (so, maybe there is a way of developing using WebGl but as you can see, is not the easy-direct and supported way).
My answer: Taking into account that is a new product with no direct competence, they will not move forward offering other platforms unless they are forced to. Meanwhile they are happy that you use C#, Visual Studio, .Net, Edge and Windows and Unity3d under Windows (hard to believe to me you can do this using Unity3d at MacOS or Linux). It's also normal that they offer a limited ecosystem at the moment, with the same excuse: it is new, so limited support due to stability and optimal concerns is available just in their more familiar context: Microsoft products.
But as soon as new device come in and start offering new things (support for programing languages, OS or web) you should be completely sure that they will evolve or die.

unity3D + kinect interection

guys i am working on a project which uses unity engien and kinect as input source ..now according to my knowledge there is not much support between unity and kinect sdk ..i have heard about zigfu framework but it is not giving me all functionalities i need..so what are options for me? im thinking to take some functionalities from zigfu and some from a background application build in .net 4.0 and using kinect official sdk ? can i connect to kinect via two interfaces at the same time? i.e zigfu and kinect sdk ....my background app will connect to unity via pipes ..is that agood option?
I've already done something similar. I'd like to use Unity 3D engine and do some interactions to animate the model using Kinect (Kinect SDK). Some functionality in Kinect SDK are not available in Zigfu, such as Hand Gripping detection.
Because Kinect SDK is suitable for WPF application, here is my solution :
Build your Unity into Unity Standalone (PC, Mac, Linux).
Create WPF application with Kinect stuff inside.
Add WindowsFormsHost inside your XAML of WPF application.
Embed your Unity Standalone into WPF using WindowsFormsHost.
To do a communication between WPF and Unity, you can use Raknet. It will work as socket does.
in my experience, its usually not a good idea to use "two of" something, when they both do the same thing. I've never heard of zigfu before, but it seems relatively easy to learn. Since its available as a unity plug in, it may be best to use that over kinect. The reason being that Unity isn't to "friendly" with third party applications.
If your aiming for XNA, its possible to convert easily if the plug-in doesn't already do it for you.
I Highly recommend looking over the unity forums, and the ZDK documentation.
http://forum.unity3d.com/threads/127924-Zigfu-dev-kit-for-Unity3D-Product-Launch

Guide for developing J2ME applications

I am new to J2ME and what I have now is Netbeans 6.7.1 IDE. Is there any basic guide for developing Mobile applications in Netbeans 6.7.1? Please provide me the links.
Netbeans.org itself has great tutorials for mobile development in Netbeans. And that is what you need:
http://netbeans.org/kb/trails/mobility.html
For examples this is a very good Quick Start for Netbeans J2ME development:
http://netbeans.org/kb/docs/javame/quickstart.html
I think this book best source for J2ME with Netbeans:
Kicking Butt with MIDP and MSA: Creating Great Mobile Applications (The Java Series)
Book Description:
The release of MIDP 2.0 and the introduction of the new Mobile Service Architecture (MSA) are generating momentum for the Java ME platform. As more and more Java-enabled mobile devices become available and more service providers become open to third-party development, the demand for customized applications will grow dramatically. Now, there's a practical, realistic guide to building MIDP 2.0/MSA applications that are robust, responsive, maintainable, and fun.
Long-time Java ME author Jonathan Knudsen offers real solutions for the complex challenges of coding efficiency, application design, and usability in constrained mobile environments. Experienced Java developers will master MIDP 2.0 and MSA programming through clear, carefully designed examples. Downloadable code is available for both NetBeans Mobility Pack and the Sun Java Wireless Toolkit. Kicking Butt with MIDP and MSA's wide-ranging content covers:
Pushing MIDP's limits, and exploiting MSA's full power
Using MIDlets, Forms, commands, core classes, and invocation
Building effective mobile user interfaces
Designing graphics with the Canvas, the Game API, SVG, and 3D
Providing storage and resources: record stores, FileConnection, and PDA PIM
Internationalizing mobile applications
Networking via WMA, Bluetooth, Web services, and SIP
Parsing XML documents
Implementing audio and advanced multimedia
Securing mobile applications with SATSA and the Payment API
Building advanced location-based applications
Designing applications for multiple devices
Creating end-to-end mobile application architectures
Tell what platform you are developing on so that the tools available for you can be given.
Also, you most definitely want an emulator so that can test your applications directly on your computer (Saves time).
Honestly, can't give much more advice than to know java well and to use google + stackoverflow. Those are what I did and ended up developing a commercial app in j2me just fine.