The idea is to create a HoloLens application that displays a hologram which can then be manipulated through the UWP application run on the desktop. The desktop application would contain various UI elements that manipulate the hologram(ie. a rotation button to turn it 45 degrees) and of course, see the same object as in the HoloLens. Naturally, I arrived at the 240 academy tutorial, but that seems a bit outdated compared to the current version of the Holotoolkit. It also doesn’t really fit my scenario, since I am not sharing between two HoloLens devices, but a desktop and a HoloLens. I figured that that shouldn’t really matter since you are still targeting for the UWP, but I wasn’t sure.
So what I tried so far was trying to edit the example scene “SharingSpawnTest” and target it for the PC to see what would happen, but I thought this wasn’t the way to do it since the project settings are set for Mixed Reality and not a regular UWP application for the desktop.
My question is basically if this is even possible and if so, how do I achieve this? Do I have to create two separate projects, one for the desktop and one specifically for the HoloLens and communicate that way?
Add a NetworkManagerHUD component to your NetworkManager object. Once you add it, type in the HoloLens IP address and connect to it. Currently, this only works for HoloLens to be the host and the desktop to be the client.
Related
I am new to UNET UNITY NETWORKING in general so what I made was a simple prototype game and everything works perfectly at least at that moment when I run two instances of the game on the same laptop so i was able to host and join form one instance to another
so it did exactly what I expected but when used another laptop and connected to the same wifi
I couldn't join with the hosted device anymore, i also tried it on mobile but the same thing happened
this is what it says when i run the game on two devices
You are trying to connect to localhost (the devices local address). That obviously doesn't work for connecting to another device.
You have to configure the IP/Address of the device that will later actually be hosting the game in the NetworkManager component:
Alternatively I can recommend the
NetworkManagerHUD component (has to be attached next to the NetworkManager component on the same GameObject)
so in the game you can still dynamically adjust the host IP/Address as shown in Using the NetworkManagerHUD:
Or if you want to go crazy you can instead also use the NetworkDiscovery to auto-discover a host in your local network (LAN - doesn't work for internet connections).
Easiest mode simply make sure to enable Use NetworkManager
Enable this to use the Network Manager
settings for broadcasting, and to then auto-join found games.
otherwise you can also implement your own script for handling the sending and receiving of broadcast messages.
UNet deprication
In general keep in mind that
UNet is deprecated, and will be removed from Unity in the future. A new system is under development. For more information and next steps see this blog post and the FAQ.
I can't figure out the wording to research the following idea.
With the Holograms app, I can set the hologram and see the other apps front window or use web browser at the same time.
How can I create an app that does not occupy the whole system but only run on the desktop with browser and others?
EDIT: I am trying to run an hologram within the shell.
https://developer.microsoft.com/en-us/windows/holographic/hololens_shell_overview
Mostly, this is for 2D apps, but the Holograms app runs 3D holograms, so is it possible to duplicate this?
3D Hololens apps do not currently support running side by side with other applications. As of the May release you are able to run multiple "flat apps" (UWP apps) side by side:
https://developer.microsoft.com/en-us/windows/holographic/release_notes_-_may_2016
Currently there has not been any announcement about running multiple 3d apps side by side. I optimistically hope that this is coming in a future OS release.
If I understand you correctly, you are interested in creating an application for the Hololens that rather than having it "appear in the world around you", would run in a Window (Desktop) that would appear in the spatial world (like Edge). If this is correct, any UWP application can run Windowed and be placed in you virtual space. You don't need a specific Hololens project for that. If however you want to pull in spatial data, you would need to wire-up the required events so that spatial data callbacks handle the incoming data. I'm not 100% sure Unity is a best fit for this problem (at least currently) since they are focusing attention on Spatial hologram applications rather than desktop UWP applications that consume spatial data.
Just to note, it is worth stating that a single 3D app (ie Unity developed) takes over the whole HoloLens device and experience. If you bloom (windows key in emulator) you can change to the UWP app view but you can't see other apps at the same time.
I am currently thinking about building a Multi Desktop Application for Oculus Rift. The idea is, that the user can use multiple screens like in a regular computer, but can see three windows (left, center, right) when moving their head. (So far so good)
But this is where it gets tricky: On the three windows, I would like to use three different browser tabs or applications that the user can watch and use simultaneously with mouse and keyboard.
Can anybody please suggest how to start the whole thing or is there a framework that I can use? UE4 or Enity3d will be able to give me multiple screens, but bringing the content/apps/browsers to them is what I cant figure out...
Thanks for any help!
fj
You can install VNC Server in the machine and render 3 VNC Clients. This is doable using, VNC Java client and render its output into a texture.
Using Open Wonderland it is possible since it has Oculus support
Regards
I've received a project from someone that includes an Arduino (Uno) board with some sensors and lights with an USB cable and a documented protocol for communicating with this board through a COM port. It works fine with some existing code, but I need to port the whole project to a Windows RT environment using an ARM processor and including the Metro interface for the application. And it's going to be completely rewritten...
First of all, my Windows RT device does have an USB port so it can connect to the board. But the challenge is to communicate with the board to read out the sensors and manipulate the lights and I happen to have problems finding some useful libraries, tutorials or other information about how to make these work together.
This project works fine with other Windows versions, though. I just need something specific for Windows RT/ARM/Metro.
Currently it is not possible to do this on Windows RT, and here is an explanation why. As a work around I am using a standard full screen WPF application in combination with the Surface SDK for touch enable UI components. The obvious disadvantage here is that you cannot publish the app to the store.
I think that we should actually try it on a real machine instead of the rt. The surface rt is basically for documents and the internet.
You'd be better off trying all of this with a Toshiba 2032.
A PDA from about 2003.
I am currently working on a project where I need to access a build in camera (software will run on a tablet), stream what the camera is showing, and allow the user to take a picture from the stream. I have a version of what I am trying to accomplish on my laptop with its built in camera working. The major difference is the Laptop is using windows XP the tablet is using windows 7.
Running the software on the tablet I get an exception (with some research it appears that exception is cause by no WIA device found). Is it possible that the built in Camera is not WIA compatible? The device does show in the Device Manager as an USB Camera Device, but unlike the camera on my laptop I can't access it directly. I have to use 3rd party software put in by the tablet maker to get the camera to work.
Has anyone experience similar problems? I have to believe if the tablet maker can do what I need I should be able to do something similar.
There also is the Windows Portable Device API that can access cameras, but that appears to be written in c++, without a .NET wrapper. Does anyone know of a simple tutorial of how I could get .NET to place nice with it? EDIT: Just tried WPD didn't list any devices either. I am beginning to thing this camera doesn't exist.
Any knowledge/ pointers to resources would be appreciated. (So far google has turned up the same few articles, no matter which way I approach the problem)
Turns out my Camera was not WIA compatible. I was able to get the tablet to do what I needed it to do using directshow (actually directshow.net)
Good links if others are trying to do something similar and having similar problems
http://msdn.microsoft.com/en-us/library/dd375454%28VS.85%29.aspx
http://directshownet.sourceforge.net/faq.html