My multiplayer VR project is based on Unity3D, Oculus Integration v1.39 and PUN2 running on Oculus Quest. I'm using the standard teleport script provided in the Oculus library. PhotonAvatarView is what I use to keep avatar position/rotation in sync across clients.
Unfortunately, when a player teleports to a new location, the other one doesn't detect any change in the remote avatar. It seems like PhotonAvatarView doesn't see the change in location of the user, which is really strange. How can I fix it?
PhotonAvatarView code is available at this URL: https://doc.photonengine.com/en-us/pun/current/demos-and-tutorials/oculusavatarsdk
Related
This is my first time trying to find an API. Should I be successful I will then need to figure out how to use it.
I want to run a VR immersive experience on an oculus quest which a person wears while sitting in the motion simulator, I want the movement of a boat in the VR headset to correspond with the movement of the chair.
The software for the chair I have is called Actuate motion v1.0.8
On their website it says they have a "C API" (Which I can't find online documentation) but they aslo mention they have a unity plugin you can use for your game. I would use their plugin before attempting their api.
We are using Vuforia for image tracking with hololens and unity engine. Vuforia works fine. We are also using Azure Spacial Anchors to fix the location of objects. However, Anchors do not seem to work with Vuforia. It appears that Vuforia captures camera events and does not pass them on to Azure Anchors, maybe?
Is there a way to get both technologies working at the same time?
the major issue would be Vuforia occupied the Camera pipeline
you may stop Vuforia and switch to ASA and switch back.
Or you may use pics and time stamps and ASA
Please read this page
https://library.vuforia.com/platform-support/working-camera-unity
may help you get the camera frame. then you may make the parameter transferd to a service you hosted in linux server, with Spatial https://github.com/microsoft/azure_spatial_anchors_ros
I usually using firebase for syncing every player for my multiplayer game but this time I can't because this time I want create a desktop game and firebase only support mobile.
can I use Gundb as alternative to store the player position and animation. and every client automatically syncing the data
#alucard555 yes, there is a very very simple example of a browser-based game (Asteroids in 250LOC!) that could work in a desktop app via Electron or something:
https://github.com/amark/gun/blob/master/examples/game/space.html
You can play the game (arrow keys to move, space to fire a shockwave, doesn't work on mobile or small screens) here:
http://gunjs.herokuapp.com/game/space.html
With regards to Unity3D specifically, you would need a JavaScript bridge. I myself have not done Unity3D development myself, but I have (?) heard (?) it supports JavaScript? Or some variant of it?
GUN by itself is plain vanilla JS, the only porting UnityScript may need is changing the default localStorage and WebSocket adapters (these are modular and can easily be switched out for something Unity supports).
However I do not have enough Unity3D experience to speak on this matter. (I just looked up Firebase's Unity support, and noticed that it is not JS based, it is C++. This may mean JS is incompatible with Unity?)
I'm developping a Universal Windows Application to stream Hololens Camera and interact with it. Can you help ?
1st attack angle : Creating a Unity project and build in UWP.
I have succefully create the SharingServer from Holotoolkit-Unity and i can send custom message to the hololens from my unity app. But impossible to get video stream from device portal into a classic Unity video player
using https://[username]:[password]#[HololensIP]/api/holographic/stream/live.mp4?holo=true&pv=true&mic=true&loopback=true
returning WindowsVideoMedia error 0x80072f8f while reading [url]
I suppose the problem is comming from the video type (without duration) but it's .mp4 H264 witch should be accepted.
In front of that impossible wall, i start over into Visual Studio making directly a UWP.
2nd attack angle : UWP straight from VS2017
There i can use WindowsDeviceProtalWrapper to get the stream and display it into a MediaElement.
But right now impossible to import SharingClient.dll into VS.
So i can't send data to my hololens with my UWP.
Thank you for your help
i'm developing VR using google cardboard SDK..
i want to move on virtual environment when i walk on real world, like this : https://www.youtube.com/watch?v=sZG5__Z9pzs&feature=youtu.be&t=48
is it possible to make VR application like that for android...? maybe using accelerometer sensor ? how can i implement this using unity...?
i try to record accelerometer sensor while i walk with smartphone, here are the result : https://www.youtube.com/watch?v=ltPwS7-3nOI [i think the accelerometer value is so random -___- ]
Actually it is not possible with only mobile:
You're up against a fundamental limitation of the humble IMU (the primary motion sensor in a smartphone).
I won't go into detail, but basically you need an external reference frame when trying to extract positional data from acceleration data. This is the topic of a lot of research right now, and it's why VR headsets that track position like the Oculus Rift have external tracking cameras.
Unfortunately, what you're trying to do is impossible without using the camera on your phone to track visual features in the scene and use those as the external reference point, which is a hell of a task better suited to a lab full of computer vision experts.
One another possible but difficult way is:
This may be possible if you connect device to internet then watch it's position from satelite(google maps or something like that)but that is a very hard thing to do.