I'm developping a Universal Windows Application to stream Hololens Camera and interact with it. Can you help ?
1st attack angle : Creating a Unity project and build in UWP.
I have succefully create the SharingServer from Holotoolkit-Unity and i can send custom message to the hololens from my unity app. But impossible to get video stream from device portal into a classic Unity video player
using https://[username]:[password]#[HololensIP]/api/holographic/stream/live.mp4?holo=true&pv=true&mic=true&loopback=true
returning WindowsVideoMedia error 0x80072f8f while reading [url]
I suppose the problem is comming from the video type (without duration) but it's .mp4 H264 witch should be accepted.
In front of that impossible wall, i start over into Visual Studio making directly a UWP.
2nd attack angle : UWP straight from VS2017
There i can use WindowsDeviceProtalWrapper to get the stream and display it into a MediaElement.
But right now impossible to import SharingClient.dll into VS.
So i can't send data to my hololens with my UWP.
Thank you for your help
Related
I'm working on a VR application (in Unity) for the Quest that one of its features is to be able to record yourself from a camera. I have tried some assets like VR capture (by RockVR), but they only work on certain platforms such as Windows or macOS. I have also tried capturing frames and then converting them to a .gif or .mp4 file but that also requires a platform-specific library. Does anyone have a solution?
I can realize sending data from HoloLens (using Unity coding by C#) to PC (coding by C#) by socket communication. But how to sending video steaming in real-time (the video starts to be recorded when I open the application in HoloLens) from HoloLens to PC by my original socket frame. In my view, maybe I should add some sentences to recognize the HoloLens camera, record video and encode the video to data, then transmit the data by my previous socket. Is it right and how to realize it?
By the way, I hope that the PC can receive the video by python so that I can process the video in the following steps.
To send video steaming in real-time between HoloLens and PC client, WebRTC should can meets your needs. Please check out this MixedReality-WebRTC project, it can help you to integrate peer-to-peer real-time audio and video communication into your application. It also implements local video capture you need and encapsulation it as a Unity3D component for rapid prototyping and integration.
You can read its official documentation via this link: MixedReality-WebRTC 1.0.0 documentation.
Moreover, this project can be used in desktop applications or even other non-mixed reality applications, which can save your development costs.
My multiplayer VR project is based on Unity3D, Oculus Integration v1.39 and PUN2 running on Oculus Quest. I'm using the standard teleport script provided in the Oculus library. PhotonAvatarView is what I use to keep avatar position/rotation in sync across clients.
Unfortunately, when a player teleports to a new location, the other one doesn't detect any change in the remote avatar. It seems like PhotonAvatarView doesn't see the change in location of the user, which is really strange. How can I fix it?
PhotonAvatarView code is available at this URL: https://doc.photonengine.com/en-us/pun/current/demos-and-tutorials/oculusavatarsdk
I'm developing an Android app where the user is in an ARCore Session and at the same time should be able to share the (raw, not augmented) camera video stream in a video call.
The used technologies are:
Unity 2018.1.6
Google ARCore Unity SDK package
WebRTC Video Chat Unity package
The required ARCore functionality and the WebRTC Video chat work great - but only individually. Regarding having both simultaneously I'm aware it cannot work immediately; in fact having built a test app I saw that starting the video call would stop ARCore preview and reverse, dependent on what is started later.
So my questions are:
Is there a way to extract the video stream from ARCore?
How do I use this video stream as input for the WebRTC video call?
Hi I am trying to create an iphone app that streams a video from a remote server (that automatically creates video files) , I'm using flash builder 4.7, mobile flex.
My expertise with fb / flex is not great but am getting there.
Using StageWebView I have made some progress and can stream a simple mp4 file , however the files I want to stream have the wrong extension so the iphones internal video player won't play them. I have no control over the remote server so can't change the mime type or file extensions. The files are legitimate mp4 files, if I copy one and change the extension they work fine.
Anybody got any idea how I can fool the ios video player into playing them?
Any help grateful appreciated I have been working on this for weeks and its driving me round the bend
Cheers
Toby