How to integrate IP Camera into Unity? - unity3d

I'm trying to map remote web camera view into a Unity 3D gameObject.
I want to map network IP Camera as a webcam texture on my gameObject (Plane).
Once I have tried to map my local camera webcam texture, It is working.
Is there a way to map a remote one?
At least can I insert remote video over the internet? like a youtube video?
If it is possible, then IP camera should not be a big deal.

Take a look at the VideoPlayer class introduced in Unity 5.6:
Content can be either a VideoClip imported asset or a URL such as
file:// or http://.
The manual explicitly mentions network streaming.
Unity’s video features include the hardware-accelerated and software
decoding of Video files, transparency support, multiple audio tracks,
and network streaming.

Related

I want to make a feature to play Youtube live broadcast in my Unity webgl project, how should I implement it

When I find the plugin to play video it prompts me CORS cross domain problem,Is there any other way to implement the live broadcast function of Youtube in webgl? One way is to set up a server to download the video and then perform the video transmission, but this is too much traffic for the serverenter image description here

Hololens 2 audio stream from Desktop

I'm currently developing an app for the HoloLens 2 that needs to stream audio from a desktop PC.
The idea is to send control information (position, orientation, etc.) to a Cycling'74 Max/Msp application running on a Windows 10 computer to process audio for 3D audio playback. I now need to somehow stream the resulting sound to the Unity app running on the HoloLens. Both devices run on the same network.
At the moment I've achieved something using mrtk webrtc for unity in combination with a virtual cable as input. My issue is that this seems to be optimized for microphone use as it applies some options like noise reduction and smaller bandwidth. I can't find a way to set the options for webrtc to stream what I need (music) with better quality.
Does anyone know how to change that on mrtk webrtc or has a better solution for the audio streaming to the hololens?
WebRTC project for Mixed Reality is deprecated and it is designed for real-time communication. If your requirement is media consumption, you need other workaround solutions.
For dedicated media streaming, you can set up a DLNA server on your PC for media access.
You may also set up Samba or NFS on your PC if you need to access files in other formats.

Can you stream a video in AR without it saving to the device with ARKit

I am trying to make an app that plays videos in AR on a flat surface. It is working but the videos take up too much space on the device. I was wondering if there was a way to stream the videos live with an internet connection in AR.
Yes there is!
Apple revealed this year, as part of RealityKit, something called Video Materials!
This can be loaded from a URL.
See the following:
https://maxxfrazer.medium.com/realitykit-videomaterials-66ad05f396f4

Is there a way to stream Unity3D camera view out as a real camera output?

I am thinking of streaming out a Unity3D camera view as it were a real camera (same output, streams and options). I would need to do the following:
Encode the frames in either: MJPEG/ MXPEG/ MPEG-4/ H.264/ H.265/ H.264+/ H.265+.
Send metadata: string input/output.
I have not seen anything about streaming out unity camera views, except 1 question (Streaming camera view to website in unity?).
Would anyone know if this were possible? and if so what would the basic outline to follow be?
Thank you for the feedback.
I would probably start with keijiro's FFMPEG Out plugin, I have a strong feeling FFPMEG allows streaming the video via commandline, which is exactly what keijiro is doing in his plugin, should be relatively easy to modify it to stream instead of recording to disk https://github.com/keijiro/FFmpegOut
You can also do it via ROS creating a publisher and publishing camera stream from the Unity to the ROS topic :)

How do media browser plugins function?

If I want to use Google Video chat on my browser
I have to download and install a plugin for it to work.
I would like to make a piece of software that creates
some interactions with a video displayed in the browser.
I assume that it might be problematic doing it with one solution
for all the browser, so if I might need to focus on only one browser
lets talk about firefox, although I think the firefox addon SDK
would not let me do a thing as complex as video interaction.
But how does the Google Video chat plugin work for the browsers?
It's only an example for one of those plugins that lets you
do activities (media in this case) with your browser
which are normally impossible.
As I understand it, Google Video Chat uses Flash.
I'm looking for something official-looking to back that up now...
Edit: I think this explains it pretty well.
Flash Player exposes certain audio/video functions to the (SWF) application. But the Flash Player does not give access to the raw real-time audio/video data to the application. There are some ActionScript API classes and methods: the Camera class allows you to capture video from your camera, the Microphone class allows you to capture audio from your microphone, the NetConnection/NetStream classes allow you to stream the video from Flash Player to remote server and vice-versa, the Video class allows you to render video either captured by Camera or received on NetStream. Given these, to display the video in Flash Player the video must be either captured by Camera object or received from remote server on NetStream. Luckily, ActionScript allows you to choose which Camera to use for capture.
When the Google plugin is installed, it exposes itself as two Camera devices; actually virtual device drivers. These devices are called 'Google Camera Adaptor 0' and 'Google Camera Adaptor 1' which you can see in the Flash Player settings, when you right click on the video. One of the device is used to display local video and the other to display the remote participant video. The Google plugin also implements the full networking protocol and stack, which I think are based on the GTalk protocol. In particular, it implements XMPP with (P2P) Jingle extension, and UDP-based media transport for transporting real-time audio/video. The audio path is completely independent of the Flash Player. In the video path: the plugin captures video from the actual camera device installed on your PC, and sends it to the Flash Player via one of the virtual camera device driver. It also encodes and sends the video to the remote user. In the reverse direction, it receives video (over UDP) from the remote user, and gives it to the Flash Player via the second of the virtual camera device drivers. The SWF application running in the browser creates two Video objects, and attaches them to two Camera object, one each for the two virtual video device, instead of attaching it to your real camera device. This way, the SWF application can display both the local and remote video in the Flash application.