How to stream video from a Raspberry Pi to Elixir and to the web - raspberry-pi

I'm trying to stream high-quality live video from a Raspberry Pi 3+camera module. I want to public the stream on a public webpage (using video.js I think). I also want an elixir server (on the same LAN as the Pi) to consume the stream to detect faces and motion in it. I am a video streaming noob.
What's the best way to do this? Specifically:
What transport mechanism should I use on the Pi? RTP? WebRTC? Is that even a transport mechanism?
How do I pull images out of a (whatever transport mechanism I used above) stream in Elixir? Is there a library that does it?
If I want to support more simultaneous users than the Pi can handle, what's the right way to proxy the stream through Phoenix?
Thanks!

Related

How to transimit video streaming from HoloLens (C#) to PC (python) by socket communication?

I can realize sending data from HoloLens (using Unity coding by C#) to PC (coding by C#) by socket communication. But how to sending video steaming in real-time (the video starts to be recorded when I open the application in HoloLens) from HoloLens to PC by my original socket frame. In my view, maybe I should add some sentences to recognize the HoloLens camera, record video and encode the video to data, then transmit the data by my previous socket. Is it right and how to realize it?
By the way, I hope that the PC can receive the video by python so that I can process the video in the following steps.
To send video steaming in real-time between HoloLens and PC client, WebRTC should can meets your needs. Please check out this MixedReality-WebRTC project, it can help you to integrate peer-to-peer real-time audio and video communication into your application. It also implements local video capture you need and encapsulation it as a Unity3D component for rapid prototyping and integration.
You can read its official documentation via this link: MixedReality-WebRTC 1.0.0 documentation.
Moreover, this project can be used in desktop applications or even other non-mixed reality applications, which can save your development costs.

How do I record video from a remote webrtc camera?

I have a raspberry pi which has webrtc via uv4l2. It is awesome! I want to record the video from the camera on a server. It's your basic surveillance camera setup... central linux server with lots of storage space, remote IP cameras, etc. I've read dozens of pages and still can't figure it out. I tried all this kurento mumbo jumbo but it's all wretch an no vomit. It never gets there. What's the command to grabthe rpi video and dump it to disk? Please help!!!
UV4L already supports audio+video recording on the server (other than on the client), if you use it with Janus WebRTC. Have a look at the bottom of the Video Conference DEMO OS page for more details. At the moment, you will have to use the REST API to login into a Janus room and turn on/off the recording. The REST API is ideal if you want to control UV4L from a custom application, but there is also a panel which allows you to dynamically send REST requests to the UV4L server.

passing real-time metadata synchronized with video frames using webrtc and text tracks

I'm using WebRTC (win c++ native client) to broadcast real-time video to peers (hosted in Chrome).
Goal: send metadata along each video frame (metadata changes at frame level).
Would it be possible to send the metadata within a text track to be consumed by a javascript at the peer side?
If not, is there an alternative way of synchronizing WebRTC real-time video with metadata?e.g., using WebRTC DataChannel / WebSockets?
I guess you need this features
https://webrtc.googlesource.com/src/+/77c8e65b88c9d2d95442b66ada504e0f1c553d66
Update multiplex encoder to support having augmenting data attached to the video
No.
At the moment, WebRTC implementation (or specification) comes with no synchronization or the ability to synchronize data with video on a frame-by-frame basis. This is something being looked at for future WebRTC specification.
There are vendors who offer such a capability in their SDKs, but this is limited to their native SDKs and doesn't work in their browser based JS SDKs.

raspberry pi video streaming through SPI

I'm somewhat beginner in gstreamer.
I am trying to stream live video from raspberry pi to MSP430 MCU over SPI connection.
What I am thinking right now is get the buffers directly from raspberry pi using appsink, and then send them through SPI connection.
But I am not sure where the buffers can be saved, and how they are saved.
I'm looking for examples using appsink, but not really sure whether I can continuously get a stream of buffers.
Is there any way that I can do that?
Or any other better way to stream a video over SPI connection can be helpful.
Thank you.

how does Webcam stream is usually done?

I am currently doing a small project and one of the components is to capture the webcam stream from one side to the other (Client-->Server). right now i have the stream from the Server as bytes and as far as i know i should transfer these bytes using UDP. My question is how to do that,
is it should be enclosed into a file and then transferred?
should i transfer the raw bytes?
should i create a buffer at the client side and when it gets full show it on the screen?
in short i would like to know how to implement the transfer of the stream from the server to the client (i need just on side).
You can stream a webcam to a client via multiple ways.
use Windows media Server/ Flash media Server. Push your webcam to the server by Windows Media Encoder or flash media encoder, and use the server live link to playback on the client(windows /Web).
Use Windows Media Encoder to stream your webcam to anyone without a server involved. when your encoder starts, you will get a URL to view your stream, which you can use to playback on the client(windows /Web).
use third party streaming services, where they give you a publishing point to publish your webcam stream, and use the link provided by them to playback on the client(windows /Web).. (check with brighcove or Mogulus by LiveStream
Hope this helps.