How can i stream a live video call between 2 people, to thousands of people. I prefer to use webRTC but I can't find the answer to my question. The viewers should be able to watch the stream in a web app.
Streaming to thousands of people is not trivial! It's not as hard as it used to be 10 years ago but is still pretty hard.
WebRTC supports direct browser to browser (peer to peer) connections. This means that WebRTC is primarily targeted at 1:1 conversation. If you want the same conversation (video or audio) to be shared among more than 2 people you have the following options:
Connect any user to any other user. This creates a fully connected graph between the viewers. This is easy to do because all you need is webrtc. No special hardware/software. However it is also very inefficient in thems of trafic and distribution and doesn't scale boyound 5-6 people.
Use A WebRTC Video relay like Jitsi VideoBridge. According to the official performance claims VideoBridge can scale to 500-1000 people given fast and wide enough internet connection.
Direct the Webrtc stream between the two participants to a WebRTC enabled streaming server. If needed, transcode the input stream to a suitalbe codex - x264/VP8/VP9. Convert the input stream to a sutable protocl - RTMP/HLS/DASH. Distribute the content using the buildin functionality of the media server or by the use of a CDN. Play the video on the client side with a player - Flowplayer/JwPlayer/ViblastPlayer/VideoJs/your own Custome Player or a combination of the above. This is the hardest solution but it is also the best one in terams of scalability and platform reach. Such a solution can scale easily to thousands of people and reach all major browsers and mobile platforms.
I guess the third alternative is the one for you. You can read more about the whole capturing/publishing/transcoding/converting business in BlookGeek's greate blog post.
A webrtc based peer2peer connection is not the choice for one-to-n streaming. As there is no broadcast so far in webrtc you should consider another technique.
Related
I have been searching for an API service that allows for browser based video capturing and encoding for the purpose of live streaming. The company I work for needs an "all inclusive" API but it does not seem to exist. The closest we have come to are streaming services that allow for a player to be embedded and the stream output to be linked to that player. These services always seem to require that you use a separate software to encode your live video.
Are there copyrights held by YouTube and Vimeo that prevent others from creating these technologies? Are there limitations with cost and scale?
Live streaming is typically broken down into to categories:
Video conferencing where where is a limited number of participants. Here video quality typically doesn't matter. This is what browser based broadcasting solutions are designed for.
Second is large audience; where there is a single broadcaster with many viewers. Here separate encoding software is preferred because they are much more feature rich, allow for more options and controls, and allow for using good quality cameras.
COVID-19 made popular new categories of a broadcasted conference calls and simple "one too many" broadcasts from a laptops.
Not many companies have built an end to end services for this use case as significant demand for them has only existed for a few months, and it takes years to build something like this. When Covid is over this market may dry up again.
Qs: API service that allows for browser based video capturing and encoding for the purpose of live streaming:
WebRTC
Qs: Streaming player to be embedded and the stream output:
HLS/DASH Player on Any Standard Browser
You can have a Media Gateway to convert from Webrtc to HLS/DASH (one to many or broadcasting scenario):
Janus
Here is a diagram to illustrate the same
I am considering implementing Freshly Tilled Soil's jq webrtc plugin for a site I am building. Ive tested it and it works quite nicely... my only worry and question is that this will eat up all my clients bandwidth.
So compared to average site visits, does anyone know how webrtc compares?
I KNOW the standard is supposed to use as little bandwidth as possible, but I was hoping to hear from some developers who have used it on their sites.
WebRTC by itself is a peer-to-peer as mentioned by Hartley and with the use of javascript libraries such as peerJS, typically developers do not need a server.
However the client themselves will consume high bandwidth if you are having multiple video chat. For example in a 5 way video chat, each client would have to upload 4 stream to the other peers and download 4 stream from the other peers.
How big websits deliver media content to large number of users? Are there any video/audio streaming frameworks available? Do they store the big video clips in databases?
I know this is not a specific question; I am not looking for a perticular website - just thinking which is the most common way
CDNs. Content Delivery Networks. There are a few free options you can go with, but the best ones you pay for. With CDNs you can deliver your media to multiple nodes through TCP and UDP both. It works utilizing both Unicast and Broadcast.
I'm looking for a solution to provide streaming video to a variety of clients. I have iPhone clients as well as Flash-based clients. I'd like to not have to provide two separate mechanisms for delivering streaming content. Apple has decreed that HTTP Live Streaming is the way to provide streaming video to the iPhone (though does carve out an exception for small progressive downloads).
My question: Are there examples of Flash implementations consuming HTTP Live Streaming content? What challenges might be faced if I were to try and implement such a player? Are there other technologies I should consider?
Thanks!
Not yet. Maybe never. But...
What you could do is stream from a Wowza Media Server, which will allow you to publish one stream that can be consumed by various clients, including both Apple client devices and Flash browser clients.
I want to be able to (live) stream the frames/video FROM the iPhone camera to the internet. I've seen in a Thread (streaming video FROM an iPhone) that it's possible using AVCaptureSession's beginConfiguration and commitConfiguration. But I don't know how to start designing this task. There are already a lot of tutorials about how to stream video TO the iPhone, and it is not actually what I am searching for.
Could you guys give me any ideas which could help me further?
That's a tricky one. You should be able to do it, but it won't be easy.
One way that wouldn't be live (not answering your need, but worth mentioning) is to capture from the camera and save it to a video file. see the AV Foundation Guide on how to do that. Once saved you can then use the HTTP Live Streaming segmenter to generate the proper segments. Apple has applications for Mac OSX, but there's an open source version as well that you could adapt for iOS. On top of that, you'd also have to run an http server to serve those segments. Lots of http servers out there you could adapt.
But to do it live, first as you have already found, you need to collect frames from the camera. Once you have those you want to convert them to h.264. For that you want ffmpeg. Basically you shove the images to ffmpeg's AVPicture, making a stream. Then you'd need to manage that stream so that the live streaming segmenter recognized it as a live streaming h.264 device. I'm not sure how to do that, and it sounds like some serious work. Once you've done that, then you need to have an http server, serving that stream.
What might actually be easier would be to use an RTP/RTSP based stream instead. That approach is covered by open source versions of RTP and ffmpeg supports that fully. It's not http live streaming, but it will work well enough.