Why do you need a separate encoder for streaming live video? - encoding

I have been searching for an API service that allows for browser based video capturing and encoding for the purpose of live streaming. The company I work for needs an "all inclusive" API but it does not seem to exist. The closest we have come to are streaming services that allow for a player to be embedded and the stream output to be linked to that player. These services always seem to require that you use a separate software to encode your live video.
Are there copyrights held by YouTube and Vimeo that prevent others from creating these technologies? Are there limitations with cost and scale?

Live streaming is typically broken down into to categories:
Video conferencing where where is a limited number of participants. Here video quality typically doesn't matter. This is what browser based broadcasting solutions are designed for.
Second is large audience; where there is a single broadcaster with many viewers. Here separate encoding software is preferred because they are much more feature rich, allow for more options and controls, and allow for using good quality cameras.
COVID-19 made popular new categories of a broadcasted conference calls and simple "one too many" broadcasts from a laptops.
Not many companies have built an end to end services for this use case as significant demand for them has only existed for a few months, and it takes years to build something like this. When Covid is over this market may dry up again.

Qs: API service that allows for browser based video capturing and encoding for the purpose of live streaming:
WebRTC
Qs: Streaming player to be embedded and the stream output:
HLS/DASH Player on Any Standard Browser
You can have a Media Gateway to convert from Webrtc to HLS/DASH (one to many or broadcasting scenario):
Janus
Here is a diagram to illustrate the same

Related

Stream a live video call between 2 people to thousands of people

How can i stream a live video call between 2 people, to thousands of people. I prefer to use webRTC but I can't find the answer to my question. The viewers should be able to watch the stream in a web app.
Streaming to thousands of people is not trivial! It's not as hard as it used to be 10 years ago but is still pretty hard.
WebRTC supports direct browser to browser (peer to peer) connections. This means that WebRTC is primarily targeted at 1:1 conversation. If you want the same conversation (video or audio) to be shared among more than 2 people you have the following options:
Connect any user to any other user. This creates a fully connected graph between the viewers. This is easy to do because all you need is webrtc. No special hardware/software. However it is also very inefficient in thems of trafic and distribution and doesn't scale boyound 5-6 people.
Use A WebRTC Video relay like Jitsi VideoBridge. According to the official performance claims VideoBridge can scale to 500-1000 people given fast and wide enough internet connection.
Direct the Webrtc stream between the two participants to a WebRTC enabled streaming server. If needed, transcode the input stream to a suitalbe codex - x264/VP8/VP9. Convert the input stream to a sutable protocl - RTMP/HLS/DASH. Distribute the content using the buildin functionality of the media server or by the use of a CDN. Play the video on the client side with a player - Flowplayer/JwPlayer/ViblastPlayer/VideoJs/your own Custome Player or a combination of the above. This is the hardest solution but it is also the best one in terams of scalability and platform reach. Such a solution can scale easily to thousands of people and reach all major browsers and mobile platforms.
I guess the third alternative is the one for you. You can read more about the whole capturing/publishing/transcoding/converting business in BlookGeek's greate blog post.
A webrtc based peer2peer connection is not the choice for one-to-n streaming. As there is no broadcast so far in webrtc you should consider another technique.

Does WebRTC chew up lots of bandwidth?

I am considering implementing Freshly Tilled Soil's jq webrtc plugin for a site I am building. Ive tested it and it works quite nicely... my only worry and question is that this will eat up all my clients bandwidth.
So compared to average site visits, does anyone know how webrtc compares?
I KNOW the standard is supposed to use as little bandwidth as possible, but I was hoping to hear from some developers who have used it on their sites.
WebRTC by itself is a peer-to-peer as mentioned by Hartley and with the use of javascript libraries such as peerJS, typically developers do not need a server.
However the client themselves will consume high bandwidth if you are having multiple video chat. For example in a 5 way video chat, each client would have to upload 4 stream to the other peers and download 4 stream from the other peers.

How to encode live broadcast of the Local FM radio stations

We are in the midst of research stage for our coming web project. We would like to make a website that streams (all) the Local FM Radio Stations.
In research for the right tools to set up the said website, several questions arises.
What software do we need to encode the live broadcast of (all) the Local FM Radio Stations? How can we connect to FM Radio Stations?
Do we need Virtual Private Server to run the software from question number One, 24/7? Can VPS do that, run a software 24/7?
If we manage to encode the live broadcast of (all) the local FM Radio stations, how do we send this thing to our website? Can we use a simple audio player such as quicktime/flash or html5 audio player and embed it to our website?
I hope someone will help us on this matter. You help is greatly appreciated. :)
Audio Capture
The first thing you need to do is set up an encoder source for your streams. I highly recommend putting the encoder at each radio station. The quality of FM radio isn't the greatest. You will get much better audio quality at the station. In addition, at least here in the US, many radio stations have all of their studios in the same place. It isn't uncommon to find 8 stations all coming from the same set of offices. Therefore, you may only have to install equipment in 3 or 4 buildings to cover all the stations in your market.
Most stations these days are using digital mixing. Buy a sound card that has a compatible digital input. AES/EBU and S/PDIF are common, and sound cards that support these are affordable.
If you must capture audio over the air, make sure you are using high quality receivers (digital where available), with a high quality outdoor antenna. There are a variety of receivers you can purchase, many of which mount directly in a rack.
Encoding
Now for the actual encoding, you need software. I've always had good luck with EdCast (if you can find the version prior to "EdCast Reborn"). SAM is a good choice for stations that have their own music library they need to manage, but I don't suggest it in your case. You can even use VLC for this part.
You will need to pick a good codec. If you want compatibility with HTML5, you will want to encode in MP3 and Ogg Vorbis. aacPlus is a good choice for saving bandwidth while still providing a decent audio quality. Most stations these days use aacPlus when possible, but not all browsers can play it, which is why you also need the other two. You can (and should) use multiple codecs per station.
Server Software
I highly recommend Icecast or SHOUTcast. They take your encoded audio and distribute it to listeners. They serve up an HTTP-like stream, which is generally compatible. If you are interested, I also do Icecast/SHOUTcast compatible hosting, with the goal of being compatible with more devices, particularly mobile.
Playback
Many stations these days use a player that tries HTML5, and falls back to Flash if necessary. jPlayer is a common choice, but there are many others. It is also good to provide a link to a playlist file containing the URL of your stream, so that users can listen in their own audio player if they choose.

are there any tools/scripts for analyzing/retrieving flash/html5 video information/metadata

I want to play youtube video with a certain resolution, like 360p
and capture the packets, and then extract the video from the packets
and then I want to analyzing/retrieving flash/html5 video information/metadata from these videos
BTW, are videos still with the same resolution when they are extracted from the captured packets?
note that these videos may not be complete
are there any good tools for analyzing/retrieving flash/html5 video information/metadata
like video bit rate, video resolution(like 360p, 480p), used audio/video codecs, video size and duration/duration
if the video is not complete, the information would ideally include the original video size, the actual video size, the original video length/duration and the actual video length/duration
I hope it is a script, if it is a tool. I hope it can be run through shell using command line coz I want automation.
A paper says perl could do this, but I don't how
thanks!
(long comment, not a complete answer)
IANAL, but your goals may not fit the YouTube Terms of Service:
Section 4. C
You agree not to access Content through any technology or means other than the video playback pages of the Service itself, the Embeddable Player, or other explicitly authorized means YouTube may designate.
Section 4. H
You agree not to use or launch any automated system, including without limitation, "robots," "spiders," or "offline readers," that […] sends more request messages to the YouTube servers […] than a human can reasonably produce in the same period by using a conventional on-line web browser. Notwithstanding the foregoing, YouTube grants the operators of public search engines permission to use spiders to copy materials from the site for the sole purpose of and solely to the extent necessary for creating publicly available searchable indices of the materials, but not caches or archives of such materials. […]
You may be able to access the required information directly using the YouTube Data API. Here is a reference, and here is a list of directly supported programming languages. Perl will work as well, as the underlying data format is plain XML or JSON.
You might also find these SO questions YouTube Player API: How to get duration of a loaded/cued video without playing it? and Youtube API get video duration from the XML enlightening.

Flash Playback and HTTP Live Streaming

I'm looking for a solution to provide streaming video to a variety of clients. I have iPhone clients as well as Flash-based clients. I'd like to not have to provide two separate mechanisms for delivering streaming content. Apple has decreed that HTTP Live Streaming is the way to provide streaming video to the iPhone (though does carve out an exception for small progressive downloads).
My question: Are there examples of Flash implementations consuming HTTP Live Streaming content? What challenges might be faced if I were to try and implement such a player? Are there other technologies I should consider?
Thanks!
Not yet. Maybe never. But...
What you could do is stream from a Wowza Media Server, which will allow you to publish one stream that can be consumed by various clients, including both Apple client devices and Flash browser clients.