Live Streaming - WebRTC to RTMP - facebook

I am working on a live streaming project and came across many services like Wowza, Dacast, Ant etc. The one suits for my requirement uses RTMP protocol and so I will have to use an encoding software like OBS to publish the stream. I actually want to publish the stream from browser/iOS/Android.
I came across this FB presentation and seems like they are using RTMP protocol. FB is successfully doing the broadcast from the browser somehow.
Can I get an insight into how the things would be working with FB / similar RTMP based live streaming apps? Thanks in advance.

Facebook supports RTMP ingest of video (used by people who utilize the Live API), as well as WebRTC ingest for browser clients.
RTMP is not used as a distribution protocol. For that, there is DASH.

Related

How to create my own custom RMTP server from scratch and stream to multiple destinations

I am using StreamYard services to do a live stream to multiple destinations like Facebook and youtube. I want to create a mobile app using flutter that can receive that stream and use it (i mean to display the live stream). Streamyard uses only RMTP for a custom destination.
My question is: is there a way to create and host a custom RMTP online that can be shared between the StreamYard platform and my mobile app. I want it to work whenever I do live stream from the StreamYard it will be shared to Facebook, youtube, and my mobile app as well. I have done a lot of research but i find out the only way is to use windows or linux as a host, which i want it online.
Also, alternative solutions are welcome like using webRTC.
Because you use StreamYard, I think you need to use the INVITE feature to start a Video Chat then convert to live streaming, it works like bellow:
UserA --WebRTC--->---+
+--->- StreamYeard -->-RTMP-->- YouTube/Twitch.tv
UserB --WebRTC---->--+
You need to buy a non-free plan to support Custom RTMP destinations to publish the RTMP to your media server like SRS or Nginx, then you could broadcast to multiple destination, like this:
+->-- YouTube/Twitch.tv
|
StreamYeard ->-RTMP-+->- Custom RTMP destinations --+--RTMP-> YouTube/Twitch.tv
(SRS/Nginx media server) |
+--HLS/FLV--> Flutter App
Note: Once streaming to your RTMP server or video cloud platform, you could covert to HLS/HTTP-FLV for your FlutterAPP to play it. About player and protocol, please read here. It depends on which part you want to build by yourself, and it's possible to build by open-source projects.
Note: Note that you could use StreamYeard to streaming to YouTube and Custom RTMP server, or use FFmpeg to pull stream from your custom server then publish to any other live streaming platform.
For this solution, the StreamYeard actually plays as Video Chat or video conference platform, like ZOOM. It will transcode each WebRTC stream and mix all the audio and videos to one RTMP stream.
So you could use WebRTC server to build your StreamYeard, then use FFmpeg to transcode and mix the streams, because it is off topic so let me stop here.

Is it possible to push media to Azure from web browser?

1) I'm researching the technology I can use for a browser applicaton that streams video. It should capture video from webcam and push it to service where it's stored and can be watched later. One of the (possible?) options is Azure Media Services. But after a quick look at the documentation it seems that it's not possible to use pure modern browser without plugins. Am I correct? If no, can you please give some links to github projects or an example of code to look at?
2) Another possible technology option is Amazon Kinesis Video Streams (looks lite the best solution I came up with so far), but maybe you can recommend some other cloud services?
Thanks!
Currently the short answer is no.
WebRTC is the right solution for broadcasting from a browser. That's the only protocol for live streaming that will be "somewhat" widely supported in modern browsers like latest Chrome.
AMS does not yet support receiving WebRTC. We only support RTMP and Smooth ingest right now (Chunked MP4)
As far as I'm aware, Kinesis also expects you to send chunked MKV (like chunked MP4 but a less popular container format), which would need a browser plugin or javascript library to support. I don't see any Producer library from them in Javascript.
WebRTC is your answer - but to catch that in the cloud, you may need to look at other solutions that run in an Azure Container. There are a bunch of 3rd party solutions out there for WebRTC.

Which protocol should be used to send video stream to a media server for live streaming?

I am working on an iOS application to stream live video from iPhone to a media server and then make it available to a larger audience using RTSP.
Which protocol or method I should use to send the video stream to a server.
Thanks.
HTTP Live streaming is not designed for you needs, it's for server->clients + I won't comment about huge delay it implies
You better check RTSP or RTMP protocol and LivU blog
For Celluar
Apple seems to make a distinction between apps that are used for just streaming content from servers and those that are used for some type of conferencing.
I think VOIP types are safe, and it seems like gocoder presenter types apps don't have issues either. There's no official page detailing this, but there is some mention under what apples considers VOIP app.
No app has issues if its over wifi only.

Live streaming on iPhone

I just started work on live streaming on iPhone. So any help of how to do live streming in iPhone. I think if I can add video tag in HTML5 and then load that html in UIWebView will work.
Am I right? If not what is your sugestion to do live streaming. I want to embed some news channel live streaming link in the application so from where I can find those links.
You have to go through HTTP Live streaming document provided by Apple.There are some sample live streaming URLs.The file extension will be .m3u8.If you want to configure your own webwserver , you have to configure FFMPEG server in your webserver.The links which will help you
1)Apple document
2)stackoverflow
3)stackoverflow
4)stackoverflow
If you're making a web app in html5 then the video tag is a good choice.
But, If you're developing a native app then MPMoviePlayerController would be a much better choice. There are many example of how to use it online.
iOS doesn't support RTMP or RSTP, so your stream would need to be a HTTP Live stream. From memory the codec choice is very limited too, eg if you supply H264+mp3 you won't get any sound despite iOS supporting mp3.
Also remember that streams from other people (such as the BBC) will normally be protected by international copyright law, so unless you have prior permission to use their stream in your app you may be breaking the law.
Apple has some nice resources on Http Live Streaming.

Flash Playback and HTTP Live Streaming

I'm looking for a solution to provide streaming video to a variety of clients. I have iPhone clients as well as Flash-based clients. I'd like to not have to provide two separate mechanisms for delivering streaming content. Apple has decreed that HTTP Live Streaming is the way to provide streaming video to the iPhone (though does carve out an exception for small progressive downloads).
My question: Are there examples of Flash implementations consuming HTTP Live Streaming content? What challenges might be faced if I were to try and implement such a player? Are there other technologies I should consider?
Thanks!
Not yet. Maybe never. But...
What you could do is stream from a Wowza Media Server, which will allow you to publish one stream that can be consumed by various clients, including both Apple client devices and Flash browser clients.