I am making an iPhone application which will send video to a server for live streaming and I wanted to know that do we require a media server for this?
Yeah, You need to create a media server. You can send your streams to server from mobile using one of the many SDKs available.
For media server:
There are many ways that you can setup a server. For now lets see RTMP server which could be used with nginx. You can use hls(HTTP Live Streaming) as stated in above with this package. Here, the RTMP Server will receive the stream and converts it into the hls recommended format and HTTP server will distribute the streaming.
This link will provide you more information.
To distribute your media content you can use an ordinary HTTP server. If you want to provide live content, you need a server component that encapsulates your content into a format that can be distributed over HTTP.
Apple provides a suite of command line utilities that allow you to prepare your content. Just search for "HTTP Live Streaming Tools" at https://developer.apple.com/downloads
The HTTP Live Streaming Overview also is a good starting point.
Related
I am using StreamYard services to do a live stream to multiple destinations like Facebook and youtube. I want to create a mobile app using flutter that can receive that stream and use it (i mean to display the live stream). Streamyard uses only RMTP for a custom destination.
My question is: is there a way to create and host a custom RMTP online that can be shared between the StreamYard platform and my mobile app. I want it to work whenever I do live stream from the StreamYard it will be shared to Facebook, youtube, and my mobile app as well. I have done a lot of research but i find out the only way is to use windows or linux as a host, which i want it online.
Also, alternative solutions are welcome like using webRTC.
Because you use StreamYard, I think you need to use the INVITE feature to start a Video Chat then convert to live streaming, it works like bellow:
UserA --WebRTC--->---+
+--->- StreamYeard -->-RTMP-->- YouTube/Twitch.tv
UserB --WebRTC---->--+
You need to buy a non-free plan to support Custom RTMP destinations to publish the RTMP to your media server like SRS or Nginx, then you could broadcast to multiple destination, like this:
+->-- YouTube/Twitch.tv
|
StreamYeard ->-RTMP-+->- Custom RTMP destinations --+--RTMP-> YouTube/Twitch.tv
(SRS/Nginx media server) |
+--HLS/FLV--> Flutter App
Note: Once streaming to your RTMP server or video cloud platform, you could covert to HLS/HTTP-FLV for your FlutterAPP to play it. About player and protocol, please read here. It depends on which part you want to build by yourself, and it's possible to build by open-source projects.
Note: Note that you could use StreamYeard to streaming to YouTube and Custom RTMP server, or use FFmpeg to pull stream from your custom server then publish to any other live streaming platform.
For this solution, the StreamYeard actually plays as Video Chat or video conference platform, like ZOOM. It will transcode each WebRTC stream and mix all the audio and videos to one RTMP stream.
So you could use WebRTC server to build your StreamYeard, then use FFmpeg to transcode and mix the streams, because it is off topic so let me stop here.
1) I'm researching the technology I can use for a browser applicaton that streams video. It should capture video from webcam and push it to service where it's stored and can be watched later. One of the (possible?) options is Azure Media Services. But after a quick look at the documentation it seems that it's not possible to use pure modern browser without plugins. Am I correct? If no, can you please give some links to github projects or an example of code to look at?
2) Another possible technology option is Amazon Kinesis Video Streams (looks lite the best solution I came up with so far), but maybe you can recommend some other cloud services?
Thanks!
Currently the short answer is no.
WebRTC is the right solution for broadcasting from a browser. That's the only protocol for live streaming that will be "somewhat" widely supported in modern browsers like latest Chrome.
AMS does not yet support receiving WebRTC. We only support RTMP and Smooth ingest right now (Chunked MP4)
As far as I'm aware, Kinesis also expects you to send chunked MKV (like chunked MP4 but a less popular container format), which would need a browser plugin or javascript library to support. I don't see any Producer library from them in Javascript.
WebRTC is your answer - but to catch that in the cloud, you may need to look at other solutions that run in an Azure Container. There are a bunch of 3rd party solutions out there for WebRTC.
I am working on a live streaming project and came across many services like Wowza, Dacast, Ant etc. The one suits for my requirement uses RTMP protocol and so I will have to use an encoding software like OBS to publish the stream. I actually want to publish the stream from browser/iOS/Android.
I came across this FB presentation and seems like they are using RTMP protocol. FB is successfully doing the broadcast from the browser somehow.
Can I get an insight into how the things would be working with FB / similar RTMP based live streaming apps? Thanks in advance.
Facebook supports RTMP ingest of video (used by people who utilize the Live API), as well as WebRTC ingest for browser clients.
RTMP is not used as a distribution protocol. For that, there is DASH.
I am trying to create a web server(REST APIs), which should be able to store, organise and stream videos for a client request.
My confusion:
How should a user can upload videos. From research, I decided that I would store all the metadata for the videos in database(google datastore), and all the video files in separate storage(Google cloud storage). Now, to upload videos, what is the proper way?
Once a video is uploaded and stored, how will the streaming will happen. Suppose a user make a request to watch a video, server will get a http request for that. But how to stream videos? Is there any service for this? Because using http streaming directly in code affects performance I guess.
From my understanding, I want to use a service which should be able to stream videos from my storage to a client upon the server's request. I guessed the server should make request to this "video streaming service" only after verifying the user credentials.
For question 1 (how to enable customers to upload objects), signed URLs are a good bet.
Question 2 is a lot bigger. Depending on your needs you could simply point clients to GCS video files, but modern media serving is a bit more advanced than that. You may want to look into using GCE with a streaming video service, for example something like Wowza. Google offers a click-to-deploy experience for it: http://cloud.google.com/tryitnow/wowza
(Keep in mind that Wowza is a separate products requiring a paid license. I don't have any experience with it and neither advocate for nor disapprove of it).
First, I wish to emphasize the keyword from. There are a lot of questions and answers on this topic but I found that no answer provide a step by step road-map to achieve this.
What I wish to achieve :
I wish to stream the video and audio (live) being recorded from the camera of iPhone/iPad to my server. And that's it.
What have I figured till now :
I guess that we can't use HTTP live streaming because it's meant for server to client and not client to server. AV framework allows the output only in the form of a mov file.
What am I not able to figure :
I don't know how to get individual frames (live) and send them to my server one by one
PS: I really don't know anything about this... You are welcomed to oversimplify things. I am writing server in node.js
You may take a look at Wowza GoCoder iOS app.
It requires Wowza as the media server though, so you'll be able to provide full-features streaming to anyone you want.
Server side set up is done easily via Wowza configs or by third-party cloud control .