How to set up a media server infrastrucure? - streaming

I need to set up a media server infrastructure to support live streaming. I have endless questions in relation to this as this area is totally new to me. I have done the research but I received so many different truths that I don't know who to believe.
Context:
Wowza
Wowza Engine
Audio and Video live streaming
15 x 20-minute live streams per day
Between 7 and 15 CONCURRENT live streams may happen at the same
moment in time
720p quality is sufficient
Every live stream will be viewed by only between 1 and 5 viewers
Viewers will view the stream on an internet browser of their choice.
However if possible they can also view the livestream on their phones
(even if its via the website through the phone's browser).
Choppy/buffering streams are not an acceptable thing
Streams do not need to be recorded and stored
Footage may be taken from webcams or phones
Audience is in the US (and so are the publisher of the live stream)
Questions:
1) Do I need Wowza transcoder?
Some suppliers told me I need the transcoders only if I require
adaptive bitrate.
Others told me I need the transcoders only if I need to stream to iPhone or apple devices
Others told me that I need
transcoders because I want to do concurrent live streams and I would
consequently need one transcoder licence per concurrent live stream
Others told me that concurrent live streams (multiple channels?) can
happen even if I do not buy transcoder licences
At this stage I do not know who to believe. The Wowza documentation says transcoders are required to convert incoming streams from one format to another and to provide adaptive bitrate but I am still not sure.
2) Can I host the website at a web hosting provider and buy (and consume) the media server from somewhere else?
- For example can I host the website on TSOHost but then have the media server from primcast or serverroom.net?
3) If the answer to the above is yes, will the bandwidth of both hosting providers be consumed when transmitting a live stream?
4) Since footage is taken either from phones or from webcams, which software do the users need to install in order to transmit the footage?
5) For 15 x 20 minute live streams per day, how much bandwidth is consumed? How do I calculate that?
6) Do I need adaptive bitrate streaming? Or is it required only if the audience can be expected to have bad internet speed?
7) Does adaptive bitrate streaming require special software on the encoding side or do the regular Adobe Flash Live Encoder and Wowza GoCoder do the trick?
Thank you in advance. If you know a freelance expert I can hire give me his details :P.

Quite a few questions, I'll try to add some answers (and you can contact me outside of SO of course)
1, Do you need Wowza Transcoder?
If the streams come from software that can send multiple bitrates, like the Flash Media Live Encoder, capable of sending the same stream in 3 different qualities, then you don't. You can use alternatively free software like ffmpeg on publisher side to avoid transcoding, but the cost is more CPU load on publisher side and of course more bandwidth upwards. Or you can still receive one stream with ffmpeg on the server and produce different qualities on the media server internally and feed those to Wowza Streaming Engine. But if you are not sensitive to cost and want a robust and simple solution, Transcoder AddOn is recommended.
2, Can I host the website at a web hosting provider and buy (and consume) the media server from somewhere else?
Sure, you can, this is a typical scenario. In your website you can embed a player like JW Player or similar and simply set them up to pull the stream from anywhere else. If you want to make sure that your streams are not reachable from other sites using the same technique, you can use (my) Wrench for authentication or build something similar.
3, will the bandwidth of both hosting providers be consumed when transmitting a live stream?
No, the player will receive the stream directly from the media server, not via the website's hosting provider.
4, Footage
What is footage?
5, Bandwidth
Multiply the bytes per second with the number of seconds and the number of streams.
6, Adaptive
You need adaptivity if the bandwidth varies, so for mobile devices it is highly recomended, but best for everyone generally, the network speed can drop anytime and if you don't want buffering spinners, you need this.
7, Does adaptive bitrate streaming require special software on the encoding side?
No, it's not the encoding side, it's the player and media server side. If multiple bitrate streams are available on the media server and the chosen technology and player supports it, then you get adaptivity.

Related

RTMP vs RTSP/RTP: Which to choose for an interactive livestream?

If you are trying to develop an interactive livestream application, you rely on ultra low (real-time) latency. For example for a video conference or a remote laboratory.
The two protocols, which should be suitable for this circumstances are:
RTSP, while transmitting the data over RTP
RTMP
*WebRTC: As I'm trying to give a bigger audience the possibility to interact with each other, WebRTC is not suitable. Because as far as I know it is not designed for a bigger audience.
My questions:
Which one should I choose for this use-case? RTSP/RTP or RTMP?
Which protocol delivers better results regarding end-to-end latency, session start-up time?
Which one consumes more hardware resources?
RTMP seems to use a persistent TCP connection. But which protocol is used for the transmission? It cannot be TCP, because this could not ensure real-time latency?
What are in general the pros and cons for using either of the protocols?
I did not find any comparison of these two protocols in scientific papers or books. Only that the famous mobile live-streaming app Periscope is using RTMP.
Other apps like Instagram or Facebook are for example providing text-based interaction with the streamer. If developers want to build the next "killer application" based on interactive live-streams: I think this question is essential to answer.
You make a lot of assumptions in your answer.
WebRTC: As I'm trying to give a bigger audience the possibility to interact with each other, WebRTC is not suitable. Because as far as I know it is not designed for a bigger audience.
That's simply not true. WebRTC doesn't know or care how you structure your applications server-side. There are plenty of off-the-shelf services for handling large group calls and low latency video distribution via WebRTC.
You should also know that for the media streams, WebRTC is RTP under the hood.
It cannot be TCP, because this could not ensure real-time latency?
Of course it can. There's some overhead with TCP, but nothing that prevents you from using it in a real time scenario. The overhead with TCP is minimal.
UDP is traditionally used for these sorts of scenarios, as reliability isn't required, but that doesn't mean TCP can't be used almost as performantly.
RTMP
RTMP is a dead protocol for Flash. No browsers support it. Other clients only support it for legacy reasons. You shouldn't use it for anything new going forward.
Only that the famous mobile live-streaming app Periscope is using RTMP.
Well, that's not a reason to do much of anything.
Which protocol delivers better results regarding end-to-end latency, session start-up time?
WebRTC
Which one consumes more hardware resources?
That's not the right question to ask. Your overhead in almost any other parts of the application is going to be far more than the transport overhead of the protocol used for distribution.
The real list of things you need to think about:
Client compatibility. What sort of clients must you support?
Do you really need low latency everywhere? Do you understand the tradeoffs you're making with that demand? Are you willing to destroy any sense of video quality and reliability for all your users if only a handful of them are going to be interactive?
What's your budget? Off-the-shelf solutions for distribution are much cheaper. If you can push off your stream to YouTube for non-interactive users, you can save yourself a ton of money. If you can't use existing infrastructure, be prepared to spend mountains of cash.
What are your actual latency requirements? Are you prepared to reduce the number of people that can use your application when these latency requirements cannot be met on crappier networks and mobile devices?
What are your quality requirements?
Where will you transcode video to a variety of bitrates?
Do your viewers need adaptive bitrate viewing?
Do you need to push streams to other platforms simultaneously?
Do you need to record the streaming for on-demand watching or going back in time?
You might also find my post here helpful: https://stackoverflow.com/a/37475943/362536
In short, check your assumptions. Understand the tradeoffs. Make decisions based on real information, not sweeping generalizations.

HTML5 / Javascript client code to record from microphone to send it to IceCast / ShoutCast server? Another solution without desktop software?

I'm trying to help this open radio station guys: radioqk.org. However I'm quite new about the topic of streaming and radio servers. I'm quite surprised that all what I found is about a desktop software clients (eg. Sam broadcaster, Butt, Radittcast, DarkSnow...). However they are confusing to configure. So we are trying to embed it on their website to make it easier to stream from any part of the World to any stream server (eg. giss.tv, caster.fm, listen2myradio.com...)
I have read that it's not possible at the moment, because there is no way to make a streaming HTTP PUT request.
However, if I have understood well, it is possible with liquidsoap.fm because its server support the webcast.js protocol, using the following code: https://github.com/webcast/webcaster
On the other hand, I have search php code able to record from microphone to store it on the server. Or maybe it's about HTML5 and its new function getUserMedia()? It seems it was difficult a few months ago, but now it is possible so:
Is there any live-streaming service with the client integrated so it can record from the user's computer microphone / sound card? I mean, is there a similar service like giss.tv able to record from the user's computer microphone / sound card?
If I'm right, IceCast is the most common opensource implementation of radio streaming. Is there any implementation of IceCast able to record from the user's computer microphone / sound card?
By the way, the idea is integrating it in a WordPress server. That's why I have based the search on PHP (I have not found a WordPress plugin able to solve this problem). However it could be done in another language / server to embed it into WordPress afterwards.
Finally, a workaround could be the following article that talk about including on the website a hyperlink to a Java-coded VNC viewer to take a desktop application to the web in 15 Minutes. In the VNC server side would be any of the desktop software available we have talk about above.
Any light about this topic? I'm quite confused about what path I should take...
I have read that it's not possible at the moment, because there is no way to make a streaming HTTP PUT request.
That's correct. In the very near future we'll have Streams support in the Fetch API, which gets around this issue. In the mean time, it isn't possible directly.
As I mentioned in the post you linked to, you can use a binary websocket connection. That's what the Liquidsoap webcast.js uses... a binary web socket, and a server that supports it. Liquidsoap supports their own protocol, so you can use this to then stream to a server like Icecast.
Is there any live-streaming service with the client integrated so it can record from the user's computer microphone / sound card?
I run the AudioPump Web Encoder, which acts as a go-between for web based clients and your servers. The web-based client can be configured in the URL, so the users don't need to do anything. This might meet your needs.
If I'm right, IceCast is the most common opensource implementation of radio streaming. Is there any implementation of IceCast able to record from the user's computer microphone / sound card?
Yes, Icecast is a popular open source server. But, the server itself cannot and should not be what is recording the audio. You wouldn't want to run the server in the same place as you're doing the encoding... otherwise you'd need all your bandwidth there. The encoder is separate from the server so that you can encode in one place, upload one stream, and then the server distribute it to thousands.

How to transmit a tv channels to the Internet/mobile app?

My partner is owner of a local channel tv and he wants an mobile app with the streaming of his tv channel. How could I start?
This is quite a complicated end to end system you require - one simple first question is whether his channel is already broadcast by an existing broadcaster over satellite etc? If so you may find they already have an OTT (internet based delivery) option.
Assuming not then you have several key components:
video source - i.e. the live stream from the channel
some sort of streaming server to convert the video source into the formats required to stream over the internet. Take a look at Wowza as an example of a paid solution of videoLAN as an open source solution.
the mobile app - this will be slightly easier if you just have to stream from one channel.
some sort of user registration and programme guide / search backend if required.
You may also require a DRM solution if you need to protect the content against unauthorised playback, copying etc.

iOS bluetooth data transfer, Gamekit or Bonjour

I'm looking around to find the appropriate technology to implement an app that I've in mind.
Basically I am using bluetooth to estabilish a connection trhrough 2 iOS devices. In the first part of communication I only send messages, and everything works ok.
In the second part i would like to permit the user to share a video file; let's assume, for example, that the video file is 20 MB.
Now, what's the best option to transfer this large data through the 2 devices?
Can I use GameKit and split the data in small packet? It'll take a reasonable amount of time?
Or it's better to use Bonjour and wait that until the user'll be under the same wifi network or create a wifi network through the 2 devices?
Thanks in advance
In case someone else (like me) stumbles upon this question, I'll share my experience:
Bluetooth
Pros: You don't need wifi
Cons:
Apple only allows you to access the BLE interface, which is a lot slower than regular bluetooth. I tried this and it takes around 5 minutes to transfer 1 MB of data. Also, you need to chop your data into chunks of ~20 Bytes and make sure the files are received correctly on the other side.
GameKit
I haven't actually tried it, but it seems GK is fine to send small text messages (since it is designed for this), however sending larger files will still be pretty slow. Also, you can only connect 8 devices simultaneously.
Bonjour and Wifi
Pros: This is as fast as it gets. You can send reasonably sized files (a few MB) within a few seconds.
Cons: You need to be in the same wifi network.
Using a remote server
Pros: Assuming you have a decent internet connection it's reasonably fast and you are not depending on wifi (if you have 3G/LTE).
As it turns out this is pretty easy when you use a Backend-as-a-Service provider like Parse.
Cons: Well, you're gonna have to write that server software... Your app users probably need a mobile data plan.
I ended up with solution #3, using Bonjour and Wifi, since I didn't want to write server side code. If you want to do this, you need to learn about Socket programming. I suggest using CocoaAsyncSocket which uses TCP so you don't have to make sure you received the file correctly.
Since it is 2016 and Swift can be used in Obj-C projects, you can have a look at my solution, which will spare you almost all of the work: https://github.com/JojoSc/OverTheEther
It allows you to send any kind of object (implementing the NSCoding protocol) to another device.

Streaming audio to mobile phones, what technology to use?

I'm planning on building an application where audio media is going to be streamed to the mobile phone for the user to listen.
The targets are smartphones: iPhone/Blackberry/Android/(J2ME ?).
I see that streaming on iPhone has to be done with HTTP Live streaming, but I don't see it supported by other platforms.
Should I broadcast the streams via rstp ? http ? Is there any way to use a unified solution for all the different mobile platform ? If anyone already had to go through this, help would be greatly appreciated.
One answer to the question "what technology to use ?", for iPhone specifically is WiFi. I know that's not the type of question you are asking, but its a point worth making! Many apps that support streaming over 3G have been rejected by Apple due to bandwidth usage. You may need to be prepared to sense network connection type and limit streaming to when you have a WiFi connection only.
Blackberry works with http and RSTP on OS 4.3 or later. I'm not familiar with other platforms but I would think http would be the most compatible.
Here is a PDF that lists the supported types by the major models.
http://docs.blackberry.com/en/smartphone_users/deliverables/15801/711-01774-123_Supported_Media_Types_on_BlackBerry_Smartphones.pdf
You will probably want to do RTSP, but it doesn't really matter. HTTP Live Streaming is just a protocol on the client side I am pretty sure. All these acronyms just describe ways of transmitting data. If a browser can access the data for a given protocol....chances are a phone can too. It sounds like you are asking more of a server side question.....but that question is the least of your worries You are going to have to think more along the lines of "How am I going to scale this" rather than "What protocol should I use to transmit data". Also, the unified solution for all clients, would be to have a server that they all hit for data. You still need to develop separate clients for each OS.
Both Android and BlackBerry supports RTSP.
Note that some BlackBerry devices only support 15fps video, so you may need separated streams to give the best experience to your users.
iPhone, starting from iPhone OS 3.0, needs HTTP Live Streaming.
The only software solution I know to support all above is Wowza, but you still need an encoder. (I think Wowza supports RTP as input, but need double checking.)
iPhone can play non-streamed audio (progressive download). Considered all platforms you would normally you just needs streams that are suitable transcoded. See f.e. https://developer.apple.com/library/content/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/Introduction/Introduction.html The title says its about 'HTTP Live Streaming' but a lot applies for just downloading and playing streams.