HTML5 / Javascript client code to record from microphone to send it to IceCast / ShoutCast server? Another solution without desktop software? - streaming

I'm trying to help this open radio station guys: radioqk.org. However I'm quite new about the topic of streaming and radio servers. I'm quite surprised that all what I found is about a desktop software clients (eg. Sam broadcaster, Butt, Radittcast, DarkSnow...). However they are confusing to configure. So we are trying to embed it on their website to make it easier to stream from any part of the World to any stream server (eg. giss.tv, caster.fm, listen2myradio.com...)
I have read that it's not possible at the moment, because there is no way to make a streaming HTTP PUT request.
However, if I have understood well, it is possible with liquidsoap.fm because its server support the webcast.js protocol, using the following code: https://github.com/webcast/webcaster
On the other hand, I have search php code able to record from microphone to store it on the server. Or maybe it's about HTML5 and its new function getUserMedia()? It seems it was difficult a few months ago, but now it is possible so:
Is there any live-streaming service with the client integrated so it can record from the user's computer microphone / sound card? I mean, is there a similar service like giss.tv able to record from the user's computer microphone / sound card?
If I'm right, IceCast is the most common opensource implementation of radio streaming. Is there any implementation of IceCast able to record from the user's computer microphone / sound card?
By the way, the idea is integrating it in a WordPress server. That's why I have based the search on PHP (I have not found a WordPress plugin able to solve this problem). However it could be done in another language / server to embed it into WordPress afterwards.
Finally, a workaround could be the following article that talk about including on the website a hyperlink to a Java-coded VNC viewer to take a desktop application to the web in 15 Minutes. In the VNC server side would be any of the desktop software available we have talk about above.
Any light about this topic? I'm quite confused about what path I should take...

I have read that it's not possible at the moment, because there is no way to make a streaming HTTP PUT request.
That's correct. In the very near future we'll have Streams support in the Fetch API, which gets around this issue. In the mean time, it isn't possible directly.
As I mentioned in the post you linked to, you can use a binary websocket connection. That's what the Liquidsoap webcast.js uses... a binary web socket, and a server that supports it. Liquidsoap supports their own protocol, so you can use this to then stream to a server like Icecast.
Is there any live-streaming service with the client integrated so it can record from the user's computer microphone / sound card?
I run the AudioPump Web Encoder, which acts as a go-between for web based clients and your servers. The web-based client can be configured in the URL, so the users don't need to do anything. This might meet your needs.
If I'm right, IceCast is the most common opensource implementation of radio streaming. Is there any implementation of IceCast able to record from the user's computer microphone / sound card?
Yes, Icecast is a popular open source server. But, the server itself cannot and should not be what is recording the audio. You wouldn't want to run the server in the same place as you're doing the encoding... otherwise you'd need all your bandwidth there. The encoder is separate from the server so that you can encode in one place, upload one stream, and then the server distribute it to thousands.

Related

Block direct access to rtsp link

I'm planning building streaming server using Darwin Streaming Server and streaming player client with VLC library. My goals is only through my client that can access the video on Darwin Streaming Server. I don't want anyone access my RSTP link without using my client. Because when I get the any RTSP link like rtsp://localhost/myvideo.mp4 I can directly play it on VLC player. But is it possible to block direct access to RTSP link without using the client I build by configuring on Darwin Streaming Server?
You can use an authentication scheme to only allow authenticated users to access content at particular URL.
Essentially this means that the web application will only send responses to clients it has verified are who they say they are - otherwise it will send an error message or simply not reply.
There is an OWASP authentication cheat sheet for authentication here, which is kept up to date and is a good starting place for you to get an overview:
https://www.owasp.org/index.php/Authentication_Cheat_Sheet
OWASP is an open industry organitsion focusing on application security -they describe themselves as:
OWASP is an open community dedicated to enabling organizations to conceive, develop, acquire, operate, and maintain applications that can be trusted
Note that this will not stop authorised users being able to copy and potentially re-distribute the content. For that the typical defence is to encrypt the content and either build your own secure player and secure key exchange solution or use one of the standard DRM solutions which cover nearly all devices now.

WebChat via WebRTC

We are currently in the middle of a large infrastructure rebuild. We are replacing everything from the CRM to the ERP to the CTI.
We have decided to use WebRTC for the CTI. After working with WebRTC for a bit I really see the promise in this technology and started to think that maybe this is the way we want to go for our Webchat as well.
The premise behind this is to be able to add Voice / Video and Screensharing to our chat feature at some point in time.
Since WebRTC is not supported in Safari IE Edge Etc. I am thinking we may be just slightly ahead of ourselves in using WebRTC for text chat.
One thought would be to build it all out as WebRTC determine if the browser allows as default back to XMPP etc.
I have been researching this on my own and have found some options out there like talky.io but in this rebuild we are focusing on not having any third parties involved in our applications (We have had a couple go bye bye with no warning).
Is there a framework / library / open source project out there that tackles part or all of this task?
Is this task as daunting as I think it is going to be or am I overreacting?
Am I crazy, should be locked in a padded room and use an existing chat service?
talky is built ontop of https://github.com/legastero/stanza.io which includes a jingle/webrtc module
Take a look at the Jitsi project (specifically Jitsi Meet). A public version is running at meet.jit.si that you can try out; it uses webrtc for the voice / video, and Jingle / XMPP for the signaling. It is all open source, so you can be sure you won't lose access if the company goes under or something else bad were to happen. The Jitsi team runs it using the Prosody XMPP server; they make a good combination.

How to transmit a tv channels to the Internet/mobile app?

My partner is owner of a local channel tv and he wants an mobile app with the streaming of his tv channel. How could I start?
This is quite a complicated end to end system you require - one simple first question is whether his channel is already broadcast by an existing broadcaster over satellite etc? If so you may find they already have an OTT (internet based delivery) option.
Assuming not then you have several key components:
video source - i.e. the live stream from the channel
some sort of streaming server to convert the video source into the formats required to stream over the internet. Take a look at Wowza as an example of a paid solution of videoLAN as an open source solution.
the mobile app - this will be slightly easier if you just have to stream from one channel.
some sort of user registration and programme guide / search backend if required.
You may also require a DRM solution if you need to protect the content against unauthorised playback, copying etc.

How to set up a media server infrastrucure?

I need to set up a media server infrastructure to support live streaming. I have endless questions in relation to this as this area is totally new to me. I have done the research but I received so many different truths that I don't know who to believe.
Context:
Wowza
Wowza Engine
Audio and Video live streaming
15 x 20-minute live streams per day
Between 7 and 15 CONCURRENT live streams may happen at the same
moment in time
720p quality is sufficient
Every live stream will be viewed by only between 1 and 5 viewers
Viewers will view the stream on an internet browser of their choice.
However if possible they can also view the livestream on their phones
(even if its via the website through the phone's browser).
Choppy/buffering streams are not an acceptable thing
Streams do not need to be recorded and stored
Footage may be taken from webcams or phones
Audience is in the US (and so are the publisher of the live stream)
Questions:
1) Do I need Wowza transcoder?
Some suppliers told me I need the transcoders only if I require
adaptive bitrate.
Others told me I need the transcoders only if I need to stream to iPhone or apple devices
Others told me that I need
transcoders because I want to do concurrent live streams and I would
consequently need one transcoder licence per concurrent live stream
Others told me that concurrent live streams (multiple channels?) can
happen even if I do not buy transcoder licences
At this stage I do not know who to believe. The Wowza documentation says transcoders are required to convert incoming streams from one format to another and to provide adaptive bitrate but I am still not sure.
2) Can I host the website at a web hosting provider and buy (and consume) the media server from somewhere else?
- For example can I host the website on TSOHost but then have the media server from primcast or serverroom.net?
3) If the answer to the above is yes, will the bandwidth of both hosting providers be consumed when transmitting a live stream?
4) Since footage is taken either from phones or from webcams, which software do the users need to install in order to transmit the footage?
5) For 15 x 20 minute live streams per day, how much bandwidth is consumed? How do I calculate that?
6) Do I need adaptive bitrate streaming? Or is it required only if the audience can be expected to have bad internet speed?
7) Does adaptive bitrate streaming require special software on the encoding side or do the regular Adobe Flash Live Encoder and Wowza GoCoder do the trick?
Thank you in advance. If you know a freelance expert I can hire give me his details :P.
Quite a few questions, I'll try to add some answers (and you can contact me outside of SO of course)
1, Do you need Wowza Transcoder?
If the streams come from software that can send multiple bitrates, like the Flash Media Live Encoder, capable of sending the same stream in 3 different qualities, then you don't. You can use alternatively free software like ffmpeg on publisher side to avoid transcoding, but the cost is more CPU load on publisher side and of course more bandwidth upwards. Or you can still receive one stream with ffmpeg on the server and produce different qualities on the media server internally and feed those to Wowza Streaming Engine. But if you are not sensitive to cost and want a robust and simple solution, Transcoder AddOn is recommended.
2, Can I host the website at a web hosting provider and buy (and consume) the media server from somewhere else?
Sure, you can, this is a typical scenario. In your website you can embed a player like JW Player or similar and simply set them up to pull the stream from anywhere else. If you want to make sure that your streams are not reachable from other sites using the same technique, you can use (my) Wrench for authentication or build something similar.
3, will the bandwidth of both hosting providers be consumed when transmitting a live stream?
No, the player will receive the stream directly from the media server, not via the website's hosting provider.
4, Footage
What is footage?
5, Bandwidth
Multiply the bytes per second with the number of seconds and the number of streams.
6, Adaptive
You need adaptivity if the bandwidth varies, so for mobile devices it is highly recomended, but best for everyone generally, the network speed can drop anytime and if you don't want buffering spinners, you need this.
7, Does adaptive bitrate streaming require special software on the encoding side?
No, it's not the encoding side, it's the player and media server side. If multiple bitrate streams are available on the media server and the chosen technology and player supports it, then you get adaptivity.

How does iphone apps interact with server?

I am a new programmer who is new to iPhone development and server stuff. I have a lot of questions to ask.
You don't have to answer all the questions; any help is appreciated!
How does iPhone apps interact with server?
Is there a particular kind of server i should use to interact iphone app with server?
If there is no particular kind of server then what kind of server can be used?
What are their advantages and disadvantages?
What should the iPhone app (which is the client) do in order to interact with the server?
How does the server know which iPhone to send data to?
What should the server do in order to interact with iPhone app (client)?
Your best bet is to have your iPhone make web requests of a web server. Your iPhone app acts just like a web browser, making http requests to a web server, and parsing the response.
I'm building an app right now that hits PHP scripts I've written that do database work, etc, and return JSON objects. It's not fancy--I could have built a whole SOAP or RPC web service, but I didn't do that, it just makes GET requests with query-string arguments.
There are handy libraries you want to know about. Google "iPhone JSON" to find the JSON library written by Stig Brautaset, that's the one most people seem to be using. Also, rather than putting yourself through all the hoops that the iPhone's built-in web client framework requires, go get ASIHTTPRequest, a very powerful and MUCH simplified web client library.
As a general rule, you want to do as much processing on the server as possible. For instance, there's a place in my app I'm searching for events happening within a user-specified range of their local coordinates ("within 10 miles of me"). I wrote PHP to build a latitude/longitude bounding box, and query from the database based on that. That's WAY faster than bringing a bunch of events down and then asking Core Location to calculate their distance from where I'm standing.
You've asked quite a few questions so I'll try my best to answer them all:
First, you need to be a bit clearer, what type of server are you talking about? Email server, web server, lolcat server, it depends.
At the basic level, the iphone communicates over the internet. The internet uses Internet Protocol, and there are two standard protocols built atop of IP: Transmission Control Protocol, and User Datagram Protocol. Each has it's own uses and functions.
TCP/IP and UDP/IP make up the backbone of internet communication.
A more specific application protocol is built atop of these two internet protocols, with a specific format to a given application. For example, HTTP is the standard protocol for transferring HTML and other Web information between a web server to a web browser client, over TCP.
So, your iPhone would use whatever protocol is required to commuincate with the server. For more common server communication, the iOS SDK provides methods to construct messages (for example if you wish to make an HTTP request to a web server, you can use initWithContentsOfURL to send a GET request).
If you built a custom server, then you will need construct the required message protocol on the iphone, and send it to the server, using either TCP or UDP (whatever your custom server expects).