Is it possible to include a shoutcast service (an audio broadcast service) inside an application built by flutter (an application for sports commentators), that is, a live audio broadcast is performed every day?
Need answers .........
Related
I am working on a flutter app (for iOS and Android). One of the function of the app is to find DLNA renderes in the local network and allow user to play the local music files on his phone on these DLNA renderes. I have incorperated the DART-DLNA packagein my app and am Able to list all DLNA-UPNP devices and I can send a url of a media to the selected DLAN renderer and it starts playing. I can also control the play/pause and stop functions. So far so good
Now I want to be able to play the music files that are on the device from this app. When working on ANdrodi native earlier I used DroidUpnp which mainly uses cling and nanoHttpd libraries. Cling provides upnp stack and nanoHttpd is used for creating a webserver to server the media files. The DroidUpnp app lists the music content like this
On going depper when reached to the actual music file the app will send that file via upnp/dlna.
On Flutter I am not sure how to go for it? How to List those files and then get the files urls like http://192.168.1.190:8192/a-24684.mp3 and send it to the render.
I am using StreamYard services to do a live stream to multiple destinations like Facebook and youtube. I want to create a mobile app using flutter that can receive that stream and use it (i mean to display the live stream). Streamyard uses only RMTP for a custom destination.
My question is: is there a way to create and host a custom RMTP online that can be shared between the StreamYard platform and my mobile app. I want it to work whenever I do live stream from the StreamYard it will be shared to Facebook, youtube, and my mobile app as well. I have done a lot of research but i find out the only way is to use windows or linux as a host, which i want it online.
Also, alternative solutions are welcome like using webRTC.
Because you use StreamYard, I think you need to use the INVITE feature to start a Video Chat then convert to live streaming, it works like bellow:
UserA --WebRTC--->---+
+--->- StreamYeard -->-RTMP-->- YouTube/Twitch.tv
UserB --WebRTC---->--+
You need to buy a non-free plan to support Custom RTMP destinations to publish the RTMP to your media server like SRS or Nginx, then you could broadcast to multiple destination, like this:
+->-- YouTube/Twitch.tv
|
StreamYeard ->-RTMP-+->- Custom RTMP destinations --+--RTMP-> YouTube/Twitch.tv
(SRS/Nginx media server) |
+--HLS/FLV--> Flutter App
Note: Once streaming to your RTMP server or video cloud platform, you could covert to HLS/HTTP-FLV for your FlutterAPP to play it. About player and protocol, please read here. It depends on which part you want to build by yourself, and it's possible to build by open-source projects.
Note: Note that you could use StreamYeard to streaming to YouTube and Custom RTMP server, or use FFmpeg to pull stream from your custom server then publish to any other live streaming platform.
For this solution, the StreamYeard actually plays as Video Chat or video conference platform, like ZOOM. It will transcode each WebRTC stream and mix all the audio and videos to one RTMP stream.
So you could use WebRTC server to build your StreamYeard, then use FFmpeg to transcode and mix the streams, because it is off topic so let me stop here.
I need to do a Live Broadcast apps using flutter on both iOS and Android.
While broadcasting, the video can be recorded by both the broadcaster and viewer (Up to 500).
I have did some studies and seems like WebRTC is the way to go. But I need to know some additional info:
How can broadcast be done? Send to a server and the server will send to individual viewer separately?
Can I put something overlay on the video while it is playing? Example: Text, Icons, Buttons .... etc
I am trying existing to stream music/video on an iphone using HTTP Live Streaming. I read the apple docs on HTTP live streaming (http://developer.apple.com/library/mac/#documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/Introduction/Introduction.html), and I get how it works.
What it doesn't say is how one would use iphone as a server? Do I have to add the tools to my ios app(mediastreamsegmenter, variantplaylistcreator) and run it as a NSTask or is there some kind of native support to stream media files.
If you really want to stream from an iPhone app you can't do this with the iPhone acting as a server. You need a separate server where you can send data from the iPhone app. So you can use the camera or the microphone in the app to get live content and then you can send asynchronously data to the server, which using mediastreamsegmenter and variantplaylistcreator will convert the data to ts segments and then will append them at the end of the m3u8 file and meanwhile another iPhone app can act as a client and watch the live content that you are streaming from the first app.
From my experience this is the only way to achieve that. Hope that helps.
I am working with a group at developing an app that will essentially be a 'radio' app. One view that will just play whatever audio is streaming at the time, and another view or two of archives to listen to past programs. What I am working on right now is how to assemble the view to play. The site in question is on-this-rock.org and the source for playing is here
Any suggestions for how I can best go about building the player to stream in the audio, without needing the rest of the site graphics?
Thanks
The stream URL is actually:
http://s4.voscast.com:8080/
This is just a SHOUTcast stream. You can build your radio player to connect directly to it. No need for the HTML/Flash on the website itself.
You can find this easily by looking at your browser tool's network tab, or by using a tool such as Fiddler or Wireshark.