I want to build a server that I can register as my own internet radio station on my hardware internet radio player using wifiradiofrontier.com. Reasoning: I want to listen to Spotify when I wake up and there is no hardware internet radio that can play Spotify triggered by an alarm.
I think of the following setup on an Ubuntu machine:
Spotify client which connects to Spotify and plays a fixed playlist. Control via Smartphone over Spotify Connect would be a nice extra.
Audio and hopefully metadata are then transferred to IceCast2 or something similar. I am not sure on how to do that. For audio I could use Jack I suppose but how can I transfer audio and metadata?
Once the data is in IceCast2 I can just set it up as webradio, register it on wifiradiofrontier.com and there we go.
Any thoughts on this? Has somebody done this before?
Related
I'm using SIP.js to connect to FusionPBX video conference room, but when callers join, all callers get a single caller video stream.
How can I get all streams in the room to handle them and view them to each others so everyone can see everyone video?
Is there any event I can use? like on("join") or something? So when someone connect I get his stream?
SIP.js have that events, but it works only on peer to peer, not in rooms.
Is there another way to make it work with FreeSWITCH?
How can I make it work?
I don't know about FusionPBX settings, but in his core (FreeSWITCH) in mod_conference is parameter like video-mode.
The mode to run video conferencing in. passthrough is non transcoded video follow audio. transcode allows for better switching and multiple codecs. mux allows for multiple parties on the video canvas at the same time
I have a raspberry pi which has webrtc via uv4l2. It is awesome! I want to record the video from the camera on a server. It's your basic surveillance camera setup... central linux server with lots of storage space, remote IP cameras, etc. I've read dozens of pages and still can't figure it out. I tried all this kurento mumbo jumbo but it's all wretch an no vomit. It never gets there. What's the command to grabthe rpi video and dump it to disk? Please help!!!
UV4L already supports audio+video recording on the server (other than on the client), if you use it with Janus WebRTC. Have a look at the bottom of the Video Conference DEMO OS page for more details. At the moment, you will have to use the REST API to login into a Janus room and turn on/off the recording. The REST API is ideal if you want to control UV4L from a custom application, but there is also a panel which allows you to dynamically send REST requests to the UV4L server.
I am working with a group at developing an app that will essentially be a 'radio' app. One view that will just play whatever audio is streaming at the time, and another view or two of archives to listen to past programs. What I am working on right now is how to assemble the view to play. The site in question is on-this-rock.org and the source for playing is here
Any suggestions for how I can best go about building the player to stream in the audio, without needing the rest of the site graphics?
Thanks
The stream URL is actually:
http://s4.voscast.com:8080/
This is just a SHOUTcast stream. You can build your radio player to connect directly to it. No need for the HTML/Flash on the website itself.
You can find this easily by looking at your browser tool's network tab, or by using a tool such as Fiddler or Wireshark.
I'm looking for help in my research
I'm trying to generate live midi so that people can listen to it via a web browser
I'm not sure but I'm guessing there must be a way to set up a midi server to accept connections from my desktop with midi sequencer , sending that midi data to an online midi server where people can connect to and listen to the midi that is generating live right in their web browser
any help appreciated
You are going to be better off writing a plugin to handle this. That being said, it is possible to dynamically play MIDI with JavaScript. See this question: generating MIDI in javascript
You could read in the data just like any feed from your web server, and play it back in chunks.
I have seen plenty of articles and SO questions about streaming TO an iPhone app, but my question is the reverse, that is, streaming FROM an iPhone app.
I have audio content in an iPhone app, that I want to stream to a browser. So the idea is that the browser can connect to a server running on the iphone. The server on the iphone will give the audio to the browser. The browser will play the endless stream.
I already have seamless looping content on the phone with AudioQueue. I already know how to setup a server running on the phone with CocoaHTTPServer. Is there a third piece that can make the AudioQueue (or a FileStream) stream to a browser connected to the internal iPhone server?
Anybody have any thoughts on how to implement this?
Well, there are a few good open source projects to dissect, port, or imitate for this. What I would suggest is looking at how Icecast and streamTranscoderv3 operate together. The latter will take an audio source and send it to an Icecast server as a source. Port parts of both and run them locally on the iPhone and you'd have a solution. I imagine that Bonjour could be used so that other systems on the LAN could find and listen to the iPhone.
Or send the streamTranscoder output to an Icecast server elsewhere and make it available for the world.
Unfortunately, neither project is over engineered - the code isn't super modular but it is comprehensible and modestly cross platform.