How to continuously stream audio while IceCast server changes streams - streaming

Problem:
Streaming live audio via an Icecast mountpoint. On the server side, when the live show stops, the server reverts to playing a music playlist (the actual mountpoint stays /live). However, when the live stream stops, the audio player stops too. Dev tools says the request has been cancelled. Player must be in HTML5, so no Flash.
Mountpoint: http://198.154.112.233:8716/
Stream: http://198.154.112.233:8716/live
I've tried:
Listening for the stream to end, and tell the player to reconnect. However, all of the events on the jPlayer and Mediaelement.js APIs don't return anything when the stream is interrupted.
Busy contacting the server host to ask for advice when dealing with their behind-the-scenes playlist switcher.
I'd like to find a client-side solution to this. Could websockets / webrtc solve this problem by keeping a connection open?

Your problem isn't client-side, but how you are handling your encoding. No changes client-side can appropriately fix this problem.
The stream configuration you are using is that the encoder is using files on disk as a backup stream. Unfortunately, it sounds like instead of re-encoding and splicing (and matching sample-rate and channels if needed), it is just sending the raw file data.
This works some of the time, as MPEG decoders are often tolerant of corrupt streams, and will re-sync. However, sometimes the stream is too broken, and the decoder gives up. The decoder will also often stop if there is a change in sample rate or channel count. (Bitrate changes are generally not a large problem.)
To fix your problem, you must contact your host.

Yes this is unfortunately a problem if the playlist and live stream are not the same codec. Additional tools such as Liquidsoap have solved the problem for me, as well as providing many more features:
savonet.sourceforge.net

Related

Streaming audio/video to iPhone other than http server

I find most of streaming audio discussions are about the streaming media from http server, e.g. AudioStreamer from cocoa with love or MPMoviePlayerController. They both init with NSURL. But my case is other than that. I use SMB to access the media files on some window shared server. The media content is got with SMB message (thru socket) and is accumulated in memory (NSMutableData)
So is there a way to play them (those NSMutableData) before download is finished ?
Update, so for streaming audio I understand I need audio queue service.
What about stream video other than http? I think it is doable because there is a free app called TIOD which does not only stream audio but also video from SMB server.
BTW, I never expect others to do work for me. I check all the document I can find and can't find a way to do it (for video). I had thought, well, that may mean it can't be done. But then I find TIOD can do that. That's why I raised the quesion in the first place to see if other has experiences for it.
Yea you can stream that as well, its the same thing as getting the data from an NSURL... if you look at audio streaming example by matt gallagher here you see that he is getting data from some URL, but ultimatly when he calls the parse function he is giving it bytes of data, same things should apply for your situation, with the data you get you should be able to call the parse function and have the Audio Player stream your audio file..

Streaming audio from a microphone on a Mac to an iPhone

I'm working on a personal project where the iPhone connects to a server-type application running on a Mac. The iPhone send and receives textual/ASCII data via standard sockets. I now need to stream the microphone from the Mac to the iPhone. I've done some work with AudioServices before but wanted to check my thoughts here before getting too deep.
I'm thinking I can:
1. Create an Audio Queue in the standard Cocoa application on the Mac.
2. In my Audio Queue Callback function, rather than writing it to a file, write it to another socket I open for audio streaming.
3. On the iPhone, receive the raw sampled/encoded audio data from the TCP stream and dump it into an Audio Queue Player which outputs to headphone/speaker.
I know this is no small task and I've greatly simplified what I need to do but could it be as easy as that?
Thanks for any help you can provide,
Stateful
This looks broadly sensible, but you'll almost certainly need to do a few more things:
Buffering. On the "recording" end, you probably don't want to block the audio queue if the buffer is full. On the "playback" end, I don't think you can just pass buffers into the queue (IIRC you'll need to buffer it until you get a callback).
Concurrency. I'm pretty sure AQ callbacks happen on their own thread, so you'll need some sort of locking/barriers around your buffer accesses.
Buffer pools, if memory allocation ends up being a big overhead.
Compression. AQ might be able to give you "IMA4" frames (IMA ADPCM 4:1, or so); I'm not sure if it does hardware MP3 decompression on the iPhone.
Packetization, if e.g. you need to interleave voice chat with text chat.
EDIT: Playback sync (or whatever you're supposed to call it). You need to be able to handle different effective audio clock rates, whether it's due to a change in latency or something else. Skype does it by changing playback speed (with pitch-correction).
EDIT: Packet loss. You might be able to get away with using TCP over a short link, but that depends a lot on the quality of your wireless network. UDP is a minor pain to get right (especially if you have to detect an MTU hole).
Depending on your data rates, it might be worthwhile going for the lower-level (BSD) socket API and potentially even using readv()/writev().
If all you want is an "online radio" service and you don't care about the protocol used, it might be easier to use AVPlayer/MPMoviePlayer to play audio from a URL instead. This involves implementing a server which speaks Apple's HTTP streaming protocol; I believe Apple has some sample code that does this.

iPhone: HTTP live streaming without any server side processing

I want to be able to (live) stream the frames/video FROM the iPhone camera to the internet. I've seen in a Thread (streaming video FROM an iPhone) that it's possible using AVCaptureSession's beginConfiguration and commitConfiguration. But I don't know how to start designing this task. There are already a lot of tutorials about how to stream video TO the iPhone, and it is not actually what I am searching for.
Could you guys give me any ideas which could help me further?
That's a tricky one. You should be able to do it, but it won't be easy.
One way that wouldn't be live (not answering your need, but worth mentioning) is to capture from the camera and save it to a video file. see the AV Foundation Guide on how to do that. Once saved you can then use the HTTP Live Streaming segmenter to generate the proper segments. Apple has applications for Mac OSX, but there's an open source version as well that you could adapt for iOS. On top of that, you'd also have to run an http server to serve those segments. Lots of http servers out there you could adapt.
But to do it live, first as you have already found, you need to collect frames from the camera. Once you have those you want to convert them to h.264. For that you want ffmpeg. Basically you shove the images to ffmpeg's AVPicture, making a stream. Then you'd need to manage that stream so that the live streaming segmenter recognized it as a live streaming h.264 device. I'm not sure how to do that, and it sounds like some serious work. Once you've done that, then you need to have an http server, serving that stream.
What might actually be easier would be to use an RTP/RTSP based stream instead. That approach is covered by open source versions of RTP and ffmpeg supports that fully. It's not http live streaming, but it will work well enough.

Streaming live H.264 video via RTSP to iphone does work! w/example

Using FFMPEG, Live555, JSON
Not sure how it works but if you look at the source files at http://github.com/dropcam/dropcam_for_iphone you can see that they are using a combination of open source projects like FFMPEG, Live555, JSON etc. Using Wireshark to sniff the packets sent from one of the public cameras that's available to view with the free "Dropcam For Iphone App" at the App Store, I was able to confirm that the iphone was receiving H264 video via RTP/RTSP/RTCP and even RTMPT which looks like maybe some of the stream is tunneled?
Maybe someone could take a look at the open source files and explain how they got RTSP to work on the iphone.
Thanks for the info TinC0ils. After digging a little deeper I'v read that they have modified the Axis camera with custom firmware to limit the streaming to just a single 320x240 H264 feed, to better provide a consistent quality video over different networks and, as you point out, be less of a draw on the phone's hardware etc. My interest was driven by a desire to use my iphone to view live video and audio from a couple of IP cameras that I own without the jerkiness of MJPEG or the inherent latency that is involved with "http live streaming". I think Dropcam have done an excellent job with their hardware/software combo, I just don't need any new hardware at the moment.
Oh yeah, I almost forgot the reason of this post RTSP PROTOCOL DOES WORK ON THE IPHONE!
They are using open source projects to receive the frames and decoding in software instead of using hardware decoders. This will work, however, this runs counter to Apple's requirement that you use their HTTP Streaming. It will also require greater CPU resources such that it doesn't decode video at the desired fps/resolution on older devices and/or decrease battery life compared to HTTP streaming.

Simultaneously stream and save a video?

I'm writing an app, part of which allows the user stream/play videos. I want to restrict the functionality so that they can only stream videos if they have a WiFi connection. I will then save the video so that when they have a 3G only (or lesser) connection they can't stream videos and can only replay videos that are saved on the phone.
Ideally, I'd like to get MPMoviePlayerController to stream/play the movie and then access the movie data and save it. However, the MPMoviePlayerController api doesn't seem to support access to the movie data.
I'd like to avoid and download-then-play scenario. Any ideas?
Two solutions come to mind.
Both this solutions require that the file is in a format that can be played progressive, e.g. that you don't need the whole file to be able to play it (but that would be a prerequisite anyway).
use a thread to download the data and append it to a file, and play the file from another thread. Now, that requires that you can handle EOF events in the MPMoviePlayerController and pause the playing until the cache file is appended to and then resume for the same point.
So far what I've seen people doing this it doesn't work because MPMoviePlayerController can't handle the EOF event. (not tested it my self yet) [Caching videos to disk after successful preload by MPMoviePlayerController
Skip the playing from a file and setup a local HTTP server and stream from that (on localhost). This is also not tested.
The idea is that MPMoviePlayerController would handlle the event of missing data better from a HTTP stream then from reading the file directly.
Downside might be that it is less efficient, but I think that is a minor increase in CPU. I don't know if the network interface would handle it, but I'm assuming it's not an issue.
I leave this answer as a wiki, because I don't have a working solution but I too want one.
There is a way to make this work, but you have to write your own HTTP Live Streaming downloader.
Basically, you parse the .m3u8 file (it's a pretty simple standard, but can get tricky with alternate streams and the possibility that the stream will simply drop out and need a new playlist to continue) and then download the chunks in .ts format to your local storage, say the Documents folder or Caches etc.
Then you'll have to set up a local HTTP server to allow the MPMoviePlayerController or AVPlayer to access the files over HTTP (since they won't touch a local file path), including a re-coded playlist file pointing to the local files, which you'll have to create yourself from the original playlist(s).
CocoaHTTPServer works great for this.
Once you've done all that, it works great. It's unavoidable that you get a little delay while you download the first chunk or two before presenting your local HTTP URL to the movie player, but after that you get seamless download, recording and preview playback.
Good luck!
the iPhone is using progressive download so it will not save on the device. For that you need to explicitly download it and then play the video from your local folder.