Streaming more than one file using Live555 - streaming

Live555 lib has a nice example testOnDemandRTSPServer.cpp This example just stream "one" given file. I want to stream more than one file. Does Live555 has playlist concept or how to stream more than one file in Live555?
Best Wishes
PS: I try to add more than one subsession, in that case Live555 just stream the last session file...

There is one more application that comes with the live555 code. Live555Media server is present inside the source code's mediaServer directory. This does the job. It uses the dynamicRTSP server class. You give it the folder with all your media files and access them as rtsp://ip/filename.

My 0.02 cents:
I'm not sure if that makes sense: how would you ensure that they are all encoded in the same format which is a requirement if you want to stream them in the same session. RTSP describe gets a media session description of the file and this is used to setup the streaming sessions so it is crucial that all files encoded similarly.
RTSP does not make any provision for playlists. Usually playlists are not transferred via RTSP, but say via HTTP. IMO if the playlist resides on the client it would make more sense to await the RTCP bye packet (at the eof) and then to do a SETUP and PLAY for the next file/RTSP URI in the playlist.
If you just want to stream a sequence of files (playlist is on the server) where the RTSP client just initiates one session, of course nothing prevents you from creating a custom file source in the live555 library that does what you want...

Recently I had to do similar task and with similar functionality:
Here what you can do for video H264 stream files to play in the row like playlist (of course if they are same resolution, encoding profile,etc)
You would have to modify ByteStreamFileSource::doGetNextFrame method.
There is code like feof(fFid)
if (feof(fFid))
{
CloseInputFile(fFid);
fFid = OpenInputFile(envir(), "test.264");
//fileName
}
else ....
Of course if you still need LGPL compliance you there will be more work to do... You will have to copy/rename this class outside library and do the same with H264VideoFileServerMediaSubsession and modify method createNewStreamSource that it would use you rewritten class of ByteStreamFileSource.

Related

How to stream audio in Gstreamer (Python)

I am making a python application that takes text converts it into audio using ibm cloud Watson TTS, then return an audio using
content = watson_tts.synthesize(text, voice), accept=format).get_result().content
then I want to take this content and stream using Gstreamer, without saving it to a file.
I know how to play files from uri using this:
player = Gst.ElementFactory.make("playbin", "player")
player.set_property("uri", uri)
player.set_state(Gst.State.PLAYING)
but that's not what I want,
what I want is being able to stream the audio directly without downloading
After executing
content = watson_tts.synthesize(text, voice), accept=format).get_result()
synthesized audio is already "downloaded" from IBM's service so instead of "what I want is being able to stream the audio directly without downloading" I suppose it's better to say "... without saving to a file".
Anyways... to "programmatically" feed gstreamer's pipeline with (audio) bytes from Python's content object, you can utilize appsrc element.
For example, the pipeline can be implemented something like this
and it will produce MPEG Transport Stream with aac encoded audio streamed via UDP.

How can I monitor an mp3 live stream to detect corruption?

Once a month the mp3 streams messes up and the only way to tell it has messed up is by listening to it as it streams. Is there a script or program or tool I can use to monitor the live streams at a given url and send some kind of flag when it corrupts?
What happens is normally it plays a song for example or some music but once a month, every month, randomly, the stream corrupts and starts random chimpmunk like trash audio. Any ideas on this? I am just getting started at this with no idea at all.
Typically, this will happen when you play a track of the wrong sample rate.
Most (all that I've seen) SHOUTcast/Icecast encoders (going straight from files) will compress for MP3 just fine, but assume a fixed sample rate of whatever they are configured for. Typically this will be 44.1kHz. If you drop in a 48kHz track, or a 22.05kHz track, they will play at different speeds while causing all sorts of random issues with the stream.
The problem is easy enough to verify. Simply create a file of a different sample rate and test it. I suspect you will reproduce the problem. If that is the case, to my knowledge there is no way to detect it, since your stream isn't actually corrupt... it just sounds incorrect. You will have to scan all of your files for sample rate. FFMPEG in a script should be able to help you with that.
Now, if the problem actually is a corrupt MP3 stream, then you have problems on your encoding side. I suspect simply swapping out whatever DLL or module you're using with a recent stable version of LAME will help.
To detect a corrupt MP3 stream, your encoder must be using CRC. If you enable it, you should be able to read through the headers of each frame to find the CRC, and then run it on the audio data. In the event you get an error (or several frames with errors), you can then trigger a warning.
You can find information on the MP3 stream header here:
http://www.mp3-tech.org/programmer/frame_header.html

Video recording and saving the video on a server

I just caught with a task which is how should i go about capturing a video in my app and saving it directly on some server. I have the sample code discussed in WWDC2010 but i just need some other more helpful links or tutorials to complete this task.
Please give me your opinion or share any links if you have.
Thanks,
There is nothing in the APIs that will allow you to do this. The only way is to use AVAssetWriters to segment the video. You would then stream the completed segments. These would need to be reassembled on the server side if you require a single file.

HTTP Live Streaming for IPhone

I'm folowing Apple's proposal in https://datatracker.ietf.org/doc/html/draft-pantos-http-live-streaming-01.
Trying a dummy HTTP Live Streaming to my iphone, I wrote a webservice with Django corresponding to a .m3u8 file. I'm begining the response with
#EXTM3U
#EXT-X-TARGETDURATION:#10
#EXT-X-MEDIA-SEQUENCE:#0
I then write the URLs of the segments (6 segments of 10 seconds each )inside the response:
#EXTINF:10,
http://...../sample_low-1.ts
...
and that's all. I change the part containing URLs of segments every minute, so in theory I'm expecting a continuous live stream.
However, when I check the stream with my iphone I observe the following:
The phone connects to ...m3u8 , gets its contents, starts downloading .ts files and starts showing the video. Then, after downloading 6th segment(last segment in the .m3u8) it reaches end of file, sees no
EXT-X-ENDLIST
and searches for the new .m3u8. The new .m3u8 is ready at the server at this point, as I renew the contents of .m3u8 every 60 seconds.
However, the phone pauses, and I cannot achieve a continuous stream on the phone.
So, obviously I make a huge mistake somewhere. Any helps and suggestions are very welcome.
Edit : Turns out that incrementing media sequence works.
How do you send the response back?
If you return the Django response object, then the server is simply sending a response with the six segments, and then will sit quietly, waiting for a new request from the client.
If you want to continuously send data from the server, you should instead yield the result, and use some kind of synchronization, so that you are sure you are not sending the same data over and over again.

how to read a http video stream with libavcodec (ffmpeg)

I'm trying to read a real-time http video stream (I've set one up using VLC and my webcam) using libavcodec from ffmpeg. so far I've been able to read an mpeg file fine but when I pass a url instead of the file name it hangs (I assume it's waiting for the end of the file).
anyone have experience/ideas on how to do this? I'm also open to alternatives if people have other suggestions.
the end goal is to do live video-streaming on the iphone. apple's http live streaming has too much lag for what I need but that's for another post :)
any help is greatly appreciated!
if you aren't using apple's way, you can't.
I have to admit I dont quite understand what you want to achieve, however, based on my interpretation of the question it seems like it is related to the segmenting of the video.
The segmenting can be achieved using ffmpeg and libavcodec (look for example here, key line is the one with packet.flags). Just remember that the segment length (in time) depends on the keyframe interval (for h264 at least). If you want an example of a full segmented streaming solution that works (most of the time), check here.
Otherwise, you have to dig into the codec and create the transport stream manually. The MPEG2-TS, which is what iOS supports, can be a little bit difficult sometimes, but it is not too bad. Good luck!
you can use libcurl to download http segment file