Get RTSP stream on server from WebRTC - streaming

How can I get RTSP stream on server from WebRTC and then use it for my own purposes, for example, to retranslate it somewhere. Maybe there is some solution or library?
Any help is much appreciated!
Denis

If you are willing to hook into the webrtc code then this can be done easily by doing the following:
Build a native webrtc C++ client on server
If the codec used is H.264, tap the output (encoded frames) from H264 RTP receiver.
Use a third party library such as (http://www.live555.com/openRTSP/) to build a RTSP server out of it.
If vp8, you might as well tap the output of vp8 decoder and then use ffmpeg to create rtsp stream out of it. Or consider webm may be?

Related

RTSP Server Gstreamer

I have to build a test server (C++ application) to familiarize with the RTSP implementation of the GStreamer library.
My task is to make something like this:
MP4 video file ==> RTSP application ==> Network stream ==> VLC Client which reproduces the video.
I think it's one of the simplest things for this library, it's like an "Hello World" of RTSP, however, i cannot find an example of this.
How may i accomplish this? Where may i find a quick step-by-step explanation of the GStreamer operation?
Cheers

How do I publish a video compressed with h265?

I set up a nginx server and I can broadcast using rtmp. I do this by compressing it with h264, but I also need to compress it with h265. rtmp and flv do not support h265.
as a result can you offer me a server and protocol to use h265? get open source if possible.
edit:
okey should not explain the problem a little more. I can send a video to my nginx server with a client and watch this video in hls and dash.
But what I have to do is compress this video with a h265 applet with a converter like ffmpeg and then watch it or send it to another client. I couldn't find anything other than rtmp to send and receive videos.
MP4 and MKV both support H.265 video streams.
You can use DASH or HLS to stream your MP4 segments.

convert h.264 live stream to mjpeg live stream

most browsers can directly display a HTTP MJPEG Stream... while they can´t display h.264 via RTSP without the help of plugins...
I have a security cam that can only stream h.264 via RTSP so I can´t view the live video on my browser (nor iPhone etc.) and I do not want to install any APPs or Plugins...
But I have a Linux Server... I would like to retrieve the h.264 on my server and restream it in MJPEG, so I can browse to my server and see the MJPEG stream via HTTP...
After three days googling around and trying a lot with live555, ffmpeg, VLC and other tools I still did not get it running...
What is the right way to achieve my goal (with free tools like ffmpeg, live555 or whatsever needed...)?
thanks for any help.
I found the answer by myself... using vlc it was important to use quotation marks :sout'....' which I did not know... now it works as expected..

how to use Gstreamer for rtsp proxy

I want to create an rtsp proxy where rtsp stream can be taken from one end and then need to stream the same(using rtsp) on other side.This is more of a rtsp relay but I wanted to trans code this relayed stream in between.
Can anyone suggest a way for doing this? I had considered Gstreamer for this, but that can act only as rtsp server, but I still don't know how to get the rtsp stream from other side and how to relay rtsp events.
Any existing library which can help me in achieving this? or If Gstreamer can be used in this way ?
Gstreamer should be able to use RTSP on input, there is rtspsrc element for it.

“Hook” libMMS to FFmpeg for iPhone Streaming

These days, I was researching the software architechture for iPhone Streaming (Base on MMS protocol).
As we know, in order to playback MMS audio stream, we should call libMMS to read wma stream data from remote media server, and then call FFmpeg to decode the stream data from wma format into PCM data buffer, and finally, enqueue the PCM data buffer into iPhone’s audioqueue to generate real sound.
The introduction above just describe the working process of iPhone streaming. If we only need to implement this simple functionality, that is not difficult. Just follow the introduction above to call libMMS, FFMpeg and audioqueue step by step, we can achieve the streaming function. Actually, I have implemented the code last week.
But, what I need is not only a simple streaming function! I need a software architechture makes FFmpeg accessing libMMS just like accessing local filesystem!
Does anybody know how to hook the libMMS interfaces like mms_read/mms_seek onto FFmpeg filesystem interfaces like av_read_frame/av_seek_frame?
I think I have to answer my own question again this time……
After several weeks reseach and debuging, I got the truth finally.
Actually, we don’t need to “hook” libMMS onto FFMpeg. Why? Because the FFMpeg already has its native mms protocol process module “mms_protocol” (see in mms_protocol.c in FFMpeg).
All we need to do is just configuring the FFMpeg to enable the mms module like this (see in config.h in FFMpeg):
#define ENABLE_MMS_PROTOCOL 1
#define CONFIG_MMS_PROTOCOL 1
After this configuration, FFMpeg will add mms protocol into its protocol list. (Actually, the protocol list has already contained “local file system protocol”). As result, the FFMpeg could be able to treat the “mms://hostserver/abc” media file like local media file. Therefore, we can still open and read the mms media file using:
av_open_input_file();
av_read_frame();
like we did on local media file before!
By the way, in my ffmpeg version, there are still many bugs in libAVFormat module for proccessing mms protocol. It took one week for me to debug it, however, I think it will be much shorter for the guy as smart as you:-)