DirectShow to RTSP to Kurento - streaming

I need to broadcast a DirectShow samples to Kurento using RTSP. Is there a good way to send RTSP from DirectShow?
So far I have tested with raw samples to HTTP endpoint, different leadtools solutions RTSP, Dash. But no luck so far!

Related

How do I publish a video compressed with h265?

I set up a nginx server and I can broadcast using rtmp. I do this by compressing it with h264, but I also need to compress it with h265. rtmp and flv do not support h265.
as a result can you offer me a server and protocol to use h265? get open source if possible.
edit:
okey should not explain the problem a little more. I can send a video to my nginx server with a client and watch this video in hls and dash.
But what I have to do is compress this video with a h265 applet with a converter like ffmpeg and then watch it or send it to another client. I couldn't find anything other than rtmp to send and receive videos.
MP4 and MKV both support H.265 video streams.
You can use DASH or HLS to stream your MP4 segments.

Get RTSP stream on server from WebRTC

How can I get RTSP stream on server from WebRTC and then use it for my own purposes, for example, to retranslate it somewhere. Maybe there is some solution or library?
Any help is much appreciated!
Denis
If you are willing to hook into the webrtc code then this can be done easily by doing the following:
Build a native webrtc C++ client on server
If the codec used is H.264, tap the output (encoded frames) from H264 RTP receiver.
Use a third party library such as (http://www.live555.com/openRTSP/) to build a RTSP server out of it.
If vp8, you might as well tap the output of vp8 decoder and then use ffmpeg to create rtsp stream out of it. Or consider webm may be?

how to stream high quality video and audio from iPhone to remote server through Internet

I am seeking the following three items, which I cannot find on STACKOVERFLOW or anywhere:
sample code for AVFoundation capturing to file chunks (~10seconds) that are ready for compression?
sample code for compressing the video and audio for transmisison across the Internet?
ffmpeg?
sample code for HTTP Live Streaming sending files from iPhone to Internet server?
My goal is to use the iPhone as a high quality AV camcorder that streams to a remote server.
If the intervening data rate bogs down, files should buffer at the iPhone.
thanks.
You can use AVAssetWriter to encode a MP4 file of your desired length. The AV media will be encoded into the container in H264/AAC. You could then simply upload this to a remote server. If you wanted you could segment the video for HLS streaming, but keep in mind that HLS is designed as a server->client streaming protocol. There is no notion of push as far as I know. You would have to create a custom server to accept pushing of segmented video streams (which does not really make a lot of sense given the way HLS is designed. See the RFC Draft. A better approach might be to simply upload the MP4(s) via a TCP socket and have your server segment the video for streaming to client viewers. This could be easily done with FFmpeg either on the command line, or via a custom program.
I also wanted to add that if you try and stream 720p video over a cellular connection your app will more then likely get rejected for excessive data use.
Capture Video and Audio using AVFouncation. You can specify the Audio and Video codecs to kCMVideoCodecType_H264 and kAudioFormatMPEG4AAC, Frame sizes, Frame rates in AVCaptureformatDescription. It will give you Compressed H264 video and AAC aduio.
Encapsulate this and transmit to server using any RTP servers like Live555 Media.

how to use Gstreamer for rtsp proxy

I want to create an rtsp proxy where rtsp stream can be taken from one end and then need to stream the same(using rtsp) on other side.This is more of a rtsp relay but I wanted to trans code this relayed stream in between.
Can anyone suggest a way for doing this? I had considered Gstreamer for this, but that can act only as rtsp server, but I still don't know how to get the rtsp stream from other side and how to relay rtsp events.
Any existing library which can help me in achieving this? or If Gstreamer can be used in this way ?
Gstreamer should be able to use RTSP on input, there is rtspsrc element for it.

How to use FFMPEG on wowza to encode Live rtmp streams?

I am looking for a source to explain how to use FFmpeg with wowza to transcode live rtmp to http. Does anyone know anything about it or know where to point me to get some info?
Thank you all,
Wowza does not currently support live transcoding of video, however, transcoding is only necessary when you need to create new video data. As long as you are outputting the bitrate/resolutions that you need wowza can re-package to most protocals you may need for internet delivery.
Example:
3 H.264 streams in an RTMP container --> WOWZA ---> 3 RTMP streams
---> 3 Apple HLS streams (http)
---> 3 Silverlight smooth streams (http)
Quickstart Guide Here
Wowza V3 (released in October 2011) has transcoding capabilities. It sounds like you're really just trying to rewrap the H.264 RTMP steam into Apple HLS, which doesn't require any transcoding, just segmenting into HTTP packets.