Convert RTSP to RTMP stream [closed] - streaming

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I have an IP camera which is streaming via RTSP and RTP.
Ideally I would like to convert RTSP to RTMP to stream it to LiveStream or similar streaming services.
Can anyone please let me know how can I convert RTSP to RTMP for the purpose of streaming it to streaming services?

Using FFMPEG you can convert rtsp stream to rtmp
For Example
ffmpeg -i "[your rtsp link]" -f flv -r -s -an "[Your rtmp link]"
run the above syntax on ubuntu or linux os . it will convert your rtsp stream to rtmp stream

After some extensive research, I have found that almost all RTSP->RTMP "solution" providers use Wowza 2 to convert RTSP->RTMP. Thats it.
Once you tell them that you need anything else too, like to convert MPEG4 part 2 to MPEG4 part 10(H.264), they tell you that they cant do that.
Wowza 3, which will be released in October 2011, will have a transcoding module which which should be able to transcode the content in addition to RTSP->RTMP stream conversion.
other potential options are:
VLCplayer
mPlayer
FFmpeg
I am still researching and will update this topic once I am done.

To summarize your options, you can use one of the following streaming servers:
Wowza, Unreal Media Server, crtmpserver, erlyvideo. All of them will receive RTSP stream and re-stream it with RTMP.

You can also use Gstreamer for it. Just create a rtsp/rtp client (source), pipe it to mux optionally (If you need any transcode, you can append here) and sink to rtmp. Compared to VLC, the performance will be faster and it is free compared to Wowza.

Related

Stream images with h.264 encoding [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 12 months ago.
Improve this question
I am extracting frames from camera and doing processing on the extracted frame.Once the processing is done, I want to stream these frames with h.264 encoding to other system.How can I do that?
You will generally want to put the H.264 into a video container like MP4 or AVI.
For example the wrapping from raw frame to streaming protocol for online video might be:
raw pixels bitmap
raw pixels encoded (e.g. h.264 encoded)
encoded video stream packaged into container with audio streams, subtitles etc (e.g. mp4 container)
container broken into 'chunks' or segments for streaming (on iOS using HLS streaming format).
Another common approach is for a camera to stream content to a dedicated streaming server and the server then provide streams to end devices using a streaming protocol like HLS or MPEG DASH. An example (at the time of writing and it appear to be kept updated) showing a stream from a camera using RTSP to a Server and then HLS or MPEG DASH from the server is here:
https://www.wowza.com/docs/how-to-re-stream-video-from-an-ip-camera-rtsp-rtp-re-streaming
If your use case is simple you will possibly not want to use a segmented ABR streaming protocol like HLS or MPEG-DASH so you could just stream the mp4 file from a regular HTTP server.
One way to approach this that will allow you build on other's examples is to use openCV in Python - you can see an example in this question and answers of writing video frames to an AVI or MP4 container: Writing an mp4 video using python opencv
Once you have your MP4 file created you can place in a folder and use a regular HTTP server to make it available for users to download or stream.
Note that if you want to stream the frames as a live stream, i.e. as you are creating them one by one, then this is trickier as you won't simply have a complete MP4 file to stream. If you do want to do this then leveraging an existing implementation would be a good place to start - this one is an example of a point to point web socket based live stream and I open source and Python based:
https://github.com/rena2damas/remote-opencv-streaming-live-video
if you want to stream data over UDP socket - use RTP protocol for streaming.
Please go through the rfc specification of RFC 6184
Media Pieline for processing the camera data:
Camera RAW data ( RGB/YUV/NV12) -> H.264 encoder -> NALU packets RTP packetization-> Socket communication.
You can use ffmpeg python interface to achieve this goal.

How do I publish a video compressed with h265?

I set up a nginx server and I can broadcast using rtmp. I do this by compressing it with h264, but I also need to compress it with h265. rtmp and flv do not support h265.
as a result can you offer me a server and protocol to use h265? get open source if possible.
edit:
okey should not explain the problem a little more. I can send a video to my nginx server with a client and watch this video in hls and dash.
But what I have to do is compress this video with a h265 applet with a converter like ffmpeg and then watch it or send it to another client. I couldn't find anything other than rtmp to send and receive videos.
MP4 and MKV both support H.265 video streams.
You can use DASH or HLS to stream your MP4 segments.

RTSP to RTMP streaming or something else [duplicate]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I have an IP camera which is streaming via RTSP and RTP.
Ideally I would like to convert RTSP to RTMP to stream it to LiveStream or similar streaming services.
Can anyone please let me know how can I convert RTSP to RTMP for the purpose of streaming it to streaming services?
Using FFMPEG you can convert rtsp stream to rtmp
For Example
ffmpeg -i "[your rtsp link]" -f flv -r -s -an "[Your rtmp link]"
run the above syntax on ubuntu or linux os . it will convert your rtsp stream to rtmp stream
After some extensive research, I have found that almost all RTSP->RTMP "solution" providers use Wowza 2 to convert RTSP->RTMP. Thats it.
Once you tell them that you need anything else too, like to convert MPEG4 part 2 to MPEG4 part 10(H.264), they tell you that they cant do that.
Wowza 3, which will be released in October 2011, will have a transcoding module which which should be able to transcode the content in addition to RTSP->RTMP stream conversion.
other potential options are:
VLCplayer
mPlayer
FFmpeg
I am still researching and will update this topic once I am done.
To summarize your options, you can use one of the following streaming servers:
Wowza, Unreal Media Server, crtmpserver, erlyvideo. All of them will receive RTSP stream and re-stream it with RTMP.
You can also use Gstreamer for it. Just create a rtsp/rtp client (source), pipe it to mux optionally (If you need any transcode, you can append here) and sink to rtmp. Compared to VLC, the performance will be faster and it is free compared to Wowza.

How to use FFMPEG on wowza to encode Live rtmp streams?

I am looking for a source to explain how to use FFmpeg with wowza to transcode live rtmp to http. Does anyone know anything about it or know where to point me to get some info?
Thank you all,
Wowza does not currently support live transcoding of video, however, transcoding is only necessary when you need to create new video data. As long as you are outputting the bitrate/resolutions that you need wowza can re-package to most protocals you may need for internet delivery.
Example:
3 H.264 streams in an RTMP container --> WOWZA ---> 3 RTMP streams
---> 3 Apple HLS streams (http)
---> 3 Silverlight smooth streams (http)
Quickstart Guide Here
Wowza V3 (released in October 2011) has transcoding capabilities. It sounds like you're really just trying to rewrap the H.264 RTMP steam into Apple HLS, which doesn't require any transcoding, just segmenting into HTTP packets.

live streaming in iPhone

Will the ALAC format support live streaming in iPhone ? the ALAC audio recording format is streamed to Server machine? so will i be able to play the audio chunk data, does ALAC format support?
Thank You.
Assuming you mean "Apple Lossless" audio...
I don't see why it wouldn't, but I don't know the details. You'll probably need to embed it in a transport stream instead of a MPEG 4 container (but then, I don't know how the HTTP live streaming works either).
I don't think streaming lossless audio is sensible, though.
Streaming lossless audio is possible, we have flac streaming using icecast and it works beautifully. However, we are not using HTTP Live Stream (HLS) to do it. We stream flac from the source generator to a number of servers and they create HLS's from there.
It is technically possible to mux alac into mpegts (ffmpeg can do this) as well as play it back (using ffmpeg), but there isn't a format identifier for other clients. Adding this feature to HLS will be as easy as calling/writing Apple and asking them to add ALAC to this list:
http://www.smpte-ra.org/mpegreg/mpegreg.html
and update their products accordingly. If you've purchased an Apple product less than 90 days ago, or you have AppleCare: give them a call. They have to work on the issue for you if you are covered. The more requests that get elevated to their engineers, the more likely they are to add support for alac in HLS.