Raspberry Pi not able to receive and view stream from another Pi using gStreamer - raspberry-pi

I've got a double Pi setup. One 3b and one 4. The 4 is setup with the 7" raspberry pi screen. The 3b is setup with two cameras. I'll name these "sender" (3b) and "player" (4).
I'm trying to use gstreamer to send low latency video from the sender to the reciever. I've found and used this command on the sender:
gst-launch-1.0 v4l2src device=/dev/video0 ! \
'video/x-raw, width=352, height=288, framerate=25/1' ! \
videoconvert ! \
x264enc pass=qual quantizer=20 tune=zerolatency ! \
rtph264pay ! \
multiudpsink clients="10.0.0.200:5600,10.0.0.178:5600"
And using this code on the receiver:
export DISPLAY=:0
gst-launch-1.0 -v udpsrc port=5600 ! application/x-rtp, payload=96 ! rtph264depay ! avdec_h264 ! autovideosink
So first I tested the receiver command on my Windows PC and it works flaweless. Sub 100ms delay and everything.
Now I test this on the receiver pi 4 and I get this:
pi#receiver4:~ $ gst-launch-1.0 -v udpsrc port=5600 ! application/x-rtp, payload=96 ! rtph264depay ! avdec_h264 ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
It stops here then I start the sender and the following happens:
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, payload=(int)96, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, payload=(int)96, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)017a000dffe1001b677a000dbcb202c12d80a506060640000003004000000ca3c50a9201000668ebc1b2c8b0, level=(string)1.3, profile=(string)high-4:2:2
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)017a000dffe1001b677a000dbcb202c12d80a506060640000003004000000ca3c50a9201000668ebc1b2c8b0, level=(string)1.3, profile=(string)high-4:2:2
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:src: caps = video/x-raw, format=(string)Y42B, width=(int)352, height=(int)288, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt601, framerate=(fraction)25/1
Redistribute latency...
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data stream error.
Additional debug info:
../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:03.494063865
Setting pipeline to NULL ...
Freeing pipeline ...
So why is this not streaming on the raspberry pi screen as it does on my windows screen? I tested with the test video src and I get a window with the expected output:
export DISPLAY=:0
gst-launch-1.0 videotestsrc ! autovideosink

You may try this as sender:
gst-launch-1.0 videotestsrc do-timestamp=1 is-live=1 ! video/x-raw, width=352, height=288, framerate=25/1 ! videoconvert ! x264enc pass=qual quantizer=20 tune=zerolatency insert-vui=1 key-int-max=16 ! h264parse ! rtph264pay ! udpsink host=<target_host_IP_v4_or_v6> port=5004 -ev
and this as receiver:
gst-launch-1.0 udpsrc address=<same_IP_v4_or_v6_address> port=5004 ! application/x-rtp,media=video,encoding-name=H264,clock-rate=90000 ! rtpjitterbuffer latency=300 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! autovideosink -v
In case the network between both RPi is wifi, you may also try disabling updsink auto-multicast property.

Related

Use GStreamer to pack existing h264 stream and send it over network to VLC

I'm trying to stream from Raspberry PI camera over network using raspivid and gstreamer cli. I want to be able to view the stream using VLC "open network stream" on the client.
This is related to question GStreamer rtp stream to vlc however mine it is not quite the same. Instead of encoding the raw output from my PI camera, my idea is to leverage the existing the h264 output of raspivid, mux it into an appropriate container and send it over TCP or UDP.
I was able to successfully capture the h264 output from raspivid into an mp4 file (with correct fps and length information) using this pipeline:
raspivid -n -w 1280 -h 720 -fps 24 -b 4500000 -a 12 -t 30000 -o - | \
gst-launch-1.0 -v fdsrc ! video/x-h264, width=1280, height=720, framerate=24/1 ! \
h264parse ! mp4mux ! filesink location="videofile.mp4"
However, when I try to stream this over a network:
raspivid -n -w 1280 -h 720 -fps 24 -b 4500000 -a 12 -t 0 -o - | \
gst-launch-1.0 -v fdsrc ! video/x-h264, width=1280, height=720, framerate=24/1 ! \
h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink host=192.168.1.20 port=5000
...and try to open the stream using rtp://192.168.1.20:5000 on VLC, it reports an error.
Edit: Ok, I was mistaken to assume that the udpsink listens for incoming connections. However, after changing the last part of the pipeline to use my client's IP address ! udpsink host=192.168.1.77 port=5000 and tried opening that with udp://#:5000 on the VLC, the player does not display anything (both the PI and the receiving computer are on the same LAN and I can see the incoming network traffic on the client).
Does anyone know how to properly construct a gstreamer pipeline to transmit existing h264 stream over a network which can be played by vanilla VLC on the client?
Assuming this is due to missing SPS/PPS data. E.g. probably it works if you start VLC first and then the video pipeline on the Raspberry PI. By default the SPS/PPS headers are most likely send only once at the beginning of the stream.
If the receiver misses SPS/PPS headers it will not be able to decode the H.264 stream. I guess this can be fixed by using the config-interval=-1 property of h264parse.
With that option SPS/PPS data should be send before each IDR-frame which should occur every couple of seconds - depending on the encoder.
Another thing is that you don't need to use rtpmp2tpay block. Just sending MPEG TS over UDP directly should be enough.
Having said that, the pipeline should look like this:
raspivid -n -w 1280 -h 720 -fps 24 -b 4500000 -a 12 -t 0 -o - | \
gst-launch-1.0 -v fdsrc ! \
video/x-h264, width=1280, height=720, framerate=24/1 ! \
h264parse config-interval=-1 ! mpegtsmux ! udpsink host=192.168.1.77 port=5000
The 192.168.1.77 is the IP address of the client running VLC at udp://#5000. Also, make sure no firewalls are blocking the incoming UDP trafict towards the client (Windows firewall, in particular).

GStreamer streaming has to start client first

I am able to stream h264 or MPEG4 video. Client (player) has to be started first and then start streaming (start server).
What I need is to be able to connect to the already playing stream (start client after server).
Both server and client run on same i.MX6 device.
Server example:
gst-launch-0.10 -vvv
filesrc location=bruce.mp4 typefind=true \
! qtdemux name=demux \
demux.video_00 \
! queue ! h264parse split-packetized=true \
! mpegtsmux ! rtpmp2tpay ! gdppay \
! udpsink host=239.255.1.1 port=5004
Client example:
gst-launch-0.10 -vvv
udpsrc port=5004 multicast-group=239.255.1.1 caps="application/x-gdp" \
! gdpdepay ! gstrtpjitterbuffer \
! rtpmp2tdepay ! mpegtsdemux name=demux \
! queue max-size-buffers=0 max-size-time=0 \
! vpudec low-latency=true framedrop=true \
! mfw_isink sync=false

Gstreamer how to record, take screenshot of a stream

I am streaming video from my raspberry using gstreamer and h264 decoder as below:
raspivid -t 999999 -h 720 -w 1080 -fps 25 -hf -b 2000000 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=192.168.2.7 port=5000
From my desktop I am able to stream the video as below:
gst-launch-1.0 -v tcpclientsrc host=192.168.2.7 port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert ! gdkpixbufoverlay location=gstreamer-logo.svg offset-x=20 offset-y=20 ! autovideosink sync=false
I am looking for the following , any help is appreciated.Thanks.
Capture the screenshot of the video while streaming
Record the stream to a local file
I was able to logo overlay using gdkpixbufoverlay, want to add time also. Tried clockoverlay, but the stream got struck.

no element "rtpmp2tdepay" . How can i install it

i am a new guy in gstreamer. I am trying to run the following command in my raspberrypi
gst-launch-0.10 -v udpsrc port=1234 caps='application/x-rtp,payload=(int)96,encoding-name=(string)H264' ! queue ! rtph264depay ! h264parse ! omxh264dec ! autovideosink sync=True
i am getting following error
WARNING: erroneous pipeline: no element "rtpmp2tdepay"
any idea how can i slove it ? how can install the element "rtpmp2tdepay"
thanks again for your response
rtpmp2tdepay is included in the gstreamer good plugin.
in your command line type in :
sudo apt-get install gst-plugins-good

gstreamer flvmux and rtmp error

I trying to stream rtmp from rasberrypi, the omx hardware encoder worked really nice, by the way, so I'm running:
gst-launch-1.0 v4l2src ! «video/x-raw,width=640,height=480,framerate=30/1» !\
omxh264enc target-bitrate=1000000 control-rate=variable !\
video/x-h264,profile=high ! h264parse ! queue ! \
flvmux name=mux alsasrc device=plughw:1 ! audioresample ! \
audio/x-raw,rate=48000,channels=1 ! queue ! voaacenc bitrate=32000 ! queue ! mux. mux. !\
rtmpsink location='rtmp://my_rtmp_for_ustream.tv_url'
And there is an error:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstAudioSrcClock
ERROR: from element /GstPipeline:pipeline0/GstAlsaSrc:alsasrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2812): gst_base_src_loop (): /GstPipeline:pipeline0/GstAlsaSrc:alsasrc0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 535913298 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
libv4l2: warning v4l2 mmap buffers still mapped on close()
Freeing pipeline ...
First of all i thought that there is some alsa problems or such, but then I tried to write simple mpegts and it worked:
gst-launch-1.0 v4l2src ! «video/x-raw,width=640,height=480,framerate=30/1» ! \
omxh264enc target-bitrate=1000000 control-rate=variable !\
video/x-h264,profile=high ! h264parse ! queue ! \
mpegtsmux name=mux alsasrc device=plughw:1 ! audioresample !\
audio/x-raw,rate=48000,channels=1 ! queue ! voaacenc bitrate=32000 ! \
queue ! mux. mux. ! filesink location=1.ts
But i can't just change "filesink location=1.ts" to rtmpsink location='rtmp://my_rtmp_for_ustream.tv_url' because i'll get an error:
WARNING: erroneous pipeline: could not link mux to rtmpsink0
So, what can I do to get it work? Thanks.
rtmpsink requires data in "video/x-flv" format. Your first pipeline clearly showing negotiation error. Can you get me caps negotiation dump of your pipeline by adding -v in command line.
Real problem was that rtmpsync need raw aac, so I added aacparse and it worked out, something like this:
gst-launch-1.0 v4l2src ! \
"video/x-raw, framerate=25/1, width=320, height=240" ! \
omxh264enc target-bitrate=300000 control-rate=variable ! \
h264parse ! queue ! flvmux name=muxer alsasrc device=hw:1 ! \
audioresample ! "audio/x-raw,rate=48000" ! queue ! \
voaacenc bitrate=32000 ! aacparse ! queue ! muxer. muxer. ! \
rtmpsink location="$RTMP_URL"