Streaming arbitrary data with Gstreamer over network - streaming

How can we use Gstreamer to stream arbitrary data?
In this very informative talk (https://www.youtube.com/watch?v=ZphadMGufY8) the lecturer mentions that Gstreamer is media agnostic and a use case where Gstreamer is used for non-media application so this should be possible but I didn’t find anything useful on internet so far.
Particular use case in which I am interested: high-speed usb bayer camera is connected to RPi4. RPi4 reads and forwards camera frames via network. Now, Gstreamer doesn’t support (as far as I know) sending bayer formatted frames via udp/rtp so I need to convert it to something else i.e. to RGB format using the bayer2rgb element. This conversion, however, consumes some part of the processing power from RPi4 so the speed in which RPi4 can read and send frames from the camera is significantly lower.
On top of that I am using RPi4 as a data acquisition system for other sensors also so it would be great if I could use Gstreamer to stream them all.

The sender pipeline is
gst-launch-1.0 videotestsrc ! video/x-bayer,format=bggr,width=1440,height=1080,framerate=10/1 ! rtpgstpay mtu=1500 ! queue ! multiudpsink clients=127.0.0.1:5000
The receiver pipeline is
gst-launch-1.0 -v udpsrc port=5000 caps="application/x-rtp, media=(string)application, clock-rate=(int)90000, encoding-name=(string)X-GST" ! queue ! rtpgstdepay ! bayer2rgb ! videoconvert ! autovideosink
Take care of the mtu, as far as I know, PI supports 1500 bytes only, no Jumbo Frames.
Also expected missing packages.

Related

Real Time streaming of USB cam on windows

I am looking for a Real Time pipeline for the following:
(if anyone of you ever streamed a USB camera from one PC to another PC into MATLAB as video/images objects, that is exactly what I'm after... if not:)
I have a USB camera on a windows PC as server, I need to get JPEGs on the client.
I need it to stream ~15 fps, the resolution is not that important atm.
I succeeded in doing the streaming, with parsing the JPEGs (but it was slow as hell, both latency and frame rate) as follows:
Tx (server): gst-launch-1.0 videotestsrc ! jpegenc ! rtpjpegpay ! udpsink host=<HOST> port=1234
Rx (clinet): gst-launch-1.0 udpsrc port=1234 ! "application/x-rtp, payload=26, encoding-name=JPEG" ! rtpjpegdepay ! jpegparse ! multifilesink location="%d.jpeg"
It works at around 0.7 fps over a very fast local network, which is worthless to me.
I've tried "pairing" my camera (ksvideosrc) to some kind of faster protocol (payload? I'm new to this world...), instead of the JPEGs I've used, but I can't find anything that gst would agree on...

How to measure intra GStreamer/gst-launch latency

im currently building a GStreamer pipleine on my Raspberry PI as follows:
v4lsrc - h264enc - mpegtsmux - udpsink
im try to figure out how to measure the time in milliseconds for one or several elements, eg. the time consumed by h264enc and mpegtsmux.
Of course I found the following entry which solves this problem by using a gstreamer plugin:
How can I quantitatively measure gstreamer H264 latency between source and display?
But i'm not really sure how to compile such a plugin. Especially on the Raspberry Pi! Ive read that every GStreamer Element needs to calculate its latency anyway. Would be super cool if anybody could help me with that!
Cheers,
Markus
I don't think there's an out-of-the-box way to do this. In the past we used to wrap a pair of identity elements around the element to be measured, and added it to a bin. Very clunky.
I went ahead and uploaded some code that uses the new GstMeta API and two new elements: markin and markout. So you can use it like so:
export GST_DEBUG=markout:5
gst-launch-1.0 -f -e videotestsrc pattern=ball ! video/x-raw,width=320,height=240 ! markin name=moo ! videoscale ! markout ! video/x-raw,width=1280,height=720 ! ximagesink
https://github.com/mpekar/gstreamer_timestamp_marking

AVB Streaming Stored file from sever to client : TimeStamping issue

I am working on AVB application.As we have created gstreamer plugins at talker side and listener side and we used that plugins to transfer stored media.
I am using below pipeline
Talker side :
gst-launch-1.0 filesrc location=/home/input.mp4 ! queue ! avbsink interface=eth0 fd=0 (here avbsink is created property to transfer avb packets)
Listener side :
gst-launch-1.0 avbsrc interface=eth0 dataSync=1 mediaType=0 fd=0 ! queue ! qtdemux name=mux mux.video_0 ! queue ! avdec_h264 ! autovideosink mux.audio_0 ! queue ! decodebin ! autoaudiosink
(i tried vaapidecode and vaapisink instead of avdec_h264 and autovideosink for hardware accelerator )
Error comming on listener side is
"WARNING: from element /GstPipeline:pipeline0/GstVaapisink0: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2683) : gst_base_sink_is_too_late(): /GstPipeline:pipeline0/GstVaapiSink:vaapisink0;
There may be a timestamping problem, or this computer is too slow. "
I have seen one solution to use sync=false then i have added sync=false with vaapisink and error message got eliminate but still video is not playing smoothly. its continuously gating stop and again starting.
Is there any solution to play video continuously.( Only high quality video(720p or more) is not playing, application is working for low quality video ).
It looks like the size of the buffer is not enough since a frame of hd video has more pixels. The other point I can propose is may be you can apply some sort of compression algorithm prior to sending the frame to listener but I am not sure if compression is contradicting any one of AVB protocols.

Video streaming over RTP using gstreamer

I am trying to stream a video file using gstreamer from one device to another over RTP. At the sender side I am using the following command :
gst-launch filesrc location=/home/kuber/Desktop/MELT.MPG ! mpegparse ! rtpsend ip=localhost
But this gives the following error : no element "rtpsend" , I downloaded all the rtp tools and still the same error. Am I using rtpsend in some wrong way?
Also can someone give me the command line code for streaming video file(locally stored in my laptop and not the testvideosrc file) from one device to another? strong text
Assuming this is a MPEG1/2 elementary stream (because you are using mpegparse) that you want to send out You need to use rtpmpvpay after your mpegparse and then give the output to udpsink.
mpegparse ! rtpmpvpay ! udpsink host="hostipaddr" port="someport"
I am not aware of any rtpsend plugin as such. The above holds true for any streaming on rtp.
Do a gst-inspect | grep rtp to see all the payloaders, depayers
If it is a mpegps stream you need to first do a mpegpsdemux before the rest of the pipeline.
EDIT:
Why not remove mpegparse? don't see why you need it. You should learn to look at the source and sink requirements in gst-inspect of the component, that will tell you the compatibility that is needed between nodes. Recieving will be reverse udpsrc port="portno" ! capsfilter caps="application/x-rtp, pt=32, ..enter caps here" ! rtpmpvdepay !

Correct gstreamer pipeline for particular rtsp stream

I'm trying to convert this RTSP URL to something else (anything!) using this gst pipeline:
gst-launch rtspsrc location=rtsp://184.72.239.149/vod/mp4:BigBuckBunny_175k.mov ! rtpmp4vdepay ! filesink location=somebytes.bin
This gives the following error:
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2791): gst_base_src_loop (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0:
streaming task paused, reason not-linked (-1)
So I guess it's something about connecting the rstp source to the depayloader. If I change the pipeline to use rtpmp4gdepay rather than vdepay, it works and produces something, but I'm not sure what the output format is.
Does anyone know what pipeline I should be using to get at the video from this URL? I'm assuming it's mp4/h264/aac, but maybe it's not.
Try this first:
gst-launch-0.10 -v playbin2 uri=rtsp://184.72.239.149/vod/mp4:BigBuckBunny_175k.mov
or
gst-launch-1.0 -v playbin uri=rtsp://184.72.239.149/vod/mp4:BigBuckBunny_175k.mov
Mov file is not directly streamable. So your RTSP source is probably sending you two elementary streams. (I am guessing this is darwin or some other similar server) So you may have to setup two outputs from rtspsrc one for audio and one for video.
rtpmp4vpay is for elementary mpeg4 streams. Is your source file having mpeg4 video codec? If it is h.264 replace it with rtph264depay. you can pass the output to decoder and play it if you want. Feed it to decodebin. To dump it in h.264 you will first have to parse and add nal headers to it ( h264parse parse perhaps? )
rtpmp4gpay is most probably accepted for the audio stream.
I am guessing your file is h.264/aac which is why rtpmp4vdepay wont work and rtpmp4gdepay will. But you are not doing anything about the video when you setup rtpmp4gdepay so you need to do that.