How to measure intra GStreamer/gst-launch latency - raspberry-pi

im currently building a GStreamer pipleine on my Raspberry PI as follows:
v4lsrc - h264enc - mpegtsmux - udpsink
im try to figure out how to measure the time in milliseconds for one or several elements, eg. the time consumed by h264enc and mpegtsmux.
Of course I found the following entry which solves this problem by using a gstreamer plugin:
How can I quantitatively measure gstreamer H264 latency between source and display?
But i'm not really sure how to compile such a plugin. Especially on the Raspberry Pi! Ive read that every GStreamer Element needs to calculate its latency anyway. Would be super cool if anybody could help me with that!
Cheers,
Markus

I don't think there's an out-of-the-box way to do this. In the past we used to wrap a pair of identity elements around the element to be measured, and added it to a bin. Very clunky.
I went ahead and uploaded some code that uses the new GstMeta API and two new elements: markin and markout. So you can use it like so:
export GST_DEBUG=markout:5
gst-launch-1.0 -f -e videotestsrc pattern=ball ! video/x-raw,width=320,height=240 ! markin name=moo ! videoscale ! markout ! video/x-raw,width=1280,height=720 ! ximagesink
https://github.com/mpekar/gstreamer_timestamp_marking

Related

Streaming arbitrary data with Gstreamer over network

How can we use Gstreamer to stream arbitrary data?
In this very informative talk (https://www.youtube.com/watch?v=ZphadMGufY8) the lecturer mentions that Gstreamer is media agnostic and a use case where Gstreamer is used for non-media application so this should be possible but I didn’t find anything useful on internet so far.
Particular use case in which I am interested: high-speed usb bayer camera is connected to RPi4. RPi4 reads and forwards camera frames via network. Now, Gstreamer doesn’t support (as far as I know) sending bayer formatted frames via udp/rtp so I need to convert it to something else i.e. to RGB format using the bayer2rgb element. This conversion, however, consumes some part of the processing power from RPi4 so the speed in which RPi4 can read and send frames from the camera is significantly lower.
On top of that I am using RPi4 as a data acquisition system for other sensors also so it would be great if I could use Gstreamer to stream them all.
The sender pipeline is
gst-launch-1.0 videotestsrc ! video/x-bayer,format=bggr,width=1440,height=1080,framerate=10/1 ! rtpgstpay mtu=1500 ! queue ! multiudpsink clients=127.0.0.1:5000
The receiver pipeline is
gst-launch-1.0 -v udpsrc port=5000 caps="application/x-rtp, media=(string)application, clock-rate=(int)90000, encoding-name=(string)X-GST" ! queue ! rtpgstdepay ! bayer2rgb ! videoconvert ! autovideosink
Take care of the mtu, as far as I know, PI supports 1500 bytes only, no Jumbo Frames.
Also expected missing packages.

Real Time streaming of USB cam on windows

I am looking for a Real Time pipeline for the following:
(if anyone of you ever streamed a USB camera from one PC to another PC into MATLAB as video/images objects, that is exactly what I'm after... if not:)
I have a USB camera on a windows PC as server, I need to get JPEGs on the client.
I need it to stream ~15 fps, the resolution is not that important atm.
I succeeded in doing the streaming, with parsing the JPEGs (but it was slow as hell, both latency and frame rate) as follows:
Tx (server): gst-launch-1.0 videotestsrc ! jpegenc ! rtpjpegpay ! udpsink host=<HOST> port=1234
Rx (clinet): gst-launch-1.0 udpsrc port=1234 ! "application/x-rtp, payload=26, encoding-name=JPEG" ! rtpjpegdepay ! jpegparse ! multifilesink location="%d.jpeg"
It works at around 0.7 fps over a very fast local network, which is worthless to me.
I've tried "pairing" my camera (ksvideosrc) to some kind of faster protocol (payload? I'm new to this world...), instead of the JPEGs I've used, but I can't find anything that gst would agree on...

AVB Streaming Stored file from sever to client : TimeStamping issue

I am working on AVB application.As we have created gstreamer plugins at talker side and listener side and we used that plugins to transfer stored media.
I am using below pipeline
Talker side :
gst-launch-1.0 filesrc location=/home/input.mp4 ! queue ! avbsink interface=eth0 fd=0 (here avbsink is created property to transfer avb packets)
Listener side :
gst-launch-1.0 avbsrc interface=eth0 dataSync=1 mediaType=0 fd=0 ! queue ! qtdemux name=mux mux.video_0 ! queue ! avdec_h264 ! autovideosink mux.audio_0 ! queue ! decodebin ! autoaudiosink
(i tried vaapidecode and vaapisink instead of avdec_h264 and autovideosink for hardware accelerator )
Error comming on listener side is
"WARNING: from element /GstPipeline:pipeline0/GstVaapisink0: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2683) : gst_base_sink_is_too_late(): /GstPipeline:pipeline0/GstVaapiSink:vaapisink0;
There may be a timestamping problem, or this computer is too slow. "
I have seen one solution to use sync=false then i have added sync=false with vaapisink and error message got eliminate but still video is not playing smoothly. its continuously gating stop and again starting.
Is there any solution to play video continuously.( Only high quality video(720p or more) is not playing, application is working for low quality video ).
It looks like the size of the buffer is not enough since a frame of hd video has more pixels. The other point I can propose is may be you can apply some sort of compression algorithm prior to sending the frame to listener but I am not sure if compression is contradicting any one of AVB protocols.

Video streaming over RTP using gstreamer

I am trying to stream a video file using gstreamer from one device to another over RTP. At the sender side I am using the following command :
gst-launch filesrc location=/home/kuber/Desktop/MELT.MPG ! mpegparse ! rtpsend ip=localhost
But this gives the following error : no element "rtpsend" , I downloaded all the rtp tools and still the same error. Am I using rtpsend in some wrong way?
Also can someone give me the command line code for streaming video file(locally stored in my laptop and not the testvideosrc file) from one device to another? strong text
Assuming this is a MPEG1/2 elementary stream (because you are using mpegparse) that you want to send out You need to use rtpmpvpay after your mpegparse and then give the output to udpsink.
mpegparse ! rtpmpvpay ! udpsink host="hostipaddr" port="someport"
I am not aware of any rtpsend plugin as such. The above holds true for any streaming on rtp.
Do a gst-inspect | grep rtp to see all the payloaders, depayers
If it is a mpegps stream you need to first do a mpegpsdemux before the rest of the pipeline.
EDIT:
Why not remove mpegparse? don't see why you need it. You should learn to look at the source and sink requirements in gst-inspect of the component, that will tell you the compatibility that is needed between nodes. Recieving will be reverse udpsrc port="portno" ! capsfilter caps="application/x-rtp, pt=32, ..enter caps here" ! rtpmpvdepay !

gstreamer pipeline that was working now requiring a bunch of queue components, why?

I have a C program that records video and audio from a v4l2 source into flv format. I noticed that the program did not work on newer versions of ubuntu. I decided to try to run the problamatic pipeline in gst-launch and try to find the simplest pipeline that would reproduce the problem. Just focusing on the video side I have reduced it to what you see below.
So I have a gstreamer pipeline that was working:
gst-launch v4l2src ! tee name="vtee" ! queue ! videorate ! ffmpegcolorspace ! ffdeinterlace ! x264enc ! flvmux name="mux" ! filesink location=vid.flv vtee. ! queue ! xvimagesink
Now it will only work if I do this with a bunch of queue's added one after another before the xvimagesink. Although this does work I get a 2 second delay before the pipeline starts to work and i also get the message :
gst-launch v4l2src ! tee name="vtee" ! queue ! videorate ! ffmpegcolorspace ! ffdeinterlace ! x264enc ! flvmux name="mux" ! filesink location=vid.flv vtee. ! queue ! queue ! queue ! queue ! queue ! xvimagesink
Although the second pipeline above works, there is a pause before the pipeline starts running and I get the message (I don't think this system is 2 slow, its a core i7 with tons of ram):
Additional debug info:
gstbasesink.c(2692): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
There may be a timestamping problem, or this computer is too slow.
Can any one explain what is happening here? What am I doing wrong?
You claim that the first pipeline stopped working but you don't explain what happened. Things stop working because something else changed:
- version of GStreamer and submodules ?
- version of OS ?
- version of camera ?
It shouldn't be necessary to add a bunch of queues in a row. In practice they will create thread boundaries and separate the part before and after across threads, and it will add the delay you see, which will affect the latency and sync.
An old message, but the problem is still not fixed. Somewhere between 9.10 and 11.10 (I upgraded a few before noticing). I got around it by avoiding the x264enc and used ffenc_mpeg4 instead.
I just noticed this note from Gstreamer Cheat Sheet :
Note: We can replace theoraenc+oggmux with x264enc+someothermuxer but then the pipeline will freeze unless we make the queue [19] element in front of the xvimagesink leaky, i.e. "queue leaky=1".
Which doesn't work for me so I'll stick with ffenc_mpeg4.