Real Time streaming of USB cam on windows - matlab

I am looking for a Real Time pipeline for the following:
(if anyone of you ever streamed a USB camera from one PC to another PC into MATLAB as video/images objects, that is exactly what I'm after... if not:)
I have a USB camera on a windows PC as server, I need to get JPEGs on the client.
I need it to stream ~15 fps, the resolution is not that important atm.
I succeeded in doing the streaming, with parsing the JPEGs (but it was slow as hell, both latency and frame rate) as follows:
Tx (server): gst-launch-1.0 videotestsrc ! jpegenc ! rtpjpegpay ! udpsink host=<HOST> port=1234
Rx (clinet): gst-launch-1.0 udpsrc port=1234 ! "application/x-rtp, payload=26, encoding-name=JPEG" ! rtpjpegdepay ! jpegparse ! multifilesink location="%d.jpeg"
It works at around 0.7 fps over a very fast local network, which is worthless to me.
I've tried "pairing" my camera (ksvideosrc) to some kind of faster protocol (payload? I'm new to this world...), instead of the JPEGs I've used, but I can't find anything that gst would agree on...

Related

broadcast visualization along live audio : lower CPU usage?

I need to broadcast an audio live signal with its visualization with wavescope, and a background image. I actually built a working gst-launch-1.0 command. It's just that it's taking uo a lot of the CPU usage and I wish to lower the resources demands. I'm testing my signal with nginx' rtmp module on localhost, and playing back with ffplay.
I'm aware that this question comes quite close, but I believe that the problem at hand is more concrete : I'm looking for a way to send less wavescope frames, in the expectation that it will require less CPU cycles.
I got a sample png image and an alsasrc that is a microphone as inputs.
gst-launch-1.0 compositor name=comp sink_1::alpha=0.5 ! videoconvert ! \
x264enc quantizer=35 tune=zerolatency ! h264parse ! 'video/x-h264,width=854,height=480' ! queue ! \
flvmux name=muxer streamable=true ! rtmpsink location='rtmp://localhost/streamer/test' \
filesrc location='sample.png' ! pngdec ! videoconvert ! imagefreeze is-live=true ! queue ! comp.sink_0 \
alsasrc ! audioconvert ! tee name=audioTee \
audioTee. ! audioresample ! muxer. \
audioTee. ! wavescope ! 'video/x-raw,width=854,height=480' ! queue2 ! comp.sink_1 \
I tried qantizer up to 50, and adding a framerate=(fraction)5/1 to the caps after h264parse. The former didn't make a difference, and the latter brought clock issues back on rails, with muxer. "unable to configure latency". I assumed that if h264parse asked for less frames, then wavescope would render them on demand. Now I'm not sure. Anyway I tried to specify a framerate on wavescope's output caps, and the issue is the same. Or I'm afraid it's rather the alsasrc that is dictating framerates to everybody.
I'm sorry my example command still has an RTMP sink, I wasn't able to reproduce this with a playbin. You'll also obviously need a valid png image as ./sample.png.

Streaming arbitrary data with Gstreamer over network

How can we use Gstreamer to stream arbitrary data?
In this very informative talk (https://www.youtube.com/watch?v=ZphadMGufY8) the lecturer mentions that Gstreamer is media agnostic and a use case where Gstreamer is used for non-media application so this should be possible but I didn’t find anything useful on internet so far.
Particular use case in which I am interested: high-speed usb bayer camera is connected to RPi4. RPi4 reads and forwards camera frames via network. Now, Gstreamer doesn’t support (as far as I know) sending bayer formatted frames via udp/rtp so I need to convert it to something else i.e. to RGB format using the bayer2rgb element. This conversion, however, consumes some part of the processing power from RPi4 so the speed in which RPi4 can read and send frames from the camera is significantly lower.
On top of that I am using RPi4 as a data acquisition system for other sensors also so it would be great if I could use Gstreamer to stream them all.
The sender pipeline is
gst-launch-1.0 videotestsrc ! video/x-bayer,format=bggr,width=1440,height=1080,framerate=10/1 ! rtpgstpay mtu=1500 ! queue ! multiudpsink clients=127.0.0.1:5000
The receiver pipeline is
gst-launch-1.0 -v udpsrc port=5000 caps="application/x-rtp, media=(string)application, clock-rate=(int)90000, encoding-name=(string)X-GST" ! queue ! rtpgstdepay ! bayer2rgb ! videoconvert ! autovideosink
Take care of the mtu, as far as I know, PI supports 1500 bytes only, no Jumbo Frames.
Also expected missing packages.

How to measure intra GStreamer/gst-launch latency

im currently building a GStreamer pipleine on my Raspberry PI as follows:
v4lsrc - h264enc - mpegtsmux - udpsink
im try to figure out how to measure the time in milliseconds for one or several elements, eg. the time consumed by h264enc and mpegtsmux.
Of course I found the following entry which solves this problem by using a gstreamer plugin:
How can I quantitatively measure gstreamer H264 latency between source and display?
But i'm not really sure how to compile such a plugin. Especially on the Raspberry Pi! Ive read that every GStreamer Element needs to calculate its latency anyway. Would be super cool if anybody could help me with that!
Cheers,
Markus
I don't think there's an out-of-the-box way to do this. In the past we used to wrap a pair of identity elements around the element to be measured, and added it to a bin. Very clunky.
I went ahead and uploaded some code that uses the new GstMeta API and two new elements: markin and markout. So you can use it like so:
export GST_DEBUG=markout:5
gst-launch-1.0 -f -e videotestsrc pattern=ball ! video/x-raw,width=320,height=240 ! markin name=moo ! videoscale ! markout ! video/x-raw,width=1280,height=720 ! ximagesink
https://github.com/mpekar/gstreamer_timestamp_marking

AVB Streaming Stored file from sever to client : TimeStamping issue

I am working on AVB application.As we have created gstreamer plugins at talker side and listener side and we used that plugins to transfer stored media.
I am using below pipeline
Talker side :
gst-launch-1.0 filesrc location=/home/input.mp4 ! queue ! avbsink interface=eth0 fd=0 (here avbsink is created property to transfer avb packets)
Listener side :
gst-launch-1.0 avbsrc interface=eth0 dataSync=1 mediaType=0 fd=0 ! queue ! qtdemux name=mux mux.video_0 ! queue ! avdec_h264 ! autovideosink mux.audio_0 ! queue ! decodebin ! autoaudiosink
(i tried vaapidecode and vaapisink instead of avdec_h264 and autovideosink for hardware accelerator )
Error comming on listener side is
"WARNING: from element /GstPipeline:pipeline0/GstVaapisink0: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2683) : gst_base_sink_is_too_late(): /GstPipeline:pipeline0/GstVaapiSink:vaapisink0;
There may be a timestamping problem, or this computer is too slow. "
I have seen one solution to use sync=false then i have added sync=false with vaapisink and error message got eliminate but still video is not playing smoothly. its continuously gating stop and again starting.
Is there any solution to play video continuously.( Only high quality video(720p or more) is not playing, application is working for low quality video ).
It looks like the size of the buffer is not enough since a frame of hd video has more pixels. The other point I can propose is may be you can apply some sort of compression algorithm prior to sending the frame to listener but I am not sure if compression is contradicting any one of AVB protocols.

gstreamer pipeline that was working now requiring a bunch of queue components, why?

I have a C program that records video and audio from a v4l2 source into flv format. I noticed that the program did not work on newer versions of ubuntu. I decided to try to run the problamatic pipeline in gst-launch and try to find the simplest pipeline that would reproduce the problem. Just focusing on the video side I have reduced it to what you see below.
So I have a gstreamer pipeline that was working:
gst-launch v4l2src ! tee name="vtee" ! queue ! videorate ! ffmpegcolorspace ! ffdeinterlace ! x264enc ! flvmux name="mux" ! filesink location=vid.flv vtee. ! queue ! xvimagesink
Now it will only work if I do this with a bunch of queue's added one after another before the xvimagesink. Although this does work I get a 2 second delay before the pipeline starts to work and i also get the message :
gst-launch v4l2src ! tee name="vtee" ! queue ! videorate ! ffmpegcolorspace ! ffdeinterlace ! x264enc ! flvmux name="mux" ! filesink location=vid.flv vtee. ! queue ! queue ! queue ! queue ! queue ! xvimagesink
Although the second pipeline above works, there is a pause before the pipeline starts running and I get the message (I don't think this system is 2 slow, its a core i7 with tons of ram):
Additional debug info:
gstbasesink.c(2692): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
There may be a timestamping problem, or this computer is too slow.
Can any one explain what is happening here? What am I doing wrong?
You claim that the first pipeline stopped working but you don't explain what happened. Things stop working because something else changed:
- version of GStreamer and submodules ?
- version of OS ?
- version of camera ?
It shouldn't be necessary to add a bunch of queues in a row. In practice they will create thread boundaries and separate the part before and after across threads, and it will add the delay you see, which will affect the latency and sync.
An old message, but the problem is still not fixed. Somewhere between 9.10 and 11.10 (I upgraded a few before noticing). I got around it by avoiding the x264enc and used ffenc_mpeg4 instead.
I just noticed this note from Gstreamer Cheat Sheet :
Note: We can replace theoraenc+oggmux with x264enc+someothermuxer but then the pipeline will freeze unless we make the queue [19] element in front of the xvimagesink leaky, i.e. "queue leaky=1".
Which doesn't work for me so I'll stick with ffenc_mpeg4.