i am a new guy in gstreamer. I am trying to run the following command in my raspberrypi
gst-launch-0.10 -v udpsrc port=1234 caps='application/x-rtp,payload=(int)96,encoding-name=(string)H264' ! queue ! rtph264depay ! h264parse ! omxh264dec ! autovideosink sync=True
i am getting following error
WARNING: erroneous pipeline: no element "rtpmp2tdepay"
any idea how can i slove it ? how can install the element "rtpmp2tdepay"
thanks again for your response
rtpmp2tdepay is included in the gstreamer good plugin.
in your command line type in :
sudo apt-get install gst-plugins-good
Related
I've got a double Pi setup. One 3b and one 4. The 4 is setup with the 7" raspberry pi screen. The 3b is setup with two cameras. I'll name these "sender" (3b) and "player" (4).
I'm trying to use gstreamer to send low latency video from the sender to the reciever. I've found and used this command on the sender:
gst-launch-1.0 v4l2src device=/dev/video0 ! \
'video/x-raw, width=352, height=288, framerate=25/1' ! \
videoconvert ! \
x264enc pass=qual quantizer=20 tune=zerolatency ! \
rtph264pay ! \
multiudpsink clients="10.0.0.200:5600,10.0.0.178:5600"
And using this code on the receiver:
export DISPLAY=:0
gst-launch-1.0 -v udpsrc port=5600 ! application/x-rtp, payload=96 ! rtph264depay ! avdec_h264 ! autovideosink
So first I tested the receiver command on my Windows PC and it works flaweless. Sub 100ms delay and everything.
Now I test this on the receiver pi 4 and I get this:
pi#receiver4:~ $ gst-launch-1.0 -v udpsrc port=5600 ! application/x-rtp, payload=96 ! rtph264depay ! avdec_h264 ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
It stops here then I start the sender and the following happens:
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, payload=(int)96, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, payload=(int)96, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)017a000dffe1001b677a000dbcb202c12d80a506060640000003004000000ca3c50a9201000668ebc1b2c8b0, level=(string)1.3, profile=(string)high-4:2:2
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)017a000dffe1001b677a000dbcb202c12d80a506060640000003004000000ca3c50a9201000668ebc1b2c8b0, level=(string)1.3, profile=(string)high-4:2:2
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:src: caps = video/x-raw, format=(string)Y42B, width=(int)352, height=(int)288, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt601, framerate=(fraction)25/1
Redistribute latency...
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data stream error.
Additional debug info:
../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:03.494063865
Setting pipeline to NULL ...
Freeing pipeline ...
So why is this not streaming on the raspberry pi screen as it does on my windows screen? I tested with the test video src and I get a window with the expected output:
export DISPLAY=:0
gst-launch-1.0 videotestsrc ! autovideosink
You may try this as sender:
gst-launch-1.0 videotestsrc do-timestamp=1 is-live=1 ! video/x-raw, width=352, height=288, framerate=25/1 ! videoconvert ! x264enc pass=qual quantizer=20 tune=zerolatency insert-vui=1 key-int-max=16 ! h264parse ! rtph264pay ! udpsink host=<target_host_IP_v4_or_v6> port=5004 -ev
and this as receiver:
gst-launch-1.0 udpsrc address=<same_IP_v4_or_v6_address> port=5004 ! application/x-rtp,media=video,encoding-name=H264,clock-rate=90000 ! rtpjitterbuffer latency=300 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! autovideosink -v
In case the network between both RPi is wifi, you may also try disabling updsink auto-multicast property.
im exploring the iOS integration with Stripe and I am stuck at the getting started page. I am at step 3 of the example iOS app from:
https://github.com/stripe/stripe-ios
I hit 'Deploy to Heroku' button from https://github.com/stripe/example-ios-backend, input my secret test stripe key but it throws the following build error on the Heroku dashboard as so:
-----> Ruby app detected
-----> Compiling Ruby/Rack ! ! An error occurred while installing ruby-2.1.2 ! ! Heroku recommends you use the
latest supported Ruby version listed here: !
https://devcenter.heroku.com/articles/ruby-support#supported-runtimes
! ! For more information on syntax for declaring a Ruby
version see: !
https://devcenter.heroku.com/articles/ruby-versions ! !
Note: Only the most recent version of Ruby 2.1 is supported on
Cedar-14 ! ! Debug InformationCommand: 'set -o pipefail;
curl -L --fail --retry 5 --retry-delay 1 --connect-timeout 3
--max-time 30 https://s3-external-1.amazonaws.com/heroku-buildpack-ruby/heroku-16/ruby-2.1.2.tgz
-s -o - | tar zxf - ' failed unexpectedly: ! ! gzip: stdin: unexpected end of file ! tar: Child returned status 1 ! tar:
Error is not recoverable: exiting now ! ! Push rejected, failed
to compile Ruby app. ! Push failed
I am not quite sure where have I gone wrong as I simply followed the steps. Anyone any advise pls?
The project's Gemfile was set to use Ruby 2.1.2, which is no longer supported by Heroku.
The project has just been updated to use Ruby 2.2.7, so you should now be able to deploy it!
I am able to stream h264 or MPEG4 video. Client (player) has to be started first and then start streaming (start server).
What I need is to be able to connect to the already playing stream (start client after server).
Both server and client run on same i.MX6 device.
Server example:
gst-launch-0.10 -vvv
filesrc location=bruce.mp4 typefind=true \
! qtdemux name=demux \
demux.video_00 \
! queue ! h264parse split-packetized=true \
! mpegtsmux ! rtpmp2tpay ! gdppay \
! udpsink host=239.255.1.1 port=5004
Client example:
gst-launch-0.10 -vvv
udpsrc port=5004 multicast-group=239.255.1.1 caps="application/x-gdp" \
! gdpdepay ! gstrtpjitterbuffer \
! rtpmp2tdepay ! mpegtsdemux name=demux \
! queue max-size-buffers=0 max-size-time=0 \
! vpudec low-latency=true framedrop=true \
! mfw_isink sync=false
I am streaming video from my raspberry using gstreamer and h264 decoder as below:
raspivid -t 999999 -h 720 -w 1080 -fps 25 -hf -b 2000000 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=192.168.2.7 port=5000
From my desktop I am able to stream the video as below:
gst-launch-1.0 -v tcpclientsrc host=192.168.2.7 port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert ! gdkpixbufoverlay location=gstreamer-logo.svg offset-x=20 offset-y=20 ! autovideosink sync=false
I am looking for the following , any help is appreciated.Thanks.
Capture the screenshot of the video while streaming
Record the stream to a local file
I was able to logo overlay using gdkpixbufoverlay, want to add time also. Tried clockoverlay, but the stream got struck.
I trying to stream rtmp from rasberrypi, the omx hardware encoder worked really nice, by the way, so I'm running:
gst-launch-1.0 v4l2src ! «video/x-raw,width=640,height=480,framerate=30/1» !\
omxh264enc target-bitrate=1000000 control-rate=variable !\
video/x-h264,profile=high ! h264parse ! queue ! \
flvmux name=mux alsasrc device=plughw:1 ! audioresample ! \
audio/x-raw,rate=48000,channels=1 ! queue ! voaacenc bitrate=32000 ! queue ! mux. mux. !\
rtmpsink location='rtmp://my_rtmp_for_ustream.tv_url'
And there is an error:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstAudioSrcClock
ERROR: from element /GstPipeline:pipeline0/GstAlsaSrc:alsasrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2812): gst_base_src_loop (): /GstPipeline:pipeline0/GstAlsaSrc:alsasrc0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 535913298 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
libv4l2: warning v4l2 mmap buffers still mapped on close()
Freeing pipeline ...
First of all i thought that there is some alsa problems or such, but then I tried to write simple mpegts and it worked:
gst-launch-1.0 v4l2src ! «video/x-raw,width=640,height=480,framerate=30/1» ! \
omxh264enc target-bitrate=1000000 control-rate=variable !\
video/x-h264,profile=high ! h264parse ! queue ! \
mpegtsmux name=mux alsasrc device=plughw:1 ! audioresample !\
audio/x-raw,rate=48000,channels=1 ! queue ! voaacenc bitrate=32000 ! \
queue ! mux. mux. ! filesink location=1.ts
But i can't just change "filesink location=1.ts" to rtmpsink location='rtmp://my_rtmp_for_ustream.tv_url' because i'll get an error:
WARNING: erroneous pipeline: could not link mux to rtmpsink0
So, what can I do to get it work? Thanks.
rtmpsink requires data in "video/x-flv" format. Your first pipeline clearly showing negotiation error. Can you get me caps negotiation dump of your pipeline by adding -v in command line.
Real problem was that rtmpsync need raw aac, so I added aacparse and it worked out, something like this:
gst-launch-1.0 v4l2src ! \
"video/x-raw, framerate=25/1, width=320, height=240" ! \
omxh264enc target-bitrate=300000 control-rate=variable ! \
h264parse ! queue ! flvmux name=muxer alsasrc device=hw:1 ! \
audioresample ! "audio/x-raw,rate=48000" ! queue ! \
voaacenc bitrate=32000 ! aacparse ! queue ! muxer. muxer. ! \
rtmpsink location="$RTMP_URL"