Gstreamer rtsp pipeline for videostream error - streaming

I have an issue when trying to connect to an rtsp videostream using gstreamer. The videostream originates from a IP camera.
Using the same rtsp address in vlc and opencv (without gstreamer) works as it should.
This pipeline is used when testing:
GST_DEBUG=1 gst-launch-1.0 -v rtspsrc location=rtsp://admin:password$#192.168.2.1:554/ch1/main/av_stream ! decodebin ! autovideosink
This error occurs when running the pipeline:
Setting pipeline to PAUSED ...
error: XDG_RUNTIME_DIR not set in the environment.
Pipeline is live and does not need PREROLL ...
Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayX11\)\ gldisplayx11-0";
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://admin:password192.168.2.1:554/ch1/main/av_stream
0:00:03.105886290 6248 0x559fd7cf1520 ERROR default gstrtspconnection.c:1046:gst_rtsp_connection_connect_with_response: failed to connect: Error resolving “admin”: Name or service not known
0:00:03.105953476 6248 0x559fd7cf1520 ERROR rtspsrc gstrtspsrc.c:5047:gst_rtsp_conninfo_connect:<rtspsrc0> Could not connect to server. (Generic error)
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not open resource for reading and writing.
Additional debug info:
gstrtspsrc.c(7893): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Failed to connect. (Generic error)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
If I use a password that is not correct, I also get the same error message. This makes me believe that gstreamer does not manage to log in to the ip camera videostream.
For trail and error purpose, I tried this pipeline without and usr or password:
GST_DEBUG=1 gst-launch-1.0 -v rtspsrc location=rtsp://192.168.2.1:554/ch1/main/av_stream ! decodebin ! autovideosink
Setting pipeline to PAUSED ...
error: XDG_RUNTIME_DIR not set in the environment.
Pipeline is live and does not need PREROLL ...
Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayX11\)\ gldisplayx11-0";
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.2.1:554/ch1/main/av_stream
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Unauthorized
Additional debug info:
gstrtspsrc.c(6540): gst_rtspsrc_send (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Unauthorized (401)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...

rtspsrc has properties for authentication.
Try using
gst-launch-1.0 -v rtspsrc user-id=admin user-pw=password location=rtsp://192.168.2.1:554/ch1/main/av_stream ! decodebin ! autovideosink

Related

ERROR IPAProxy ipa_proxy.cpp:149 Configuration file 'ov5640.json' not found for IPA module 'raspberrypi'?

I'm trying to interface ov5640 with raspberry Pi 4 model B, kernel version 5.15.61, till now I had loaded driver successfully and Bound driver with ov5640 successfully, and video nodes were created successfully.
I tried to stream video using gstreamer
command:
gst-launch-1.0 libcamerasrc ! video/x-raw, width=640, height=480, framerate=30/1 ! videoconvert ! videoscale ! clockoverlay time-format="%D %H:%M:%S" ! autovideosink
and it thrown error
error:
Setting pipeline to PAUSED ...
[0:02:06.704696651] [1320] INFO Camera camera_manager.cpp:293 libcamera v0.0.0+3866-0c55e522
[0:02:06.726785010] [1324] WARN CameraSensor camera_sensor.cpp:212 'ov5640 10-003c': Recommended V4L2 control 0x009a0922 not supported
[0:02:06.726879464] [1324] WARN CameraSensor camera_sensor.cpp:264 'ov5640 10-003c': The sensor kernel driver needs to be fixed
[0:02:06.726907245] [1324] WARN CameraSensor camera_sensor.cpp:266 'ov5640 10-003c': See Documentation/sensor_driver_requirements.rst in the libcamera sources for more information
[0:02:06.728038265] [1324] WARN CameraSensor camera_sensor.cpp:411 'ov5640 10-003c': Failed to retrieve the camera location
[0:02:06.746434072] [1324] ERROR IPAProxy ipa_proxy.cpp:149 Configuration file 'ov5640.json' not found for IPA module 'raspberrypi'
[0:02:06.746582013] [1324] ERROR IPARPI raspberrypi.cpp:213 Could not create camera helper for ov5640
[0:02:06.746635055] [1324] ERROR RPI raspberrypi.cpp:1253 Failed to load a suitable IPA library
[0:02:06.746922788] [1324] ERROR RPI raspberrypi.cpp:1184 Failed to register camera ov5640 10-003c: -22
ERROR: from element /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0: Could not find any supported camera on this system.
Additional debug info:
../src/gstreamer/gstlibcamerasrc.cpp(354): gst_libcamera_src_open (): /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0:
libcamera::CameraMananger::cameras() is empty
ERROR: pipeline doesn't want to preroll.
Failed to set pipeline to PAUSED.
Setting pipeline to NULL ...
Freeing pipeline ...
can anyone give me a direction how to solve this?

Setting up an USB webcam RTSP stream with GStreamer

I'm using GStreamer to send the camera feed of /dev/video1 (Raspberry Pi's usb webcam) through a RTSP server that I can connect with another Raspberry Pi.
Result of v4l2-ctl -d /dev/video1 --list-formats:
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
[0]: 'MJPG' (Motion-JPEG, compressed)
[1]: 'YUYV' (YUYV 4:2:2)
The pipeline I'm using is
./gst-rtsp-launch --port 8555 '( v4l2src device='/dev/video1 ! image/jpeg,width=800,height=600,framerate=30/1 ! jpegparse ! rtpjpegpay name=pay0 pt=96 )' --gst-debug-level=3`
When running it, and letting the other machine connect, the console gives this message:
0:00:02.097412343 3234 0xb4c1c0c0 FIXME default gstutils.c:3981:gst_pad_create_stream_id_internal:<appsrc0:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:02.102907578 3234 0xb5a07600 WARN v4l2src gstv4l2src.c:692:gst_v4l2src_query:<v4l2src0> Can't give latency since framerate isn't fixated !
0:00:02.170888076 3234 0xb4c1b980 WARN v4l2bufferpool gstv4l2bufferpool.c:790:gst_v4l2_buffer_pool_start:<v4l2src0:pool:src> Uncertain or not enough buffers, enabling copy threshold
0:00:02.410829991 3234 0x166ba90 FIXME rtspmedia rtsp-media.c:3581:gst_rtsp_media_suspend: suspend for dynamic pipelines needs fixing
0:00:02.414457433 3234 0x166ba90 FIXME rtspmedia rtsp-media.c:3581:gst_rtsp_media_suspend: suspend for dynamic pipelines needs fixing
0:00:02.414551635 3234 0x166ba90 WARN rtspmedia rtsp-media.c:3607:gst_rtsp_media_suspend: media 0xb5a34130 was not prepared
0:00:03.878249884 3234 0x166ba90 WARN rtspmedia rtsp-media.c:3868:gst_rtsp_media_set_state: media 0xb5a34130 was not prepared
On the client Raspberry Pi, using VLC on the static IP vlc rtsp://192.168.0.10:8555/video, gives this error (and triggers the previous one in the other board):
mmal: mmal_component_create_core: could not create component 'vc.ril.hvs' (1)
mmal: mmal_vc_component_create: failed to create component 'vc.ril.hvs' (1:ENOMEM)
mmal: mmal_component_create_core: could not create component 'vc.ril.hvs' (1)
mmal: mmal_vc_component_create: failed to create component 'vc.ril.hvs' (1:ENOMEM)
mmal: mmal_component_create_core: could not create component 'vc.ril.hvs' (1)
mmal: mmal_vc_component_create: failed to create component 'vc.ril.hvs' (1:ENOMEM)
mmal: mmal_component_create_core: could not create component 'vc.ril.hvs' (1)
mmal: mmal_vc_port_info_set: failed to set port info (3:0): EINVAL
mmal: mmal_vc_port_set_format: mmal_vc_port_info_set failed 0x909bcaa0 (EINVAL)
Falha de segmentação
The last line means "Segmentation fault". The screen in the client board flickers black before giving this error, and the board connect to the webcam only shows this error after the client connected.
Connecting to localhost on the same board using vlc rtsp://127.0.0.1:8555/video works for a little bit, then it breaks.
How can I fix this pipeline, so the video can be shown correctly through connection between the two boards?
For the record:
I asked in the comments which version of gstreamer you were using, to which the answer was "1.14.4".
I suggested you update to the latest version (1.20.1), because a segmentation fault where you see it sounds like a potential bug in gstreamer.
Turns out that it was correct: updating gstreamer (to 1.18.4) resolved the problem!

Error while using persistent datasource using mongodb ini hyperledger composer

I am trying to use persistent datasource using mongoDB in hyperledger composer on a UBUNTU droplet
but after starting the rest server and den after issuing a command docker logs -f rest i am getting the following error(i have provided a link to the image)
webuser#ubuntu16:~$ docker logs -f rest
[2018-08-29T12:38:31.278Z] PM2 log: Launching in no daemon mode
[2018-08-29T12:38:31.351Z] PM2 log: Starting execution sequence in -fork mode- for app name:composer-rest-server id:0
[2018-08-29T12:38:31.359Z] PM2 log: App name:composer-rest-server id:0 online
WARNING: NODE_APP_INSTANCE value of '0' did not match any instance config file names.
WARNING: See https://github.com/lorenwest/node-config/wiki/Strict-Mode
Discovering types from business network definition ...
(node:15) DeprecationWarning: current URL string parser is deprecated, and will be removed in a future version. To use the new parser, pass option { useNewUrlParser: true } to MongoClient.connect.
Connection fails: Error: Error trying to ping. Error: Failed to connect before the deadline
It will be retried for the next request.
Exception: Error: Error trying to ping. Error: Failed to connect before the deadline
Error: Error trying to ping. Error: Failed to connect before the deadline
at _checkRuntimeVersions.then.catch (/home/composer/.npm-global/lib/node_modules/composer-rest-server/node_modules/composer-connector-hlfv1/lib/hlfconnection.js:806:34)
at <anonymous>
[2018-08-29T12:38:41.021Z] PM2 log: App [composer-rest-server] with id [0] and pid [15], exited with code [1] via signal [SIGINT]
[2018-08-29T12:38:41.024Z] PM2 log: Starting execution sequence in -fork mode- for app name:composer-rest-server id:0
[2018-08-29T12:38:41.028Z] PM2 log: App name:composer-rest-server id:0 online
WARNING: NODE_APP_INSTANCE value of '0' did not match any instance config file names.
WARNING: See https://github.com/lorenwest/node-config/wiki/Strict-Mode
Discovering types from business network definition ...
(node:40) DeprecationWarning: current URL string parser is deprecated, and will be removed in a future version. To use the new parser, pass option { useNewUrlParser: true } to MongoClient.connect.
Connection fails: Error: Error trying to ping. Error: Failed to connect before the deadline
It will be retried for the next request.
I don't understand what is the problem and what wrong I am doing because I have followed all the steps in the hyperledger composer document with success....
Is it because I am using it on ubuntu droplet....?? anyone help
EDIT
I followed all the steps mentioned in this tutorial
but instead of using google authentication i am using github authentication.
Also i have changed my local host to the ip of my ubuntu droplet in connection.json file and also in this command
sed -e 's/localhost:7051/peer0.org1.example.com:7051/' -e 's/localhost:7053/peer0.org1.example.com:7053/' -e 's/localhost:7054/ca.org1.example.com:7054/' -e 's/localhost:7050/orderer.example.com:7050/' < $HOME/.composer/cards/restadmin#trade-network/connection.json > /tmp/connection.json && cp -p /tmp/connection.json $HOME/.composer/cards/restadmin#trade-network/
bt yet with no success! i get the following error now.....
webuser#ubuntu16:~$ docker logs rest
[2018-08-30T05:03:02.916Z] PM2 log: Launching in no daemon mode
[2018-08-30T05:03:02.989Z] PM2 log: Starting execution sequence in -fork mode- for app name:composer-rest-server id:0
[2018-08-30T05:03:02.997Z] PM2 log: App name:composer-rest-server id:0 online
WARNING: NODE_APP_INSTANCE value of '0' did not match any instance config file names.
WARNING: See https://github.com/lorenwest/node-config/wiki/Strict-Mode
Discovering types from business network definition ...
(node:15) DeprecationWarning: current URL string parser is deprecated, and will be removed in a future version. To use the new parser, pass option { useNewUrlParser: true } to MongoClient.connect.
Discovering the Returning Transactions..
Discovered types from business network definition
Generating schemas for all types in business network definition ...
Generated schemas for all types in business network definition
Adding schemas for all types to Loopback ...
Added schemas for all types to Loopback
SyntaxError: Unexpected string in JSON at position 92
at JSON.parse ()
at Promise.then (/home/composer/.npm-global/lib/node_modules/composer-rest-server/server/server.js:141:34)
at
at process._tickDomainCallback (internal/process/next_tick.js:228:7)
[2018-08-30T05:03:09.942Z] PM2 log: App [composer-rest-server] with id [0] and pid [15], exited with code 1 via signal [SIGINT]
This error Error trying to ping. Error: Failed to connect before the deadline means that the composer-rest-server in the container cannot see/connect to the underlying Fabric at the URLs in the connection.json of the card that you are using to start the REST server.
There are a number of reasons why:
The Fabric is not started
You are using a Business Network Card that has localhost in the URLs of the connection.json, and localhost just re-directs back into the rest container.
Your rest container is started on a different Docker network bridge to your Fabric containers and cannot connect to the Fabric.
Have you followed this tutorial in the Composer documentation? If followed completely it will avoid the 3 problems mentioned above.

How to stream via RTMP using Gstreamer?

I am attempting to stream video and audio using Gstreamer to an RTMP Server (Wowza) but there are a number of issues.
There is almost no documentation about how to properly utilise rtmpsink, a plugin that sends media via RTMP to a specified server. Not only that but crafting the correct Gstreamer pipeline that is rtmpsink compatible is simply a trial and error exercise currently.
My current Gstreamer pipeline is:
sudo gst-launch-1.0 -e videotestsrc ! queue ! videoconvert ! x264enc ! flvmux streamable=true ! queue ! rtmpsink location='rtmp://<ip_address>/live live=true'
Running the above on my Linux machine spits out this error:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Redistribute latency...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstRTMPSink:rtmpsink0: Could not open resource for writing.
Additional debug info:
gstrtmpsink.c(246): gst_rtmp_sink_render (): /GstPipeline:pipeline0/GstRTMPSink:rtmpsink0:
Could not connect to RTMP stream "rtmp://31.24.217.8/live live=true" for writing
EOS on shutdown enabled -- waiting for EOS after Error
Waiting for EOS...
ERROR: from element /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0:
streaming task paused, reason error (-5)
ERROR: from element /GstPipeline:pipeline0/GstQueue:queue0: Internal data flow error.
Additional debug info:
gstqueue.c(992): gst_queue_handle_sink_event (): /GstPipeline:pipeline0/GstQueue:queue0:
streaming task paused, reason error (-5)
Due to lack of documentation on the Wowza side another issue is actually pin-pointing the correct ip address to point rtmpsink at and lack of documentation on the Gstreamer side, proper RTMP authentication is elusive aside from some examples found on some forums of which cannot be confirmed as working due to other variables.
What is the correct Gstreamer pipeline for streaming via RTMP using rtmpsink and how do I properly implement rtmpsink for this with and without authentication?
Actually the pipeline you're using is working fine.
However, disabling the Wowza's RTMP security it is a must, also pointing to the correct direction of too.
Following the guidelines on the next page: https://www.wowza.com/forums/content.php?36-How-to-set-up-live-streaming-using-an-RTMP-based-encoder
Re-check that RTMP is enabled in application Playback Types:
Disable all security options to assure the GStreamer compatibility.
In the Playback Security tab, check that No client restrictions is selected (selected by default).
In the Sources tab, in the left columns, it is possible to check the server settings:
Once we have done all this steps, we can launch the previous pipeline:
gst-launch-1.0 -e videotestsrc ! queue ! videoconvert ! x264enc ! flvmux streamable=true ! queue ! rtmpsink location='rtmp://192.168.1.40:1935/livertmp/myStream'
It works and it is possible to check the result clicking on Test Players button. The result is next:
Although probably it is out of scope, it is possible to add audio to the pipeline and improve it adding some properties that were missing:
gst-launch-1.0 videotestsrc is-live=true ! videoconvert ! x264enc bitrate=1000 tune=zerolatency ! video/x-h264 ! h264parse ! video/x-h264 ! queue ! flvmux name=mux ! rtmpsink location='rtmp://192.168.1.40:1935/livertmp/myStream' audiotestsrc is-live=true ! audioconvert ! audioresample ! audio/x-raw,rate=48000 ! voaacenc bitrate=96000 ! audio/mpeg ! aacparse ! audio/mpeg, mpegversion=4 ! mux.
Regarding to the password encrypted content, is not straightforward to achieve it with GStreamer.

gst-launch-1.0 two pipelines/sinkfiles

I am working on a flying drone that sends live stream from a Raspberry Pi 2 to my computer trough a 3G modem/WI-FI, and the stream is made with this command :
sudo raspivid -t 999999999 -w 320 -h 240 -fps 20 -rot 270 -b 100000 -o - | gst-launch-1.0 -e -vvv fdsrc ! h264parse ! rtph264pay pt=96 config-interval=5 ! udpsink host=192.168.0.103 port=5000
The stream works very well, but i have a problem, while raspivid is running i want to take pictures from 5 to five seconds, and when i am executing this command while running raspivid i'm getting this :
root#raspberrypi:/var/www/camera# /usr/bin/raspistill -o cam2.jpg
mmal: mmal_vc_component_enable: failed to enable component: ENOSPC
mmal: camera component couldn't be enabled
mmal: main: Failed to create camera component
mmal: Failed to run camera app. Please check for firmware updates
Now what solutions do i have? Another idea is that i use gstream with both udpsink and filesink to a .avi, but i get error again :
WARNING: erroneous pipeline: could not link multifilesink0 to filesink0
What can i do in this case?
Thanks.
AFAIK only one Raspberry Pi program can grab the camera at a time. Since you're always streaming live video that precludes you from adding the five second snapshots on the Pi side (unless you write something custom from scratch).
What I'd suggest doing instead is handling the five second snapshots on the receiving side using the same encoded video data you're using for the live stream. This will ease battery usage on your drone and all the data you need is being sent already.