Problem when playing RTSP stream with Gstreamer - raspberry-pi

Hardware & Software: Raspberry Pi 4, IP camera, Raspbian Buster, Gstreamer 1.14.1 (from repository). Raspberry and camera are on the local network.
I'm trying to run the RTSP video stream with the following pipeline:
gst-launch-1.0 rtspsrc location='rtsp://web_camera_ip' ! rtph264depay ! h264parse ! v4l2h264dec ! autovideosink
Within one minute, the playback stops.
Log:
0:00:00.681624278 1491 0xb4810980 FIXME default gstutils.c:3981:gst_pad_create_stream_id_internal:<fakesrc0:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
Progress of execution: (request) Sent PLAY request
0:00:01.155264612 1491 0xb1507fb0 WARN v4l2 gstv4l2object.c:4186:gst_v4l2_object_probe_caps:<v4l2h264dec0:src> Failed to probe pixel aspect ratio with VIDIOC_CROPCAP: An inadmissible argument
0:00:01.166871436 1491 0xb1507fb0 WARN v4l2 gstv4l2object.c:4186:gst_v4l2_object_probe_caps:<v4l2h264dec0:src> Failed to probe pixel aspect ratio with VIDIOC_CROPCAP: An inadmissible argument
0:00:01.170107746 1491 0xb1507fb0 FIXME basesink gstbasesink.c:3145:gst_base_sink_default_event:<autovideosink0-actual-sink-xvimage> stream-start event without group-id. Consider implementing group-id handling in the upstream elements
0:00:01.174576265 1491 0xb1507fb0 WARN v4l2videodec gstv4l2videodec.c:808:gst_v4l2_video_dec_decide_allocation:<v4l2h264dec0> Duration invalid, not setting latency
0:00:01.211967620 1491 0xb48105b0 WARN v4l2bufferpool gstv4l2bufferpool.c:1189:gst_v4l2_buffer_pool_dqbuf:<v4l2h264dec0:pool:src> Driver should never set v4l2_buffer.field to ANY
This line appears at the moment of stopping:
0:00:13.102438914 1491 0xb48105b0 WARN v4l2allocator gstv4l2allocator.c:1349:gst_v4l2_allocator_dqbuf:<v4l2h264dec0:pool:src:allocator> V4L2 provided buffer has bytesused 0 which is too small to include data_offset 0
What I tried (it didn't help solve the problem):
Replacing autovideosink with fakesink
Replacing v4l2h264dec with avdec_h264_mmal
Various rtspsrc parameters
Playback with playbin
Error does not appear if decoder is replaced with fakesink:
gst-launch-1.0 rtspsrc location='rtsp://web_camera_ip' ! rtph264depay ! h264parse ! fakesink
Additional information:
My camera displays the time (hours, minutes, seconds) above the image. Playback is always stopped at a certain value of seconds. When the camera restarts, this value changes randomly - 17, 32, 55... Changing the time in the camera does not solve the problem.
VLC player on Raspberry plays stream from this camera without any problems
Gstreamer plays local h264 files without any problems
Gstreamer plays RSTP TV channel broadcasting from the Internet without any problems.
I also tried to play substream (low resolution) from IP camera and RTSP stream from smartphone (IP Webcam application). The same problem appears.
When running this project (on SD card) on Raspberry 3 , the problem remains.
On Raspberry 3 with Raspbian Stretch and Gstreamer 1.10 from repository there was no problem.
Thank you for the answers!

The problem was in my local network. RTSP stream from any device is periodically interrupted for a split second. When using VLC player it is not visible, because it instantly restarts the broadcast. In this case Gstreamer interrupts stream and generates an error message.
I've connected my IP camera directly to Raspberry over ethernet, everything works fine.
Broadcast over the Internet is also stable.

I was having this exact same issue. Luckily it appears fixed with gstreamer 1.16.2. I built using a variation of the script at
https://github.com/PietroAvolio/Building-Gstreamer-Raspberry-Pi-With-SRT-Support/blob/master/gstreamer-build.sh
Using 1.16.2 it just keeps on going and doesn't hang.

Related

how to record video (1080p 30fps) from raspberry pi camera Using Gstreamer?

I am a beginner in Gstreamer.
My objective is to record a video at 1080p resolution at 30 fps in H264 format from my Raspberry pi camera using Gstreamer following pipeline:
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter
caps="video/x-raw, width=1920,height=1080,framerate=30/1" ! videoflip
method=rotate-180 ! gst-debug ! videoconvert ! videorate ! x264enc! avimux
! filesink location=test_video.h264
After I ran the above pipeline,
I did not get any kind of error and recorded video shows 1080p and 30fps but frames are dropping heavily.
Is this right pipelining or not?
Am I missing any elements in this pipeline?
x264enc is a software encoder and uses a lot of CPU power. Raspberry pi has an OpenMax hardware h264 encoder which can be accessed with gstreamers gst-omx bindings. You can use the hardware encoder with inserting an omxh264enc element instead of x264enc element.

Gstreamer crackling sound on Raspberry Pi 3 while playing video

I am playing a hardware accelerated video on the Raspberry Pi 3 using this simple pipeline:
gst-launch-1.0 playbin uri=file:///test/test.mp4
As soon as the video begins to play, any sound being played in parallel using ALSA begins to crackle (tested with gstreamer and mplayer). It's a simple WAV-file and I am using a USB audio interface.
Listening to the headphone jack already crackles without playing an audio file (but this jack is very low quality and I don't know if that's a different effect).
Playing the audio in the same pipeline as the video does not help. CPU is only on approx. 30 % load and there's plenty of free memory. I already overclocked the SD card. Playing two videos in parallel with omxplayer has no impact and sound still plays well. But as soon as I start the pipe above, the sound begins to crackle.
I tried "stress" to simulate high CPU load. This had no impact either, so CPU does not seem to be the problem (but maybe the GPU?).
This is the gstreamer pipeline to test the audio:
gst-launch-1.0 filesrc location=/test/test.wav ! wavparse ! audioconvert ! alsasink device=hw:1,0
GST_DEBUG=4 shows no problems.
I tried putting queues on different places but nothing helped. Playing a video without audio tracks works a little bit better. But I have no idea, where the ressource shortage may lie, if it even is one.
It somehow seems like gstreamer is disturbing audio streams.
Any ideas where the problem may be are highly appreciated.
It seems like the USB driver of my interface is expecting a very responsive system. I bought a cheap new USB audio interface with a bInterval value of 10 instead of 1 and everything works fine now. More details can be found here: https://github.com/raspberrypi/linux/issues/2215

How to route / pipe output of Watson text to speech to local speaker vs terminal

I'm using the Watson service speech to text and text to speech. So far, I've been able to get everything working excepting getting the sound to output to the speakers on a Raspberry PI running Node JS.
I'm connecting to the text to speech service. It seems that I am getting a response back from the Watson service, but it is displaying on the Pi terminal versus going to the USB speaker. At the end of the text display, it shows that it is being sent to the hw device 0:0 which is wrong. It should go to 1:0
When I test the Raspberry Pi sound without Watson, the audio works fine without the Watson Service, using aplay which plays back on a different hw device (1:0).
So my question is this: Is there a parameter using the https: interface that enables you to control to which hw sound device to route the speech to or does that all have to be controlled locally, somewhere within my JS code.
It turns out that the hw device setting of device 0:0 was coded within the js code (which I did not write). Changing this parameter routed the sound to the correct sound card which was a usb device running on 1:0.
Mystery solved!

Raspberry Pi Motion-Mmal: I want to disable live stream capabilities

I have a few questions about Motion-MMAL for the Raspberry Pi B+ model, running on Raspbian: Sorry if these are noob questions.
1) I want to disable the live stream capability completely; however, i can only find information on how to keep the live stream local
2) If I do not visit the local address for the live stream, is it still technically uploading data and live streaming?
I think if you comment out the webcam_port value in motion.conf then you will not be streaming. Default is 0 for disabled.
So motion will still be recording video if there is actual motion. Nobody will be able to view a live stream.

Omxplayer Stop "Have a nice day"

I have a C# code developed to stream a video file encoded to raw H264 and stream using UDP protocol when I receive that stream using omxplayer it played about 30s and then stopped and display 'have a nice day'. But I can receive that stream using ffplay and vlc player (in windows) without any problem.
Then i tried to stream using vlc player. And also i got same issue. Stream stopped and displayed 'have a nice day'.If any one can help me to figure this out, it is very grateful.
code used to receive stream in omxplayer
omxplayer udp://224.1.1.1:1234
I have given a screen shot of omxplayer when receiving the stream. According to this the buffer size of the omxplayer becomes 0 after playing some time. I tried changing the url of the stream as given below.
omxplayer -s udp://224.1.1.1:1234?overrun_nonfatal=1
Then that problem was alright. But the stream was not smoothly running. Can you suggest any better solution for this problem.
Thank You!