I'm building an Electron streaming app that I deploy on Raspberry Pi 3 with attached camera (OV5647 5Mpx) that supposedly supports YUV/RAW WGB formats. When I try to access it via:
const constraints = {"video": true}
navigator.mediaDevices.getUserMedia(constraints)
I get an error: DOMException: Requested device not found and basically no further details.
I also have a different app based on gstreamer and it is able to stream image from this camera with following input settings (it works on the same device):
gst_parse_launch ("webrtcbin bundle-policy=max-bundle name=sendrecv "
TURN_SERVER
"v4l2src ! video/x-raw,width=640,height=480,framerate=30/1 ! v4l2h264enc ! video/x-h264,level=(string)3.1,stream-format=(string)byte-stream ! h264parse ! rtph264pay ! "
"" RTP_CAPS_H264 "96 ! sendrecv. "
"alsasrc device=hw:0 ! audioamplify amplification=15 ! audio/x-raw,channels=(int)2,format=(string)S32LE,rate=(int)44100,layout=(string)interleaved ! deinterleave name=d d.src_0 ! queue ! audioconvert ! audioresample ! webrtcdsp echo-suppression-level=2 noise-suppression-level=3 voice-detection=true ! queue ! audioconvert ! opusenc ! rtpopuspay ! "
"queue ! " RTP_CAPS_OPUS "97 ! sendrecv. ", &error);
My question is, what would be the further steps to analyse why Chromium can't recognize my camera? Is it a matter of drivers or rather app configuration?
Also, would it be possible to acquire the video stream directly from gstreamer somehow? And pass it further to WebRTC PeerConnection within Electron app, not using the UserMedia at all?
EDIT:
I've tried opening the same application via a browser on a desktop version of Raspbian (same code hosted as a webpage on an external server). It failed with the same error on Chrome, but worked on Firefox. So seems like Chrome backend cannot communicate with the camera. How is it different from the Firefox backend and how to include the missing bits manually?
FYI:
I've filed a bug report in Chromium as it appears to be a problem since 89.0.4389.128 version https://bugs.chromium.org/p/chromium/issues/detail?id=1259138
Related
I bought a HotPi a while ago, and decided to use it. So I followed the procedure to configure the IR and after a few hours I was able to IR signals on my Raspberry 1.
But my purpose here is to send IR signals, which I tried, without any luck.
So this is the command I try to do (just for test):
irsend SEND_START devinput KEY_POWER ; sleep 3
And this is what lircd tells me:
lircd-0.9.4c[907]: Notice: accepted new client on /var/run/lirc/lircd
lircd-0.9.4c[907]: Info: Cannot configure the rc device for /dev/lirc0
lircd-0.9.4c[907]: Error: invalid send buffer
lircd-0.9.4c[907]: Error: this remote configuration cannot be used to transmit
lircd-0.9.4c[907]: Error: error processing command: SEND_START devinput KEY_POWER
lircd-0.9.4c[907]: Error: transmission failed
lircd-0.9.4c[907]: Info: removed client
Edit:
It seems I'm not using the good drivers. According to the HotPi documentation, I'm suppose to use lirc-rpi, which I'm suppos to install with
sudo modprobe lirc-rpi
Which, at least, doesn't return an error. But trying to configure the interface tells me that the driver doesn't exist:
pi#raspberrypi:~ $ mode2 --driver lirc-rpi --device /dev/lirc0
Driver `lirc-rpi' not found. (Missing -U/--plugins option?)
Available drivers:
accent
alsa_usb
asusdh
atilibusb
atwf83
audio
audio_alsa
awlibusb
bte
bw6130
commandir
creative
creative_infracd
default
devinput
dfclibusb
dsp
dvico
ea65
file
ftdi
ftdi-exp
ftdix
girs
i2cuser
irlink
irtoy
livedrive_midi
livedrive_seq
logitech
macmini
mouseremote
mouseremote_ps2
mp3anywhere
mplay
mplay2
pcmak
pinsys
pixelview
samsung
sb0540
silitek
slinke
sonyir
srm7500libusb
tira
tira_raw
udp
uirt2
uirt2_raw
usb_uirt_raw
usbx
zotac
Here is no info what lirc version you are using. There are vast differences between the legacy 0.9.0 still used in some distros and modern lirc.
That said, the logs seems pretty clear. You are using the devinput driver, right? This driver does not support sending data, reflecting the fact that also the kernel doesn't.
You then need to use another driver - first stop might be the default one. If/when using this other driver, you need another lircd.conf.
Please refer to http://lirc.org/html/configuration-guide.html
EDIT: Ah, lirc-0.9.4c says the log. Sorry, my bad. The reply should still be valid, though.
When you record the remote, use:
irrecord -d /dev/lirc0 -f name.conf
The -f uses raw mode. This then worked for me on the transmit side, before I got same error as you.
I am attempting to stream video and audio using Gstreamer to an RTMP Server (Wowza) but there are a number of issues.
There is almost no documentation about how to properly utilise rtmpsink, a plugin that sends media via RTMP to a specified server. Not only that but crafting the correct Gstreamer pipeline that is rtmpsink compatible is simply a trial and error exercise currently.
My current Gstreamer pipeline is:
sudo gst-launch-1.0 -e videotestsrc ! queue ! videoconvert ! x264enc ! flvmux streamable=true ! queue ! rtmpsink location='rtmp://<ip_address>/live live=true'
Running the above on my Linux machine spits out this error:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Redistribute latency...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstRTMPSink:rtmpsink0: Could not open resource for writing.
Additional debug info:
gstrtmpsink.c(246): gst_rtmp_sink_render (): /GstPipeline:pipeline0/GstRTMPSink:rtmpsink0:
Could not connect to RTMP stream "rtmp://31.24.217.8/live live=true" for writing
EOS on shutdown enabled -- waiting for EOS after Error
Waiting for EOS...
ERROR: from element /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0:
streaming task paused, reason error (-5)
ERROR: from element /GstPipeline:pipeline0/GstQueue:queue0: Internal data flow error.
Additional debug info:
gstqueue.c(992): gst_queue_handle_sink_event (): /GstPipeline:pipeline0/GstQueue:queue0:
streaming task paused, reason error (-5)
Due to lack of documentation on the Wowza side another issue is actually pin-pointing the correct ip address to point rtmpsink at and lack of documentation on the Gstreamer side, proper RTMP authentication is elusive aside from some examples found on some forums of which cannot be confirmed as working due to other variables.
What is the correct Gstreamer pipeline for streaming via RTMP using rtmpsink and how do I properly implement rtmpsink for this with and without authentication?
Actually the pipeline you're using is working fine.
However, disabling the Wowza's RTMP security it is a must, also pointing to the correct direction of too.
Following the guidelines on the next page: https://www.wowza.com/forums/content.php?36-How-to-set-up-live-streaming-using-an-RTMP-based-encoder
Re-check that RTMP is enabled in application Playback Types:
Disable all security options to assure the GStreamer compatibility.
In the Playback Security tab, check that No client restrictions is selected (selected by default).
In the Sources tab, in the left columns, it is possible to check the server settings:
Once we have done all this steps, we can launch the previous pipeline:
gst-launch-1.0 -e videotestsrc ! queue ! videoconvert ! x264enc ! flvmux streamable=true ! queue ! rtmpsink location='rtmp://192.168.1.40:1935/livertmp/myStream'
It works and it is possible to check the result clicking on Test Players button. The result is next:
Although probably it is out of scope, it is possible to add audio to the pipeline and improve it adding some properties that were missing:
gst-launch-1.0 videotestsrc is-live=true ! videoconvert ! x264enc bitrate=1000 tune=zerolatency ! video/x-h264 ! h264parse ! video/x-h264 ! queue ! flvmux name=mux ! rtmpsink location='rtmp://192.168.1.40:1935/livertmp/myStream' audiotestsrc is-live=true ! audioconvert ! audioresample ! audio/x-raw,rate=48000 ! voaacenc bitrate=96000 ! audio/mpeg ! aacparse ! audio/mpeg, mpegversion=4 ! mux.
Regarding to the password encrypted content, is not straightforward to achieve it with GStreamer.
I am working on a flying drone that sends live stream from a Raspberry Pi 2 to my computer trough a 3G modem/WI-FI, and the stream is made with this command :
sudo raspivid -t 999999999 -w 320 -h 240 -fps 20 -rot 270 -b 100000 -o - | gst-launch-1.0 -e -vvv fdsrc ! h264parse ! rtph264pay pt=96 config-interval=5 ! udpsink host=192.168.0.103 port=5000
The stream works very well, but i have a problem, while raspivid is running i want to take pictures from 5 to five seconds, and when i am executing this command while running raspivid i'm getting this :
root#raspberrypi:/var/www/camera# /usr/bin/raspistill -o cam2.jpg
mmal: mmal_vc_component_enable: failed to enable component: ENOSPC
mmal: camera component couldn't be enabled
mmal: main: Failed to create camera component
mmal: Failed to run camera app. Please check for firmware updates
Now what solutions do i have? Another idea is that i use gstream with both udpsink and filesink to a .avi, but i get error again :
WARNING: erroneous pipeline: could not link multifilesink0 to filesink0
What can i do in this case?
Thanks.
AFAIK only one Raspberry Pi program can grab the camera at a time. Since you're always streaming live video that precludes you from adding the five second snapshots on the Pi side (unless you write something custom from scratch).
What I'd suggest doing instead is handling the five second snapshots on the receiving side using the same encoded video data you're using for the live stream. This will ease battery usage on your drone and all the data you need is being sent already.
I am able to access my raspberry pi webcam server from all the pcs under the same network. But I found lag(>5sec) in the streaming. Is it possible to reduce the lag. any ideas? please share...
Thanks
Regards
Hema Chowdary
Well this depends on what you are using to send and receive the stream but for fun let's say you are using GStreamer, People have reported sub 100 ms over Wifi with the following setup even if I never got it <110ms.
Sender:
gst-launch-0.10 alsasrc device=hw:0 ! audio/x-raw-int, rate=48000, channels=1, endianness=1234, width=16, depth=16, signed=true ! udpsink host=192.168.1.255 port=5000
Reciever:
gst-launch-0.10 udpsrc buffer-size=1 port=5000 ! audio/x-raw-int, rate=48000, channels=1, endianness=1234, width=16, depth=16, signed=true ! alsasink sync=false
I did not come up with this configuration for GStreamer but rather SWAP_File did on the official Raspberry Pi forum.
I get "segmentation fault" error when I play the following RTSP plugin:
./TEST "( v4l2src always-copy=FALSE input-src=composite ! video/x-raw-yuv,format=\(fourcc\)NV12, width=320,height=240 ! queue ! dmaiaccel ! dmaienc_h264 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=1000000 ! rtph264pay name=pay0 pt=96 )"
TEST is the test-launch application from rtsp examples. I get the following error:
davinci_resizer davinci_resizer.2: RSZ_G_CONFIG:0:1:124
vpfe-capture vpfe-capture: IPIPE Chained
vpfe-capture vpfe-capture: Resizer present
tvp514x 1-005d: tvp5146 (Version - 0x03) found at 0xba (DaVinci I2C adapter)
vpfe-capture vpfe-capture: dma_alloc_coherent size 7168000 failed
Segmentation fault
Can anyone tell me as to what is going wrong.
Thanks,
Maz
See vpfe-capture vpfe-capture: dma_alloc_coherent size 7168000 failed
memory allocation has failed somewhere in your capture driver. This question is better suited for TI's e2e list, no? I don't think this ia a gstreamer generic issue but issue specific to the embedded hardware.
Why don't you get a simple filesrc ! h264parse ! rtph264pay pipeline up first and then slowly make it more and more complicated. [replace bitstream with yuv and do encoding, then add the capture]