How to get AR.Drone 2 Video with UDP Network - ar.drone

Does somebody already try to get the video stream of the Parrot AR Drone with UDP network ?
We already bricked 2 Drone by trying change their sofware ! :D

Related

How to transimit video streaming from HoloLens (C#) to PC (python) by socket communication?

I can realize sending data from HoloLens (using Unity coding by C#) to PC (coding by C#) by socket communication. But how to sending video steaming in real-time (the video starts to be recorded when I open the application in HoloLens) from HoloLens to PC by my original socket frame. In my view, maybe I should add some sentences to recognize the HoloLens camera, record video and encode the video to data, then transmit the data by my previous socket. Is it right and how to realize it?
By the way, I hope that the PC can receive the video by python so that I can process the video in the following steps.
To send video steaming in real-time between HoloLens and PC client, WebRTC should can meets your needs. Please check out this MixedReality-WebRTC project, it can help you to integrate peer-to-peer real-time audio and video communication into your application. It also implements local video capture you need and encapsulation it as a Unity3D component for rapid prototyping and integration.
You can read its official documentation via this link: MixedReality-WebRTC 1.0.0 documentation.
Moreover, this project can be used in desktop applications or even other non-mixed reality applications, which can save your development costs.

Problem when playing RTSP stream with Gstreamer

Hardware & Software: Raspberry Pi 4, IP camera, Raspbian Buster, Gstreamer 1.14.1 (from repository). Raspberry and camera are on the local network.
I'm trying to run the RTSP video stream with the following pipeline:
gst-launch-1.0 rtspsrc location='rtsp://web_camera_ip' ! rtph264depay ! h264parse ! v4l2h264dec ! autovideosink
Within one minute, the playback stops.
Log:
0:00:00.681624278 1491 0xb4810980 FIXME default gstutils.c:3981:gst_pad_create_stream_id_internal:<fakesrc0:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
Progress of execution: (request) Sent PLAY request
0:00:01.155264612 1491 0xb1507fb0 WARN v4l2 gstv4l2object.c:4186:gst_v4l2_object_probe_caps:<v4l2h264dec0:src> Failed to probe pixel aspect ratio with VIDIOC_CROPCAP: An inadmissible argument
0:00:01.166871436 1491 0xb1507fb0 WARN v4l2 gstv4l2object.c:4186:gst_v4l2_object_probe_caps:<v4l2h264dec0:src> Failed to probe pixel aspect ratio with VIDIOC_CROPCAP: An inadmissible argument
0:00:01.170107746 1491 0xb1507fb0 FIXME basesink gstbasesink.c:3145:gst_base_sink_default_event:<autovideosink0-actual-sink-xvimage> stream-start event without group-id. Consider implementing group-id handling in the upstream elements
0:00:01.174576265 1491 0xb1507fb0 WARN v4l2videodec gstv4l2videodec.c:808:gst_v4l2_video_dec_decide_allocation:<v4l2h264dec0> Duration invalid, not setting latency
0:00:01.211967620 1491 0xb48105b0 WARN v4l2bufferpool gstv4l2bufferpool.c:1189:gst_v4l2_buffer_pool_dqbuf:<v4l2h264dec0:pool:src> Driver should never set v4l2_buffer.field to ANY
This line appears at the moment of stopping:
0:00:13.102438914 1491 0xb48105b0 WARN v4l2allocator gstv4l2allocator.c:1349:gst_v4l2_allocator_dqbuf:<v4l2h264dec0:pool:src:allocator> V4L2 provided buffer has bytesused 0 which is too small to include data_offset 0
What I tried (it didn't help solve the problem):
Replacing autovideosink with fakesink
Replacing v4l2h264dec with avdec_h264_mmal
Various rtspsrc parameters
Playback with playbin
Error does not appear if decoder is replaced with fakesink:
gst-launch-1.0 rtspsrc location='rtsp://web_camera_ip' ! rtph264depay ! h264parse ! fakesink
Additional information:
My camera displays the time (hours, minutes, seconds) above the image. Playback is always stopped at a certain value of seconds. When the camera restarts, this value changes randomly - 17, 32, 55... Changing the time in the camera does not solve the problem.
VLC player on Raspberry plays stream from this camera without any problems
Gstreamer plays local h264 files without any problems
Gstreamer plays RSTP TV channel broadcasting from the Internet without any problems.
I also tried to play substream (low resolution) from IP camera and RTSP stream from smartphone (IP Webcam application). The same problem appears.
When running this project (on SD card) on Raspberry 3 , the problem remains.
On Raspberry 3 with Raspbian Stretch and Gstreamer 1.10 from repository there was no problem.
Thank you for the answers!
The problem was in my local network. RTSP stream from any device is periodically interrupted for a split second. When using VLC player it is not visible, because it instantly restarts the broadcast. In this case Gstreamer interrupts stream and generates an error message.
I've connected my IP camera directly to Raspberry over ethernet, everything works fine.
Broadcast over the Internet is also stable.
I was having this exact same issue. Luckily it appears fixed with gstreamer 1.16.2. I built using a variation of the script at
https://github.com/PietroAvolio/Building-Gstreamer-Raspberry-Pi-With-SRT-Support/blob/master/gstreamer-build.sh
Using 1.16.2 it just keeps on going and doesn't hang.

How do I record video from a remote webrtc camera?

I have a raspberry pi which has webrtc via uv4l2. It is awesome! I want to record the video from the camera on a server. It's your basic surveillance camera setup... central linux server with lots of storage space, remote IP cameras, etc. I've read dozens of pages and still can't figure it out. I tried all this kurento mumbo jumbo but it's all wretch an no vomit. It never gets there. What's the command to grabthe rpi video and dump it to disk? Please help!!!
UV4L already supports audio+video recording on the server (other than on the client), if you use it with Janus WebRTC. Have a look at the bottom of the Video Conference DEMO OS page for more details. At the moment, you will have to use the REST API to login into a Janus room and turn on/off the recording. The REST API is ideal if you want to control UV4L from a custom application, but there is also a panel which allows you to dynamically send REST requests to the UV4L server.

TCP traffic slows down udp traffic

I am trying to wrote a VR remote gaming system.
A TCP video streaming program which only send H.264 frames of screenshots from desktop to my android phone. and an Android app sending orientation data in UDP to desktop. Everything was tested in local network
If I only run the sensor sender program. I can see the game camera rotating in real-time.
But if I run the two program at the same time. TCP Screen streaming is always smooth but sensor data receiving start to delay.
Then I lower down the sending fps. It gets smoother but still not acceptable.
TCP traffic has 500KBps~1000KBps
UDP has 600Bps
Does it mean network congestion occuring? How can I overcome this?
What if I change video streaming protocal to UDP? but an I-frame may be larger than 65KB

Raspberry Pi Motion-Mmal: I want to disable live stream capabilities

I have a few questions about Motion-MMAL for the Raspberry Pi B+ model, running on Raspbian: Sorry if these are noob questions.
1) I want to disable the live stream capability completely; however, i can only find information on how to keep the live stream local
2) If I do not visit the local address for the live stream, is it still technically uploading data and live streaming?
I think if you comment out the webcam_port value in motion.conf then you will not be streaming. Default is 0 for disabled.
So motion will still be recording video if there is actual motion. Nobody will be able to view a live stream.