Transmission of live feed from webcam over RF and receive same feed from different pluto - raspberry-pi

Is there any way we can transmit live feed from Rasp Webcam / Webcam and transmit over channel(RF) with Software defined radio, and receive feed from somewhere else ?
I tried to connect everythink in matlab simulink but the delay is totally unacceptable. is there any way we can transmit that over RF ?

Related

How to stream live video in h.264 using a rasberry pi

So I have been trying for (embarassingly) long to efficiently stream video from my pi to my computer over the internet. Video encoding and streaming still looks fuzzy to me, since most examples hide the things that happen behind the scenes.
Things I've tried:
OpenCV capture, encode as jpeg, send through socket and decode - I get a working video feed, but data usage is crazy.
using the Picamera module, capture and send over socket, receive and pipe into VLC - also worked, data usage is low, but latency is even more than the first approach. Also I want to play the video on my own UI on pyQt
using the Picamera module, capture and send over socket, pipe to FFMPEG, and try to use opencv to capture from the stdout - I got "bad argument" errors on the cv2.VideoCapture(stdout) method.
All I want is an efficient way to transmit video with low bandwidth and latency using h264, and no pre-written protocols like rtmp or installing servers, then blast it onto my pyQt UI. Something like:
bytes =
video.capture().encode('h264')
udp_socket.sendto(bytes, (ip, port))
And on the receiving side:
data = udp_socket.recvfrom(65534) frame = data.decode() update_frame()
The closest thing I got to this was with the first approach.
Any help would be greatly appreciated.

How to get real time audio data from raspberry pi to simulink and apply real time signal processing

I am facing a problem with getting live (real time) audio data from raspberry pi to simulink.
I want to get the data continuously from the microphone that is connected the raspberry pi and process it in simulink continuously too.
I am using the ALSA Audio Capture block but its not working for me.
Anyone tried this before and may help please?
Thank you.

How to stream video from a Raspberry Pi to Elixir and to the web

I'm trying to stream high-quality live video from a Raspberry Pi 3+camera module. I want to public the stream on a public webpage (using video.js I think). I also want an elixir server (on the same LAN as the Pi) to consume the stream to detect faces and motion in it. I am a video streaming noob.
What's the best way to do this? Specifically:
What transport mechanism should I use on the Pi? RTP? WebRTC? Is that even a transport mechanism?
How do I pull images out of a (whatever transport mechanism I used above) stream in Elixir? Is there a library that does it?
If I want to support more simultaneous users than the Pi can handle, what's the right way to proxy the stream through Phoenix?
Thanks!

Get video stream from tcpip port (ArDrone) to Matlab figure

I am trying to obtain video from an IP webcam in particular that HD camera on ARdrone parrot 2.0, I can get the video stream either with MPEG4.2 soft encoder or with H264-like codec or with MJPEG-like codec and is sent by a tcpip port by the drone.
The point is that, I have already been able to display this video stream on ffplay (external program to Matlab) using win7, but I would like to save video in a variable on Matlab in order to use image proccesing with it and control the drone depending on what "it's seeing".
I have already looked up for this matter and have not found anything illuminating. So I will be very grateful if any of you know how to get this image straight from the tcpip port or even from ffplay program.

Raspberry Pi Motion-Mmal: I want to disable live stream capabilities

I have a few questions about Motion-MMAL for the Raspberry Pi B+ model, running on Raspbian: Sorry if these are noob questions.
1) I want to disable the live stream capability completely; however, i can only find information on how to keep the live stream local
2) If I do not visit the local address for the live stream, is it still technically uploading data and live streaming?
I think if you comment out the webcam_port value in motion.conf then you will not be streaming. Default is 0 for disabled.
So motion will still be recording video if there is actual motion. Nobody will be able to view a live stream.