How to stream live video in h.264 using a rasberry pi - raspberry-pi

So I have been trying for (embarassingly) long to efficiently stream video from my pi to my computer over the internet. Video encoding and streaming still looks fuzzy to me, since most examples hide the things that happen behind the scenes.
Things I've tried:
OpenCV capture, encode as jpeg, send through socket and decode - I get a working video feed, but data usage is crazy.
using the Picamera module, capture and send over socket, receive and pipe into VLC - also worked, data usage is low, but latency is even more than the first approach. Also I want to play the video on my own UI on pyQt
using the Picamera module, capture and send over socket, pipe to FFMPEG, and try to use opencv to capture from the stdout - I got "bad argument" errors on the cv2.VideoCapture(stdout) method.
All I want is an efficient way to transmit video with low bandwidth and latency using h264, and no pre-written protocols like rtmp or installing servers, then blast it onto my pyQt UI. Something like:
bytes =
video.capture().encode('h264')
udp_socket.sendto(bytes, (ip, port))
And on the receiving side:
data = udp_socket.recvfrom(65534) frame = data.decode() update_frame()
The closest thing I got to this was with the first approach.
Any help would be greatly appreciated.

Related

how does Webcam stream is usually done?

I am currently doing a small project and one of the components is to capture the webcam stream from one side to the other (Client-->Server). right now i have the stream from the Server as bytes and as far as i know i should transfer these bytes using UDP. My question is how to do that,
is it should be enclosed into a file and then transferred?
should i transfer the raw bytes?
should i create a buffer at the client side and when it gets full show it on the screen?
in short i would like to know how to implement the transfer of the stream from the server to the client (i need just on side).
You can stream a webcam to a client via multiple ways.
use Windows media Server/ Flash media Server. Push your webcam to the server by Windows Media Encoder or flash media encoder, and use the server live link to playback on the client(windows /Web).
Use Windows Media Encoder to stream your webcam to anyone without a server involved. when your encoder starts, you will get a URL to view your stream, which you can use to playback on the client(windows /Web).
use third party streaming services, where they give you a publishing point to publish your webcam stream, and use the link provided by them to playback on the client(windows /Web).. (check with brighcove or Mogulus by LiveStream
Hope this helps.

How to get libVLC to play segmented packets from a stream?

I am curious as to how to use libVLC to play segments of a clip downloaded over the network. We are developing sort of a P2P program where packets can have several paths to get to the destination. Once it gets there I need to know how to play it. My idea is that it will be placed in a buffer and the player will play the packets in a FIFO manner.
The code I have is a modified version of this tutorial. I know that it can directly receive streams and pretty much play whatever file we give it. However, I don't know how to do it so that I can place a segment of a file (obtained from the packets) into the buffer and let it play.
A lot of my search results in seeing libvlc_video_set_callbacks() but I fail to see how that helps. If it matters, we are only considering MP4 files for now and will expand later.

how to stream high quality video and audio from iPhone to remote server through Internet

I am seeking the following three items, which I cannot find on STACKOVERFLOW or anywhere:
sample code for AVFoundation capturing to file chunks (~10seconds) that are ready for compression?
sample code for compressing the video and audio for transmisison across the Internet?
ffmpeg?
sample code for HTTP Live Streaming sending files from iPhone to Internet server?
My goal is to use the iPhone as a high quality AV camcorder that streams to a remote server.
If the intervening data rate bogs down, files should buffer at the iPhone.
thanks.
You can use AVAssetWriter to encode a MP4 file of your desired length. The AV media will be encoded into the container in H264/AAC. You could then simply upload this to a remote server. If you wanted you could segment the video for HLS streaming, but keep in mind that HLS is designed as a server->client streaming protocol. There is no notion of push as far as I know. You would have to create a custom server to accept pushing of segmented video streams (which does not really make a lot of sense given the way HLS is designed. See the RFC Draft. A better approach might be to simply upload the MP4(s) via a TCP socket and have your server segment the video for streaming to client viewers. This could be easily done with FFmpeg either on the command line, or via a custom program.
I also wanted to add that if you try and stream 720p video over a cellular connection your app will more then likely get rejected for excessive data use.
Capture Video and Audio using AVFouncation. You can specify the Audio and Video codecs to kCMVideoCodecType_H264 and kAudioFormatMPEG4AAC, Frame sizes, Frame rates in AVCaptureformatDescription. It will give you Compressed H264 video and AAC aduio.
Encapsulate this and transmit to server using any RTP servers like Live555 Media.

Streaming live H.264 video via RTSP to iphone does work! w/example

Using FFMPEG, Live555, JSON
Not sure how it works but if you look at the source files at http://github.com/dropcam/dropcam_for_iphone you can see that they are using a combination of open source projects like FFMPEG, Live555, JSON etc. Using Wireshark to sniff the packets sent from one of the public cameras that's available to view with the free "Dropcam For Iphone App" at the App Store, I was able to confirm that the iphone was receiving H264 video via RTP/RTSP/RTCP and even RTMPT which looks like maybe some of the stream is tunneled?
Maybe someone could take a look at the open source files and explain how they got RTSP to work on the iphone.
Thanks for the info TinC0ils. After digging a little deeper I'v read that they have modified the Axis camera with custom firmware to limit the streaming to just a single 320x240 H264 feed, to better provide a consistent quality video over different networks and, as you point out, be less of a draw on the phone's hardware etc. My interest was driven by a desire to use my iphone to view live video and audio from a couple of IP cameras that I own without the jerkiness of MJPEG or the inherent latency that is involved with "http live streaming". I think Dropcam have done an excellent job with their hardware/software combo, I just don't need any new hardware at the moment.
Oh yeah, I almost forgot the reason of this post RTSP PROTOCOL DOES WORK ON THE IPHONE!
They are using open source projects to receive the frames and decoding in software instead of using hardware decoders. This will work, however, this runs counter to Apple's requirement that you use their HTTP Streaming. It will also require greater CPU resources such that it doesn't decode video at the desired fps/resolution on older devices and/or decrease battery life compared to HTTP streaming.

Best way to create FLV stream from screenshots

I want to create a FLV stream generated from images taken from my directx application, to end up on a webpage.
My current plan is (have been) to send screenshots as JPG:s from the dx app, to a client running on Linux. This client converts the JPG:s to a MJPEG stream. And ffmpeg converts the MJPEG stream to FLV - ending up in Flash Player in the browser.
Something like;
run dx app on windows machine, it listens for connection to send screenshot JPG:s to
on linux machine; ./jpg_to_mjpeg_client | ffmpeg -f mjpeg -i - output.flv
I thought the plan was good, but I'm stuck now. ffmpeg doesn't seem to handle the MJPEG stream coming from the client correctly. I used some code I found on the net for creating the MJPEG stream from the JPG:s, and I understand that there are no real specification for the MJPEG format so maybe they don't use the same MJPEG format or something.
Right now I'm sending [size of JPG buffer], [JPG buffer] for every frame from the dx app. I guess I could encode some stream there too somehow, but on the other hand I dont want to waste too much CPU on the rendering machine either.
How would you do it? Any tips are highly appreciated! Libraries/API:s to use, other solutions.. I don't have much experience of video encoding at all, but I know my ways around "general programming" pretty well.
C or C++ is preferred, but Java or Python might be OK too. I want it pretty fast though -
it has to be created in real time, one frame from the dx app should end up in the browser as soon as possible :-)
Oh, and in the future, the plan is it should be interactive so that I could communicate with/control the DX app from the webapp in the browser. Might be good to add that information too. Sort of like a web-based VCR and the movie is rendered in real-time from the DX app.
Thanks,
Use gstreamer on Linux. You can patch together almost any combination of inputs and outputs using whatever codecs you like. It is a bit of a hassle to learn.