I want to live stream with HTTP Live Streaming, but I have problem with libx264 or maybe something other.
My hardware and software environment:
Macbook Pro
VirtualBox with Ubuntu 16.04
Nginx and FFmpeg (in Ubuntu)
I am able to stream a static video file (in Ubuntu with Nginx and FFmpeg).
The FFmpeg command is as following:
$ffmpeg -i /my/sample/video.mp4 -codec:v libx264 -f hls /output/file.m3u8
But when it comes to webcam (live streaming), Mac OS Safari fails to open the stream. I used the following command:
$ffmpeg -i /dev/video0 -codec:v libx264 -f hls /output/file.m3u8
I guess it's the problem of libx264 because when I use mpeg2video encoder, Mac OS Safari indeed can play the stream:
$ffmpeg -i /dev/video0 -codec:v mpeg2video -f hls /output/file.m3u8
I know there is a library called video4linux2, should I use video4linux2 for capturing my webcam? But I don't know the appropriate FFmpeg command for HTTP Live Streaming (I tried FFserver but there is error something like cannot rename hls)
anyone shed some light on my problem?
I have figured it out!
The reason why Mac OS Safari cannot open a HTTP Live Streaming encoded by libx264 is because the default codec in Mac OS does not support the default output by libx264!
Simply add -pix_fmt yuv420p to the FFmpeg command and everything works fine:
$ffmpeg -f video4linux2 -i /your/webcam/path -codec:v libx264 -pix_fmt yuv420p /output/file.m3u8
Related
SOLVED & EDITED -
Catch: Doing live stream on Facebook with FMMPEG.
In past it was easy i did many times as facebook was using rtmp.
But now facebook is using RTMPS so i am getting different errors i have tried 100 commands.
I have a image test.png and a audio file test.m4a (its a podcast) and facebook stream key is 1234.
( i have tried 100 types of commands so cant post here and cant post errors aswell.)
so please can someone help me to go live on my facebook page with image+m4a file.
i prefer centos but i will manage ubuntu if you prefer.
Regards..
Solved : See my answer might help someone.
SOLVED
Hope it will be useful for someone.
I was trying with all possible results from google and stackoverflow.
nothing worked.
Then i did my own way and it worked after 2 hour.
i will stream video out.mp4 from my server on Facebook.
Install FFMPEG4 ( older version has issue with rtmps )
ffmpeg -re -y -i out.mp4 -c:a copy -ac 1 -ar 44100 -b:a 96k -vcodec libx264 -pix_fmt yuv420p -tune zerolatency -f flv -maxrate 2000k -preset veryfast "rtmps://live-api-s.facebook.com:443/rtmp/key"
You can stream on as many platforms as you want.
PS. if you want to stream image+audio replace our.mp4 .
but i used ffmpeg to make video from m4a file ( it will stream without lag) & buffer)
If you are on CentOS or Redhat, you may find it difficult to intall ffmpeg. Or your installation can miss some of the libraries/protocols required by ffmpeg to be able to do FB live. For this purpose, running docker image would be a great idea:
Install docker and pull ffmpeg docker image
Run docker ffmpeg image with necessary arguments
For Installing docker, run following commands:
remove older version of docker if installed
sudo yum remove docker docker-client docker-client-latest docker-common docker-latest docker-latest-logrotate docker-logrotate docker-engine
install docker yum repository
sudo yum install -y yum-utils
sudo yum-config-manager --add-repo https://download.docker.com/linux/centos/docker-ce.repo
install docker and start/enable docker service
sudo yum install docker-ce docker-ce-cli containerd.io
sudo systemctl start docker
sudo systemctl enable docker
Reference : https://docs.docker.com/engine/install/centos/
To run ffmpeg for FB live, run following command:
docker run -v $(pwd):$(pwd) -w $(pwd) jrottenberg/ffmpeg -re -y -i [VIDEO_FILE] -c:a copy -ac 1 -ar 44100 -b:a 96k -vcodec libx264 -pix_fmt yuv420p -tune zerolatency -f flv -maxrate 2000k -preset veryfast "rtmps://live-api-s.facebook.com:443/rtmp/[KEY]"
replace [VIDEO_FILE] with the file you want to stream live to fb and [KEY] with the stream key from facebook. Reference : https://hub.docker.com/r/jrottenberg/ffmpeg/
Please note that you need to visit facebook live page (https://www.facebook.com/live/producer/) before you can run the above command.
Thanks Mate for your answer. even you are late anyway i did something similar.
I have already figured it by using DOCKER.
using Restreamer : https://datarhei.github.io/restreamer/
It work on both centos & Ubuntu
Aslo i tried it manually it was very difficult to install and setup but i did that and will not recommended.
I'm trying to stream from Raspberry PI camera over network using raspivid and gstreamer cli. I want to be able to view the stream using VLC "open network stream" on the client.
This is related to question GStreamer rtp stream to vlc however mine it is not quite the same. Instead of encoding the raw output from my PI camera, my idea is to leverage the existing the h264 output of raspivid, mux it into an appropriate container and send it over TCP or UDP.
I was able to successfully capture the h264 output from raspivid into an mp4 file (with correct fps and length information) using this pipeline:
raspivid -n -w 1280 -h 720 -fps 24 -b 4500000 -a 12 -t 30000 -o - | \
gst-launch-1.0 -v fdsrc ! video/x-h264, width=1280, height=720, framerate=24/1 ! \
h264parse ! mp4mux ! filesink location="videofile.mp4"
However, when I try to stream this over a network:
raspivid -n -w 1280 -h 720 -fps 24 -b 4500000 -a 12 -t 0 -o - | \
gst-launch-1.0 -v fdsrc ! video/x-h264, width=1280, height=720, framerate=24/1 ! \
h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink host=192.168.1.20 port=5000
...and try to open the stream using rtp://192.168.1.20:5000 on VLC, it reports an error.
Edit: Ok, I was mistaken to assume that the udpsink listens for incoming connections. However, after changing the last part of the pipeline to use my client's IP address ! udpsink host=192.168.1.77 port=5000 and tried opening that with udp://#:5000 on the VLC, the player does not display anything (both the PI and the receiving computer are on the same LAN and I can see the incoming network traffic on the client).
Does anyone know how to properly construct a gstreamer pipeline to transmit existing h264 stream over a network which can be played by vanilla VLC on the client?
Assuming this is due to missing SPS/PPS data. E.g. probably it works if you start VLC first and then the video pipeline on the Raspberry PI. By default the SPS/PPS headers are most likely send only once at the beginning of the stream.
If the receiver misses SPS/PPS headers it will not be able to decode the H.264 stream. I guess this can be fixed by using the config-interval=-1 property of h264parse.
With that option SPS/PPS data should be send before each IDR-frame which should occur every couple of seconds - depending on the encoder.
Another thing is that you don't need to use rtpmp2tpay block. Just sending MPEG TS over UDP directly should be enough.
Having said that, the pipeline should look like this:
raspivid -n -w 1280 -h 720 -fps 24 -b 4500000 -a 12 -t 0 -o - | \
gst-launch-1.0 -v fdsrc ! \
video/x-h264, width=1280, height=720, framerate=24/1 ! \
h264parse config-interval=-1 ! mpegtsmux ! udpsink host=192.168.1.77 port=5000
The 192.168.1.77 is the IP address of the client running VLC at udp://#5000. Also, make sure no firewalls are blocking the incoming UDP trafict towards the client (Windows firewall, in particular).
I want live streaming on YouTube from my Raspberry Pi 3. The script properly work when I run the manually from shell. When I add that script in the file sudo nano /etc/rc.local to run it automatically on startup it run only first time when the Raspberry Pi starts next time its stop working and give an error 'cannot open connection network is unreachable'.
here is the script that i use for live streaming on YouTube from Raspberry Pi.
raspivid -o - -t 0 -vf -hf -fps 30 -b 6000000 | avconv -re -ar 44100 -ac 2 -acodec pcm_s16le -f s16le -ac 2 -i /dev/zero -f h264 -i - -vcodec copy -acodec aac -ab 128k -g 50 -strict experimental -f flv rtmp://a.rtmp.youtube.com/live2/[your-secret-key-here]
I want to run this script automatically each time when the Raspberry Pi startup without any error.
for more information check this link.
After caring out many attempts I found the solution.
It is very Simple. just make a new python file with any name in my case livestream.py and paste the code.
import os
os.system(raspivid -o - -t 0 -vf -hf -fps 25 -b 600000 | avconv -re -ar 44100 -ac 2 -acodec pcm_s16le -f s16le -ac 2 -i /dev/zero -f h264 -i - -vcodec copy -acodec aac -ab 128k -g 50 -strict experimental -f flv rtmp://a.rtmp.youtube.com/live2/[your-secret-key-here]
)
It can be be run by installing php in Raspberry Pi by using this script
sudo apt-get install php5-fpm php5-mysql
and run file livestream.php the php code is
<?php
exec("raspivid -o - -t 0 -vf -hf -fps 25 -b 600000 | avconv -re -ar 44100 -ac 2 -acodec pcm_s16le -f s16le -ac 2 -i /dev/zero -f h264 -i - -vcodec copy -acodec aac -ab 128k -g 50 -strict experimental -f flv rtmp://a.rtmp.youtube.com/live2/[your-secret-key-here]
");
?>
I'm very new to Raspberry Pi, and have no prior notable experience with Linux so this is all new to me...
Octoprint is a 3D printer spooler that you can run on your raspberry pi. One of the features on Octoprint is the ability to setup a USB camera to view either still images or a stream of your print.
I am using the Octopi prepackaged Octoprint image.
Octoprint's github contains the following info referring to my USB camera. But I have no idea how to implement this.
Hama PC-Webcam "AC-150" on Raspberry Pi
./mjpg_streamer -o output_http.so -w ./www -i input_uvc.so -y -r 640x480 -f 10
https://github.com/foosel/OctoPrint/wiki/Webcams-known-to-work
I'm guessing this is an easy command that I enter via console, but I've winged few commands with no luck. Can someone shed some light on how I use this? Like I said I'm an absolute beginner with the pi...
Any help is greatly appreciated!
Try this:
camera_usb_options="-r VGA -f 10 -y"
sudo service octoprint stop
fuser /dev/video0
/dev/video0: **1871m**
$ ps axl | grep **1871** *Change this number by yours*
$ kill -9 **1871**
./mjpg_streamer -i "input_uvc.so $camera_usb_options" -o "output_http.so -w ./www"
sudo service octoprint start
I bought a USB Audio Controller for RaspberryPi, in order to capture Audio input. I already done below steps on Rasbian but still unsure about Audio Capturing.
Can you please guide me on how do I get it?
Type the following commands to install the Audio device
pi#raspberrypi ~ $ sudo apt-get install alsa-utils
Detection Successfully by
pi#raspberrypi ~ $ lsusb
pi#raspberrypi ~ $ amixer
pi#raspberrypi ~ $ alsamixer
Also configure USB Audio device to make it default
/etc/modprobe.d/alsa-base.conf
by adding pound/hash symbol with
options snd-usb-audio index=2
#options snd-usb-audio index=2
simply write this command of Command Terminal, in order to record Audio
arecord -f cd -D plughw:0 -d 10 a.wav
Let me know if you get further any issue in recording sound