I want to analyse a mp4 video by:
gst-launch-0.10 filesrc location=file:~/examples/fullstream.mp4 ! tsparse ! tsdemux ! queue ! ffdec_h264 max-threads=0 ! ffmpegcolorspace ! autovideosink name=video
or
gst-launch-0.10 filesrc location=http://192.168.40.228:8080/fullstream.mp4 ! mpegtsdemux ! queue ! ffdec_h264 max-threads=1 ! ffmpegcolorspace ! autovideosink name=video
But the terminal shows:
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Resource not found.
Additional debug info:
gstfilesrc.c(1042): gst_file_src_start (): /GstPipeline:pipeline0/GstFileSrc:filesrc0:
No such file "file:/home/zhaozhenjie/examples/fullstream.mp4"
Setting pipeline to NULL ...
Freeing pipeline ...
The ~/examples/fullstream.mp4 does exist. So I want to ask what's wrong with the code? I use Ubuntu 14.04.
You cannot use filesrc for network stream. For http serving file you have to use souphttpsrc - it was available also in GStreamer 0.10
gst-launch-0.10 souphttpsrc location=http://192.168.40.228:8080/fullstream.mp4 ! qtdemux name=demuxer demuxer. ! queue ! faad ! audioconvert ! audioresample ! autoaudiosink demuxer. ! queue ! ffdec_h264 ! ffmpegcolorspace ! autovideosink
UPDATE
This is for using filesrc - the format of path is just normal absolute path or relative path from place where you execute (if needed use double " around):
gst-launch-0.10 filesrc location=/home/user/examples/fullstream.mp4 ! qtdemux name=demuxer demuxer. ! queue ! faad ! audioconvert ! audioresample ! autoaudiosink demuxer. ! queue ! ffdec_h264 ! ffmpegcolorspace ! autovideosink
This should maybe also work..
gst-launch-0.10 filesrc location=~/examples/fullstream.mp4 ...
Related
I am trying to send a video source to three outputs: multicast, filesystem, and (resized video) display with gst-launch-1.0.
This is the command,
gst-launch-1.0 videotestsrc ! x264enc ! tee name=t \
t. ! queue ! rtph264pay ! udpsink host=224.1.1.1 port=20000 auto-multicast=true \
t. ! queue ! h264parse ! splitmuxsink location=./vid%02d.mkv max-size-time=10000000000 \
t. ! queue ! videoconvert ! videoscale ! video/x-raw,width=100 ! autovideosink
and this is the error,
WARNING: erroneous pipeline: could not link queue2 to videoconvert0
Your problem is that you are sending h264 stream to videconvert that rather expects raw video. So you would just add decoding:
gst-launch-1.0 -e videotestsrc ! video/x-raw,width=640,height=480,framerate=30/1 ! queue ! x264enc ! tee name=t t. ! queue ! rtph264pay ! udpsink host=224.1.1.1 port=20000 auto-multicast=true t. ! queue ! h264parse ! splitmuxsink location=./vid%02d.mkv max-size-time=10000000000 t. ! queue ! h264parse ! avdec_h264 ! videoconvert ! videoscale ! video/x-raw,width=100 ! autovideosink
I am developing a video chat application and I need realtime streaming with audio and video in sync. This is what I did......
video encoding with x264 encoder and decoding
audio encoding with lamemp3 encoder and decoding
mpegts muxing and demuxing
my sender command:
gst-launch-1.0 -e mpegtsmux name="muxer" ! udpsink host=172.20.4.19 port=5000 v4l2src ! video/x-raw, width=640,height=480 ! x264enc tune=zerolatency byte-stream=true ! muxer. pulsesrc ! audioconvert ! lamemp3enc target=1 bitrate=64 cbr=true ! muxer. rtph264pay
my reciever command:
gst-launch-1.0 udpsrc port=5000 ! decodebin name=dec ! queue ! autovideosink dec. ! queue ! audioconvert ! audioresample ! autoaudiosink
However I am getting a delay of more than 1 second. What is causing this delay (assuming that I did something enherintly wrong)? And how would I minimize it?
Looking for explanation how to using named elements in respect with muxing two inputs in one module. For instance muxing audio and video in one mpegtsmux modle
gst-launch filesrc location=surround.mp4 ! decodebin name=dmux ! queue ! audioconvert ! lamemp3enc dmux. ! queue ! x264enc ! mpegtsmux name=mux ! queue ! filesink location=out.ts
Above pipeline gives plugins interconnection like below
So it shows audio doesn't connect to mpegtsmus.
How to modify command line to have audio and video muxedup in mpegtsmux ?
Thanks!
I'll try to give the basic idea though I'm not that proficient and could be plain wrong.
A pipeline can consist of several sub-pipelines. If some element (bin) ends not with a pipe (!) but with a start of another element, then it's a new sub-pipeline: filesrc location=a.mp4 ! qtdemux name=demp4 demp4. ! something
A named bin (usually a muxer), or its pads like somedemux.audio_00 can be a source and/or a sink in other sub-pipelines: demp4. ! queue ! decodebin ! x264enc ! mux.
Usually a sub-pipeline ends with a named bin/muxer, either declared: mpegtsmux name=mux or referenced by name: mux. The dot at the end is a syntax of a reference.
Then the named muxer can be piped to a sink in a yet another sub-pipeline: mux. ! filesink location=out.ts
If you're only using the only audio or video stream from a source, you don't have to specify a pad like muxname.audio_00. muxname. is a shortcut to "suitable audio/video pad from muxname".
The example
That said, I assume that your mp4 file has both audio and video. In this case, you need to demux it into 2 streams first, decode, re-encode and then mux them back.
Indeed, your audio is not connected to mpegtsmux.
If you really need to decode the streams, that's what I would do. This didn't work for me, though:
gst-launch-1.0 filesrc location=surround.mp4 ! \
qtdemux name=demp4 \
demp4. ! queue ! decodebin ! audioconvert ! lamemp3enc ! mpegtsmux name=mux \
demp4. ! queue ! decodebin ! x264enc ! mux. \
mux. ! filesink location=out.ts
or let's use decodebin to magically decode both streams:
gst-launch-1.0 filesrc location=surround.mp4 ! \
decodebin name=demp4 \
demp4. ! queue ! audioconvert ! lamemp3enc ! mpegtsmux name=mux \
demp4. ! queue ! x264enc ! mux. \
mux. ! filesink location=out.ts
It is not linked because your launch line doesn't do it. Notice how the lamemp3enc element is not linked downstream.
Update your launch line to:
gst-launch filesrc location=surround.mp4 ! decodebin name=dmux ! queue ! audioconvert ! lamemp3enc ! mux. dmux. ! queue ! x264enc ! mpegtsmux name=mux ! queue ! filesink location=out.ts
The only change is " ! mux." after the lamemp3enc to tell it to link to the mpegtsmux.
While you are updating things, please notice that you are using gstreamer 0.10 that is years obsolete and unmantained, please upgrade to the 1.x series to get latest improvements and bugfixes.
i'm currently trying to stream two side by side webcams from my raspberry pi.
i found a pipeline for gstreamer:
gst-launch v4l2src device=/dev/video1 ! videoscale ! ffmpegcolorspace ! \
video/x-raw-yuv, width=640, height=480 ! videobox border-alpha=0 left=-640 !\
videomixer name=mix ! ffmpegcolorspace ! jpegenc ! tcpserversink \
host=192.168.1.108 port=8080 sync=false v4l2src ! videoscale !\
ffmpegcolorspace ! video/x-raw-yuv, width=640, height=480 !\
videobox right=-640 ! mix.
both webcams indicates that they are active by light, but i only can see the right side.
could someone please help me on this?
regards
carsten
I ran the line fine in my Linux box but just as a wild guess, try adding a queue element before every videomixer input pad.
I see dev/video1 but no dev/video2 or rather dev/video0 might want to specify that in your v4l2src.
Also I was having trouble with a pipeline similar to yours, this one worked for me:
gst-launch-0.10 v4l2src device=/dev/video1 ! videoscale ! ffmpegcolorspace ! video/x-raw-yuv, width=320, height=240 ! videobox border-alpha=0 ! videomixer name=mixme ! ffmpegcolorspace ! jpegenc ! avimux ! filesink location=sbs-3d-video.mov v4l2src device=/dev/video0 ! videoscale ! ffmpegcolorspace ! video/x-raw-yuv, width=320, height=240 ! videobox left=-320 ! mixme.
Sorry for your version of gstreamer:
gst-launch v4l2src device=/dev/video1 ! videoscale ! ffmpegcolorspace ! video/x-raw-yuv, width=320, height=240 ! videobox border-alpha=0 ! videomixer name=mixme ! ffmpegcolorspace ! jpegenc ! avimux ! filesink location=sbs-3d-video.mov v4l2src device=/dev/video0 ! videoscale ! ffmpegcolorspace ! video/x-raw-yuv, width=320, height=240 ! videobox left=-320 ! mixme.
This works:
gst-launch-0.10 \
videotestsrc ! ffmpegcolorspace ! 'video/x-raw-yuv' ! mux. \
audiotestsrc ! audioconvert ! 'audio/x-raw-int,rate=44100,channels=1' ! mux. \
avimux name=mux ! filesink location=gst.avi
I can let it run for a while, kill it, and then totem gst.avi displays a nice test card with tone.
However, trying to do something more useful like
gst-launch-0.10 \
filesrc location=MVI_2034.AVI ! decodebin name=dec \
dec. ! ffmpegcolorspace ! 'video/x-raw-yuv' ! mux. \
dec. ! audioconvert ! 'audio/x-raw-int,rate=44100,channels=1' ! mux. \
avimux name=mux ! filesink location=gst.avi
it just displays
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
and then stalls indefinitely.
What's the trick to get the version with decodebin rolling ?
Aha... this does what I want:
gst-launch-0.10 \
filesrc location=MVI_2034.AVI ! decodebin name=dec \
dec. ! queue ! ffmpegcolorspace ! 'video/x-raw-yuv' ! queue ! mux. \
dec. ! queue ! audioconvert ! 'audio/x-raw-int,channels=1' ! audioresample ! 'audio/x-raw-int,rate=44100' ! queue ! mux. \
avimux name=mux ! filesink location=gst.avi
The queue elements (both leading and trailing) do appear to be crucial.
Further experiments adding things like videoflip or
videorate ! 'video/x-raw-yuv,framerate=25/1'
into the video part of the pipeline all work as expected.
your pipeline seems to be correct. however, gst-launch is a limited tool - i would suggest coding the pipeline in python or ruby for better debugging.