How does Rust work with process arguments? - command-line

I'm really confused about Rust processes. I'm trying to call something like this:
ffmpeg -i path/to/test-video.webm -ab 160k -ac 2 -vn -f mp3 -
This should extract sound out of video and send it to stdout. So I've done this:
let sound: std::process::Output = Command::new("ffmpeg")
.arg(format!("-i {}", args.input.to_str().unwrap()))
.arg("-ab 160k")
.arg("-ac 2")
.arg("-vn")
.arg("-f mp3")
.arg("-")
.stdout(Stdio::piped())
.stdin(Stdio::inherit())
.stderr(Stdio::inherit())
.output()
.unwrap();
But for some reason, this doesn't work. It prints this to stderr:
Unrecognized option 'i path/to/test-video.webm'.
Error splitting the argument list: Option not found
When I remove the slashes from args (so it looks like .arg(format!("i {}", ...)).arg("ab 160k")..., I get this:
Output file #0 does not contain any stream
I think I misunderstood how this works, but I tested it on other applications and it seemed to work the way I'm doing it now. What did I miss, how does Rust work with these arguments?
And just to be clear, I know about the ffmpeg crates, but they don't work for me for some reason, I can't even compile them.

try this
let sound: std::process::Output = Command::new("ffmpeg")
.arg("-i")
.arg(args.input.to_str().unwrap())
.arg("-ab")
.arg("160k")
.arg("-ac")
.arg("2")
.arg("-vn")
.arg("-f")
.arg("mp3")
.arg("-")
.stdout(Stdio::piped())
.stdin(Stdio::inherit())
.stderr(Stdio::inherit())
.output()
.unwrap();

Related

ffmpeg: What is the best practice to keep a live connection/socket with a camera, and save time on ffprobe

Today... I used the following command: with subprocess.PIPE and subprocess.Popen in python 3:
ffmpeg -i udp://{address_of_camera} \
-vf select='if(eq(pict_type,I),st(1,t),gt(t,ld(1)))' setpts=N/FRAME_RATE/TB \
-f rawvideo -an -vframes {NUM_WANTED_FRAMES} pipe:`
This command helps me to capture NUM_WANTED_FRAMES frames from a live camera at a given moment.
However... it takes me about 4 seconds to read the frames, and about 2.5 seconds to open a socket between my computer and the camera's computer.
Is there a way, to have a socket/connection always open between my computer and the camera's computer, to save the 2.5 seconds?
I read something about fifo_size and overrun_fatal. I thought that maybe I can set fifo_size to be equal to NUM_WANTED_FRAMES, and overrun_fatal to True? Will this solve my problem? Or is there a different and simpler/better solution?
Should I try to record always (no -vframes flag) store the frames in a queue(With max size), and upon a wish to slice the video, read from my queue buffer? Will it work well with the keyframe?
Also... What to do when ffmpeg fails? restart the ffmpeg command?
FFmpeg itself is an one-n-done type of app. So, to keep the camera running, the best option is to "record always (no -vframes flag)" and handle whether to drop/record frames in Python.
So, a rough sketch of the idea:
import subprocess as sp
from threading import Thread, Event
from queue import Queue
NUM_WANTED_FRAMES = 4 # whatever it is
width = 1920
height = 1080
ncomp = 3 # rgb
framesize = width*height*ncomp # in bytes
nbytes = framesize * NUM_WANTED_FRAMES
proc = Popen(<ffmpeg command>, stdout=sp.PIPE)
stdout = proc.stdout
buffer = Queue(NUM_WANTED_FRAMES)
req_frame = Event() # set to record, default to drop
def reader():
while True:
if req_frame.is_set():
queue.put(stdout.read(nbytes))
record_frame.clear()
else:
# frames not requested, drop
stdout.read(framesize)
rd_thread = threading.Thread(target=reader)
rd_thread.start()
...
# elsewhere in your program, do this when you need to get the camera data
req_frame.set()
framedata = queue.get()
....
Will it work well with the keyframe?
Yes, if your FFmpeg command has -discard nokey it'll read just keyframes.
What to do when ffmpeg fails? restart the ffmpeg command?
Have another thread to monitor the health of proc (Popen object) and if it is dead, you need to restart subprocess with the same command and overwrite with the new stdout. You probably want to protect your code with try-except blocks as well. Also, adding timeouts to queue ops would be a good idea, too.

How to create MPEG2 Transport Stream Pipeline Using Python and Gstreamer

In developing a streaming audio application I used the gst-launch-1.0 command-line tool to generate an MPEG Transport stream for testing. This worked as intended (I was able to serve the stream from a simple http server and hear it using VLC media player). I then tried to replicate the encoding part of that stream in Python gstreamer code. The Python version connected to the server ok, but no audio could be heard. I'm trying to understand why the command-line implementation worked, but the Python one did not. I am working on Mac OS 10.11 and Python 2.7.
The command line that worked was as follows:
gst-launch-1.0 audiotestsrc freq=1000 ! avenc_aac ! aacparse ! mpegtsmux ! tcpclientsink host=127.0.0.1 port=9999
The Python code that created the gstreamer pipeline is below. It instantiated without producing any errors and it connected successfully to the http server, but no sound could be heard through VLC. I verified that the AppSrc in the Python code was working, by using it with a separate gstreamer pipeline that played the audio directly. This worked fine.
def create_mpeg2_pipeline():
play = Gst.Pipeline()
src = GstApp.AppSrc(format=Gst.Format.TIME, emit_signals=True)
src.connect('need-data', need_data, samples()) # need_data and samples defined elsewhere
play.add(src)
capsFilterOne = Gst.ElementFactory.make('capsfilter', 'capsFilterOne')
capsFilterOne.props.caps = Gst.Caps('audio/x-raw, format=(string)S16LE, rate=(int)44100, channels=(int)2')
play.add(capsFilterOne)
src.link(capsFilterOne)
audioConvert = Gst.ElementFactory.make('audioconvert', 'audioConvert')
play.add(audioConvert)
capsFilterOne.link(audioConvert)
capsFilterTwo = Gst.ElementFactory.make('capsfilter', 'capsFilterTwo')
capsFilterTwo.props.caps = Gst.Caps('audio/x-raw, format=(string)F32LE, rate=(int)44100, channels=(int)2')
play.add(capsFilterTwo)
audioConvert.link(capsFilterTwo)
aacEncoder = Gst.ElementFactory.make('avenc_aac', 'aacEncoder')
play.add(aacEncoder)
capsFilterTwo.link(aacEncoder)
aacParser = Gst.ElementFactory.make('aacparse', 'aacParser')
play.add(aacParser)
aacEncoder.link(aacParser)
mpegTransportStreamMuxer = Gst.ElementFactory.make('mpegtsmux', 'mpegTransportStreamMuxer')
play.add(mpegTransportStreamMuxer)
aacParser.link(mpegTransportStreamMuxer)
tcpClientSink = Gst.ElementFactory.make('tcpclientsink', 'tcpClientSink')
tcpClientSink.set_property('host', '127.0.0.1')
tcpClientSink.set_property('port', 9999)
play.add(tcpClientSink)
mpegTransportStreamMuxer.link(tcpClientSink)
My question is, how does the gstreamer pipeline that I've implemented in Python differ from the command-line pipeline? And more generally, how do you DEBUG this sort of thing? Does gstreamer have any 'verbose' mode?
Thanks.
One question at a time:
1) How does it differ from gst-launch-1.0?
It is hard to tell without seeing your full code but I'll try to guess:
gst-launch-1.0 does proper pad linking. When you have a muxer like you do you can't directly link it as it is created without any sink pads. You need to request one to be created before you can link. Take a look at dynamic pads: https://gstreamer.freedesktop.org/documentation/application-development/basics/pads.html
Also, gst-launch-1.0 has error handling, so it checks that every action succeeded and otherwise reports an error. I'd recommend you add a GstBus message handler to get notified of error messages at least. Also you should check the return for the functions you call in GStreamer, that would allow you to catch this linking error in your program.
2) Gstreamer debugging?
Mostly done by setting the GST_DEBUG variable: https://gstreamer.freedesktop.org/documentation/tutorials/basic/debugging-tools.html#the-debug-log
Run your application with: GST_DEBUG=6 ./yourapplication and you should see lots of logging.

Correct gstreamer pipeline for particular rtsp stream

I'm trying to convert this RTSP URL to something else (anything!) using this gst pipeline:
gst-launch rtspsrc location=rtsp://184.72.239.149/vod/mp4:BigBuckBunny_175k.mov ! rtpmp4vdepay ! filesink location=somebytes.bin
This gives the following error:
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2791): gst_base_src_loop (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0:
streaming task paused, reason not-linked (-1)
So I guess it's something about connecting the rstp source to the depayloader. If I change the pipeline to use rtpmp4gdepay rather than vdepay, it works and produces something, but I'm not sure what the output format is.
Does anyone know what pipeline I should be using to get at the video from this URL? I'm assuming it's mp4/h264/aac, but maybe it's not.
Try this first:
gst-launch-0.10 -v playbin2 uri=rtsp://184.72.239.149/vod/mp4:BigBuckBunny_175k.mov
or
gst-launch-1.0 -v playbin uri=rtsp://184.72.239.149/vod/mp4:BigBuckBunny_175k.mov
Mov file is not directly streamable. So your RTSP source is probably sending you two elementary streams. (I am guessing this is darwin or some other similar server) So you may have to setup two outputs from rtspsrc one for audio and one for video.
rtpmp4vpay is for elementary mpeg4 streams. Is your source file having mpeg4 video codec? If it is h.264 replace it with rtph264depay. you can pass the output to decoder and play it if you want. Feed it to decodebin. To dump it in h.264 you will first have to parse and add nal headers to it ( h264parse parse perhaps? )
rtpmp4gpay is most probably accepted for the audio stream.
I am guessing your file is h.264/aac which is why rtpmp4vdepay wont work and rtpmp4gdepay will. But you are not doing anything about the video when you setup rtpmp4gdepay so you need to do that.

Make MPlayer show all playback state change message in output

I'm currently using MPlayer in slave mode for a video player im making.
As of currently, the media player shows ==== PAUSED ==== when it's paused and I can read the output for this status to know when the video is paused.
The command line arg i am using as of now is msglevel identify=6:statusline=-1 (i disabled statusline as it produced A: 0.7 V: 0.6 A-V: 0.068 ... and unneccessary stuff)
What do I need to set the msglevel (or anything else) so that it will also show ==== PLAYING ==== or any indication that it started playing, stopped, media ended, loading, etc?
I found out how to get if the video is paused.
By sending command pausing_keep_force get_property pause to mplayer, it responds with ANS_pause=no if not paused, and ANS_pause=yes if paused. Problem solved.
Based on what I can decipher from the OP's answer to his/her own question, he/she was looking for a way to determine whether mplayer was paused or playing. I've written a little bash script that can handle this task with some simple function calls.
You can actually inspect the last couple lines of mplayer's output to see if mplayer is paused. I put together a little bash library that can be used to query some status information about mplayer. Take a look on my GitHub. There are instructions for integrating my script with other bash scripts.
If you implement my script, you will need to play your media file using the playMediaFile function. Then you can simply call the isPaused function as a condition in bash like this:
if isPaused; then
# do something
fi
# or
if ! isPaused; then
# do something
fi
# or
ifPaused && #do something

Redirecting printf?

How do you redirect the output of printf to, for example, a stream or something? I have a gui app that links with a console library. The library makes repeated calls to printf. I need a way to intercept those and have them processed by a function. Also, creating a console is not an option. Im using Windows, btw.
Edit - Also I was hoping not to redirect to a file.
freopen(filename, mode, stdout);
If you want to avoid using a file you can use a named pipe, redirect stdout to it and read from it in a different thread or process.
Some pseudocode with omitted error checking:
HANDLE hPipe = CreateNamedPipe("\\.\pipe\SomePipeName", ...);
int pipeFileDescriptor = _open_osfhandle(hPipe, someFlags);
_dup2(pipeFileDescriptor, _fileno(stdout));
Now what printf writes to stdout should go to the pipe.
In another thread or process you can read from the pipe into a buffer:
HANDLE hPipeClient = CreateFile("\\.\pipe\SomePipeName", ...);
ReadFile(hPipeClient, ...);
I think it will work but I haven't tested it yet.