Why does FFMPEG always make large WebM files? - encoding

I'm trying to encode my movies into WebM:
ffmpeg -i input.MOV -codec:v libvpx -quality good -cpu-used 0 -b:v 10k
-qmin 10 -qmax 42 -maxrate 10k -bufsize 20k -threads 8 -vf scale=-1:1080
-codec:a libvorbis -b:a 192k
output.webm
I want to encode at a couple of different bit rates (video and audio combined):
2192 kbps
1692 kbps
1000 kbps
The problem is that no matter which bit rates I enter, I always get a file with a bit rate higher than 1900 kbps. (1914 kbps with the code example above.)
What am I doing wrong?

libvpx is a little complicated with regard to rate control and quality settings. Please refer to the vpx Encoding Guide and the VP8 Encode Parameter Guide for more info. It took me an hour of digging through the source code to understand it.
If you want to set constant bitrate, you will have to set b:v, maxrate and minrate to the same values, for example like so (note that I left out the audio options here for brevity):
ffmpeg -i input.mov -c:v libvpx -b:v 1900K -maxrate 1900K -minrate 1900K output.webm
If instead you want to use variable quality and just specify the upper bound for the bitrate, then you need to set both b:v and crf. If you leave out crf, the specified bitrate will just be taken as an average. Only with crf, the encoder changes the meaning of b:v to the maximum allowed rate.
ffmpeg -i input.mov -c:v libvpx -b:v 1900K -crf 10 output.webm
A value of 10 for the CRF is a good starting point, but libvpx may change the quality per frame within the bounds of qmin ≤ q ≤ qmax, which you can also specify if you want. Setting a lower bound of 10 for qmin seems a little high to me, but in essence you'll have to do some trial and error anyway, since if the maximum bitrate is too low, you'll constantly saturate it.

Related

How to get FFMPEG to use more GPU when encoding

so the situation is as following
Im receiging 20/30 uncompressed image per second. format is either PNG or Bitmap. Each individual photo size is between 40 and 50 mb (all have same size since uncompressed).
I want to encode them to a 265 lossless video and stream them to a http server using FFMPEG.
The output video is 1920x1080, so there is some downsampling.
Compression is allowed but nothing is allowed to be lost other than the down sampling.
now i m still in the testing phase. i have a 500 sample image. and i m tryng to encode them as effeciently as possible.
Im using commands such as :
ffmpeg -hwaccel cuvid -f image2 -i "0(%01d).png" -framerate 30 -pix_fmt p010le -c:v hevc_nvenc -preset lossless -rc vbr_hq -b:v 6M -maxrate:v 10M -vf scale=1920:1080 -c:a aac -b:a 240k result.mp4
I have a powerfull modern quadro GPU and a 6 cores intel CPU and an Nvme hard drive.
The usuage of the GPU when encoding is exactly 10%, CPU is circa 30-40%
How can i get GPU usuage to 80% ? The machine on which im going to run the code will have at leat a quadro 4000 (maybe stronger) and i want to use it to the fullest
That’s not how it works. GPUs do not use the standard vector processing units for video encoding. (Well, it does a little, for things like color conversion and scaling, but not not for everything). The GPU has dedicated circuitry for video encoding primitives. When those are full, it doesn’t matter how many GPU cores you have, they will be idle.
So to to use “more” GPU, you don’t get a beefy GPU, you buy a card that has more NVENC cores.
If your ffmpeg was compiled with --enable-libnpp then consider using the GPU based scale_npp filter instead of scale which is CPU only. Example from FFmpeg Wiki: Hardware Acceleration:
ffmpeg -hwaccel cuda -i input -vf scale_npp=1920:1080 -c:v h265_nvenc output.mp4
You may see an improvement in performance or GPU utilization.

Linphone opus codec sampling rate

I would like to use opus codec in linphone
But I have a few problems using it. If someone with opus codec knowledge could help me out would appreciate it.
How I can force audio sampling scheme to 8000 Hz? Currently, it uses 48000 Hz only.
Thanks in advance
If you look at rfc7587 Section 4.1, you can read this:
Opus supports 5 different audio bandwidths, which can be adjusted
during a stream. The RTP timestamp is incremented with a 48000 Hz
clock rate for all modes of Opus and all sampling rates. The unit
for the timestamp is samples per single (mono) channel. The RTP
timestamp corresponds to the sample time of the first encoded sample
in the encoded frame. For data encoded with sampling rates other
than 48000 Hz, the sampling rate has to be adjusted to 48000 Hz.
Reading more in the rfc7587, you will find out that, in SDP, you will always see the codec being using "OPUS/48000/2", no matter the real sampling rates.
No matter the real sampling rate, as explained above, the RTP timestamp will always be incremented with a 48000 Hz clock rate.
If you wish to control the real sampling rate for the codec (and thus, the bandwidth), you can use the following SDP parameters: maxplaybackrate and maxaveragebitrate are the ones to be used.
Section 3.1.1 is listing the relation between maxaveragebitrate and the sampling rate:
3.1.1. Recommended Bitrate
For a frame size of 20 ms, these are the bitrate "sweet spots" for Opus in various configurations:
o 8-12 kbit/s for NB speech,
o 16-20 kbit/s for WB speech,
o 28-40 kbit/s for FB speech,
o 48-64 kbit/s for FB mono music, and
o 64-128 kbit/s for FB stereo music.
Conclusion: to use only 8000Hz in OPUS, you must negotiate with such parameters, where 12kbit/s is the maximum setup for opus in NB speech:
m=audio 54312 RTP/AVP 101
a=rtpmap:101 opus/48000/2
a=fmtp:101 maxplaybackrate=8000; sprop-maxcapturerate=8000; maxaveragebitrate=12000
I don't know if linphone is following all the parameters, but this is the theory!

ffplay keep video/audio sync when using select filter

I'm trying to play/skip some clips of a video using ffplay. My first approach to skip say frames 100 to 400 was:
ffplay -vf "select='lte(n\,100)+gte(n\,400)'" -i INPUT
this skips the desired frames, however it also freezes the video during the skipped frames. I tried to fix this by modifying the video presentation time stamp (PTS) with the setpts option:
ffplay -vf "select='lte(n\,100)+gte(n\,400)',setpts='PREV_OUTPTS'" -i INPUT
this seems to work (stills freeze a bit, guess is because of buffering), but now the audio is out of sync. I've tried applying a select filter and modifying the PTS on the audio as well
ffplay -vf "select='lte(n\,100)+gte(n\,400)',setpts='PREV_OUTPTS'" -af "aselect='lte(n\,100)+gte(n\,400)',asetpts='PREV_OUTPTS'" -i INPUT
this skips some audio frames, but still out of sync. I've tried with the aresample=async=10000 option with similar results. Moving some/all of the filters to the output (placing them after the -i INPUT) doesn't work either.
Does someone know how to skip parts of a video using ffplay? Many thanks
Audio frame numbers != video frame numbers. AAC audio generated by FFmpeg's encoder is 1024 samples per frame, so a 48kHz stream has 48000/1024 = 46.875 audio frames per second. Other codecs may have different rates.
Use t instead of n, and generate a continuous series of timestamps.
ffplay
-vf "select='lte(t\,4)+gte(t\,16)',setpts=N/FRAME_RATE/TB"
-af "aselect='lte(t\,4)+gte(t\,16)',asetpts=N/SR/TB"
-i INPUT
I assume a video frame rate of 25 fps. Modify accordingly.

Apple Quick Time Mov Files Slow Down Playback Rate Via Command Line

I am looking for commmandline to slow down Quick Time formated MOV files. Most likely using FFMPEG. I do not mind converting to MP4 format either.
To slow down your video, you have to use a multiplier greater than 1:
ffmpeg -i input.mov -filter:v "setpts=2.0*PTS" output.mov
I am not sure if this works right now.
batch slow down .mov speed (No answer here either)
Almost impossible without full reencoding (or transcondig).
If the source is video only, it can be easily done by simple hex editing. Just change the track timescale value in the MDHD box =>
http://wiki.multimedia.cx/?title=QuickTime_container#mdhd
The lower timescale the slower play rate.
I've tested it works as following:
1) find out current frame rate with Mediainfo tool
2) Open the file with HxD
3) Recklessly search 'mdhd'
4) Between 'mdhd' and 'hdlr', find 16 bit big endian hex representation of frame rate and change it
I'm not sure but this kind of hacking seems not supported by ffmpeg.
But if it also has audio track, changing its timescale will produce noisy sound, therefore reencoding is unavoidable.
Transcoding is rather straightforward work. I'd recommend HandBreak or other GUI frontends.
Use this line
ffmpeg -i input.mkv -filter_complex "[0:v]setpts=0.5*PTS[v];[0:a]atempo=2.0[a]" -map "[v]" -map "[a]" output.mkv
I used this link
https://trac.ffmpeg.org/wiki/How%20to%20speed%20up%20/%20slow%20down%20a%20video

Converting mp4 to webm in a batch

I have a large batch (400+) of mp4 files I wish to convert to webm.
I've tried:
ffmpeg -i myfile.mp4 -c:v libvpx -minrate 1M -maxrate 1M -b:v 1M myfile.webm
but the file is corrupt when I try to play it. Could anyone help? Here is input data for one of the mp4 files... I'm not really good enough with these things to know what it all means, but I tried my best to pull out bits I thought might be relevant.
Format : MPEG-4
File size : 2.18 MiB
Duration : 1s 857ms
Overall bit rate mode : Variable
Overall bit rate : 9 829 Kbps
Video
Codec ID : 20
Bit rate mode : Constant
Bit rate : 9 808 Kbps
Width : 1 280 pixels
Height : 720 pixels
Display aspect ratio : 16:9
Frame rate mode : Constant
Frame rate : 59.940 fps
Bit depth : 8 bits
Scan type : Progressive
Compression mode : Lossy
Bits/(Pixel*Frame) : 0.178
Stream size : 2.15 MiB (99%)
Writing library : Lavc54.59.100
When I do that with an MP4 file of my own, the output actually plays fine in both mplayer and vlc; you might want to read ffmpegs official examples on this.
You should define the audio format you want to use; whatever tells you the file is "broken" maybe doesn't like what it sees, which probably is what was in the MP4 container to begin with:
ffmpeg -i input.mp4 -c:v libvpx -qmin 0 -qmax 50 -crf 5 -b:v 1M -c:a libvorbis output.webm