Windows command line execute pdftk.exe with arguments length > 8192 characters - command-line

I need to merge ~1000 pdf files. I call pdftk from my program for this purpose. However, it happens that input arguments exceeds maximum Windows command line character length of 8192 and i get error.
I tried to store my input in txt file, but without success:
pdftk < file.txt
type file.txt | pdftk
It looks like, that pdftk does not get input from redirection, pipe operators.
I tested it on Windows 10.
Does anyone know how it can be achieved?

Related

Exiftool: Want to output to one text file using -w command

I'm currently trying to use exiftool on Windows command prompt to read meta data from multiple files, then output to a single text file.
The exact command I last tried looked like this:
exiftool.exe -FileName -GPSPosition -CreateDate -d "%m:%d:%Y %H:%M:%S" -c "%d° %d' %.2f"\" -charset UTF-8 -ext jpg -w _Coordinate_Date.txt S:\Nick\Test\
When I run this, I get 7 individual text files with the content for one corresponding file in each of them. However, I simply want to output all of it to one single text file. Any help is greatly appreciated
The -w (textout) option can only be used to write multiple files. It is not meant to be used to output to a single file. As per the docs on -w:
It is not possible to specify a simple filename as an argument -- creating a single output file from multiple source files is typically done by shell redirection
Which is what you're doing with the >> ./output.txt part of your command. The -w _Coordinate_Date.txt isn't doing anything and I would think throw an Invalid TAG name: "w _Coordinate_Date.txt" error if quoted together like that as it gets treated as a single arugment. The -w option requires two arguments, the -w and either an extension or a format string.
I actually figured it out, if you wrap the entire -w _Coordinate_Date.txt command in quotations and append it to a file, you can throw all of the output into one text file.
i.e. "-w _Coordinate_Date.txt >> ./output.txt"

ffmpeg concat command not reading input file correctly

I am trying to concatenate two video files using ffmpeg, and I am receiving an error.
ffmpeg -f concat -safe 0 -i list.txt -c copy concat.mp4
And the error output I receive is....
[concat # 0x7ff922000000] Line 1: unknown keyword '43.mp4'
list.txt: Invalid data found when processing input
It looks like that the file names in the list have to be specially formatted to look like:
file '/path/to/file1.wav'
with a word file included. I spent a lot of time trying to guess why ffmpeg encountered an error trying to read the file names. It didn't matter if they were in the list or in the command line. So only after I utilized a command
for f in *.wav; do echo "file '$f'" >> mylist.txt; done
to make list from ffmpeg's manual I had success. The only difference was an additional word file.
Here you can read it yourself: https://trac.ffmpeg.org/wiki/Concatenate#demuxer

Random linebreaks in PowerShell standard error output

I want to convert many .iso files to .mp4 with HandBrake, so I am trying to use the command line interface. I would prefer to write my scripts for this in powershell instead of batch files. However, the standard error contains linebreaks at random location if I use powershell.
For troubleshooting, I created a simplified script both in powershell and in batch.
Powershell:
& "$Env:ProgramFiles\HandBrake\HandBrakeCLI.exe" #(
'--input', 'V:\',
'--title', '1', '--chapter', '1',
'--start-at', 'duration:110', '--stop-at', 'duration:15',
'--output', 'pmovie.mp4',
'--format', 'av_mp4'
) > ".\pstd.txt" 2> ".\perr.txt"
Batch file:
"%ProgramFiles%\HandBrake\HandBrakeCLI.exe" --input V:\ --title 1 --chapter 1 --start-at duration:110 --stop-at duration:15 --output ".\cmovie.mp4" --format av_mp4 > ".\cstd.txt" 2> ".\cerr.txt"
Both scripts create the same .mp4 file, the difference is only the standard error output they create:
Powershell:
HandBrakeCLI.exe : [10:41:44] hb_init: starting libhb thread
At C:\Test\phandbrake.ps1:1 char:2
+ & <<<< "$Env:ProgramFiles\HandBrake\HandBrakeCLI.exe" #(
+ CategoryInfo : NotSpecified: ([10:41:44] hb_i...ng libhb thread
:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
[10:41:44] thread 541fc20 started ("libhb")
HandBrake 1.1.2 (2018090500) - MinGW x86_64 - https://handbrake.fr
8 CPUs detected
O
pening V:\...
[10:41:44] CPU: Intel(R) Core(TM) i7-2600K CPU # 3.40GHz
[10:41:44] - Intel microarchitecture Sandy Bridge
[10:41:44] - logical processor count: 8
[10:41:44] Intel Quick Sync Video support: no
[10:41:44] hb_scan: path=V:\, title_index=1
src/libbluray/disc/disc.c:424: error opening file BDMV\index.bdmv
src/libbluray/disc/disc.c:424: error opening file BDMV\BACKUP\index.bdmv
[10:41:44] bd: not a bd - trying as a stream/file instead
libdvdnav: Using dvdnav version 6.0.0
l
ibdvdnav: Unable to open device file V:\.
libdvdnav: vm: dvd_read_name failed
libdvdnav: DVD disk re
ports i
tself wi
th Region mask 0x
0000000
0. Reg
ions:
1 2 3 4 5
6 7 8
Batch file:
[10:41:35] hb_init: starting libhb thread
[10:41:35] thread 5a2cc30 started ("libhb")
HandBrake 1.1.2 (2018090500) - MinGW x86_64 - https://handbrake.fr
8 CPUs detected
Opening V:\...
[10:41:35] CPU: Intel(R) Core(TM) i7-2600K CPU # 3.40GHz
[10:41:35] - Intel microarchitecture Sandy Bridge
[10:41:35] - logical processor count: 8
[10:41:35] Intel Quick Sync Video support: no
[10:41:35] hb_scan: path=V:\, title_index=1
src/libbluray/disc/disc.c:424: error opening file BDMV\index.bdmv
src/libbluray/disc/disc.c:424: error opening file BDMV\BACKUP\index.bdmv
[10:41:35] bd: not a bd - trying as a stream/file instead
libdvdnav: Using dvdnav version 6.0.0
libdvdnav: Unable to open device file V:\.
libdvdnav: vm: dvd_read_name failed
libdvdnav: DVD disk reports itself with Region mask 0x00000000. Regions: 1 2 3 4 5 6 7 8
libdvdread: Attempting to retrieve all CSS keys
libdvdread: This can take a _long_ time, please be patient
libdvdread: Get key for /VIDEO_TS/VIDEO_TS.VOB at 0x00000130
libdvdread: Elapsed time 0
This bothers me because I would like to check these text files to be sure that there was no error during the encoding.
I suppose this may be related to a lack of synchronization between threads that write to the same stream but I am not sure about it.
The question: What can I do to get the standard error output from PowerShell without these random line breaks?
You might try the Start-Process command, with -RedirectStandardError, -RedirectStandardInput, and -Wait options.
These -Redirect... options on Start-Process do OS level I/O redirection directly to the target file, as most shells do. As I understand it, that's not how PowerShell angle-bracket redirection works, instead they the angle brackets pipe the output through another PowerShell pipeline, using Write-File (or something), which inserts line-breaks between strings it receives.
I'm not sure of the exact details of this, but I'm glad to hear it seems to address the problem for you as it has for me.
I think the issue here is that there is a certain width to the console, and the console itself is essentially being redirected to a file.
My solution to this is to redirect the output directly to the pipeline, using:
2>&1 #Interpreted by the console
2>&1 | x #Output directly to x
And then using Out-File with the available -Width parameter:
$(throw thisisnotsometthingyoucanthrowbutisinfactaverylongmessagethatdemonstratesmypoint) 2>&1 |
Out-File "test.txt" -Width 10000
In this case, powershell will write 10,000 characters before wrapping the text.
However, you also have some odd line breaks in there that I can't replicate right now. That said, now that you know how to send output through the pipeline, you can use other methods to remove the line breaks.
For example, you can use this function which prints out the exact control characters that cause line breaks.
$(throw error) 2>&1 | Out-String | Debug-String
Then, you can go through the output and replace the problem characters, like so:
$(throw error) 2>&1 | Out-String | % {$_ -replace "`r"} | Out-File "test.txt" -Width 10000
Burt Harris' helpful answer shows you one way to avoid the problem, via Start-Process, which requires you to structure the command fundamentally differently, however.
If the output that an equivalent batch file produces is sufficient, there's an easier way: simply call cmd /c and let cmd handle the output redirections, as in your batch file:
cmd /c "`"`"$Env:ProgramFiles\HandBrake\HandBrakeCLI.exe`"`"" #(
'--input', 'V:\',
'--title', '1', '--chapter', '1',
'--start-at', 'duration:110', '--stop-at', 'duration:15',
'--output', 'pmovie.mp4',
'--format', 'av_mp4'
) '> .\pstd.txt 2> .\perr.txt'
Note how the two output redirections are passed as a single, quoted string, to ensure that they are interpreted by cmd.exe rather than by PowerShell.
Also note the embedded escaped double quotes (`") around the executable path to ensure that cmd.exe sees the entire path as a single, double-quoted string.
As for the extra line breaks you're seeing:
I have no specific explanation, but I can tell you how > and 2> work differently in PowerShell - both compared to cmd.exe (batch files) and Start-Process with -RedirectStandard*:
cmd.exe's redirection operator (>) writes raw bytes to the specified target file, both when redirecting stdout (just > or, explicitly, 1>) and stderr (2>); as such, text output by external programs such as HandBrakeCLI.exe is passed through as-is.
Start-Process, which uses the .NET API under the hood, does essentially the same when -RedirectStandardOutput and/or -RedirectStandardError parameters are specified.
By contrast, Powershell's own > operator functions differently:
PowerShell-internally (when calling native PowerShell commands) it converts input objects (that aren't already strings) to strings using PowerShell's rich output formatting system, before sending them to the output file(s), using the character encoding detailed below.
Output received from external programs is assumed to be text, whose encoding is assumed to be the system's OEM character encoding by default, as reflected in [console]::OutputEncoding and chcp. The decoded text is loaded into .NET strings (which are inherently UTF-16-based) line by line.
For redirected stdout output, these strings are re-encoded on output to the target file, using the following encoding by default:
Windows PowerShell: UTF-16LE ("Unicode")
PowerShell Core: UTF-8 without BOM
Note: Only in Windows PowerShell v5.1 or higher and PowerShell Core can you change these defaults - see this answer for details.
By contrast, when redirecting stderr output, via stream 2 (PowerShell's error stream), the strings are wrapped in error objects (instances of type [System.Management.Automation.ErrorRecord]) before being output, and the resulting objects are converted to strings based on PowerShell's output-formatting system, and the same character encoding as above is applied on output to the target file.
You can see evidence of that in your output containing extra information and lines such as HandBrakeCLI.exe : [10:41:44] hb_init: starting libhb thread and
At C:\Test\phandbrake.ps1:1 char:2, ...
It also means that extra line breaks can be introduced, because text produced by the output-formatting system assumes a fixed line width based on the console window's width.
That said, that doesn't explain the oddly placed line breaks in your case.

How to prevent the output truncated if the rows of output from the windbg to large?

If the output rows from the windbg command to large ,such as 100k rows, finally the windbg just display thousands of the rows, and most of them would be truncated , so my question is how to prevent the output truncated , or write all of the rows from the output to a local file to keep all of the output rows? the "write Windows text to file" wouldn't helpful.
Not sure if it would help, but .logopen and .logclose commands might be helpful in this case (respectively open and close a log file which keeps a copy of the events and commands from the Debugger Command window).
See also Keeping a Log File in WinDbg.
sometimes simply piping works especially when running cdb and quitting after executing just one command
cdb -c "tc 100;q" calc >> foo.txt
you should have 100 calls lets check
grep -c !.*: foo.txt
256
lets check how many sysenter were done and what were the index of the syscalls
grep sysenter -B 4 foo.txt | grep eax | awk "{print $1}"
eax=000000ea
eax=0000014d
eax=000000fb
we can use the output when the commands run for an infinite amount of time
without having file locked issues
like this
if .logopen .logclose isnt an option
Try to open additional command window with Ctrl+N and execute the long outputed command within it

redirect input in command line

I am not really getting the redirect input in DOS mode.
I know the working example: sort < list.txt
which sorts the content of my list.txt
but why doesn't this work:
dir < arguments.txt
the content of my arguments.txt file is for instance just: -D
I would expect the command
dir < arguments.txt
be equal to
dir /D
why isn't this working?
thanks
juergen
The < operator redirects console input to be from a file. The cooresponding > operator redirects the console output to be to a file.
The sort command reads the console until it reaches end of file (cntl-Z) and then it produces the sorted result.
The dir command does not accept console input, only arguements on the command line, so the the file containing /D is never read.