Is redirected stdout/stderr buffered in install4j? - install4j

I'm using install4j to generate Windows executables.
The launcher is configured to redirect stderr and stdout to log\error.log resp. log\output.log.
This all works as intended, the log files are written in the expected location and with the expected content.
However, I do not know whether output is flushed or buffered.
I.e. if I kill the program via the Task Manager, can I expect to see the last line that was printed to stderr, or can I expect to lose some output?
(Both outcomes would be fine, I just need to know what will happen so I known how to interpret the log files I'm getting, and what to ask of customers to make sure that I get complete logs.)

The redirection files are flushed for every newline, but not for every character.

Related

Having Powershell Autofill command line prompt

There is an old command line tool my company uses to deploy log files to various servers.... whoever wrote it made it very very repetitive.
There is a lot of prompting that happens and I want to automate this process. We have a long term goal of replacing this .exe file down the line but for now automation works for the short term..
Example
./logdeploy.exe
Enter the destination folder:
I would like the powershell script to just automatically enter the folder, since its literally the same folder. because this exe is going to ask for it at least 20 times throughout this process, so copy paste just gets anyoing.
Is this even possible to do?
If there really is no way around simulating interactive user input in order to automate your external program, a solution is possible under the following assumption:
Your external program reads interactive responses from stdin (the standard input stream).
While doing so is typical, it's conceivable that a given program's security-sensitive prompts such as for passwords deliberately accept input from the terminal only, as so to expressly prevent automating responses.
If the first assumption holds, the specific method that must be used to send the response strings via stdin depends on whether the external program clears the keyboard buffer before each prompt.
(a) If it does not, you can simply send all strings in a single operation.
(b) If it does, you need to insert delays between sending the individual strings, so as to ensure that input is only sent when the external program is actively prompting for input.
This approach is inherently brittle, because in the absence of being able to detect when the external program is read to read a prompt response, you have to guess how much time needs to elapse between sending responses - and that time may vary based on many runtime conditions.
It's best to use longer delays for better reliability, which, however, results in increased runtime overall.
Implementation of (a):
As zett42 and Mathias R. Jessen suggest, use the following to send strings C:\foo and somepass 20 times to your external program's stdin stream:
('C:\foo', 'somepass') * 20 | ./logdeploy.exe
Again, this assumes that ./logdeploy.exe buffers keyboard input it receives before it puts up the next prompt.
Implementation of (b):
Note: The following works in PowerShell (Core) 7+ only, because only there is command output being sent to an external program properly streamed (sent line by line, as it becomes available); unfortunately, Windows PowerShell collects all output first.
# PowerShell 7+ only
# Adjust the Start-Sleep intervals as needed.
1..20 | ForEach-Object {
Start-Sleep 1
'C:\foo'
Start-Sleep 2
'somepass'
} | ./logdeploy.exe

DBeaver: Redirect server output to file

I'm using DBeaver to execute a large script file which produces a lot of output (via PostgreSQLs RAISE NOTICE statement). I can see the output in the server output tab, however, the buffer size seems to be limited so a lot of output is lost at the end of the execution.
Is it somehow possible to either increase the server output tab buffer size or redirect the server output directly to a file?
I was experiencing the same issue as you, and I have been unable to find any setting which limits the output length.
In my case, what I eventually discovered was that there was an error in my script which was causing it to fail silently. It looks like part of the output is missing, however it was just the script terminating prematurely.
I encourage you to consider this option, and check your script for errors. Be aware that errors in the script don't appear in the output log.

PowerShell monitoring external processes as they run

I have a script that runs an external process which creates some output (a file) and then I also capture console output to file (the log)
try
{
p4d -r $root -jc | out-file $output
}
I later check the log output, grab some info and the script carries on.
The problem is that the external process could (and has once) stalled and I need a way to check that on the fly to handle the error.
The best way I can think to do this is to monitor the file that the process creates for increasing size. Obviously this isn't without issue as it could potentially stall at any point and we don't know the resulting file size.
I will likely check the size of the last successful process and use that to set some limits.
My question is how do I achieve the whole check a process whilst it's running thing?

Tell if STDOUT of Perl Script is redirected

So I wrote a simple html 1.0 server, and I have some perl scripts on the server. In particular, I have this script called my histogram, that is an html form with a form action equal another cgi file. Here's the code:
print "<form action=\"tallyandprint.cgi\" method=\"GET\">";
Now, when I call tallyandprint.cgi, it plots a graph with gnuplot and sends it to the user's browser (STDOUT is redirected in the html server code, so perl inherets it). Now, I also want to be able to run tallyandprint.cgi from bash, but take a different style of arguments. Right now, I use perl parsing to grab the patterns by parsing the url, and separating the contents between the + symbol (example:?pattern=1+2+3+4 is what the url is).
Thats fine and dandy, but I don't want my arguments to be written in bash as 1+2+3+4, but rather separated differently. I tried to use perl's version of isatty(), but since the input is always from the terminal (because the server executes it), I cannot distinguish between whether the input is from bash or from web browser this way.
My next though was to find out if STDOUT is redirected. Since if the webserver runs the cgi, the STDOUT will be redirected to the socket that the user is connected to. If run in bash, the STDOUT should be the normal tty. How can I check this in perl?
if (-t STDOUT) {
say "STDOUT is probably not redirected";
}
else {
say "STDOUT is probably redirected";
}

Check progress of silent Terminal command writing a file?

Not really sure if this is possible, but I am running this on Terminal:
script -q \/my\/directory\/\/$outfile \.\/lexparser.csh $file
Explanation
Through a perl script. The first directory and $outfile is where I am saving the output of the Terminal command. the \.\/lexparser.csh $file is just calling on that script to work on the input file, $file.
Problem
However, I put -q b/c I didn't want to save the unnecessary print to the file. The file is big ~ 30 thousand lines of text. It has been running for some time now, which was expected.
Question
I would like to check and ensure everything is going smoothly. The file output name is in Finder, but I'm afraid if I click on it, it will ruin the output. How can check the progress (possibly the current text file) without disrupting the process?
Thanks for your time, let me know if the question is unclear.
Open a new Terminal, navigate to the output directory, and:
tail -f <output_file>
You will continue to see new appends to the file without interruption to any writing process. Just leave the Terminal open with the tail going, and you can watch it all day long. Grab some popcorn.
In addition to tail, you can also learn about tee. The point of tee is to output to a file while also outputting to STDOUT in your terminal. Best of both worlds! Well, someone good aspects of two possible worlds.
You could tail the file via the command line which shouldn't cause problems.
Additionally you could have the program print to stderr as well as stdout, redirect stdout to the file and allow stderr through so it could tell you it's progress. Though that is more of a 20 / 20 hindsight solution.