troubles while redirecting stderr in csh - perl

I'm writing a Perl script that should execute commands in shell and parse their output. As a shell I'm intended to use csh. I've started with this
my $out = `cmd`
but it doesn't capture STDERR, which I need too. Running sh with output redirection does nothing
my $out = `sh -c "cmd 2>&1"`
still captures only STDOUT, not STDERR.
Even redirecting to file in csh doesn't work for me
tcsh$ cmd >& logfile.log
still captures STDOUT only %)
The command I'm trying to execute is actuallty sh script and some commands in this script print into STDERR and I want to capture that output. If I execute sh -c "cmd 2>/dev/null" STDERR actually goes to /dev/null and only STDOUT is printed in terminal.
Could anyone help me with this?

I believe there is something you are not telling us. Are you on cygwin? Or Windows? Do you have a PERL5SHELL environment variable set?
There is something that you are not telling us because both of these work fine on the five platforms I can easily test on:
% perl -le '$out = `sh -c "grep missing /dev/nowhere 2>&1" | cat -n`; chomp $out; print "got <<<$out>>>"'
got <<< 1 grep: /dev/nowhere: No such file or directory>>>
But in far, there is no reason to call sh(1) explicitly for shelling out. That’s because Perl always calls sh(1) for all its backtick, pipe opens, and system() shell-outs:
% perl -le '$out = `grep missing /dev/nowhere 2>&1 | cat -n`; chomp $out; print "got <<<$out>>>"'
got <<< 1 grep: /dev/nowhere: No such file or directory>>>
The only except to this I can think of occurs on non-Unix systems, where because they have no /bin/sh, something else is defined.
But under no circumstances will simple shell-outs be calling tcsh(1) behind your back. You’d’ve had to’ve seriously hacked the perl(1) source to get that to happen. I also rather doubt you could (easily) hack the binary, since the string "/bin/tcsh" is going to be longer than "/bin/sh", and it isn’t very often going to be found in /bin/ anyway.
That you can’t get stderr redirection working even from the shell says something pretty weird is going on. I think we need more information.

Here, you are capturing the STDOUT of sh, which is not the STDERR of cmd:
my $out = `sh -c "cmd 2>&1"`;
Can you just run cmd directly?
my $out = `cmd 2>&1`;

Backquotes capture STDOUT not STDERR.
system will dump both stdout and stderr to their parent's settings.
If you want to capture STDERR, you need something like IPC::Open3:
Extremely similar to open2(), open3() spawns the given $cmd and connects CHLD_OUT for reading from the child, CHLD_IN for writing to the child, and CHLD_ERR for errors. If CHLD_ERR is false,

You said that running the command cmd >& logfile.log in tcsh sends only cmd's stdout to the log file, not its stderr. That doesn't make sense.
Try replacing cmd with the following script:
#!/bin/sh
echo stdout
echo STDERR 1>&2
Both "stdout" and "STDERR" should show up in logfile.log.
If so, then perhaps your "cmd" is doing something odd. My best guess is that cmd is writing to /dev/tty, not to either stdout or stderr; that wouldn't be affected by redirection.
To see what I mean, add this line to the above script:
echo tty > /dev/tty

I don't really have time to mock up an example as I normally would, nor even test one. I am thinking that you might try using Capture::Tiny to see if that helps.

Related

Windows command prompt creating but not redirecting output to file

I'm having the opposite problem of so many posts I've seen on here.
I'm running a perl command written by someone else and the output is all being forced to the screen despite using the ">" command.
Windows clearly knows what I'm intending because the file name I give is being created fresh and new every time I execute my command but the contents/size of the log file are empty and 0 bytes long.
My perl executable lives in a different place than my perl routine/.pl file.
I tried running as administrator and not.
This is not something wrong with the program. Some of my coworkers execute it just fine and there is no output to their screens.
The general syntax is:
F:\git\repoFolderStructure\bin>
F:\git\repoFolderStructure\bin>perl alog.pl param1 param2 commaSeparatedParam3 2020-12-17,18:32:33 2020-12-17,18:33:33 > mylogfile.log
>>>>>Lots and lots of output I wish was in a file
Also attempted in the directory with my perl.exe and gave the path to my repo folder's bin.
Is there something weird about windows that could create/prevent the > operator behavior?
Here's the kicker: I did ipconfig > out.txt just fine, though...nothing written to the screen.
Thanks for any tips for what I could do to try and change the behavior!
It could be that the output is being sent to STDERR, while you are capturing STDOUT. Append 2>&1 to capture both to the same file.
>perl -e"print qq{STDOUT\n}; warn qq{STDERR\n};" >stdout
STDERR
>type stdout
STDOUT
>perl -e"print qq{STDOUT\n}; warn qq{STDERR\n};" 2>stderr
STDOUT
>type stderr
STDERR
>perl -e"print qq{STDOUT\n}; warn qq{STDERR\n};" >stdout 2>stderr
>type stdout
STDOUT
>type stderr
STDERR
>perl -e"print qq{STDOUT\n}; warn qq{STDERR\n};" >both 2>&1
>type both
STDERR
STDOUT
Note that 2>&1 must come after you redirect STDOUT if you want to combine both streams.

Perl STDERR printed in the wrong order with Tee

I'm trying to redirect STDOUT and STDERR from a perl script - executed from a bash script - to both screen and log file.
perlscript.pl
#!/usr/bin/perl
print "This is a standard output";
print "This is a second standard output";
print STDERR "This is an error";
bashscript.sh
#!/bin/bash
./perlscript.pl 2>&1 | tee -a logfile.log
If I execute the perlscript directly the screen output is printed in the correct order :
This is a standard output
This is a second standard output
This is an error
But when I execute the bash script the STDERR is printed first (in both screen and file) :
This is an error
This is a standard output
This is a second standard output
With a bash script as child the output is ordered flawlessly. Is it a bug with perl or tee? Am I doing something wrong?
An usual trick to turnoff buffering is to set the variable $|. Add the below line at
beginning of your script.
$| = 1;
This would turn the buffering off. Also refer to this excellent article by MJD explaining buffering in perl. Suffering from Buffering?
I guess this has to do with the way STDOUT and STDERR buffers are flushed. Try
autoflush STDOUT 1;
at the beginning of your perl script so that STDOUT is flushed after each print statement.

From Perl, spawn a shell, configure it, and fork the STDOUT

I use a Perl script to configure and spawn a compiled program, that needs a subshell configured a certain way, so I use $returncode = system("ulimit -s unlimited; sg ourgroup 'MyExecutable.exe'");
I want to capture and parse the STDOUT from that, but I need it forked, so that the output can be checked while the job is still running. This question comes close:
How can I send Perl output to a both STDOUT and a variable? The highest-rated answer describes a function called backtick() that creates a child process, captures STDOUT, and runs a command in it with exec().
But the calls I have require multiple lines to configure the shell. One solution would be to create a disposable shell script:
#disposable.sh
#!/bin/sh
ulimit -s unlimited
sg ourgroup 'MyExecutable.exe'
I could then get what I need either with backtick(disposable.sh) or open(PROCESS,'disposable.sh|').
But I'd really rather not make a scratch file for this. system() happily accepts multi-line command strings. How can I get exec() or open() to do the same?
If you want to use shell's power (that includes loops, variables, but also multiple command execution), you have to invoke the shell (open(..., 'xxx|') doesn't do that).
You can pass your shell script to the shell with the -c option of the shell (another possibility would be to pipe the commands to the shell, but that's more difficult IMHO).
That means calling the backtick function from the other answer like this:
backtick("sh", "-c", "ulimit -s unlimited; sg ourgroup 'MyExecutable.exe'");
The system tee with backticks will do this, no?
my $output = `(ulimit -s unlimited; sg ourgroup 'MyExecutable.exe') | tee /dev/tty`;
or modify Alnitak's backticks (so it does use a subshell)?
my $cmd = "ulimit -s unlimiited ; sg ourgroup 'MyExecutable.exe'";
my $pid = open(CMD, "($cmd) |");
my $output;
while (<CMD>) {
print STDOUT $_;
$output .= $_;
}
close CMD;
Expect should be used as you are interacting with your program: http://metacpan.org/pod/Expect
Assuming /bin/bash on your *nix matches something like bash-3.2$ the below program can be used to launch number of commands using $exp->send on bash console and output from each command can then be parsed for further actions.
#!/usr/bin/perl
use Expect;
my $command="/bin/bash";
my #parameters;
my $exp= new Expect;
$exp->raw_pty(1);
$exp->spawn($command);
$exp->expect(5, '-re', 'bash.*$');
$exp->send("who \n");
$exp->expect(10, '-re', 'bash.*$');
my #output = $exp->before();
print "Output of who command is #output \n";
$exp->send("ls -lt \n");
$exp->expect(10, '-re', 'bash.*$');
my #output = $exp->before();
print "Output of ls command is #output \n";

stop console output from unix command execution in perl

I am running unix commands in perl as
$cmd="ls -l";
$result=`$cmd`;
print $log_file $result;
but the output from execution is also printed on screen.
how to execute command without printing result on screen ?
The standard output stream isn't printed to screen -- it's captured to $result. But there is a second output stream called the standard error stream that programs can write to, and which also by default goes to the screen. Many programs use this stream for logging, or (in the case of ls) for writing error messages. To capture this in addition, use
$cmd = "ls -l 2>&1";
To discard it instead, use
$cmd = "ls -l 2>/dev/null";
I think there's a mistake. Try that in a shell:
perl -e 'my $x = `ls -l`;'
That doesn't produce any output.
it happens probably because $log_file is somehow attached to STDOUT, duplicated to it, something like:
open(my $log_file, ">&STDOUT")

Capturing the output of STDERR while piping STDOUT to a file

I have a rather odd situation. I'm trying to automate the backup of a collection of SVN repositories with Perl. I'm shelling out to the svnadmin dump command, which sends the dump to STDOUT, and any errors it encounters to STDERR.
The command I need to run will be of the form:
svnadmin dump $repo -q >$backupFile
STDOUT will go to the backup file, but, STDERR is what I need to capture in my Perl script.
What's the right way to approach this kind of situation?
EDIT:
To clarify:
STDOUT will contain the SVN Dump data
STDERR will contain any errors that may happen
STDOUT needs to end up in a file, and STDERR needs to end up in Perl. At no point can ANYTHING but the original content of STDOUT end up in that stream or the dump will be corrupted and I'll have something worse than no backup at all, a bad one!
Well, there are generic ways to do it within perl too, but the bash solution (which the above makes me think you're looking for) is to redirect stderr first to stdout and then redirect stdout to a file. intuitively this doesn't make a whole lot of sense until you see what's happening internally to bash. But this works:
svnadmin dump $repo -q 2>&1 >$backupFile
However, do not do it the other way (ie, put the 2>&1 at the end), or else all the output of both stdout and stderr will go to your file.
Edit to avoid some people's confusion that this doesn't work:
What you want is this:
# perl -e 'print STDERR "foo\n"; print "bar\n";' 2>&1 > /tmp/f
foo
# cat /tmp/f
bar
and specifically you don't want this:
# perl -e 'print STDERR "foo\n"; print "bar\n";' > /tmp/f 2>&1
# cat /tmp/f
foo
bar
Here's one way:
{
local $/; # allow reading stderr as a single chunk
open(CMD, "svnadmin dump $repo -q 2>\&1 1>$backupFile |") or die "...";
$errinfo = <CMD>; # read the stderr from the above command
close(CMD);
}
In other words, use the shell 2>&1 mechanism to get stderr to a place where Perl can easily read it, and use 1> to get the dump sent to the file. The stuff I wrote about $/ and reading the stderr as a single chunk is just for convenience -- you could read the stderr you get back any way you like of course.
While tchrist is certainly correct that you can use handle direction and backticks to make this work, I can also recommend David Golden's Capture::Tiny module. It gives generic interfaces to capturing or tee-ing STDOUT and STDERR, from there you can do with them what you will.
This stuff is really easy. It’s what backticks were invented for, for goodness’ sake. Just do:
$his_error_output = `somecmd 2>&1 1>somefile`;
and voilà you’re done!
I don’t understand what the trouble is. Didn’t have your gazzintas drilled into you as a young child the way Jethro did? :)
From perldoc perlop for qx:
To read both a command's STDOUT and its STDERR separately, it's
easiest to redirect them separately to files, and then read from those
files when the program is done:
system("program args 1>program.stdout 2>program.stderr");