Adding logging to my Perl Install script - perl

I am working on writing a Perl script that will install software changes for our application. Mostly this involves running sql files in sqlplus. While I do log each sql file that gets run and grep each log file for errors, I would like to get a single log file of everything. That way if something weird happens during the install I will have a single file with everything that happened while the patch was being installed. If I was writing this in bash I would do the following:
exec >my_log_file
exec 2>&1
... some code runs
exec 1>&-
exec 2>&-
The exec command will redirect anything that goes to stdout and stderr to my log file. At the end of the script I turn that off so that I can grep the log for errors. I can also redirect stdout back to /dev/tty and cat the log if I want.
Is there a pure Perl way to get the same effect? I'm guessing that if I run the above code with the system command in Perl I am creating a new process and thus would not get all of the output from the script. I also thought about creating a bash wrapper script for the logging, but that would require the customer to run the wrapper script. It would be easier if I can do this in Perl.
Thanks.

The following is loosely adapted from an example in "perldoc -f open".
# Take copies of the original STDOUT and STDERR.
open(my $oldout, ">&", \*STDOUT) or die "Can't dup STDOUT: $!";
open(my $olderr ">&", \*STDERR) or die "Can't dup STDERR: $!";
# Open STDOUT to a log file
open(STDOUT, '>', 'my_file.log') or die "Can't redirect STDOUT: $!";
# Copy STDERR to STDOUT
open(STDERR, ">&STDOUT") or die "Can't dup STDOUT: $!";
# Unbuffer STDOUT and STDERR
select STDERR; $| = 1;
select STDOUT; $| = 1;
# Do stuff that prints to STDOUT and STDERR
...
# Restore original STDOUT and STDERR
open(STDOUT, ">&", $oldout) or die "Can't dup \$oldout: $!";
open(STDERR, ">&", $olderr) or die "Can't dup OLDERR: $!";

Related

IPC::Open2 output to already open file handle as per doc example

The documented example in perldoc IPC::Open2 (read from parent STDIN and write to already open handle) is a simplified version of what I'm trying to achieve. Namely, parent writes a preamble to a output file, then a subprocess writes its output directly to the same file.
I've made a simple child script which reads input lines and prints to STDERR and STDOUT. The STDOUT being the the 'already open handle' from the parent.
#!/usr/bin/env perl
##parent.pl
use IPC::Open2;
# read from parent STDIN and write to already open handle
open my $file, '>', 'outfile.txt' or die "open failed: $!";
my $pid = open2($file, "<&STDIN", "./child.pl");
# reap zombie and retrieve exit status
waitpid( $pid, 0 );
my $child_exit_status = $? >> 8;
#!/usr/bin/env perl
##child.pl
while(<STDIN>){
print STDOUT "STDOUT: ",$_;
print STDERR "STDERR: ", $_;
}
print STDERR "END OF CHILD\n";
An example run of parent.pl:
Hello
^D
STDERR: Hello
STDERR: END OF CHILD
However, I don't see the expected "STDOUT: Hello" in the output file 'outfile.txt'
Is there some additional setup I've missed to get this example to work?
open my $file, '>', 'outfile.txt' or die "open failed: $!";
my $pid = open2($file, "<&STDIN", "./child.pl");
This will create a new pipe, and overwrite the $file variable with a handle refering to the read end of the pipe, closing the old file handle in the process ;-)
In order to pass an existing file handle to open2 or open3, you want to use the >&FILEHANDLE format, but I wasn't able to figure out any way to do that when FILEHANDLE is a local variable, as your my $file.
But the undocumented >&NUM or >&=NUM forms (where NUM is a file descriptor number) just work:
open my $file, '>', 'outfile.txt' or die "open failed: $!";
my $pid = open2('>&'.fileno($file), '<&STDIN', './child.pl');
Example:
$ perl -MIPC::Open2 -e '
open my $f, ">foo";
open2(">&".fileno($f), "<&STDIN", "echo bar")
'; cat foo
bar

Redirect STDERR of parent process to file handle of child process

I need to call an external logging process from a Perl script that takes data passed to it and writes it to a network service. That is easy enough to do. However, I have the additional requirement that any writes to STDERR from the parent process gets redirected to the external process.
What I've tried do is open a file handle to a write pipe of the external process, then redirect STDERR to the file handle. Here is my test script, that unfortunately does not work yet.
#!/usr/bin/perl
use strict;
use warnings;
# open write filehandle to external process
open my $fh, '|-', 'pipefile_http',
or die "Couldn't open logfile: $!\n";
# redirect STDERR from parent process to same write filehandle to child process
my $fileno = fileno($fh);
open STDERR, ">&$fileno" or die "Couldn't switch STDERR to fileno $fileno: $!\n";
print $fh "1. print to file handle\n";
print STDERR "2. print to STDERR\n";
print "3. print to STDOUT\n";
close $fh;
exit 0;
When I run this script, it successfully redirects the print call to STDERR to the external logging process, but the print call to $fh does not work (the message disappears). Also, the script hangs indefinitely after it successfully prints message #3 to STDOUT. When I run the script with strace, I can see that the script is hanging on a waitpid() call (the pid of the external process).
Any advice on how I can do this?
Just reassign STDERR:
#!/usr/bin/perl
use strict;
use warnings;
# open write filehandle to external process
open my $fh, '|-', 'pipefile_http',
or die "Couldn't open logfile: $!\n";
# reassign STDERR
*STDERR = $fh;
print $fh "1. print to file handle\n";
print STDERR "2. print to STDERR\n";
print "3. print to STDOUT\n";
close $fh;
exit 0;

Perl: Tee both merged STDOUT + STDERR and STDERR to separate files without CPAN modules

I have searched the internet for different solutions and the closest I have done is an exercise in open statements:
use strict;
use warnings;
use Carp;
# Save previous state
my ($old_out, $old_err);
open($old_out, ">&", \*STDOUT) or
croak("Cannot save STDOUT: $!\n");
open($old_err, ">&", \*STDERR) or
croak("Cannot save STDERR: $!\n");
my ($fh_OUTLOG);
# Make a filehandle for the tee output
open($fh_OUTLOG, "|-", "tee out_err.log") or
croak("Cannot open filehandle to tee to log file: $!\n");
# Duplicate STDOUT and STDERR to the filehandle
open(STDOUT, ">&", $fh_OUTLOG) or
croak("Cannot duplicate STDOUT to filehandle: $!\n");
open(STDERR, ">&", $fh_OUTLOG) or
croak("Cannot duplicate STDERR to filehandle: $!\n");
# Code...
# Restore STDOUT
open(STDOUT, ">&", $old_out) or
croak("Cannot restore STDOUT: $!\n");
# Close the filehandle
close $fh_OUTLOG
or croak( ($!) ? "Error closing fh_OUTLOG: $!"
: "Exit status $? from tee log.out");
# Restore STDERR
open(STDERR, ">&", $old_err) or
croak("Cannot restore STDERR: $!\n");
But this code will not produce an log file with STDERR, and I dont know how to add it. The criteria for the solution are:
Perl v 5.8.3.
No CPAN modules.
No Bash extra command to start the script.
I was thinking of using some teesolutions for STDERR as well:
open(STDERR, "| tee log.err");
open(STDERR, ">&", $fh_OUTLOG);
But using this will hang the script in the end of the execution. (Think the tee child process still is alive). But the script still hangs if I try to kill it in the end.
my $pid_std_err = open(STDERR, "| tee log.err") or
croak("Cannot tee filehandle STDERR to log.err: $!\n");
# Code...
kill -9, $pid_std_err;
Create two pipes and associate them wih STDOUT and STDERR.
Create a task (thread, process, etc) that uses IO::Select to read from those two pipes.
The task places what it reads from those pipes in the appropriate files(s).

how can I print to STDOUT in perl module if it is redirected in main script

I have redirected STDOUT in a Perl script. Everything I print in my module is redirected to a file. Is there a way to restore STDOUT in a Perl module?
Here is my example
require my_module;
open(STDOUT, ">$outlog") || die "Error stdout: $!";
open(STDERR, ">>$outlog") || die "Error stderr: $!";
my_module::my_func();
So I want to print a message on STDOUT in my_module::my_func() function and exit.
Actually you can't restore STDOUT unless you save it at some other location.
You can do following:
# Save current STDOUT handle in OLDOUT
open (OLDOUT, ">&STDOUT") or die "Can't open OLDOUT: $!";
# Set STDOUT to a your output file
open (STDOUT, ">$youroutputfile") or die "Can't open STDOUT: $!";
# Do whatever you want to do here.......
# ...........
# Close STDOUT output stream
close (STDOUT);
# Reset STDOUT stream to previous state
open (STDOUT, ">&OLDOUT") or die "Can't open STDOUT: $!";
# Close OLDOUT handle
close (OLDOUT);
# Here your preview STDOUT is restored....
:)
Seems I found solution. First I saved STDOUT in main script then I used it in module.
require my_module;
open(SAVEOUT, ">&STDOUT") || die "Unable to save STDOUT: $!";
open(STDOUT, ">$outlog") || die "Error stdout: $!";
open(STDERR, ">>$outlog") || die "Error stderr: $!";
my_module::my_func();
In my_module::my_func() I added the following line before exiting
open (STDOUT, ">&main::SAVEOUT") or die "Unable to restore STDOUT : $!";
print "a_module!!!\n";
My printed message was sent to STDOUT

How can I reinitialize Perl's STDIN/STDOUT/STDERR?

I have a Perl script which forks and daemonizes itself. It's run by cron, so in order to not leave a zombie around, I shut down STDIN,STDOUT, and STDERR:
open STDIN, '/dev/null' or die "Can't read /dev/null: $!";
open STDOUT, '>>/dev/null' or die "Can't write to /dev/null: $!";
open STDERR, '>>/dev/null' or die "Can't write to /dev/null: $!";
if (!fork()) {
do_some_fork_stuff();
}
The question I have is: I'd like to restore at least STDOUT after this point (it would be nice to restore the other 2). But what magic symbols do I need to use to re-open STDOUT as what STDOUT used to be?
I know that I could use "/dev/tty" if I was running from a tty (but I'm running from cron and depending on stdout elsewhere). I've also read tricks where you can put STDOUT aside with open SAVEOUT,">&STDOUT", but just the act of making this copy doesn't solve the original problem of leaving a zombie around.
I'm looking to see if there's some magic like open STDOUT,"|-" (which I know isn't it) to open STDOUT the way it's supposed to be opened.
# copy of the file descriptors
open(CPERR, ">&STDERR");
# redirect stderr in to warning file
open(STDERR, ">>xyz.log") || die "Error stderr: $!";
# close the redirected filehandles
close(STDERR) || die "Can't close STDERR: $!";
# restore stdout and stderr
open(STDERR, ">&CPERR") || die "Can't restore stderr: $!";
#I hope this works for you.
#-Hariprasad AJ
If it's still useful, two things come to mind:
You can close STDOUT/STDERR/STDIN in just the child process (i.e. if (!fork()). This will allow the parent to still use them, because they'll still be open there.
I think you can use the simpler close(STDOUT) instead of opening it to /dev/null.
For example:
if (!fork()) {
close(STDIN) or die "Can't close STDIN: $!\n";
close(STDOUT) or die "Can't close STDOUT: $!\n";
close(STDERR) or die "Can't close STDERR: $!\n";
do_some_fork_stuff();
}
Once closed, there's no way to get it back.
Why do you need STDOUT again? To write messages to the console? Use /dev/console for that, or write to syslog with Sys::Syslog.
Honestly though, the other answer is correct. You must save the old stdout (cloned to a new fd) if you want to reopen it later. It does solve the "zombie" problem, since you can then redirect fd 0 (and 1 & 2) to /dev/null.