Perl Cleartool Accept Statement - perl

I am using the following code to execute the ClearCase statement "Accept" from a perl script.
$acceptA = `accept $Component`;
After execution, inside my perl script, the value of $acceptA is blank.
The text displayed on the screen during execution of this line is : "ERROR You do not have permissions to ACCEPT this work."
How do I read this line? I thought it would return into the variable $acceptA as it does with the "cleartool checkin" command?

As i do not know ClearCase and how that accept works, i can only guess. Seeing how it is an error message, it might be written to STDERR instead of STDOUT and backticks only capture STDOUT of the command executed.
In that case, redirecting the commands STDERR to STDOUT would work. Try
$acceptA = `accept $Component 2>&1`
and see if that works in capturing the output in case of error as well.

I eventually redirected SYSERR to an output file which I could read/write.
open STDERR, ">/ellipse/el6.3.3_ry_sup/src/0/$logfile"
All the error messages that was displayed on the screen as part of the system command, went into $logfile.
I was also able to add to SYSERR with the following:
print STDERR "\nAccepting $Component";
Thanks for all the help.

Related

I have a system call that somehow is causing malformed header from script Bad header

I have a Perl CGI program. myperlcgi.pl
Within this program I have the following:
my $retval = system('extprogram')
extprogram has a print statement within.
The output from extprogram is being included within myperlcgi.pl
I tried adding
> output.txt 2>&1
to system call but did not change anything.
How do I prevent output form extprogram being used in myperlcgi.pl.
Surprised that stdout from system call is being used in myperlcgi.pl
The system command just doesn’t give you complete control over capturing STDOUT and STDERR of the executed command.
Use backticks or open to execute the command instead. That will capture the STDOUT of the command’s execution. If the command also outputs to STDERR, then you can append 2>&1 to redirect STDERR to STDOUT for capture in backticks, like so:
my $output = `$command 2>&1`;
If you really need the native return status of the executed command, you can get that information using $? or ${^CHILD_ERROR_NATIVE}. See perldoc perlvar for details.
Another option is to use the IPC::Open3 Perl library, but I find that method to be overkill for most situations.

How to hide the output of a command ran inside sytem()?

I have looked all over and found plenty of answers, and after much trial and error, I have come to this conclusion: I don't know.
I am running a thing in Perl:
my $command = sprintf('commandHereDoingSCPishThingsThatHasToRunInMyShell', options, options, options);
system($command);
When I run my script in my terminal window, I see the output of the command inside the sprintf() function.
I would like for the output to be re-directed to a file, and to not have to see the program executing on my terminal window.
Thanks!
Raw use of perl's system() is discouraged because it isn't trivial to detect errors and come up with a relevant error message. Consider using IPC::System::Simple. Since you want to capture the output, use its capture().
use IPC::System::Simple 'capture';
my $output = capture($command); # dies if command fails; catch this with eval if necessary
File::Slurp::write_file('somefile', $output);
Alternatively, just redirect the output to a file in your command:
use IPC::System::Simple 'system';
system("$command >somefile");

Perl - impdp from perl

I am not able to re-direct the output of impdp command from system () perl to a file.
#!/usr/bin/perl
$a="impdp GLS_UCELL_80TC5_DEV/comverse directory=DUMP_DIR_1028704 dumpfile=ACCESS_REGION_VALUES.dmp CONTENT=data_only";
system("$a 1>t.tmp");
However, upon changing the system command to system("$a 2>t.tmp"), t.tmp gets created.
I want to understand why the re-direction 1>t.tmp not working in my case.
1>t.tmp (or just >t.tmp) redirects STDOUT. Anything printed to STDOUT will end up in the file.
2>t.tmp redirects STDERR. Anything printed to STDERR will end up in the file.
You want to capture what your program writes to STDERR, so you need to use the latter, or >t.tmp 2>&1 which redirects both.

How can I suppress system output when using nohup from Perl?

In Perl I am starting a process using the nohup command. The command is below:
system("nohup myproc pe88 &");
This works fine and the process starts as expected. However I would like to suppress the following output of this command - which is:
Sending output to nohup.out
I must have this process redirecting all of it's output to nohup.out but I just don't want it displayed when I run my Perl program. I want to instead, print my own user friendly message. I've tried a few variants but nothing has worked for me yet.
"Sending output to nohup.out" message is sent to STDERR, so you can catch the STDERR via the usual methods
either via shell: system("nohup myproc pe88 2> /tmp/error_log.txt &");
Use /dev/null instead of /tmp/error_log.txt if you don't need stderr at all; and add "> /tmp/myout.txt" to redirect stdout.
Or by capturing via Perl (don't use system() call, instead use IPC::Open3 or capture command from IPC::System::Simple)
How about:
system("nohup myproc pe88 >nohup.out 2>&1 &");
The man page for nohup says:
If standard output is a terminal,
append output to 'nohup.out' if
possible, '$HOME/nohup.out' otherwise.
If standard error is a terminal,
redirect it to standard output. To
save output to FILE, use `nohup
COMMAND > FILE'.
So if you explicitly redirect STDOUT and STDERR to nohup.out, then nohup doesn't print that message. Granted, you don't get the automatic fallback to $HOME/nohup.out if nohup.out is unwritable, but you can check for that first if that's an issue.
Note that if you redirect just STDOUT, nohup prints a "redirecting stderr to stdout" message.

wget not behaving via IPC::Open3 vs bash

I'm trying to stream a file from a remote website to a local command and am running into some problems when trying to detect errors.
The code looks something like this:
use IPC::Open3;
my #cmd = ('wget','-O','-','http://10.10.1.72/index.php');#any website will do here
my ($wget_pid,$wget_in,$wget_out,$wget_err);
if (!($wget_pid = open3($wget_in,$wget_out,$wget_err,#cmd))){
print STDERR "failed to run open3\n";
exit(1)
}
close($wget_in);
my #wget_outs = <$wget_out>;
my #wget_errs = <$wget_err>;
print STDERR "wget stderr: ".join('',#wget_errs);
#page and errors outputted on the next line, seems wrong
print STDERR "wget stdout: ".join('',#wget_outs);
#clean up after this, not shown is running the filtering command, closing and waitpid'ing
When I run that wget command directly from the command-line and redirect stderr to a file, something sane happens - the stdout will be the downloaded page, the stderr will contain the info about opening the given page.
wget -O - http://10.10.1.72/index.php 2> stderr_test_file
When I run wget via open3, I'm getting both the page and the info mixed together in stdout. What I expect is the loaded page in one stream and STDERR from wget in another.
I can see I've simplified the code to the point where it's not clear why I want to use open3, but the general plan is that I wanted to stream stdout to another filtering program as I received it, and then at the end I was going to read the stderr from both wget and the filtering program to determine what, if anything went wrong.
Other important things:
I was trying to avoid writing the wget'd data to a file, then filtering that file to another file, then reading the output.
It's key that I be able to see what went wrong, not just reading $? >> 8 (i.e. I have to tell the user, hey, that IP address is wrong, or isn't the right kind of website, or whatever).
Finally, I'm choosing system/open3/exec over other perl-isms (i.e. backticks) because some of the input is provided by untrustworthy users.
You are passing an undefined value as the error handle argument to open3, and as IPC::Open3 says:
If CHLD_ERR is false, or the same file descriptor as CHLD_OUT, then STDOUT and STDERR of the child are on the same filehandle (this means that an autovivified lexical cannot be used for the STDERR filehandle, see SYNOPSIS) ...
The workaround is to initialize $wget_err to something before calling open3:
my ($wget_pid, $wget_in, $wget_out, $wget_err);
use Symbol qw(gensym);
$wget_err = gensym();
if (!$wget_pid = open3( ... ) ) { ...