How to capture STDOUT from executable (cap) executed within a perl script executed from a crontab - perl

Whew that is a long-winded title. But it explains my issue:
I have a crontab that runs a perl script.
That perl script runs a cap task, which outputs to STDOUT some status messages.
The perl script is supposed to capture the STDOUT (currently using backticks) from cap and parse it.
Now, this works 100% fine when I run the script from a bash user. However, when I run the script from a crontab, the perl script doesn't capture any output from the cap task.
Has anyone dealt with anything like this before? Thanks.

Maybe your cap executables are died without emitting any message to stdout. Did you checking the success state of execution?
Could you tried this?
$check_result = `$cmd 2>&1`;
if ($?){
die "$cmd failed with $check_result, $!";
}

Related

Perl executing an exe using system()?

I could not able to run the exe using perl code.
my $XSD = "C:\\IParser\\Iparser\.exe --xsds \"$dir\\$out\_xsd\.xml\"";
system($XSD);
The above $xsd run in commend prompt it will execute fine. when I run through perl source code it shows error as
the handle is invalid
I don't no what is the problem behind this. Please help.
This could be because of closed handles to STDIN, STDOUT and STDERR (or any of them). Most probably if you run this as a daemon or from service.
Try opening the standard handles before running the process (assuming you're on windows according to path):
open(STDIN, "<NUL");
open(STDOUT, ">NUL");
open(STDERR, ">NUL");

Perl -- command executing inside a script hangs

When I run the following script, it does exactly what I want it to do and exits:
setDisplay.sh:
#!/bin/bash
Xvfb -fp /usr/share/fonts/X11/misc/ :22 -screen 0 1024x768x16 2>&1 &
export DISPLAY=:22
When I run ./setDisplay.sh, everything works fine.
OK, here's where the fun starts...
I have a Perl script that calls setDisplay...
Here is the eamorr.pl script:
#!/usr/bin/perl
use strict;
use warnings;
my $homeDir="/home/eamorr/Dropbox/site/";
my $cmd;
my $result;
print "-----Setting display...\n";
$cmd="sh $homeDir/setDisplay.sh";
print $cmd."\n";
$result=`$cmd`;
print $result;
It just hangs when I run ./eamorr.pl
I'm totally stuck...
When you do this:
$result=`$cmd`;
a pipe is created connecting the perl process to the external command, and perl reads from that pipe until EOF.
Your external command creates a background process which still has the pipe on its stdout (and also its stderr since you did 2>&1). There will be no EOF on that pipe until the background process exits or closes its stdout and stderr or redirects them elsewhere.
If you intend to collect the stdout and stderr of Xvfb into the perl variable $result, you'll naturally have to wait for it to finish. If you didn't intend that, I can't guess what you were trying to do with the 2>&1.
Also a script that ends with an export command is suspect. It can only modify its own environment, and then it immediately exits so there's no noticeable effect. Usually that's a sign that someone is trying to modify the parent process's environment, which is not possible.

Perl not executing command when in debugger or as a Win32::Daemon

Synopsis
I execute a shell command from Perl and when run from the command line it works, but when run in the debugger it does not work. Running it as a Win32::Daemon shows the same behaviour.
The Source Code
I execute a command either with backticks
print `$cmd`
or like this:
open FH, "$cmd |" or die "Couldn't execute $cmd: $!\n";
while(defined(my $line = <FH>)) {
chomp($line);
print "$line\n";
}
close FH;
The command reads like this:
$cmd = '"C:\path\to\sscep.exe" getca -f "C:\path\to\config\capi_sscep.cnf"'
Even creating a small test script that just executes this command does only work if run from command line.
The System
Windows x64
Active Perl v5.16.0, MSWin32-x64-multi-thread
Eclipse Juno 20120614-1722
What works
I works to open an administrator prompt (necessary for script execution) and to:
perl script.pl
Output gets printed to screen, $? is 0.
What does not work
Starting Eclipse and running a debug session with the same perl script.pl call.
Also not working is adding a service and executing the command (created with Win32::Daemon). The daemon itself is working perfectly fine and starting the perl script as expected. Only the command does not get executed. $? is 13568 or 53 if shifted with $? >> 8, no output gets printed. The exit code does not belong to the program.
Further Details
The tool I am calling is called sscep and is extended by me. It uses the OpenSSL API and loads the capi engine (Windows CryptoAPI). But the command itself does at least print output before any serious action starts. I can happily provide the source code for this, but I doubt it will help.
I was able to narrow this further down: The problem only exists in the combination of the Perl program (CertNanny) and the binary (sscep). Calling dir inside CertNanny works, calling sscep in a test Perl script works, too. So what could possibly be done in Perl to break a single binary from being called...?
Any ideas where this problem might originate from or how I can possibly narrow it down?
Here is what I believe the problem to be: when you run your program on the command line, the system() command goes through the shell (cmd.exe); when you run your program elsewhere, it does not. Unfortunately, the two methods handle command line arguments differently. Here is an article that seems like it should help you solve the problem.
In my experience, this sort of thing is a mess in Windows. I have had trouble with this issue in Perl, also.

How to set a crontab job for a perl script

I have a Perl script which I want to run every 4 hours through cron. But somehow it fails to execute through cron and runs fine if I run it through command line. Following is the command which I set in crontab:
perl -q /path_to_script/script.pl > /dev/null
Also, when I run this command on command prompt, it does not execute but when I go in the leaf folder in path_to_script and execute the file, it runs fine.
Also, where will the log files of this cron job be created so that I can view them?
You should probably change the working directory to "leaf folder".
Try this in your crontab command:
cd /path_to_script; perl script.pl >/dev/null
Wrt. log files. Cron will mail you the output. But since you sent stdout to /dev/null, only stderr will be mailed to you.
If you want the output saved in a log file, then pipe the stderr/stdout output of the script into a file, like so:
cd /path_to_script; perl script.pl 2>&1 >my_log_file
Usually cron will send you mail with the output of your program. When you're figuring it out, you probably want to check the environment. It won't necessarily be the same environment as your login shell (since it's not a login shell):
foreach my $key ( keys %ENV ) {
printf "$key: $$ENV{$key}\n";
}
If you're missing something you need, set it in your crontab:
SOME_VAR=some_value
HOME=/Users/Buster
If you need to start in a particular directory, you should chdir there. The starting directory from a cron job probably isn't what you think it is. Without an argument, chdir changes to your home directory. However, sometimes those environment variables might not be set in your cron session, so it's probably better to have a default value:
chdir( $ENV{HOME} || '/Users/Buster' );
At various critical points, you should give error output. This is a good thing even in non-cron programs:
open my $fh, '<', $some_file or die "Didn't find the file I was expecting: $!";
If you redirect things to /dev/null, you lose all that information that might help you solve the problem.
looks like you may have missed the
#!/usr/bin/perl
at the start of your perl script which is why you might need perl -q to run it
once you have added that line you can run it directly from the command line using
/path_to_script/script.pl
If you use a command in your perl program, i advise you to put the full path to the command in your program.
I have try to load environment but it is not more helpful.
After a oversee with one colleague, i think it's from interaction between perl and the system environment.
Best regards,
Moustapha Kourouma

How do I redirect the output of Perl script executed from within another perl script to a file?

I am running a perl script via crontab and redircting its output to a file:
30 1 * * * /full/path/to/my_script.pl >> /full/path/to/my_log_file
Within my_script.pl, I'm executing several other perl scripts via the system() command:
#/usr/bin/env perl
system( "/full/path/to/another_script.pl" );
And within those scripts, I am using 'print' to write to STDOUT:
#/usr/bin/env perl
print "Standard output...\n";
It appears, however, that none of those child scripts' output is getting redirected to my_log_file- The only output I see there is that of the parent perl script. Am I missing something obvious? This is on a linux system.
Instead of system(), use qx:
print qx( "/full/path/to/another_script.pl" );
Hmmm, if using system() then STDOUT should end up back in your log.
Do you have an example?
My immediate thoughts go towards the system() call isn't actually running the other scripts - do you use a full path to the script? Remember cron wont have the same $PATH that your shell has, so may not find the scripts you are trying to run unless they have the full path.
You could also capture STDERR in your log:
30 1 * * * /my_script.pl 2>&1 >> my_log_file