How to set a crontab job for a perl script - perl

I have a Perl script which I want to run every 4 hours through cron. But somehow it fails to execute through cron and runs fine if I run it through command line. Following is the command which I set in crontab:
perl -q /path_to_script/script.pl > /dev/null
Also, when I run this command on command prompt, it does not execute but when I go in the leaf folder in path_to_script and execute the file, it runs fine.
Also, where will the log files of this cron job be created so that I can view them?

You should probably change the working directory to "leaf folder".
Try this in your crontab command:
cd /path_to_script; perl script.pl >/dev/null
Wrt. log files. Cron will mail you the output. But since you sent stdout to /dev/null, only stderr will be mailed to you.
If you want the output saved in a log file, then pipe the stderr/stdout output of the script into a file, like so:
cd /path_to_script; perl script.pl 2>&1 >my_log_file

Usually cron will send you mail with the output of your program. When you're figuring it out, you probably want to check the environment. It won't necessarily be the same environment as your login shell (since it's not a login shell):
foreach my $key ( keys %ENV ) {
printf "$key: $$ENV{$key}\n";
}
If you're missing something you need, set it in your crontab:
SOME_VAR=some_value
HOME=/Users/Buster
If you need to start in a particular directory, you should chdir there. The starting directory from a cron job probably isn't what you think it is. Without an argument, chdir changes to your home directory. However, sometimes those environment variables might not be set in your cron session, so it's probably better to have a default value:
chdir( $ENV{HOME} || '/Users/Buster' );
At various critical points, you should give error output. This is a good thing even in non-cron programs:
open my $fh, '<', $some_file or die "Didn't find the file I was expecting: $!";
If you redirect things to /dev/null, you lose all that information that might help you solve the problem.

looks like you may have missed the
#!/usr/bin/perl
at the start of your perl script which is why you might need perl -q to run it
once you have added that line you can run it directly from the command line using
/path_to_script/script.pl

If you use a command in your perl program, i advise you to put the full path to the command in your program.
I have try to load environment but it is not more helpful.
After a oversee with one colleague, i think it's from interaction between perl and the system environment.
Best regards,
Moustapha Kourouma

Related

How to capture STDOUT from executable (cap) executed within a perl script executed from a crontab

Whew that is a long-winded title. But it explains my issue:
I have a crontab that runs a perl script.
That perl script runs a cap task, which outputs to STDOUT some status messages.
The perl script is supposed to capture the STDOUT (currently using backticks) from cap and parse it.
Now, this works 100% fine when I run the script from a bash user. However, when I run the script from a crontab, the perl script doesn't capture any output from the cap task.
Has anyone dealt with anything like this before? Thanks.
Maybe your cap executables are died without emitting any message to stdout. Did you checking the success state of execution?
Could you tried this?
$check_result = `$cmd 2>&1`;
if ($?){
die "$cmd failed with $check_result, $!";
}

Persistant effects of modifying process environment via system

I am making a few calls to the system, mainly cd commands, as certain functions need to called from certain directories on my system. However, I have noticed that once a call is finished, the effects of that call are lost.
For example, lets say that I start in /home/project and then call:
system("setenv home/project/env/NeededEnvironment");
system("make cfile.o");
The second system call doesn’t know about the first call setting the environment needed for the file to compile. I have tried putting them into one system call separated by ; as well, but have the same problem. Is there anyway to get the effect of the first call to be saved?
That is how system works: it creates a subshell to execute your command, and when the command is complete, the subshell exits leaving your perl process unaffected.
Section 8 of the Perl FAQ also answers this question.
I {changed directory, modified my environment} in a perl script. How come the change disappeared when I exited the script? How do I get my changes to be visible?
Unix
In the strictest sense, it can't be done—the script executes as a different process from the shell it was started from. Changes to a process are not reflected in its parent—only in any children created after the change. There is shell magic that may allow you to fake it by eval()ing the script's output in your shell; check out the comp.unix.questions FAQ for details.
You want code along the lines of
system("cd /home/project/env/NeededEnvironment && make cfile.o") == 0
or warn "$0: make failed";
or use the -C option to make and avoid shell argument parsing as in
system("make", "-C", "/home/project/env/NeededEnvironment", "cfile.o") == 0
or warn "$0: make failed";
If you are writing a Perl script, use Perl itself and shell-out as rarely as possible.
If you need to change your directory:
chdir 'some/other/dir';
If you need to set an environment variable:
$ENV{ SOME_VAR } = 'Some value';
Update
Here are some more commands where the shell equivalent should not be used:
mkdir
unlink
rmdir
Modules everyone should know about:
File::Copy
File::Path
File::Basename
File::Spec

Running a program that requires a password on the command line from a Perl script

I have written a Perl wrapper around a shell script. I am using IPC::Run::Simple to execute system commands. As an example:
run ("mkdir ~$usr/12.2.0_cp/faiz_cpv/$pdate") or die "Error $ERR";
run ("cp ~$usr/12.2.0_cp/faiz_cpv/MPlist.lst ~$usr/12.2.0_cp/faiz_cpv/$pdate") || die "Error: $ERR";
run ("cd ~$usr/12.2.0_cp/faiz_cpv/$pdate; sh /opsutils/mfg_top/rel/CPV/bin/list_generation.sh . MPlist.lst mfg_relall_us\#oracle.com") or die "error $ERR";
.
.
One of these shell scripts requires the user of the script to enter their password. That is, a message is printed on stdout and the password is accepted via the shell. A number of calls are made to this shell script during the entire process which means a user must reenter his password a number of times.
Is there a way by which I can request user for the password at the command line itself, and pass that password implicitly instead of prompting user for the password again and again?
Perl has mkdir and chdir built in and File::Copy provides a copy routine. Its generally safer and faster to use them than shelling out. Though it will not translate ~ for you. File::chdir makes changing a directory and running a command a little safer.
For the rest, use the full IPC::Run to control interacting with your program and Term::ReadLine::Gnu to read the password without displaying it. Sorry this is just a sketch and not a full answer.

How to grab the Unix command that I had run using Perl script?

As I am still new to Unix and Perl, I'm finding a simple and direct method to grab the Unix command that I had run using Perl script.
What I know is "history" can track back the commands that I had run, but it is not working in Perl using back ticks history to run it.
I tried to put "history > filename" in vi text editor in a temporary file, use command "source" it, and it works, but command "source" also not working in Perl script using back ticks.
Can anyone guide me about my problems? direct me to correct method to solve my problems? T.T
Thanks.
You can't. Shells (well, bash and tcsh, anyway, your shell might, but probably doesn't, vary) only save command history in interactive mode. Commands run in a subshell by a perl script won't be added to the history file.
This will get the history of commands that were run by the user in interactive mode:
$data_file = "~/.bash_history";
open(DAT, $data_file) || die("Could not open file!");
#fileData = <DAT>;
close(DAT);
foreach $command (#fileData) {
# Do things here.
}
As mentioned by Wobble, though, this history file will not include commands run from a Perl script - you'll have to have the script append the command to a file when it runs it, thus creating it's own history file (or, append it to ~/.bash_history, which will have it share the history file with interactive shells).
If you have access to the perl script (that is, you can change it), you can simply write each command run in the perl script to a chosen text file:
sub run_program
{
my $program = shift;
open PROGS, ">>my-commands.txt", or die $!;
print PROGS $program."\n";
`$program`;
close(PROGS);
}
then just run `run_program($command) every time you wish to run a command in the script.

Missing output when running system command in perl/cgi file

I need to write a CGI program and it will display the output of a system command:
script.sh
echo "++++++"
VAR=$(expect -c " spawn ssh -o StrictHostKeyChecking=no $USER#$HOST $CMD match_max
100000 expect \"*?assword:*\" send -- \"$PASS\r\" send -- \"\r\" expect eof ")
echo $VAR
echo "++++++"
In CGI file:
my $command= "ksh ../cgi-bin/script.sh";
my #output= `$command`;
print #output;
Finally, when I run the CGI file in unix, the $VAR is a very long string including \n and some delimiters. However, when I run on web server, the output is
++++++
++++++
So $VAR is missing when passing in the web interface/browser.
I know maybe the problem is $VAR is very long string.
But anyway, is there anyway to solve this problem except writing the output to a file then retrieve it from browser?
Thanks if you are interested in my question.
script.sh uses several environment variables: $USER, $HOST, $CMD and $PASS. The CGI environment will have different environment variables set than a login shell. You may need to set these variables from your CGI script before calling script.sh.
Try finding where commands like expect and ssh that you are calling are on your system and adding their directory paths to the PATH used by your script.
I.e.
which expect
returns /usr/bin/expect then add the line:
PATH=$PATH:/usr/bin && export PATH
near the beginning of the ksh script. During debug you may also want to redirect stderr to a file by appending 2>/tmp/errors.txt to the end of your command since stderr is not shown in the browser.
my $command= "ksh ../cgi-bin/script.sh 2>/tmp/errors.txt";