Persistant effects of modifying process environment via system - perl

I am making a few calls to the system, mainly cd commands, as certain functions need to called from certain directories on my system. However, I have noticed that once a call is finished, the effects of that call are lost.
For example, lets say that I start in /home/project and then call:
system("setenv home/project/env/NeededEnvironment");
system("make cfile.o");
The second system call doesn’t know about the first call setting the environment needed for the file to compile. I have tried putting them into one system call separated by ; as well, but have the same problem. Is there anyway to get the effect of the first call to be saved?

That is how system works: it creates a subshell to execute your command, and when the command is complete, the subshell exits leaving your perl process unaffected.
Section 8 of the Perl FAQ also answers this question.
I {changed directory, modified my environment} in a perl script. How come the change disappeared when I exited the script? How do I get my changes to be visible?
Unix
In the strictest sense, it can't be done—the script executes as a different process from the shell it was started from. Changes to a process are not reflected in its parent—only in any children created after the change. There is shell magic that may allow you to fake it by eval()ing the script's output in your shell; check out the comp.unix.questions FAQ for details.
You want code along the lines of
system("cd /home/project/env/NeededEnvironment && make cfile.o") == 0
or warn "$0: make failed";
or use the -C option to make and avoid shell argument parsing as in
system("make", "-C", "/home/project/env/NeededEnvironment", "cfile.o") == 0
or warn "$0: make failed";

If you are writing a Perl script, use Perl itself and shell-out as rarely as possible.
If you need to change your directory:
chdir 'some/other/dir';
If you need to set an environment variable:
$ENV{ SOME_VAR } = 'Some value';
Update
Here are some more commands where the shell equivalent should not be used:
mkdir
unlink
rmdir
Modules everyone should know about:
File::Copy
File::Path
File::Basename
File::Spec

Related

Calling perl script from perl

I have a perl script which takes 2 arguments as follows and calls appropriate function depending on the argument. I call this script from bash, but i want to call it from perl, is it possible?
/opt/sbin/script.pl --group="value1" --rule="value2";
Also the script exits with a return value that I would like to read.
The Perl equivalent of sh command
/opt/sbin/script.pl --group="value1" --rule="value2"
is
system('/opt/sbin/script.pl', '--group=value1', '--rule=value2');
You could also launch the command in a shell by using the following, though I'd avoid doing so:
system(q{/opt/sbin/script.pl --group="value1" --rule="value2"});
Just like you'd have to do in sh, you'll have to follow up with some error checking (whichever approach you took). You can do so by using use autodie qw( system );. Check the docs for how to do it "manually" if you want more flexibility.
If you want to capture the output:
$foo = `/opt/sbin/script.pl --group="value1" --rule="value2"`;
If you want to capture the exit status, but send script.pl's output to stdout:
$status = system "/opt/sbin/script.pl --group=value1 --rule=value2";
If you want to read its output like from a file:
open SCRIPT, "/opt/sbin/script.pl --group=value1 --rule=value2 |" or die $!;
while (<SCRIPT>) ...
Yes. You can use system, exec, or <backticks>.
The main difference between system and exec is that exec "executes a command and never returns.
Example of system:
system("perl", "foo.pl", "arg");

Shell Programming inside Perl

I am writing a code in perl with embedded shell script in it:
#!/usr/bin/perl
use strict;
use warnings;
our sub main {
my $n;
my $n2=0;
$n=chdir("/home/directory/");
if($n){
print "change directory successful $n \n";
$n2 = system("cd", "section");
system("ls");
print "$n2 \n";
}
else {
print "no success $n \n";
}
print "$n\n";
}
main();
But it doesn't work. When I do the ls. The ls doesn't show new files. Anyone knows another way of doing it. I know I can use chdir(), but that is not the only problem, as I have other commands which I have created, which are simply shell commands put together. So does anyone know how to exactly use cli in perl, so that my compiler will keep the shell script attached to the same process rather than making a new process for each system ... I really don't know what to do.
The edits have been used to improve the question. Please don't mind the edits if the question is clear.
edits: good point made by mob that the system is a single process so it dies everytime. But, What I am trying to do is create a perl script which follows an algorithm which decides the flow of control of the shell script. So how do I make all these shell commands to follow the same process?
system spawns a new process, and any changes made to the environment of the new process are lost when the process exits. So calling system("cd foo") will change the directory to foo inside of a very short-lived process, but won't have any effect on the current process or any future subprocesses.
To do what you want to do (*), combine your commands into a single system call.
$n2 = system("cd section; ls");
You can use Perl's rich quoting features to pass longer sequences of commands, if necessary.
$n2 = system q{cd section
if ls foo ; then
echo we found foo in section
./process foo
else
echo we did not find foo\!
./create_foo > foo
fi};
$n2 = system << "EOSH";
cd section
./process bar > /tmp/log
cd ../sekshun
./process foo >> /tmp/log
if grep -q Error /tmp/log ; then
echo there were errors ...
fi
EOSH
(*) of course there are easy ways to do this entirely in Perl, but let's assume that the OP eventually will need some function only available in an external program
system("cd", "section"); attempts to execute the program cd, but there is no such program on your machine.
There is no such program because each process has its own current work directory, and one process cannot change another process's current work directory. (Programs would malfunction left and right if it was possible.)
It looks like you are attempting to have a Perl program execute a shell script. That requires recreating the shell in Perl. (More specific goals might have simpler solutions.)
What I am trying to do is create a perl script which follows an algorithm which decides the flow of control of the shell script.
Minimal change:
Create a shell script that prompts for instructions/commands. Have your Perl script launch the shell script using Expect and feed it answers/commands.

Perl: flock() works on Linux, ignores previous lock on AIX

In a nutshell: wrote a Perl script using flock(). On Linux, it behaves as expected. On AIX, flock() always returns 1, even though another instance of the script, using flock(), should be holding an exclusive lock on the lockfile.
We ship a Bash script to restart our program, relying on flock(1) to prevent simultaneous restarts from making multiple processes. Recently we deployed on AIX, where flock(1) doesn't come by default and won't be provided by the admins. Hoping to keep things simple, I wrote a Perl script called flock, like this:
#!/usr/bin/perl
use Fcntl ':flock';
use Getopt::Std 'getopts';
getopts("nu:x:");
%switches = (LOCK_EX => $opt_x, LOCK_UN => $opt_u, LOCK_NB => $opt_n);
my $lockFlags = 0;
foreach $key (keys %switches) {
if($switches{$key}) {$lockFlags |= eval($key)};
}
$fileDesc = $opt_x || $opt_u;
open(my $lockFile, ">&=$fileDesc") || die "Can't open file descriptor: $!";
flock($lockFile, $lockFlags) || die "Can't change lock - $!\n";;
I tested the script by running (flock -n -x 200; sleep 60)200>lockfile twice, nearly simultaneously, from two terminal tabs.
On Linux, the second run dies with "Resource temporarily unavailable", as expected.
On AIX, the second run acquires the lock, with flock() returning 1, as most definitely not expected.
I understand the flock() is implemented differently on the two systems, the Linux version using flock(1) and the AIX one using, I think, fcntl(1). I don't have enough expertise to understand how this causes my problem, and how to solve it.
Many thanks for any advice.
This isn't anything to do with AIX, the open() call in your script is incorrect.
Should should be something like:
open (my $lockfile, ">>", $fileDesc) # for LOCK_EX, must be write
You were using the "dup() previously opened file handle" syntax with >&=, but the script had not opened any files to duplicate, nor should it.
My quick tests shows the correct behavior (debugging added)
first window:
$ ./flock.pl -n -x lockfile
opened lockfile
locked
second window:
$./flock.pl -n -x lockfile
opened lockfile
Can't change lock - Resource temporarily unavailable
$
It's not about different commands, I suppose; it's more about global differences between AIX and Linux.
In POSIX systems, file locks are advisory: each program could check the file's state and then reconsider what has to be done with it. No explicit checks = no locking.
In Linux systems, however, one can try to enforce a mandatory lock, although the doc itself states that it would be unwise to rely on it: implementation is (and probably will ever be) buggy.
Therefore, I suggest implementing such checks of advisory flags within the script itself.
More about it: man 2 fcntl, man 2 flock.

How to set a crontab job for a perl script

I have a Perl script which I want to run every 4 hours through cron. But somehow it fails to execute through cron and runs fine if I run it through command line. Following is the command which I set in crontab:
perl -q /path_to_script/script.pl > /dev/null
Also, when I run this command on command prompt, it does not execute but when I go in the leaf folder in path_to_script and execute the file, it runs fine.
Also, where will the log files of this cron job be created so that I can view them?
You should probably change the working directory to "leaf folder".
Try this in your crontab command:
cd /path_to_script; perl script.pl >/dev/null
Wrt. log files. Cron will mail you the output. But since you sent stdout to /dev/null, only stderr will be mailed to you.
If you want the output saved in a log file, then pipe the stderr/stdout output of the script into a file, like so:
cd /path_to_script; perl script.pl 2>&1 >my_log_file
Usually cron will send you mail with the output of your program. When you're figuring it out, you probably want to check the environment. It won't necessarily be the same environment as your login shell (since it's not a login shell):
foreach my $key ( keys %ENV ) {
printf "$key: $$ENV{$key}\n";
}
If you're missing something you need, set it in your crontab:
SOME_VAR=some_value
HOME=/Users/Buster
If you need to start in a particular directory, you should chdir there. The starting directory from a cron job probably isn't what you think it is. Without an argument, chdir changes to your home directory. However, sometimes those environment variables might not be set in your cron session, so it's probably better to have a default value:
chdir( $ENV{HOME} || '/Users/Buster' );
At various critical points, you should give error output. This is a good thing even in non-cron programs:
open my $fh, '<', $some_file or die "Didn't find the file I was expecting: $!";
If you redirect things to /dev/null, you lose all that information that might help you solve the problem.
looks like you may have missed the
#!/usr/bin/perl
at the start of your perl script which is why you might need perl -q to run it
once you have added that line you can run it directly from the command line using
/path_to_script/script.pl
If you use a command in your perl program, i advise you to put the full path to the command in your program.
I have try to load environment but it is not more helpful.
After a oversee with one colleague, i think it's from interaction between perl and the system environment.
Best regards,
Moustapha Kourouma

How can I run a shell script from inside a Perl script run by cron?

Is it possible to run Perl script (vas.pl) with shell sript inside (date.sh & backlog.sh) in cron or vice versa?
Thanks.
0 19 * * * /opt/perl/bin/perl /reports/daily/scripts/vas_rpt/vasCIO.pl 2> /reports/daily/scripts/vas_rpt/vasCIO.err
Error encountered:
date.sh: not found
backlog.sh: not found
Perl script:
#!/opt/perl/bin/perl
system("sh date.sh");
open(FH,"/reports/daily/scripts/vas_rpt/date.txt");
#date = <FH>;
close FH;
open(FH,"/reports/daily/scripts/vas_rpt/$cat1.txt");
#array = <FH>;
system("sh backlog.sh $date[0] $array[0]");
close FH;
cron runs your perl script in a different working directory than your current working directory. Use the full path of your script file:
# I'm assuming your shell script reside in the same
# dir as your perl script:
system("sh /reports/daily/scripts/date.sh");
Or if your're allergic to hardcoding paths like I am you can use the FindBin package from CPAN:
use FindBin qw($Bin);
system("sh $Bin/date.sh");
If your shell script also needs to start in the correct path then it's probably better to first change your working directory:
use FindBin qw($Bin);
chdir $Bin;
system("sh date.sh");
You can do what you want as long as you are careful.
The first thing to remember with cron jobs is that you get almost no environment set.
The chances are, the current directory is / or perhaps $HOME. And the value of $PATH is minimal - your profile has not been run, for example.
So, your script didn't find 'date.sh' because it wasn't in the correct directory.
To get the data from the shell script into your program, you need to pipe it there - or arrange for the 'date.sh' to dump the data into the file successfully. Of course, Perl has built-in date and time handling, so you don't need to use the shell for it.
You also did not run with use warnings; or use strict; which would also help you. For example, $cat1 is not a defined variable.
Personally, I run a simple shell script from cron and let it deal with all the complexities; I don't use I/O redirection in the crontab file. That's partly a legacy of working on ancient systems - but it also leads to portable and reliable running of cron jobs.
It's possible. Just keep in mind that your working directory when running under cron may not be what you think it is - it's the value in your HOME environment variable, or that specified in the /etc/passwd file. Consider fully qualifying the path to your .shes.
There are a lot of things that need care in your script, and I talk about most of them in the "Secure Programming Techniques" chapter of Mastering Perl. You can also find some of it in perlsec/
Since you are taking external data and passing them to other external programs, you should use taint checking to ensure that the data are what you expect. What if someone were able to sneak something extra into those files?
When you want to pass data to external programs, use system in the list form so the shell doesn't get a chance to interpret possible meta-characters.
Instead of relying on the PATH to find the programs that you expect to run, specify their full paths explicitly to ensure you are at least running the file you think you are (and not something someone snuck into a directory that is earlier in PATH). If you were really paranoid (like taint checking is), you might also check that those files and directories had suitable permissions (e.g., not world-writeable).
Just as a bonus note, if you only want one line from a filehandle, you can use the line-input operator in scalar context:
my $date = <$fh>;
You probably want to chomp the data too to get rid of possible ending newlines. Even if you don't think a terminating newline should be there because another program created the file, someone looking at the file with a text editor might add it.
Good luck, :)