Perl script: different results form command line and CGI - perl

Warning: I'm perl and CGI beginner so this can be stupid question.
I write a really simple perl script which should get info about open files and running processes on system. There is something like this function for processes:
sub num_processes() {
my #lines = `/bin/ps -ef`;
return scalar #lines;
}
If I run it from bash, it returns all running processes on system but when I run it via apache and CGI it retruns only 2 processes (running script and running 'ps -ef'). This CGI script runs under user with shell (/bin/bash) enabled. Is there any posibility how to get all the processes via apache and CGI?

Your CGI script will run as the Apache user account. Your shell call will run as your user account. This is probably why you get two different answers. Take a look at something like suEXEC to manage the user under which CGI scripts are run.

Related

how to collect ARGV using Perl CGI?

I want to run a script from my website using CGI.pm - The script I am running is usally ran from the command line and requires several command line ARGV inputs. How do I deal with this using CGI.pm? - can I insert a system($command) into Perl CGI script? The script can be seen here - http://www.ncbi.nlm.nih.gov/IEB/ToolBox/C_DOC/lxr/source/doc/blast/web_blast.pl
how to collect ARGV using Perl CGI?
#ARGV didn't go anywhere, but CGI doesn't use command line arguments, so there are no command line arguments to collect.
can I insert a system($command) into Perl CGI script?
Yes.
You can dual-purpose the script by checking if you are connected to a terminal:
if (-t STDOUT) {
# Command LIne mode, use #ARGV;
}
else {
# CGI mode, get ARGV equivalent from CGI->param
}
You will have to adjust the output to work in CGI mode, by adding content headers before you output anything.
If you use system($foo) in a web page, make sure the logic controlling what's in $foo is secure, otherwise you might end up hacked.

Shell Programming inside Perl

I am writing a code in perl with embedded shell script in it:
#!/usr/bin/perl
use strict;
use warnings;
our sub main {
my $n;
my $n2=0;
$n=chdir("/home/directory/");
if($n){
print "change directory successful $n \n";
$n2 = system("cd", "section");
system("ls");
print "$n2 \n";
}
else {
print "no success $n \n";
}
print "$n\n";
}
main();
But it doesn't work. When I do the ls. The ls doesn't show new files. Anyone knows another way of doing it. I know I can use chdir(), but that is not the only problem, as I have other commands which I have created, which are simply shell commands put together. So does anyone know how to exactly use cli in perl, so that my compiler will keep the shell script attached to the same process rather than making a new process for each system ... I really don't know what to do.
The edits have been used to improve the question. Please don't mind the edits if the question is clear.
edits: good point made by mob that the system is a single process so it dies everytime. But, What I am trying to do is create a perl script which follows an algorithm which decides the flow of control of the shell script. So how do I make all these shell commands to follow the same process?
system spawns a new process, and any changes made to the environment of the new process are lost when the process exits. So calling system("cd foo") will change the directory to foo inside of a very short-lived process, but won't have any effect on the current process or any future subprocesses.
To do what you want to do (*), combine your commands into a single system call.
$n2 = system("cd section; ls");
You can use Perl's rich quoting features to pass longer sequences of commands, if necessary.
$n2 = system q{cd section
if ls foo ; then
echo we found foo in section
./process foo
else
echo we did not find foo\!
./create_foo > foo
fi};
$n2 = system << "EOSH";
cd section
./process bar > /tmp/log
cd ../sekshun
./process foo >> /tmp/log
if grep -q Error /tmp/log ; then
echo there were errors ...
fi
EOSH
(*) of course there are easy ways to do this entirely in Perl, but let's assume that the OP eventually will need some function only available in an external program
system("cd", "section"); attempts to execute the program cd, but there is no such program on your machine.
There is no such program because each process has its own current work directory, and one process cannot change another process's current work directory. (Programs would malfunction left and right if it was possible.)
It looks like you are attempting to have a Perl program execute a shell script. That requires recreating the shell in Perl. (More specific goals might have simpler solutions.)
What I am trying to do is create a perl script which follows an algorithm which decides the flow of control of the shell script.
Minimal change:
Create a shell script that prompts for instructions/commands. Have your Perl script launch the shell script using Expect and feed it answers/commands.

Redirect Perl print statement to Apache log

We have a Java web application running on Apache that calls Perl scripts in certain use cases. I would like to be able to redirect the print statements of the Perl scripts (which are printing to STDOUT by default) to the Apache log.
What is the best way to do this?
Using this in the Perl scripts worked:
print STDERR "my comment";

How to set a crontab job for a perl script

I have a Perl script which I want to run every 4 hours through cron. But somehow it fails to execute through cron and runs fine if I run it through command line. Following is the command which I set in crontab:
perl -q /path_to_script/script.pl > /dev/null
Also, when I run this command on command prompt, it does not execute but when I go in the leaf folder in path_to_script and execute the file, it runs fine.
Also, where will the log files of this cron job be created so that I can view them?
You should probably change the working directory to "leaf folder".
Try this in your crontab command:
cd /path_to_script; perl script.pl >/dev/null
Wrt. log files. Cron will mail you the output. But since you sent stdout to /dev/null, only stderr will be mailed to you.
If you want the output saved in a log file, then pipe the stderr/stdout output of the script into a file, like so:
cd /path_to_script; perl script.pl 2>&1 >my_log_file
Usually cron will send you mail with the output of your program. When you're figuring it out, you probably want to check the environment. It won't necessarily be the same environment as your login shell (since it's not a login shell):
foreach my $key ( keys %ENV ) {
printf "$key: $$ENV{$key}\n";
}
If you're missing something you need, set it in your crontab:
SOME_VAR=some_value
HOME=/Users/Buster
If you need to start in a particular directory, you should chdir there. The starting directory from a cron job probably isn't what you think it is. Without an argument, chdir changes to your home directory. However, sometimes those environment variables might not be set in your cron session, so it's probably better to have a default value:
chdir( $ENV{HOME} || '/Users/Buster' );
At various critical points, you should give error output. This is a good thing even in non-cron programs:
open my $fh, '<', $some_file or die "Didn't find the file I was expecting: $!";
If you redirect things to /dev/null, you lose all that information that might help you solve the problem.
looks like you may have missed the
#!/usr/bin/perl
at the start of your perl script which is why you might need perl -q to run it
once you have added that line you can run it directly from the command line using
/path_to_script/script.pl
If you use a command in your perl program, i advise you to put the full path to the command in your program.
I have try to load environment but it is not more helpful.
After a oversee with one colleague, i think it's from interaction between perl and the system environment.
Best regards,
Moustapha Kourouma

How can I run a shell script from inside a Perl script run by cron?

Is it possible to run Perl script (vas.pl) with shell sript inside (date.sh & backlog.sh) in cron or vice versa?
Thanks.
0 19 * * * /opt/perl/bin/perl /reports/daily/scripts/vas_rpt/vasCIO.pl 2> /reports/daily/scripts/vas_rpt/vasCIO.err
Error encountered:
date.sh: not found
backlog.sh: not found
Perl script:
#!/opt/perl/bin/perl
system("sh date.sh");
open(FH,"/reports/daily/scripts/vas_rpt/date.txt");
#date = <FH>;
close FH;
open(FH,"/reports/daily/scripts/vas_rpt/$cat1.txt");
#array = <FH>;
system("sh backlog.sh $date[0] $array[0]");
close FH;
cron runs your perl script in a different working directory than your current working directory. Use the full path of your script file:
# I'm assuming your shell script reside in the same
# dir as your perl script:
system("sh /reports/daily/scripts/date.sh");
Or if your're allergic to hardcoding paths like I am you can use the FindBin package from CPAN:
use FindBin qw($Bin);
system("sh $Bin/date.sh");
If your shell script also needs to start in the correct path then it's probably better to first change your working directory:
use FindBin qw($Bin);
chdir $Bin;
system("sh date.sh");
You can do what you want as long as you are careful.
The first thing to remember with cron jobs is that you get almost no environment set.
The chances are, the current directory is / or perhaps $HOME. And the value of $PATH is minimal - your profile has not been run, for example.
So, your script didn't find 'date.sh' because it wasn't in the correct directory.
To get the data from the shell script into your program, you need to pipe it there - or arrange for the 'date.sh' to dump the data into the file successfully. Of course, Perl has built-in date and time handling, so you don't need to use the shell for it.
You also did not run with use warnings; or use strict; which would also help you. For example, $cat1 is not a defined variable.
Personally, I run a simple shell script from cron and let it deal with all the complexities; I don't use I/O redirection in the crontab file. That's partly a legacy of working on ancient systems - but it also leads to portable and reliable running of cron jobs.
It's possible. Just keep in mind that your working directory when running under cron may not be what you think it is - it's the value in your HOME environment variable, or that specified in the /etc/passwd file. Consider fully qualifying the path to your .shes.
There are a lot of things that need care in your script, and I talk about most of them in the "Secure Programming Techniques" chapter of Mastering Perl. You can also find some of it in perlsec/
Since you are taking external data and passing them to other external programs, you should use taint checking to ensure that the data are what you expect. What if someone were able to sneak something extra into those files?
When you want to pass data to external programs, use system in the list form so the shell doesn't get a chance to interpret possible meta-characters.
Instead of relying on the PATH to find the programs that you expect to run, specify their full paths explicitly to ensure you are at least running the file you think you are (and not something someone snuck into a directory that is earlier in PATH). If you were really paranoid (like taint checking is), you might also check that those files and directories had suitable permissions (e.g., not world-writeable).
Just as a bonus note, if you only want one line from a filehandle, you can use the line-input operator in scalar context:
my $date = <$fh>;
You probably want to chomp the data too to get rid of possible ending newlines. Even if you don't think a terminating newline should be there because another program created the file, someone looking at the file with a text editor might add it.
Good luck, :)