Why two CGI scripts with the same URL can't run simultaneously? - server

I start two instances of this Perl CGI script approximately simultaneously:
#!/usr/bin/perl
use strict;
use warnings;
print "Content-Type: text/plain\n\n";
print "Started at: " . time;
sleep 10;
Comparing the start time of the two scripts (which it outputs to the browser), I get that the difference is 10 sec, what is exactly the run time of the first script. This experiment shows that two instances of the same Perl script cannot run simultaneously.
Now I ask why, and whether this can be corrected (to allow two instances of the same Perl script to run simultaneously)?
We use a combination of Apache 2 and Nginx with CGI Perl scripts (not mod_perl).

Related

Writing to a text file using Perl CGI

I am using cgi to call a perl function, and inside the perl function, I want it to write something in a txt. But it's not working for now. I have tried to run the cgi script on my apache server. The cgi could print the webpage as I requied, but I can't see the file that I want it to write. Seems like the perl script is not execute by the server.
The perl script is quite simple.The file name is perl predict_seq_1.pl
#!/usr/bin/perl -w
use strict;
use warnings;
my $seq = $ARGV[0];
my $txtfile = "test.txt";
open(my $FH, '>', $txtfile);
print $FH "test $seq success\n";
close $FH;
And the cgi part, I just call it by
system('perl predict_seq_1.pl', $geneSeq);
Your CGI program is going to run by a user who has very low permissions on your system - certainly lower than your own user account on that same system.
It seems likely that one of two things is happening:
Your CGI process can't see the program you're trying to run.
Your CGI program doesn't have permission to execute the program you're trying to run.
You should check the return value from system() and look at the value of $?.
Either give system a single string, or give it individual arguments. Your script is attempting and failing to run the program perl predict_seq_1.pl, rather than having perl run the script predict_seq_1.pl.
system('perl', 'predict_seq_1.pl', $geneSeq);

Perl: What is the fastest way to run a perl script from within a perl script?

I am writing a Perl script that uses other Perl scripts (not mine). Some of them receive inputs with flags and some don't. Another thing that I need is to redirect the outputs of these scripts to different files. For example:
W\O flags: script1.pl arg1 arg2 arg3 > output1.log
W flags: script2.pl -a 1 -b 2 -c 3 > output2.log
Bottom line - I was using system() to do this, but then I found out that the script takes too long.
I tried doing this with do() but it didn't work (like here).
So what is the fastest way to achieve that?
You need to make your test script define a subroutine that executes everything you want to run, then make your main script read the Perl code of the test script and invoke that subroutine - so the test script will look something like this:
#!/usr/bin/env perl
#
# defines how to run a test
use strict;
use warnings;
sub test
{
my ($arg1, $arg2, $arg3) = #_;
# run the test
(...)
)
The main script:
#!/usr/bin/env perl
#
# runs all of the tests
use strict;
use warnings;
require 'testdef.pl'; # the other script
foreach (...)
{
(...)
test($arg1, $arg2, $arg3);
}
This is still a very basic way of doing it.
The proper way, as ikegami says, is to turn the test script into a module.
That will be worthwhile if you will be creating more test script files than just these two or if you want to install the scripts in various locations.
Use a multi argument system call: http://perldoc.perl.org/functions/system.html
This is not going to execute the shell, you could spare a few CPU cycles
system(qw(script1.pl arg1 arg2 arg3 > output1.log));
"As an optimization, may not call the command shell specified in
$ENV{PERL5SHELL} . system(1, #args) spawns an external process and
immediately returns its process designator, without waiting for it to
terminate. "
if you are not interested in the return status you could use exec instead or you could use fork/thread for paralel execution.

Perl script: different results form command line and CGI

Warning: I'm perl and CGI beginner so this can be stupid question.
I write a really simple perl script which should get info about open files and running processes on system. There is something like this function for processes:
sub num_processes() {
my #lines = `/bin/ps -ef`;
return scalar #lines;
}
If I run it from bash, it returns all running processes on system but when I run it via apache and CGI it retruns only 2 processes (running script and running 'ps -ef'). This CGI script runs under user with shell (/bin/bash) enabled. Is there any posibility how to get all the processes via apache and CGI?
Your CGI script will run as the Apache user account. Your shell call will run as your user account. This is probably why you get two different answers. Take a look at something like suEXEC to manage the user under which CGI scripts are run.

Writing into a pipe in perl

I am passing the commands to some application through the Perl script using the pipe.
So I write the commands on pipe. But my problem is that the pipe is not waiting till the command execution is over from the application side and it takes the next command. So it is not blocking the inputs till the command execution is over. I need my Perl script work like UNIX shell. But it happens like the process is running in to background. I use readling to read the inputs.
#!/usr/bin/perl -w
use strict;
use Term::ReadLine;
open (GP, "|/usr/bin/gnuplot -noraise") or die "no gnuplot";
use FileHandle;
GP->autoflush(1);
# define readline
$term = new Term::ReadLine 'ProgramName';
while (defined( $_ = $term->readline('plot>'))) {
printf GP ("$_\n");
}
close(GP);
I recommend use of a concrete CPAN module, like Chart::Gnuplot, so you can have a high level of control

Perl flock does not work from CGI

(Before running the below script replace /home/porton/t/MOVE with a path to a file you have the right to create or erase.)
When I start this script from the command line and during 10 secs start the same script from command line again, it prints what I expect:
Flock: 1
and
Flock: 0
correspondingly.
But when I run it twice (with interval between the time of the requests less than 10 secs) as CGI that is as http://test.localhost/cgi-bin/test2.pl it prints
Flock: 1
for both two CGI requests.
What is the error? Why it behaves in a different unexpected way when run from CGI?
#!/usr/bin/perl
use strict;
use warnings;
use Fcntl qw(:flock);
print "Content-Type: text/plain\n\n";
open(my $lock_fh, '>', "/home/porton/t/MOVE");
print "Flock: " . flock($lock_fh, LOCK_EX|LOCK_NB) . "\n";
sleep 10;
Are you sure the two requests are running in parallel? They might be handled sequentially, i.e. the second request could be processed after the first one is completed, and after the lock has been released.