I am trying to run a Perl Fast CGI process in Azure. I have built a hello world test page (test.pl) and if I run it from the command it works
d:\home\site\wwwroot\bin\perl\bin\perl.exe -MFCGI::IIS=do test.pl
I have a handler for *.pl to run d:\home\site\wwwroot\bin\perl\bin\perl.exe (Strawbery Perl) with the arguments -MFCGI::IIS=do (i.e. the same as the above).
I get a 500 error. When I view the detailed log the only information I get back is FastCgiModule Error Code 0x8007000d.
My file runs from the command line but it won't run within the web app. Why not?
The problem was simple. My perl hello world worked from the command line, but for a web page it obviously needs to set the content type. I was missing the first line!
print "Content-type: text/html", "\n\n";
print "Hello World.\n";
Related
I am trying to run a Perl script directly from GitHub. This thread seems to address my issue (indeed, it helped me run dofiles in Stata directly from GitHub). However, when I type the following in a command prompt:
"perl https://rawgit.com/EconJoe/medline2014-xmlparsers/master/desc2014_meshtreenumbers.pl"
I get The following error message:
"Can't open perl script "https://rawgit.com/EconJoe/medline2014-xmlparsers/master/desc2014_meshtreenumbers.pl": Invalid argument"
Thanks for any help.
perl can't fetch a script from a URL. You have to do that separately.
curl -L https://rawgit.com/EconJoe/medline2014-xmlparsers/master/desc2014_meshtreenumbers.pl | perl
I have a perl file (eg:test.pl) which does some DB operations.
While testing, its working fine.
I execute this file as a background process by using the command
perl test.pl &
Its working properly for some days.
But after some days ,the file execution get stopped.
How can I find the reason or view the error?
I checked the log file "/var/log/httpd/error_log", but can't find anything.
I keep the perl file in a server, which runs in Cent OS.
Any one have idea?
There is no 'perl error log'
But you can define a destination for output to be saved to, just run your script like this:
perl test.pl >> /var/log/some-log-file.log 2>&1 &
This will redirect STDOUT (normal shell output) and STDERR (error output) to /var/log/some-log-file.log instead of to the terminal.
You may also wish to use nohup in order to have the script ignore HANGUP (logout) signals, which could be causing your unexpected terminations:
nohup perl test.pl >> /var/log/some-log-file.log 2>&1 &
Obviously, whichever user you run the script as will need to have write access to the log file.
I'm beginning to write a simple Perl program on my Mac, and I understand that the first line needs to be the location of Perl itself, every example or tutorial I find tells me the first line should be:
"#!/usr/bin/perl"
However, with that there, I attempt to run the file under localhost and I get this error:
Internal Server Error
The server encountered an internal error or misconfiguration and was unable to complete your request.
Please contact the server administrator, you#example.com and inform them of the time the error occurred, and anything you might have done that may have caused the error.
More information about this error may be available in the server error log.
Anyone have any idea why this is happening?
Thanks in advance, and let me know if any more information is needed!
P.S. if it helps, when I execute the command: "perl -v" it tells me
This is perl, v5.10.0 built for darwin-thread-multi-2level
(with 2 registered patches, see perl -V for more detail)
As Erik said, /usr/bin/perl is the standard location for Perl on OSX. You can also verify this by running which -a perl from terminal (this will list all instances of Perl on your path).
Can you run your script from the command-line, i.e. ./<myscript>.pl? It's possible that you haven't made the script executable.
#!/usr/bin/perl is the correct path for 10.6. If you're running from the webserver, your first line before any output should be a HTML header. You may have forgotten one?
#!/usr/bin/perl
use CGI;
print CGI->header('text/html');
print "hello world";
I have a program which reads the output from external application.The external app gives set of output. My program reads the output from this external app while($line=<handle to external app>) and print it to STDOUT.But the "print $line STDOUT" prints only some lines and when the error has occurred ,print to STDOUT is not working , but my one more logging statement "push #arr,$line" has stored complete output from the external app.From this i got to know STDOUT is not working properly when error happens.
Eg:
if external app output is like:
Starting command
First command executed successfully
Error:123 :next command failed
Program terminated
In here the STDOUT prints only :
Starting command
First command executed successfully
But if I check the array it has complete output including error details. So I guessed STDOUT has been redirected or lost.
So I tried storing STDOUT in the beginning of the program to $old_handle using open and then try to restore it before print statement using select($old_handle) (thinking some thing redirects STDOUT when error happens)
But I was not successfull, I don't know what is wrong here. Please help me.
It's possible the output is being buffered. Try setting
$| = 1;
at the start of your program. This will cause the output to be displayed straight away, rather than being buffered for later.
Just guess, may be because error output doesn't go to STDOUT. Use redirect
first_program |& perl_program
or
first_program 2>&1 | perl_program
I have a CGI (perl) script that is attempting to call curl using the open command:
#curl = ('/usr/bin/curl', '-S','-v','--location', $url,
'-H', 'Content-Type:'.$content_type,
'-H', "Authorization: $authorization",
'-H', "X-Gdata-Key:$gdata_key",
'-H', "Content-Length:$content_length",
'-H','GData-Version:2',
'--data',"\#$filename");
And then executed like so:
open CURL, "-|", #curl;
The program works perfectly from the command line, but when I try to run it in the browser, the page ends up timing out.
What do I need to change on my server or in my script to get this to work properly?
You should check if the open succeeded and also attempt to close the pipe explicitly, check for error. In case of error, die with the error message. Then find the error message in the server error log.
In addition to the Sinan's suggestion, the timeout you are getting might point to a long running process - always an issue when run under CGI. Please look at other solutions like a Queue Manager. I use Beanstalk for these situations. But I have heard good things about Gearman and The Schwartz
I also learnt a lot about running processes that take a lot of time under CGI from this article
After looking at the error log and seeing the error
[Mon Nov 30 14:59:07 2009] [error] slurp_filename(
'/var/www/vhosts/mydomain.net/httpdocs /youtube/youtube.pl') / opening: (2)
No such file or directory at /usr/lib64/perl5/vendor_perl/5.8.6/
x86_64-linux-thread-multi/ModPerl/RegistryCooker.pm line 540
I thought it had something to do with the fact that I'm passing in my XML to curl as a file instead of as a string. Here is the new command that works with the xml passed as a string instead:
#curl = ('/usr/bin/curl', '-S','-v','--location', $url, '-H',
'Content-Type:'.$content_type,'-H',"Authorization: $authorization",'-H',
"X-Gdata-Key:$gdata_key",'-H',"Content-Length:$content_length",'-H',
'GData-Version:2','--data',"$xml");
And I am still using the command below to open/call curl:
open CURL, "-|", #curl;
It now runs successfully in the browser and returns me the values I am requesting with it.