CGI script doesn't run in browser - perl

I am making a .cgi file which prints all values from database table on webpage in a table format.The problem is that when I run the file on putty terminal emulator it works fine but when I try to run the file on my browser I get an error message "file not found" even though T typed the correct location of the file on the server.
I can't understand what am I doing wrong? I set my file's permission to chmod 755 * using putty but it's still not working.Is this a problem of file permissions or table structure is wrong for running on browser something else?
Please help...
people.CGI File
#!/usr/bin/perl
use CGI;
use DBI;
use strict;
#use warnings;
#use diagnostics;
print "Content-type:text/html\r\n\r\n";
#$q = CGI->new;
#print $q->header;
my $dsn = "DBI:mysql:Demo:localhost"; # Data source name
my $username = "mint"; # User name
my $password = "MINT123"; # Password
my $dbh;
my $sth; # Database and statement handles
$dbh = DBI->connect($dsn, $username, $password);
$sth = $dbh->prepare("SELECT * from people");
$sth->execute();
print "<h1>ganesh</h1>";
print "<table >
<tr>
<th>ID</th>
<th>Name of People Involved</th>
<th>Position</th>
<th>Roles(A user can have multiple roles)</th>
<th>Notes</th>
</tr>";
while( my $href = $sth->fetchrow_hashref )
{
print "<tr>";
print "<td>$$href{'id'}</td>";
print "<td>$$href{'name'}</td>";
print "<td>$$href{'pos'}</td>";
print "<td>$$href{'role'}</td>";
print "<td>$$href{'notes'}</td>";
#print "<td><input type='text' value=\"$$href{'display_name'}\" id =\"dis-$$href{'windows_id'}\" readonly> </td>";
#print "<td><input type='text' value=\"$$href{'email_id'}\" readonly> </td>";
print "</tr>";
}
print "</table>";
$sth->finish();
$dbh->disconnect();
Database Table structure...
Table data...
Output when i run the file in putty...
Message when i try running the file on my browser..

The two answers you have received previously are complete nonsense. You don't need to to use a CGI object in order to run a CGI program. Of course, it makes it easier, but it's not necessary.
The only part of the CGI protocol that your program needs to handle is the Content-Type header. And you're doing that with your print line.
No, your problem is somewhere else completely. But, unfortunately, it's somewhere where we can be of very little help without knowing a lot more. You're getting a file not found error because the web server can't find your code. In other words, the address that you're typing into your browser (128.9.45.170/~pankaj.yadav/Test/cgi/people.cgi) doesn't match a filename on your web server.
This all comes down to how your web server is configured. How are web addresses mapped onto file paths? We don't know. Only your web server administrator will know the answer for sure.
You might get a clue if you look at the web server error log. You'll see a file not found error in the log which will (hopefully) contain the actual file path that the web server is trying to find. And that might help you work out where you should put your CGI program.

Related

I could not download specific page via perl get, bash command GET and wget

I have an issue with downloading a page,
my $url='http://www.ncbi.nlm.nih.gov/nuccore?linkname=pcassay_nucleotide&from_aid=504934,1806,1805,1674';
I can browse following with a browser but when I run bash command in perl or linux shell,
GET $url >OUTPUT1; # Even it does not write anything to file "OUPUT1"
When I try wget, It downloads but not correct ,I mean with --> <title>Error - Nucleotide - NCBI</title>. I want the page with items , but it returns me a page without items.
my $html = qx{wget --quiet --output-document=OUTPUT1 $url};
**Note: I noticed a few minutes ago, url is ok with Mozilla firefox, but it can not be browsed via google chrome. it is weird, probably my issue related with this too. Any idea?
Code from link:
my $url='http://www.ncbi.nlm.nih.gov/nuccore?linkname=pcassay_nucleotide&from_aid=504934,1806,1805,1674';
my $html = qx{wget --quiet --output-document=OUTPUT11 $url};
# wget get something, but it does not get items, it gets what I get via google chrome
`GET $url2 >OUTPUT11`; # it does not write anything to file,
OK, given your code - the problem is almost certainly one of interpolation. Because the & in your URL is going to be interpreted by the shell you're spawning as 'background this process'.
That's almost certainly not what you want. Why not just use LWP natively?
#!/usr/bin/perl
use strict;
use warnings;
use LWP::Simple;
my $url='http://www.ncbi.nlm.nih.gov/nuccore?linkname=pcassay_nucleotide&from_aid=504934,1806,1805,1674';
my $content = get $url;
print $content;
open ( my $output_fh, '>', 'output.html' ) or die $!;
print {$output_fh} $content;
close ( $output_fh );

Perl Cgi: How to popup confirm with password

How to popup confirm with password (other password not same like ssh password), then execute with ssh?
Package
use CGI qw/:standard/;
use DBI;
use strict;
use Time::Local;
use Net::SSH2;
#use Term::ReadKey; < maybe?
Main Code
my $pw;
my $pws = "1234";
if ($lqcgi->param('sel_cl')){
$selected_action = $lqcgi->param('sel_cl');
print "Password: \n"; #How to popup confirm with password. I don't know!
$pw = <STDIN>;
chomp ($pw);
if ($pw eq $pws){
$ssh2->connect($host{$mxEnv}) or die $!;
$ssh2->auth_password($usr{$mxEnv},$pwd{$mxEnv});
}
}
Even i had this problem, the solution i found is that! Maintaining .htaccess file at my webserver with following entries in the configuration file in webserver.
for more information go here
https://httpd.apache.org/docs/2.4/howto/htaccess.html
which will allow CGI script to pop up authentication pop up and authenticates again your LDAP if LDAP maintained or else you can use remote->user variable to get the user id and have it cross checked against an array or a file.
i hope this might help you.
-Shantesh

Perl CGI To Write File

I have written a Java Applet as a school project and I need a CGI file to create a file in the cgi-bin directory. The problem is when I run the code from the browser, the code executes but my file is not created with the variable name. Nothing is created. Here is the code
#!/usr/bin/perl -wT
use CGI;
print "content-type: text/plain\n\n";
my $q = CGI->new();
my $name = $q->param('username');
my $pw = $q->param('param');
my $bool = $q->param('bool');
my $rel = $q->param('rel');
my $ext = ".txt";
my $strt = "../cgi-bin/";
my $app = $strt . $name . $ext;
print $app;
open (FILE,'>',$app) or print "Error";
print FILE $pw . "\n";
print FILE $bool . "\n";
print FILE $rel;
close(FILE);
When I run the cgi it prints the $app variable and it is the correct address I want but the file is not created. If I change the line
open (FILE,'>',$app) or print "Error";
to
open (FILE,'>','../cgi-bin/test.txt') or print "Error";
it creates the file where I want it. Any ideas why this would happen when using the variable $app? Either way I never get Error printed to the browser.
SOLUTION:
Thanks guys for the help. When using:
use CGI::Carp qw(fatalsToBrowser);
I got this error:
Content-type: text/html
<H1>Software error:</H1>
<PRE>Insecure dependency in open while running with -T switch
</PRE>
<P>
For help, please send mail to the webmaster (or webmaster), giving this error message
and the time and date of the error.
It seems it was not liking the -T. Once I removed that it worked. Thanks again
Why you use ../cgi-bin to write into cgi-bin ?
Just use:
open (FILE, ">$name$ext") or die $!;
and use CGI::Carp qw(fatalsToBrowser); to carp fatals on the browser (suitable for this debug) with file creation
-T is Perl's "tainted data" flag. It stops you from doing unsafe operations with untrusted data. Yes, your script works without the -T flag, but now you have a very insecure script.
If someone passes in a username value of ../../../../../../../../home/badguy/secret, then you will write the username and password into secret.txt in badguy's home directory. -T prevents you from doing that. That's why -T exists.

perl script can not be opened with browser and is ok with Linux shell command

I have a few perl script file which have been up and running for past few years, all of a sudden, for past few days, they were up, down, up, down, ..... There are no syntax error for them, since sometimes they are up, and they have been there for quite a while, i did not change them recently. Plus, I can run them from Lunix shell command without any problem, the file permission is 755, so everything seems to be set up properly. They are hosted by a web hosting company, I have no access to server log file.
The error message is typical perl error message:
"Internal Server Error
The server encountered an internal error or misconfiguration and was unable to complete your request.
Please contact the server administrator and inform them of the time the error occurred, and the actions you performed just before this error.
More information about this error may be available in the server error log."
Add use CGI::Carp qw( fatalsToBrowser ); early in your program to have the error returned to the browser.
Alternatively, you could use the same technique CGI::Carp uses or a wrapper to your script to save the errors in your own log file.
Add the following to the start of a script to have it log errors and warnings to a log file of your choice.
sub self_wrap {
my $log_qfn = "$ENV{HOME}/.web.log"; # Adjust as needed.
open(my $log_fh, '>>', $log_qfn)
or warn("Can't append to log file \"$qfn\": $!"), return;
require IPC::Open3;
require POSIX;
my $prefix = sprintf("[%s] [client %s] ",
POSIX::strptime('', localtime),
$ENV{REMOTE_ADDR} || '???'
);
my $suffix = $ENV{HTTP_REFERER} ? ", $ENV{HTTP_REFERER}" : '';
my $pid = IPC::Open3::open3(
'<&STDIN',
'>&STDOUT',
local *CHILD_STDERR,
$^X, $0, #ARGV
);
while (<CHILD_STDERR>) {
print(STDERR $_);
chomp;
print($log_fh $prefix, $_, $suffix, "\n");
}
waitpid($pid, 0);
POSIX::_exit(($? & 0x7F) ? ($? & 0x7F) | 0x80 : $? >> 8);
}
BEGIN { self_wrap() if !$ENV{WRAPPED}++; }
If your site has recently been transferred onto a different server by your hosting company or the server settings have recently been changed try saving the file with HTML kit by using 'Save as extra' >> 'Save as UNIX format' and then upload.

Display Output In Browser Perl CGI SSH

I'm executing remote commands using Net::OpenSSH using a web frontend. My commands return without failure on the command line, but I get nothing in a web browser. I've done a couple hour research to no avail--any ideas?
Here is some code to give you an example (some removed for obvious reasons).
#!/usr/bin/perl -w
use strict;
use CGI ':standard';
use Net::OpenSSH;
# Here in the code is just the header and standard tags
print "1";
print "2"; # both display
my $ssh = Net::OpenSSH->new($host, user => $uname, key_path => $key); # all works
$ssh- error and die "Can't ssh to host" . $ssh->error;
print "3";
$ssh->system("uname -a") or
die "remote command failed: " . $ssh->error;
my #lsa = $ssh->capture("ls -a");
$ssh->error and
die "remote ls command failed: ". $ssh->error;
print "4";
print "5";
print #lsa; # won't display in browser, just terminal/CLI
Cheers!
I maintain CGI.pm. I recommend these additions to your simple script:
Before you print anything else, print the standard HTTP header: print header();
Add this after the use CGI line: use CGI::Carp qw(fatalsToBrowser); ... that will display any run-time problems in the browser. If you don't get any output after these changes, check that the script compiles with perl -cw script.pl
Below is about the minimum Perl code that worked for me on Debian machine. I suggest you go through it and compare it to your actual code.
However, it did not work out-of-the box on my Debian, I had make some decisions most of which probably aren't very safe, but that's more about specific environment:
make home for user that server runs writable (/var/www)
add host to ~/.ssh/known_hosts beforehand
use the strict_mode => 0 to bypass Net::OpenSSH's security checks instead of finding proper
ctl_dir (Net::OpenSSH requires that the folder and all above folders are 0755 or more strict,
so /tmp I used is normally not good enough)
I believe there are much safer practices than that, but as I said, that's specific to environment.
So the code:
#!/usr/bin/perl
use strict;
use warnings;
use Net::OpenSSH;
use File::Temp qw/ tempdir /;
# necessary minimum for CGI
print "Content-type: text/plain\n\n";
# prepare temp dir
my $temp = tempdir("/tmp/sshme.pl-XXXXXXXX", CLEANUP => 1);
# open SSH session
my %opts = (
user => "user",
password => "password",
ctl_dir => $temp,
strict_mode => 0 ## NOT recommended - see my comments
);
my $ssh = Net::OpenSSH->new("host", %opts);
$ssh->error
and die "Couldn't establish SSH connection: ". $ssh->error;
# perform command and print output
my #lines = $ssh->capture("ls")
or die "remote command failed: " . $ssh->error;
print #lines;
Perhaps your errors get directed to standard error, not standard output. In that case, they'll usually end up in the server log, not the browser window. Perhaps you can use POSIX::dup2 to avoid this:
use POSIX;
# Make sure to send HTTP headers before redirecting streams!
POSIX::close(2); # close original stderr stream. No more output to apache logs!!!
POSIX::dup2(1,2); # redirect error stream to standard output, so errors will appear in browser.
Processes launched by perl, like e.g. some ssh binary, will inherit these streams.