check return code from curl and if not 0 email admin - perl

I am trying to check status of google aws servers using similar line to below:
time curl -k https://blablabla.azazonaws.com/ratingsvc
If the connection failed, times out, etc. I would like to email admin.
I am having trouble getting the error code into a variable

To get further control of the HTTP GET (return codes, analyze the message contents, etc), I recommend the use of LWP::UserAgent
use LWP::UserAgent;
my $ua = LWP::UserAgent->new;
my $response = $ua->get('https://blablabla.azazonaws.com/ratingsvc');
and $response is an instance of HTTP::Response, so you can get total control of the GET response.

You can do this to obtain the returned exit code:
use strict;
use warnings;
my $command = 'time curl -k https://blablabla.azazonaws.com/ratingsvc';
my $return_code = system($command);
if ($return_code == 0) {
print "Everything went well...\n";
} else {
print "Oops... curl exited with code $return_code\n";
}
That will also print STDOUT and STDERR from curl to the screen but it will not affect your program. If you don't want to see the output of the command, you can add something to the end like >/dev/null, you'll still have your returned exit code.
HTH
Francisco

Related

Perl url encoding using Curl

I am having an issue where I am using cURL inside a perl script to execute a http request. I believe my issue is related to special characters in the URL string but I cannot figure out how to make it work.
I can confirm that the URL is correct as I can run it from my browser.
My perl script is
#!/usr/bin/perl
use strict;
use warnings;
$url = "http://machine/callResync?start=2017-02-01 00.00.00.000&end=2017-02-01 23.23.999";
system "curl $url
It fails when it reaches the first whitespace. I tired to escape that using %20.
After that I put in %26 to escape the & but then I get another issue. I have tired a number of different combinations but it keeps failing.
Any idea's.
Use the URI module to correctly build a URL, and rather than shelling out to cURL you should use a Perl library like LWP::Simple to access the page
The disadvantage of LWP::Simple is that it may be too simple in that it provides no diagnostics if the transaction fails. If you find you need something more elaborate then you should look at
HTTP::Tiny,
LWP::UserAgent, or
Mojo::UserAgent.
If you need help with these then please ask
use strict;
use warnings 'all';
use URI;
use LWP::Simple 'get';
my $url = URI->new('http://machine/callResync');
$url->query_form(
start => '2017-02-01 00.00.00.000',
end => '2017-02-01 23.23.999',
);
my $content = get($url) or die "Failed to access URL";
Problem number 1: You used an invalid URL. Spaces can't appear in URLs.
my $url = "http://machine/callResync?start=2017-02-01%2000.00.00.000&end=2017-02-01%2023.23.999";
Problem number 2: Shell injection error. You didn't correctly form your shell command.
system('curl', $url);
or
use String::ShellQuote qw( shell_quote );
my $cmd = shell_quote('curl', $url);
system($cmd);

Email script not working on IIS

I have the following test script to send an email:
use strict;
use Net::SMTP;
print "Content-Type: text/plain\n\n";
print "Sending email...\n";
my $smtp = Net::SMTP->new('10.0.0.1', Port => 25, Timeout => 10, Debug => 1);
$smtp->mail("user1\#domain.local");
$smtp->to("user2\#domain.local");
$smtp->data();
$smtp->datasend("From: user1\#domain.local\n");
$smtp->datasend("To: user2\#domain.local\n");
$smtp->datasend("Subject: Test\n\n");
$smtp->datasend("Testing 1 2 3\n");
$smtp->datasend();
$smtp->quit;
It works fine when I run it from the command line, I get the email right away. But when I put it in C:\inetpub\wwwroot and run it from a web browser, I get the Sending email... text but then nothing. No email is sent, no error message is shown. I looked at the mail server log and no connection is even made. I'm not sure why it's working from cmd but not from IIS. Is there some extra configuration needed for the script to do this through IIS?
I also tried with sendmail() and get similar results.
First of all, add the following headers to the file
use strict 'vars';
use warnings;
use diagnostics;
use feature qw/say/;
use CGI::Carp qw(warningsToBrowser fatalsToBrowser);
... your code...
# Print Warnings
warningsToBrowser(1);
That will give you more information if it's failing or throwing a warning on something.
Secondly, how did you install Net::SMTP? Make sure it, and all it's dependancies have permissions by the IIS worker process.

Why isn't this perl cgi script redirecting?

I have a perl cgi script that is exactly the following:
#!/usr/bin/perl -w
use CGI;
$query = new CGI;
print $query->redirect("http://www.yahoo.com");
At the command line things look OK:
$perl test.pl
Status: 302 Moved
Location: http://www.yahoo.com
When I load it in the browser, http://localhost/cgi-bin/test.pl, the request gets aborted, and depending on the browser I get various messages:
Connection reset by server.
No data received.
The only research I could find on this issue, stated that a common problem is printing some data or header before the redirect call, but I am clearly not doing that here.
I'm hosting it from a QNX box with the default slinger server.
The code works fine on my machine, check the following
Check the error logs, eg: tail /var/log/http/error_log
Do the chmod/chown permissions match other working CGi scripts, compare using ls -l
Does printing the standard hello world work? Change your print statement to
print $query->header(), 'Hello World';
Add the following for better errors
use warnings;
use diagnostics;
use CGI::Carp 'fatalsToBrowser';
at the command line use slinger will return some basic use options. For logging you need both syslogd and -d enabled in slinger. Ie
slinger -d &
Then look to /var/log/syslog for errors

Perl expect output of executed command

i wrote a little script which executes the ls command. Now I want to store the output of that command in a variable to work with it in the further perl script. Unfortunately I cannot grab the output from the ls command.
my ($core) = "servername";
my $tel_con = "ssh ".$core. " -l $user";
$ssh = Expect->spawn("$tel_con");
$ssh->log_stdout(1);
unless ( $ssh->expect( 5, -re => '$' ) ) {
return "Never connected " . $ssh->exp_error() . "\n";
}
sleep 1;
$ssh->send_slow(0, "ls -l\r");
$ssh->clear_accum();
my $str = $ssh->exp_after();
print "STR = '$str'\n";
Maybe you can give me some help please?
use Net::OpenSSH;
my $ssh = Net::OpenSSH->new($core, user => $user);
$ssh->error and die $ssh->error;
my $output = $ssh->capture('ls -l');
print "command output:\n$output\n\n";
In case you can not or don't want to use Net::OpenSSH you may do:
my #output = $exp->expect(5);
print 'OUT: '.$output[3].'END';
To get the whole output (including the used command, return string, console information)
you could call expect in a seperate process and grab the output via qx or open a pipe
What is send_slow? Depending on how this command sends the ls command, the output can be received in different ways.
Most probably, the error output of the command is stored in the $? variable, possibly byte-shifted.
It seems that Expect will redirect everything to STDOUT and log it internally. Since you enabled output with $ssh->log_stdout(1), you should be able to get the results of ls -l directly from STDOUT (maybe by redirecting standard out to a variable). You can also see try grab the data from the internal logging. Just make sure to grab the output before doing clear_accum().
From CPAN: $object->send_slow($delay, #strings);
... After each character $object will be checked to determine whether or not it has any new data ready and if so update the accumulator for future expect() calls and print the output to STDOUT and #listen_group if log_stdout and log_group are appropriately set.
Layman's terms?
I must obligatorily post this link - I still go there from time-to-time, it is the most layman-y explanation I've ever found of all things regex:
http://www.anaesthetist.com/mnm/perl/Findex.htm

How do I get the output of curl into a variable in Perl if I invoke it using backtics?

I'm trying to get the response of a curl call into a variable in perl.
my $foo = `curl yadd yadda`;
print $foo;
does not work. When I run this at the command line the curl call prints all its output correctly in the terminal, but the variable is not filled with that data.
Is there a way to do this without installing and calling the Perl curl lib?
It probably sends its stuff to stderr. Try
my $foo = `curl yadd yadda 2>&1`;
You also might consider looking at LWP::UserAgent or even LWP::Simple.
What do you really want to do? Use curl at all costs, or grab the contents of a web page?
A more perlish way of doing this (which relies on no external programs that may or may not be installed on the next machine where you need to do this) would be:
use LWP::Simple;
my $content = get("http://stackoverflow.com/questions/1015438/")
or die "no such luck\n";
If you want to see why the GET failed, or grab multiple pages from the same site, you'll need to use a bit more machinery. perldoc lwpcook will get you started.
In the shell 2> means redirect fileno 2. Fileno 2 is always what a program sees as stderr. Similarly, fileno 0 is stdin and fileno 1 is stdout. So, when you say 2>&1 you are telling the shell to redirect stderr (fileno 2) into stdout (fileno 1). Since the backticks operator uses the the shell to run the command you specify, you can use shell redirection, so
my $foo = `curl yadda yadda 2>&1`;
is telling curl to redirect its output into stdout, and since the backtick operator catches stdout, you get what you were looking for.
Very old post, but the real way of using curl in backticks is using the appropriate switch of curl.
This switch is -o which defines where to send the output.
More from the curl man page:
Specifying the output as '-' (a single dash) will force the output to
be done to stdout.
This also prevents having possible errors in $foo, which would happen if you redirect the complete STDERR to STDOUT on errors:
my $foo = `curl -o - yadd yadda`;
Try this:
$var = `curl "http://localhost" 2>/dev/null`;
print length($var)
curl displays progress information on stderr, redirecting that to /dev/null makes it easier to see what's going on.
This works on my system:
#!/usr/bin/perl
use strict;
use warnings;
my $output = `curl www.unur.com`;
print $output;
__END__
C:\> z1
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN"
"http://www.w3.org/TR/html4/strict.dtd"><html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
etc.
You can open a pipe as if it were a file.
$url = "\"http://download.finance.yahoo.com/d/quotes.csv?s=" .
"$symbol&f=sl1d1t1c1ohgvper&e=.csv\"";
open CURL, "curl -s $url |" or die "single_stock_quote: Can't open curl $!\n";
$line = <CURL>;
close CURL;
It might be that some of the output you want to capture is in standard err, not standard out. Try this:
my $foo = system "curl http://www.stackoverflow.com";
print $foo;