Why does my Net::Telnet program timeout? - perl

I'm written small code to connect to remote server using Perl but observing error messages
#!/usr/bin/perl -w
use Net::Telnet;
$telnet = new Net::Telnet ( Timeout=>60, Errmode=>'die');
$telnet->open('192.168.50.40');
$telnet->waitfor('/login:/');
$telnet->print('queen');
$telnet->waitfor('/password:/');
$telnet->print('kinG!');
$telnet->waitfor('/:/');
$telnet->print('vol >> C:\result.txt');
$telnet->waitfor('/:/');
$telnet->cmd("mkdir vol");
$telnet->print('mkdir vol234');
$telnet->cmd("mkdir vol1");
$telnet->waitfor('/\$ $/i');
$telnet->print('whoamI');
print $output;
But while running i'm getting following errors
C:\>perl -c E:\test\net.pl
E:\test\net.pl syntax OK
C:\>perl E:\test\net.pl
command timed-out at E:\test\net.pl line 13
C:\>
Help me in this regard. I'm new to Perl.

I'm not sure about that Net::Telnet, but using '/:/' (with quotes) I guess is the problem. I.e. /:/ (within slashes) - that's regular expression, but with quotes that's simply string which should appear on terminal (i.e. it waits for string '/:/' - slash, two dots, slash).
To debug such programs (if Net::Telnet doesn't echo interaction with remote system) you can simply put: print "I'm waiting for login...\n" at lines before waitfor()

Why aren't you using Net::Telnet's login method to log-in? When you're at such a low level, you have to handle all of the details yourself. If you look in the source for that method, you'll see it doing quite a bit of work, including a kludge to get around a login bug on Linux.

Related

Perl as a batch-script tool - fully piping child processes?

Apologies if some of the terminology may be slighlty off here. Feel free to correct me if I use a wrong term for something.
Is it possible to use Perl as an "advanced shell" for running "batch" scripts? (on Windows)
The problem I face when replacing a .bat/.cmd script that's getting too complicated with a perl script is that I can't easily run sub processes as a shell does.
That is, I would like to do the same thing from my perl script as a shell does when invoking a child process, that is, fully "connecting" STDIN, STDOUT and STDERR.
Example:
foo.bat -
#echo off
echo Hello, this is a simple script.
set PARAM_X=really-simple
:: The next line will allow me to simply "use" the tool on the shell I have open,
:: that is STDOUT + STDERR of the tool are displayed on my STDOUT + STDERR and if
:: I enter something on the keyboard it is sent to the tools STDIN
interactive_commandline_tool.exe %PARAM_X%
echo The tool returned %ERROLEVEL%
However, I have no clue how to fully implement this in perl (is it possible at all?):
foo.pl -
print "Hello, this is a not so simple script.\n";
my $param_x = get_more_complicated_parameter();
# Magic: This sub executes the process and hooks up my STDIN/OUT/ERR and
# returns the process error code when done
my $errlvl = run_executable("interactive_commandline_tool.exe", $param_x);
print "The tool returned $errlvl\n";
How can I achieve this in perl? I played around with IPC::Open3 but it seems this doesn't do the trick ...
Probably you'll find IPC::Run3 useful. It allow you to capture both STDOUT and STDERR (but not pipe them in real time). Command error level will be returned in $?.
Why not this way:
print "Hello, this is a not so simple script.\n";
my $param_x = get_more_complicated_parameter();
system('cmd.exe', $param_x);
my $errorlevel = $? >> 8;
print "The tool returned $errorlevel\n";
sub get_more_complicated_parameter { 42 }
I don't have your interactive program, but the shell executed allowed me to enter commands, it has inherited environment defined in perl, etc.
I am using perl as replacement for more complicated shell scripts on Windows for long time and so far everything I needed was possible.

How can I get entire command line string?

I'm writing a perl script that mimics gcc. This my script needs to process some stdout from gcc. The part for processing is done, but I can't get the simple part working: how can I forward all the command line parameters as is to the next process (gcc in my case). Command lines sent to gcc tend to be very long and can potentially contain lots of escape sequences and I don't want now to play that game with escaping and I know that it's tricky to get it right on windows in complicated cases.
Basically,
gcc.pl some crazies\ t\\ "command line\"" and that gcc.pl has to forward that same command line to real gcc.exe (I use windows).
I do it like that: open("gcc.exe $cmdline 2>&1 |") so that stderr from gcc is fed to stdout and I my perl script processes stdout. The problem is that I can't find anywhere how to construct that $cmdline.
I would use AnyEvent::Subprocess:
use AnyEvent::Subprocess;
my $process_line = sub { say "got line: $_[0]" };
my $gcc = AnyEvent::Subprocess->new(
code => ['gcc.exe', #ARGV],
delegates => [ 'CompletionCondvar', 'StandardHandles', {
MonitorHandle => {
handle => 'stdout',
callback => $process_line,
}}, {
MonitorHandle => {
handle => 'stderr',
callback => $process_line,
}},
],
);
my $running = $gcc->run;
my $done = $running->recv;
$done->is_success or die "OH NOES";
say "it worked";
The MonitorHandle delegate works like redirection, except you have the option of using a separate filter for each of stdout and stderr. The "code" arg is an arrayref representing a command to run.
"Safe Pipe Opens" in the perlipc documentation describes how to get another command's output without having to worry about how the shell will parse it. The technique is typically used for securely handling untrusted inputs, but it also spares you the error-prone task of correctly escaping all the arguments.
Because it sidesteps the shell, you'll need to create the effect of 2>&1 yourself, but as you'll see below, it's straightforward to do.
#! /usr/bin/perl
use warnings;
use strict;
my $pid = open my $fromgcc, "-|";
die "$0: fork: $!" unless defined $pid;
if ($pid) {
while (<$fromgcc>) {
print "got: $_";
}
}
else {
# 2>&1
open STDERR, ">&STDOUT" or warn "$0: dup STDERR: $!";
no warnings "exec"; # so we can write our own message
exec "gcc", #ARGV or die "$0: exec: $!";
}
Windows proper does not support open FH, "-|", but Cygwin does so happily:
$ ./gcc.pl foo.c
got: gcc: foo.c: No such file or directory
got: gcc: no input files
Read up on the exec function and the system function in Perl.
If you provide either of these with an array of arguments (rather than a single string), it invokes the Unix execve() function or a close relative directly, without letting the shell interpret anything, exactly as you need it to do.
Thanks for answers, I came to conclusion that I made a big mistake that I touched perl again: hours of time wasted to find out that it can't be done properly.
Perl uses different way to split command line parameters than all other apps that use MS stdlib (which is standard on win32).
Because of that some commandline parameters that were meant to be interpreted as a signle commandline argument, by perl can be interpreted as more than one argument. That means that all what I'm trying to do is waste of time because of that buggy behavior in perl. It's impossible to get this task done correctly if I 1) can't access original command line as is and 2) perl doesn't split command line arguments correctly.
as a simple test:
script.pl """test |test"
on win32 will incorrectly interpret command line as:
ARGV=['"test', '|test']
Whereas, the correct "answer" on windows has to be
ARGV=['"test |test']
I used activestate perl, I tried also latest version of strawberry perl: both suck. It appears that perl that comes with msys works properly, most likely because it was built against mingw instead of cygwin runtime?..
The problem and reason with perl is that it has buggy cmd line parser and it won't work on windows NO MATTER WHAT cygwin supports or not.
I have a simple case where an environment variable (which I cannot control) expands to
perl gcc.pl -c "-IC:\ffmpeg\lib_avutil\" rest of args
Perl sees that I have two args only: -c and '-IC:\ffmpeg\lib_avutil" rest of args'
whereas any conforming windows implementation receives second cmd line arg as: '-IC:\ffmpeg\lib_avutil\', that mean that perl is a huge pile of junk for my simple case, because it doesn't provide adequate means to access cmd line arguments. I'm better off using boost::regex and do all my parsing in c++ directly, at least I won't ever make dumb mistakes like ne and != for comparing strings etc. Windows's escaping rules for command line arguments are quite strange, but they are standard on windows and perl for some strange reason doesn't want to follow OS's rules.

How can I send POST and GET data to a Perl CGI script via the command line?

I am trying to send a get or a post through a command-line argument. That is test the script in the command line before I test through a browser (the server has issues). I tried searching online, and I suppose I was probably using incorrect terminology because I got nothing. I know this is possible because I saw someone do it. I just don't remember how it was done.
Thanks! :)
To test a CGI program from the command line, you fake the environment that the server creates for the program. CGI.pm has a special offline mode, but often I find it easier not to use because of the extra setup I need to do for everything else my programs typically expect.
Depending on the implementation of your script, this involves setting many environment variables, which you can do from a wrapper script that pretends to be the server:
#!/bin/bash
export HTTP_COOKIE=...
export HTTP_HOST=test.example.com
export HTTP_REFERER=...
export HTTP_USER_AGENT=...
export PATH_INFO=
export QUERY_STRING=$(cat query_string);
export REQUEST_METHOD=GET
perl program.cgi
If you're doing this for a POST request, the environment is slightly different and you need to supply the POST data on standard input:
#!/bin/bash
export CONTENT_LENGTH=$(perl -e "print -s q/post_data/");
export HTTP_COOKIE=...
export HTTP_HOST=test.example.com
export HTTP_REFERER=...
export HTTP_USER_AGENT=...
export PATH_INFO=...
export QUERY_STRING=$(cat query_string);
export REQUEST_METHOD=POST
perl program.cgi < post_data
You can make this as fancy as you need and each time you want to test the program, you change up the data in the query_string or post_data files. If you don't want to do this in a shell script, it's just as easy to make a wrapper Perl script.
Are you using the standard CGI module?
For example, with the following program (notice -debug in the arguments to use CGI)
#! /usr/bin/perl
use warnings;
use strict;
use CGI qw/ :standard -debug /;
print "Content-type: text/plain\n\n",
map { $_ . " => " . param($_) . "\n" }
param;
you feed it parameters on the command line:
$ ./prog.cgi foo=bar baz=quux
Content-type: text/plain
foo => bar
baz => quux
You can also do so via the standard input:
$ ./prog.cgi
(offline mode: enter name=value pairs on standard input; press ^D or ^Z when done)
foo=bar
baz=quux
^D
Content-type: text/plain
foo => bar
baz => quux
Old discussion, but I was looking for the same answers - so for those who follow - this is what I found out
RTFM! from the CGI man page ( and there is more )
DEBUGGING
If you are running the script from the command line or in the perl
debugger, you can pass the script a list of keywords or parameter=value
pairs on the command line or from standard input (you don't have to
worry about tricking your script into reading from environment
variables). You can pass keywords like this:
your_script.pl keyword1 keyword2 keyword3
or this:
your_script.pl keyword1+keyword2+keyword3
or this:
your_script.pl name1=value1 name2=value2
or this:
your_script.pl name1=value1&name2=value2
To turn off this feature, use the -no_debug pragma.
If you don't want to alter the perl script, you can call it with at least two environment variables set, as others mentioned already. To simulate a GET request:
shell$ QUERY_STRING=limit=20 REQUEST_METHOD=GET ./events_html.pl
That's the console shortcut for www.myserver.org/events_html.pl?limit=20
Yes, it's possible to do this from the command line, bypassing your server. This page explains all: Perl CGI debugging (sitewizard.com) (Especially item 6 on that page). Here I quote the most important part:
To test the script offline using the
GET method, simply set the
QUERY_STRING environment variable
accordingly. If you are using Windows,
you might use the following command
line in a DOS window prior to running
the script in the same window:
set QUERY_STRING=recipient=John#Doe.com&Fullname=M+Name
To test the script offline using the
POST method, put the line below into a
text file named, say, testinput.txt.
recipient=John#Doe.com&Fullname=M+Name
Then redirect that file as an input to
the script. On Unix systems as well as
under Windows' MSDOS prompt, you can
do it this way:
perl -w scriptname.pl < testinput.txt
Your script will then receive that
input as though it was sent it by a
form on the website. Check the error
messages that perl spouts, if any, to
help you track the problem in the
script.
To give a cgi script post data:
$ echo -n 'a=b;c=d' | REQUEST_METHOD=POST CONTENT_LENGTH=999 perl index.cgi
To give a cgi script get data:
$ perl index.cgi 'a=b;c=d'
LWP comes with ready made scripts that can be used from the command-line. Check for GET and POST scripts in your system.
In Windows, you can use VBScript to write a command line util that calls into the MS XML library:
Dim XMLHttp : Set XMLHttp = CreateObject("Microsoft.XMLHTTP")
On Error Resume Next
strIPAddress = WScript.Arguments(0)
strMACAddress = WScript.Arguments(1)
strSubnetMask = WScript.Arguments(2)
On Error Goto 0
WScript.Echo "Attempting to wake host " & strIPAddress & " on NIC " & strMACAddress &
"using netmask " & strSubnetMask
strGetUrl = http://wolService/WolService/WolService.asmx/WakeBroadcast?hostIP=" &
strIPAddress & "&macAddress=" & strMACAddress & "&subnetMask=" & strSubnetMask
XMLHttp.Open "GET", strGetUrl, False
XMLHttp.Send ""
WScript.Echo XMLHttp.ResponseText
Edit: This script sends HTTP requests and can be used from the command line. I got confused by the question 'How can I send POST and GET data to a Perl CGI script via the command line' and thought this was about sending POST and GET data to a Perl CGI script via the command line from an unspecified client OS.

Why does TextMate always complain 'Can't find string terminator '"'' when it runs a Perl script?

I have a long-ish Perl script that runs just fine, but always gives this warning:
Can't find string terminator '"' anywhere before EOF at -e line 1
I've read elsewhere online that this is because of a misuse of single or double quotes and the error generally stops the script from running, but mine works. I'm pretty sure I've used my quotes correctly.
Is there anything else that could cause this warning?
EDIT:
I'm running the script via TextMate, which may be spawning a new Perl process to run my script.
I actually get the error when I run simple scripts as well, like this one:
#!/usr/bin/perl -w
use strict;
use warnings;
print "Hello world.";
Yes, you are right, your script does that in TextMate when I try it too.
Simple solution: don't run it using TextMate; just use the command line:
cd Projectdirectory
chmod +x myscript.pl
./myscript.pl
Hello world
More complex solution: tell TextMate that their application is broken and wait for them to fix it. The error is coming from some other Perl script that TextMate is invoking. Even a completely blank file run as Perl in TextMate fails with this error.
-Alex
The "at -e line 1" bit means it's coming from a one-liner. I suspect your long script is somewhere starting a separate perl process (possibly indirectly), and that perl is what is giving the error (and not doing whatever it is supposed to do.)
Start the debugger by doing
perl -d ./youscript.pl
Then keep pressing n[ENTER] (or just ENTER after you press n once) until you see the warning - the line that was just executed is your culprit. n stands for the next debugger directive btw.

How can I get the command-line output of a DOS tool using Perl?

I want to meassure the throughput of a link using Windows build-in FTP tool inside a Perl script. Therefore the script creates the following command script:
open <ip>
<username>
<password>
hash
get 500k.txt
quit
Afterwards I run the command script using the following Perl code:
system(#args);
#args = ("ftp", "-s:c:\\ftp_dl.txt");
system(#args);
If I run the command inside a DOS-box the output looks like this:
ftp> open <ip>
Connected to <ip>
220 "Welcome to the fast and fabulous DUFTP005 ftp-server :-) "
User (<ip>:(none)):
331 Please specify the password.
230 Login successful.
ftp> hash
Hash mark printing On ftp: (2048 bytes/hash mark) .
ftp> get 500k.txt
200 PORT command successful. Consider using PASV.
150 Opening BINARY mode data connection for 500k.txt (14336 bytes).
#######
226 File send OK.
ftp: 14336 bytes received in 0.00Seconds 14336000.00Kbytes/sec.
ftp> quit
221 Goodbye.
To be able to get the throughput I need the extract that line:
ftp: 14336 bytes received in 0.00Seconds 14336000.00Kbytes/sec.
I'm not very familiar with Perl. Does anybody have an idea how to get that line?
Use either open in pipe mode:
open($filehandle, "$command|") or die "did not work: $! $?";
while(<$filehandle>)
{
#do something with $_
}
or use backticks:
my #programoutput=`$command`
You can't get the output with system().
Instead use bactkicks:
my $throughput = 0;
my $output = `ftp -s:c:\\ftp_dl.txt`;
if (($? == 0) && ($output =~ /([\d+\.]+)\s*K?bytes\/sec/m)) {
$throughput = $1;
}
$output will contain all the lines from the execution of the ftp command (but not any error message sent to STDERR).
Then we check if ftp returned success (0) and if we got a throughput somewhere in the output.
If so, we set $throughput to it.
This being Perl, there are many ways to do this:
You could also use the Net::FTP module that supports Windows to deal with the file transfer and use a timing module like Time::HiRes to time it and calculate your throughput.
This way you won't depend on the ftp program (your script would not work on localised version of Windows for instance without much re-work, and you need to rely on the ftp program to be installed and in the same location).
See perlfaq8, which has several answers that deal with this topic. The ones you probably need for this question are:
Why can't I get the output of a command with system()?
How can I capture STDERR from an external command?
Also, you might be interested in some of the IPC (Interprocess Communication) Perl modules that come in the standard library:
IPC::Open2
IPC::Open3
Some of the Perl documentation might also help:
perlipc - Perl interprocess communication
perlopentut - Perl open tutorial
If you're not familiar with the Perl documentation, you might check out my Perl documentation documentation.
Good luck,
You should try and use libcurl which is more suited for the task.
There is an easy to use API