perl -- check if file handle is stdin - perl

I am using select with a TCP server. I want to add STDIN to the select filehandle set.
#!/usr/bin/perl
use IO::Select;
use IO::Socket::INET;
my $sock = IO::Socket::INET->new(LocalPort => $serv_listen_port, Proto => 'tcp', List en=> 1);
my $s = IO::Select->new();
$s->add(\*STDIN); #want to be responsive to user input (allow me to type commands for example)
$s->add($sock);
#readytoread=$s->can_read(1); #timeout = 1sec
foreach $readable (#readytoread) {
if ($readable==$sock) {
#This was a listen request, I accept and add new client here
}
if ($readable == STDIN){ #what to do on this line?
#This is user typing input into server on terminal
}
}
Need help with 4th to last line in the code here.

$readable->fileno == fileno STDIN
Or, if you're comfortable with that, fileno STDIN is zero, which you can check directly.

can_read returns the exact value passed to add, so you can simply use
$readable == \*STDIN

Related

Perl print backticks STDOUT

I am writing a very simple bind shell in Perl that is supposed to open a specific port on windows and when I connect to it with ncat, I should be able to write commands from ncat, those be executed on the machine and to return me STDOUT.
Here is my code so far:
use strict; use IO::Socket;
my($sock, $newmsg, $cmd, $MAXLEN, $PORTNO);
$MAXLEN = 2048;
$PORTNO = 4444;
$sock = IO::Socket::INET->new(LocalPort => $PORTNO, Proto => 'udp');
while ($sock->recv($newmsg, $MAXLEN)) {
my($port, $ipaddr) = sockaddr_in($sock->peername);
my #cmd = qx($newmsg);
print(#cmd);
$sock->send(#cmd);
}
If I type something easy like echo Hello world it returns as expected, Hello World. Problem arises when I type dir or something that includes new lines(or so I have guessed, I'm pretty new with perl).
When typing dir, the result gets printed on the perl shell (as expected, due to print(#cmd), but then also prints usage: $sock->send(BUF, [FLAGS, [TO]]) at -e line 1. and exits, but I never receive the message in ncat.
I am running the command from a command shell as a one-liner:
perl -e "use strict; use IO::Socket; my($sock, $newmsg, $cmd, $MAXLEN, $PORTNO); $MAXLEN = 1024; $PORTNO = 4444; $sock = IO::Socket::INET->new(LocalPort => $PORTNO, Proto => 'udp'); while ($sock->recv($newmsg, $MAXLEN)) {my($port, $ipaddr) = sockaddr_in($sock->peername);my #cmd = qx($newmsg);print(#cmd);$sock->send(#cmd);}"
How can I do it so that it prints things as dir?
Your problem is with $sock->send(#cmd). The send method expects the message to be a scalar not an array. You can fix this by the following
my $CRLF="\x0D\x0A"; # Network line break
$sock->send(join $CRLF, #cmd); # Convert #command into a single string

use of uninitialized value. How can I fix this error?

#!/usr/bin/perl
use Net::SSH::Expect;
use warnings;
use strict;
#my($stdout, $stderr, $exit) = $ssh->cmd("ls -l /home/$usr")
# Making an ssh connection with user-password authentication
# 1) construct the object
my $ssh = Net::SSH::Expect->new (
host => "host",
password=> 'pwd',
user => 'user',
raw_pty => 1
#Expect=>log_file("finally.txt")
);
# 2) logon to the SSH server using those credentials.
# test the login output to make sure we had success
my $login_output = $ssh->login();
if ($login_output !~ /Welcome/) {
die "Login has failed. Login output was $login_output";
}
# disable terminal translations and echo on the SSH server
# executing on the server the stty command:
$ssh->exec("stty raw -echo");
my $stdout = $ssh->send(chr(13));
my $stdout2 = $ssh->send("SDT-FI");
my $stdout3 = $ssh->send("ENG");
my $stdout4 = $ssh->send('SORT FI-WIP "84144"');
my $stdout5 = $ssh->send(chr(13));
my $stdout6 = $ssh->send("OFF");
my $stdout7 = $ssh->send(chr(13));
print($stdout3);
#$expect->log_file("adp-n.txt");
#y $line;
# returns the next line, removing it from the input stream:
# while ( defined ($line = $ssh->read_all()) ) {
# print $line . "\n";
#}
So i am trying to print $stdout3 so i can get information about the output
but i keep getting " use of uninitialized value $stdout3 in print at connnn3.pl line 50"
is there something in my code wrong?
how can i fix this?
UPDATE, SOLVED!
The reason why it was returning "use of uninitialized value" was because the function
send()
Is void, so instead i used
exec()
And that solved it
From the documentation of Net::SSH::Expect:
void send($string) - sends $string to the SSH server, returns nothing
Thus, send obviously returns nothing (void) and that's why you get this warning when trying to print the (non-existing) return value of send. If you want to get data back from the server use peek, eat, read_all or similar as documented.

Better way to handle perl sockets to read/write to active proccess

First of all I would thank you guys not offering a work around as a solution (although it would be cool to know other ways to do it). I was setting up tg-master project (telegram for cli) to be used by check_mk alert plugin. I found out that telegram runs on a stdin/stdout proccess so I tought it would be cool to "glue" it, so i wrote with a lot of building blocks from blogs and cpan the next 2 pieces of code. They already work (i need to handle broken pipes sometimes) but I was wondering if sharing this could come from some experts new ideas.
As you could see my code relies on a eval with a die reading from spawned process, and I know is not the best way to do it. Any suggestions? :D
Thank you guys
Server
use strict;
use IO::Socket::INET;
use IPC::Open2;
use POSIX;
our $pid;
use sigtrap qw/handler signal_handler normal-signals/;
sub signal_handler {
print "what a signal $!\nlets kill $pid\n";
kill 'SIGKILL', $pid;
#die "Caught a signal $!";
}
# auto-flush on socket
$| = 1;
# creating a listening socket
my $socket = new IO::Socket::INET(
LocalHost => '0.0.0.0',
LocalPort => '7777',
Proto => 'tcp',
Listen => 5,
Reuse => 1
);
die "cannot create socket $!\n" unless $socket;
print "server waiting for client connection on port 7777\n";
my ( $read_proc, $write_proc );
my ( $uid, $gid ) = ( getpwnam "nagios" )[ 2, 3 ];
POSIX::setgid($gid); # GID must be set before UID!
POSIX::setuid($uid);
$pid = open2( $read_proc, $write_proc, '/usr/bin/telegram' );
#flush first messages;
eval {
local $SIG{ALRM} = sub { die "Timeout" }; # alarm handler
alarm(1);
while (<$read_proc>) { }
};
while (1) {
my $client_socket = $socket->accept();
my $client_address = $client_socket->peerhost();
my $client_port = $client_socket->peerport();
print "connection from $client_address:$client_port\n";
# read until \n
my $data = "";
$data = $client_socket->getline();
# write to spawned process stdin the line we got on $data
print $write_proc $data;
$data = "";
eval {
local $SIG{ALRM} = sub { die "Timeout" }; # alarm handler
alarm(1);
while (<$read_proc>) {
$client_socket->send($_);
}
};
# notify client that response has been sent
shutdown( $client_socket, 1 );
}
$socket->close();
Client
echo "contact_list" | nc localhost 7777
or
echo "msg user#12345 NAGIOS ALERT ... etc" | nc localhost 7777
or
some other perl script =)
If you are going to implement a script that performs both reads and writes from/to different handles, consider using select (the one defined as select RBITS,WBITS,EBITS,TIMEOUT in the documentation). In this case you will totally avoid using alarm with a signal handler in eval to handle a timeout, and will only have one loop with all of the work happening inside it.
Here is an example of a program that reads from both a process opened with open2 and a network socket, not using alarm at all:
use strict;
use warnings;
use IO::Socket;
use IPC::Open2;
use constant MAXLENGTH => 1024;
my $socket = IO::Socket::INET->new(
Listen => SOMAXCONN,
LocalHost => '0.0.0.0',
LocalPort => 7777,
Reuse => 1,
);
# accepting just one connection
print "waiting for connection...\n";
my $remote = $socket->accept();
print "remote client connected\n";
# simple example of the program writing something
my $pid = open2(my $localread, my $localwrite, "sh -c 'while : ; do echo boom; sleep 1 ; done'");
for ( ; ; ) {
# cleanup vectors for select
my $rin = '';
my $win = '';
my $ein = '';
# will wait for a possibility to read from these two descriptors
vec($rin, fileno($localread), 1) = 1;
vec($rin, fileno($remote), 1) = 1;
# now wait
select($rin, $win, $ein, undef);
# check which one is ready. read with sysread, not <>, as select doc warns
if (vec($rin, fileno($localread), 1)) {
print "read from local process: ";
sysread($localread, my $data, MAXLENGTH);
print $data;
}
if (vec($rin, fileno($remote), 1)) {
print "read from remote client: ";
sysread($remote, my $data, MAXLENGTH);
print $data;
}
}
In the real production code you will need to carefully check for errors returned by various function (socket creation, open2, accept, and select).

Resolving issue with Net::OpenSSH and passing multiple commands to a router

I'm working on moving a Perl script that pushed commands to routers. We have turned off telnet, so I'm working on getting SSH to work. After looking at a number of SSH libraries in Perl, I've opted to use Net::OpenSSH. I have no problem logging in and passing commands to the routers, but the problem I'm having is with entering config mode and subsequently passing a command.
The problem is that with each command entered, the underlying system appears to logout then reenter with the next subsequent command. For example with a Juniper router I'm trying to do the following:
edit private
set interfaces xe-1/3/2 description "AVAIL: SOMETHING GOES HERE"
commit
exit
quit
Tailing the syslog from the router I'm seeing something like this...
(...)
UI_LOGIN_EVENT: User 'tools' login, class 'j-remote-user' [65151], ssh-connection 'xxx.xxx.xxx.xxx 42247 xxx.xxx.xxx.xxx 22', client-mode 'cli'
UI_CMDLINE_READ_LINE: User 'tools', command 'edit private '
UI_DBASE_LOGIN_EVENT: User 'tools' entering configuration mode
UI_DBASE_LOGOUT_EVENT: User 'tools' exiting configuration mode
UI_LOGOUT_EVENT: User 'tools' logout
UI_AUTH_EVENT: Authenticated user 'remote' at permission level 'j-remote-user'
UI_LOGIN_EVENT: User 'tools' login, class 'j-remote-user' [65153], ssh-connection 'xxx.xxx.xxx.xxx 42247 xxx.xxx.xxx.xxx 22', client-mode 'cli'
UI_CMDLINE_READ_LINE: User 'tools', command 'set interfaces '
UI_LOGOUT_EVENT: User 'tools' logout
(...)
As you notice I'm getting a LOGOUT_EVENT after each command entered. Of course exiting config mode immediately after entering it causes the set interfaces command to fail as it's no longer in config mode.
The Perl code I'm using is as follows...
#!/usr/bin/perl -w
use strict;
use lib qw(
/usr/local/admin/protect/perl
/usr/local/admin/protect/perl/share/perl/5.10.1
);
use Net::OpenSSH;
my $hostname = "XXXXX";
my $username = "tools";
my $password = "XXXXX";
my $timeout = 60;
my $cmd1 = "edit private";
my $cmd2 = 'set interfaces xe-1/3/2 description "AVAIL: SOMETHING GOES HERE"';
my $cmd3 = "commit";
my $cmd4 = "exit";
my $ssh = Net::OpenSSH->new($hostname, user => $username, password => $password, timeout => $timeout,
master_opts => [-o => "StrictHostKeyChecking=no"]);
$ssh->error and die "Unable to connect to remote host: " . $ssh->error;
my #lines = eval { $ssh->capture($cmd1) };
foreach (#lines) {
print $_;
};
#lines = eval { $ssh->capture($cmd2) };
foreach (#lines) {
print $_;
};
#lines = eval { $ssh->capture($cmd3) };
foreach (#lines) {
print $_;
};
#lines = eval { $ssh->capture($cmd4) };
foreach (#lines) {
print $_;
};
$ssh->system("quit");
The sequence of events is the same as when telnet was used. The only real change was in using SSH objects verses Telnet objects. I'm stumped. Any ideas you could provide would be quite helpful.
[SOLVED, sort of]
The suggestion let Net::Telnet do the driving was the correct one. The following code works...
#!/usr/bin/perl -w
use strict;
use Net::OpenSSH;
use Net::Telnet;
use Data::Dumper;
my $promptEnd = '/\w+[\$\%\#\>]\s{0,1}$/o';
my $cmd1 = "show system uptime | no-more";
my $cmd2 = "show version brief | no-more";
my $hostname = "xxx.xxx";
my $username = "xxxxxxx";
my $password = "xxxxxxx";
my $timeout = 60;
my $ssh = Net::OpenSSH->new(
$hostname,
user => $username,
password => $password,
timeout => $timeout,
master_opts => [ -o => "StrictHostKeyChecking=no" ]
);
$ssh->error and die "Unable to connect to remote host: " . $ssh->error;
my ( $fh, $pid ) = $ssh->open2pty( { stderr_to_stdout => 1 } );
my %params = (
fhopen => $fh,
timeout => $timeout,
errmode => 'return',
);
$conn = Net::Telnet->new(%params);
$conn->waitfor($promptEnd);
#lines = $conn->cmd($cmd1);
foreach (#lines) {
print $_;
}
#lines = $conn->cmd($cmd2);
foreach (#lines) {
print $_;
}
$conn->cmd("quit");
The problem I'm having is that I can't seem to separate the code into subroutines. Once the $conn object is returned from a subroutine, the underlying ssh connection drops. I need to separate this logic in order to not have to rewrite many, many programs and lines of code that relay on this pusher routine. However that problem I'll direct to another question.
[Edit, fully solved]
Just an update in case anyone needs to do something similar.
While the above worked very well when run under a single subroutine, I found that any time I passed the handle to another subroutine, the telnet handle remained open, but the ssh connection dropped.
To solve this I found that if I passed the ssh handle to another subroutine, then later attached the open2pty, and attached Net::Telnet, then I could pass the Net::Telnet handle between subroutines without the underlying ssh connection dropping. This also worked for Net::Telnet::Cisco as well. I have this code working well with Cisco, Juniper, and Brocade routers.
You should also consider adding a few more parameters to the Net::Telnet->new() because it is interacting with ssh rather than a TELNET server.
-telnetmode => 0
-output_record_separator => "\r",
-cmd_remove_mode => 1,
Because there is no TELNET server on remote side, -telnetmode => 0 turns off TELNET negotiation.
The end-of-line is most likely just a carriage-return (i.e. -output_record_separator => "\r") rather than the TCP or TELNET combination of carriage-return linefeed ("\r\n").
Always strip the echoed back input -cmd_remove_mode => 1
There are several possibilities:
Some routers accept having the sequence of commands sent up front via stdin:
my $out = $ssh->capture({stdin_data => join("\r\n", #cmds, '')})
In other cases you will have to use something like Expect to send a command, wait for the prompt to appear again, send another command, etc.
If you were using Net::Telnet before, the Net::OpenSSH docs explain how to integrate both (though I have to admit that combination is not very tested).
Also, some routers provide some way to escape to a full Unix-like shell. I.e., preppending the commands with a bang:
$ssh->capture("!ls");

Using select to poll connections - TCP server

use strict; use warnings;
use IO::Socket;
use IO::Select;
my $read_select = IO::Select->new();
my $write_select = IO::Select->new();
my $socket = IO::Socket::INET->new(
LocalHost => '127.0.0.1',
LocalPort => '5556',
Proto => 'tcp',
Listen => 50,
Reuse => 1,
) or die "Could not create socket: $!";
print "Socket Created . Waiting for connection ...\n";
## poll to accept new connection or to receive data from a connection
$read_select->add($socket);
print "Added socket to read list ";
my $reade;
my $newconn;
my #read;
my #write;
while(1) {
#read = $read_select->can_read();
foreach my $reade(#read) {
if($reade == $socket) {
print "New conn received";
my $newconn = $reade->accept();
$write_select->add($newconn);
}
else {
print "data received";
}
}
}
#write = $write_select->can_write();
foreach my $write(#write) {
$write->send("got ur data");
}
I am trying to poll for connections using select statement. Why is that if i use an infinite loop, no connection is accepted. It works fine without while(1)
I think you are being bitten by I/O buffering here. Perl buffers all input and output. It generally doesn't print to the terminal until it has received an entire line.
Your code is probably working with the while(1), but you can't see the output of your debug print statements because the output to the terminal is being buffered. Once you get to the second time through the loop, $read_select->can_read() blocks forever, so you never see the output of the print statements.
You can probably fix this just by adding \n to the end of each print statement. Another option is setting $| = 1;. This disables buffering. See perlvar's discussion of $| for more information on buffering.