Passing name of file via socket in perl - perl

I have 2 scripts written in perl. First one takes a file and send it via socket to server. The server is my second script - and it saves to a file.
Server save file as a specified name - fixed in code. How to take the name of sending file, and send it to the server, before sending a file?
My code below:
Client:
my $socket = IO::Socket::INET->new(
PeerAddr => $local_host,
PeerPort => $local_port,
Proto => 'tcp',
)or die "Alert!";
my $bandwidth = 1024*5 ; # 5Kb/s -
open my $fh, '<', "$direc/$my_data"
or die "couldn't open the file";
my $buffer ;
while( sysread($fh, $buffer , $bandwidth) ) {
print $socket $buffer ;
sleep(1) ;
}
print "Data send.End \n" ;
close ($fh) ;
close($socket) ;
My server:
my $my_socket = new IO::Socket::INET(
LocalHost => $local_host,
LocalPort => $local_port,
Proto => 'tcp',
Listen => 5,
Reuse => 1
);
die "Couldn't open my_socket $!n " unless $my_socket;
print "You can send the data now \n";
my $accepter = $my_socket->accept();
my $count=0;
#print "$directory.$save_dir/$my_data";
open my $fh, '>', "$direc/$save_dir/$my_data" #my data is the name, and it's "fixed", how to take it from client?
or die "Couldn't open the file";
while(<$accepter>){
chomp;
last if $count++ ==10;
say $fh $_;
}
print "End \n";
close $fh;
close $my_socket;

Having the server write a filename specified by the client is a security risk. The client could tell the server to overwrite files, including the server itself.
Instead, use a UUID for the real filename. Store the client filename / real filename pair elsewhere.
You need to come up with a protocol so the server can distinguish between the filename and content. We could use an existing format such as JSON or YAML, but they require slurping the whole file into memory and encoding the content. You could make something up, like "the first line is the filename", but we can do a little better.
If you want to stream, we can use a stripped down HTTP protocol. Send headers as Key: Value lines. A blank line ends headers and begins sending content. For just a little extra effort we have a simple protocol that's extensible.
Here's the main loop of the server using UUID::Tiny and also autodie.
# Read Key: Value headers until we see a blank line.
my %headers;
while(my $line = <$accepter>) {
chomp $line;
last if $line eq "";
my($key, $value) = split /\s*:\s*/, $line;
$headers{$key} = $value;
}
# Store the file in a random filename. Do not use the client's filename
# to avoid a host of security issues.
my $filename = create_uuid_as_string(UUID_V4);
open my $fh, ">", "incoming/$filename";
# Read the content and save it to the file.
my $buf;
while( read($accepter, $buf, 1024) ) {
print $fh $buf;
}
say "$headers{Filename} was stored in incoming/$filename";
close $my_socket;
And the client simply sends a Filename header before sending the file's content.
open my $fh, '<', $filename;
print $socket "Filename: $filename\n\n";
my $buffer ;
while( sysread($fh, $buffer , $bandwidth) ) {
print $socket $buffer ;
}

Related

How to flush pipe handle obtained from open?

I am trying to flush a pipe handle obtained from open using either autoflush() and flush() methods from the IO::Handle module, but I think it is not working. Here is an example:
host.pl:
use feature qw(say);
use strict;
use warnings;
my $client_pid = open ( my $fh, '|-', 'client.pl' )
or die "Could not open client: $!";
#$fh->autoflush(1); # adding this line does not help
sleep 2;
say "Host: sent message";
print $fh "Hello";
#print $fh "Hello\n"; # adding a newline works fine
$fh->flush() or warn "$!"; # this does not work
sleep 2;
say "Host exits.";
close $fh;
client.pl:
use feature qw(say);
use strict;
use warnings;
say "Client running..";
chomp (my $line = <STDIN>);
say "Client got line: '$line'";
sleep 1;
say "Client exits..";
The output of running host.pl is:
Client running..
Host: sent message
Host exits.
Client got line: 'Hello'
Client exits..
Expected output would be:
Client running..
Host: sent message
Client got line: 'Hello'
Client exits..
Host exits.
I know I can fix this by adding a newline at the end of string to be printed:
print $fh "Hello\n";
but I am curious why $fh->flush() is not working here?
The data is being sent to the client immediately, but the client waits for a newline to arrive.
readline (for which <> is a shortcut in your program) reads until a newline is encountered before returning (although changing $/ can change that behaviour. If you want a call that returns as soon as data is available, use sysread.
use BLOCK_SIZE => 64*1024;
say "Client running..";
while (1) {
my $rv = sysread(\*STDIN, my $buf, BLOCK_SIZE);
die($!) if !defined($rv);
last if !$rv;
say "Got: $buf";
}
Note a single print can result in data being received in multiple chunks. In practice, especially with a socket instead of a pipe, you'd need some way of framing your messages in order to reliably identify them. For example, the following client expects sentinel-terminated messages (the sentinel being a newline):
use BLOCK_SIZE => 64*1024;
say "Client running..";
my $buf = '';
while (1) {
my $rv = sysread(\*STDIN, $buf, BLOCK_SIZE, length($buf));
die($!) if !defined($rv);
last if !$rv;
while ($buf =~ s/^([^\n]*)\n//) {
my $msg = $1;
say "Got: $msg";
}
say "Got a partial message" if length($buf);
}
die("Premature EOF\n") if length($buf);
Try sending:
$fh->autoflush();
print($fh "abc");
sleep(1);
print($fh "def\n");
sleep(1);
print($fh "ghi\njkl\nmno");
sleep(1);
print($fh "pqr\n");
This can be adapted to handle length-prefixed messages or any other message format.

Perl script to send e-mail reading file content

Wrote a Perl script to extract contents from 2 arrays and store it to a file (out.log) in the below format
open my $fh, ">", "out.log" or die "Cannot open out.log: $!";
for my $i (0..$#arr_1){
print $fh "$arr_1[$i]\t$arr_2[$i]\#gmail.com\n"; }
close $fh;
12345 joe#gmail.com
67890 jack#gmail.com
45678 john#gmail.com
Now by reading the out.log file content , I have to send e-mail to joe#gmail.com with e-mail body content
Your balance is:12345
to, jack#gmail.com
Your balance is:67890
to,john#gmail.com
Your balance is:45678
I can read the log file and form the mail body content like below but unsure how to achieve the above mentioned scenario.
my $mail_body = "Your balance is:\n";
{
local $/ = undef;
open FILE, "file" or die "...: !$";
$mail_body .= <FILE>;
close FILE;
}
Looking forward for help .Thanks in advance.
haven't tested but this should solve your problem.
use strict;
use warnings;
use Mail::Sendmail;
open my $fh, '<', 'output.log' or die "could not open file output.log !$";
while (my $line = <$fh>) { # read the content of the file line by line
# match and capture balance and the email address
my ($balance, $to_email) = $line =~ /(\d+)\s+(\S+)/;
%mail = ( To => $to_email,
From => 'your_eamil#address.comacine',
Messag => "your balance is: $balance"
);
sendmail(%mail);
}
close $fh;

Perl Socket File Transfer Problems

I have my connection working and data transfer working.... to an extent. I am designing a small program that sets up a client and server socket. Once connected, the client can send files to the server.
My problem is, when I begin to send my "test" file, the server never ends it's while loop. It keeps concatenating data into the output file. Even stranger, the data in the output file is correct, except there is extra white space between the lines.
I know this is because I am not chomping the \n character, and I add another \n character on the server. But if I chomp on the client, then everything is on one line. Therefore the server (regardless of it adding a newline character) outputs it all on one line, because it only received one line. If I chomp on the server side, I get an empty text file... which confuses me.
Furthermore, the server never stops concatenating... it makes an infinite loop even after the client disconnected. The terminal outputs this indefinitely:
Use of uninitialized value $data in concatenation (.) or string at ./tcp_server.pl line 51, <GEN2> line 14.
Here is my code for the server:
#!/usr/bin/perl
# Flushing to STDOUT after each write
$| = 1;
use warnings;
use strict;
use IO::Socket::INET;
use v5.10;
# Server side information
my $listen_port = '7070';
my $protocal = 'tcp';
# Finds IP address of host machine
# Connects to example site on HTTP
my $ip_finder = IO::Socket::INET->new(
PeerAddr=> "www.google.com",
PeerPort=> 80,
Proto => "tcp"
) or die "The IP can not be resolved: $!\n";
# The found IP address of Host
my $ip_address = $ip_finder->sockhost;
# Creating socket for server
my $server = IO::Socket::INET->new (
LocalPort => $listen_port,
Proto => $protocal,
Listen => 5,
Reuse => 1,
) or die "Socket could not be created, failed with error: $!\n"; # Prints error code
print "Socket created using IP: $ip_address\n";
print "Waiting for client connection on port $listen_port\n";
# Accept connection
my $client_socket = $server->accept();
open(my $fh, ">out")
or die "File can not be opened: $!";
while($client_socket) {
# Retrieve client information
my $client_address = $client_socket->peerhost();
my $client_port = $client_socket->peerport();
print "Client accepted: $client_address, $client_port\n";
my $data = <$client_socket>;
print $fh "$data\n";
}
close $fh;
$server->close();
and the client:
#!/usr/bin/perl
# Flushing to STDOUT after each write
$| = 1;
use warnings;
use strict;
use IO::Socket::INET;
use v5.10;
# Client side information
# Works by setting $dest to server address, needs to be on same domain
# my $dest = '<IP goes here>';
# my $dest = '<IP goes here>';
my $dest = '<host>.cselabs.umn.edu';
my $port = '7070';
my $protocal = 'tcp';
my $client = IO::Socket::INET->new (
PeerHost => $dest,
PeerPort => $port,
Proto => $protocol,
) or die "Socket could not be created, failed with error: $!\n"; # Prints error code
print "TCP connection established!\n";
open(my $fh, "<test")
or die "Can't open: $!";
while(my $line = <$fh>) {
print $client $line;
}
close $fh;
# sleep(10);
$client->close();
You don't add newline at the client. The client just reads a line from the file (including newline) and prints it to the socket, but does not add another newline. But at the server you read the line from the socket (including the newline) and add another newline. Also, the loop condition in the server is not correct. You should not check if the socket exists because it will exists even after the connection got closed. Instead you should check for error while reading (that's where you undefined warnings come from). And, it may be better not to read line by line but instead chunks independent from the lines, e.g.
while (read($client_socket, my $data, 8192)) {
print $fh $data;
}

Perl WWW::Mechanize Comparing the response header content length of different urls

I have a question I'm hoping you could help with?
I have two text files containing the following:
FILE1.txt
http://www.dog.com/
http://www.cat.com/
http://www.antelope.com/
FILE2.txt
1
2
Barry
The output I correctly achieve is as follows:
http://www.dog.com/1
http://www.dog.com/2
http://www.dog.com/Barry
http://www.cat.com/1
http://www.cat.com/2
http://www.cat.com/Barry
http://www.antelope.com/1
http://www.antelope.com/2
http://www.antelope.com/Barry
Code to do the above
open my $animalUrls, '<', 'FILE1.txt' or die "Can't open: $!";
open my $directory, '<', 'FILE2.txt' or die "Can't open: $!";
my #directory = <$directory>; #each line of the file into an array
close $directory or die "Can't close: $!";
while (my $line = <$animalUrls>) {
chomp $line;
print $line.$_ foreach (#directory);
push (#newListOfUrls, $line.$_) foreach (#directory); #put each new url into array
}
Now the problem I am having:
I need to get the Content Length of the original urls (File1.txt) and compare the Content-Length of each of the new urls with the corresponding original one to see if they are the same or different, for example:
I need to compare the Content-Length of http://www.dog.com/1 with the Content-Length of original url http://www.dog.com/ to see if there is a difference.
Then I need to compare the Content-Length of http://www.dog.com/2 with the Content-Length of the original url http://www.dog.com/ to see if there is a difference.
Then I need to compare the Content-Length of http://www.dog.com/Barry with the Content-Length of the original url http://www.dog.com/ to see if there is a difference.
Then I need to compare the Content-Length of http://www.cat.com/1 with the Content-Length of the original url http://www.cat.com/ to see if there is a difference.
Then I need to compare the Content-Length of http://www.cat.com/2 with the Content-Length of the original url http://www.cat.com/ to see if there is a difference.
And so on........
Code to get the Content-Length:
print $mech->response->header('Content-Length'); #returns the content length
What I am having trouble with is how to compare each new url with the correct corresponding original one? (i.e not accidently comparing the Content-Length of http://www.cat.com/Barry with the Content-Length of http://www.dog.com/) Should I use a hash maybe and how would I go about that?
Your help with this would be much appreciated, Many Thanks
You should use a hash for this. I'd change your input code to make a more complex data structure, as this makes the task easier.
open my $animalUrls, '<', 'FILE1.txt' or die "Can't open: $!";
open my $directory, '<', 'FILE2.txt' or die "Can't open: $!";
my #directory = <$directory>; #each line of the file into an array
close $directory or die "Can't close: $!";
my $newURLs;
while ( my $baseURL = <$animalUrls> ) {
chomp $baseURL;
SUBDIR: foreach my $subdir (#directory) {
chomp $subdir;
next SUBDIR if $subdir eq "";
# put each new url into arrayref
push( #{ $newURLs->{$baseURL} }, $baseURL . $subdir );
}
}
We can now use this to our advantage. Assuming we have already set up Mechanize:
foreach my $url ( keys %{$newURLs} ) {
# first get the base URL and save its content length
$mech->get($url);
my $content_length = $mech->response->header('Content-Length');
# now iterate all the 'child' URLs
foreach my $child_url ( #{ $newURLs->{$url} } ) {
# get the content
$mech->get($child_url);
# compare
if ( $mech->response->header('Content-Length') != $content_length ) {
print "$child_url: different content length: $content_length vs "
. $mech->response->header('Content-Length') . "!\n";
}
}
}
You could even do it without the second set of foreach loops by putting the code where you build up your data structure.
If you are unfamiliar with these references, take a look at perlreftut. What we have done here is make a hash with a key for each of the base URLs, and put an array of all the generated child URLs into that. If you use Data::Dumper to output the final $newURLs, it will look something like this:
$VAR1 = {
'http://www.dog.com/' => [
'http://www.dog.com/1',
'http://www.dog.com/2',
],
'http://www.cat.com/' => [
'http://www.cat.com/1',
'http://www.cat.com/2',
],
};
EDIT: I updated the code. I used these files to test it:
URLS:
http://www.stackoverflow.com/
http://www.superuser.com/
Dirs:
faq
questions
/
This code seems to do what you need. It stores all the URLs in #urls and prints the content lengths as it fetches each URL. I don't know what you need the length data for afterwards, but I have stored the lengths of each response in the hash %lengths to associate them with the URLs.
use 5.010;
use warnings;
use LWP::UserAgent;
STDOUT->autoflush;
my #urls;
open my $fh, '<', 'FILE1.txt' or die $!;
while (my $base = <$fh>) {
chomp $base;
push #urls, $base;
open my $fh, '<', 'FILE2.txt' or die $!;
while (my $path = <$fh>) {
chomp $path;
push #urls, $base.$path;
}
}
my $ua = LWP::UserAgent->new;
my %lengths;
for my $url (#urls) {
my $resp = $ua->get($url);
my $length = $resp->header('Content-Length');
$lengths{$url} = $length;
printf "%s -- %s\n", $url, $length // 'undef';
}
output
http://www.dog.com/ -- undef
http://www.dog.com/1 -- 56244
http://www.dog.com/2 -- 56244
http://www.dog.com/Barry -- 56249
http://www.cat.com/ -- 156
http://www.cat.com/1 -- 11088
http://www.cat.com/2 -- 11088
http://www.cat.com/Barry -- 11088
http://www.antelope.com/ -- undef
http://www.antelope.com/1 -- undef
http://www.antelope.com/2 -- undef
http://www.antelope.com/Barry -- undef

Send a command on multiple devices

I have this script to read a devices list and send a command. but currently it reads only the first device and sends it the command, ignoring the rest. What have I missed?
#!\usr\bin\Perl\bin\perl
use warnings;
use strict;
use NET::SSH2;
use MIME::Base64;
my $host = "C:/temp/devices.txt"; # input file
my $user = "XXX"; # your account
my $pass = "XXXXXX"; # your password 64 bit mime
my $ssh2 = Net::SSH2->new();
my $result = "C:/temp/result.txt"; # output file
$ssh2->debug(1); # debug on/off
open(List, '<', "$host") or die "$!";
while(<List>) {
chomp $_;
$ssh2->connect("$_") or die "Unable to connect host $# \n";
my $dp=decode_base64("$pass");
$ssh2->auth_password("$user","$dp");
my $chan = $ssh2->channel();
$chan->exec('sh run');
my $buflen =100000;
my $buf = '0' x $buflen;
my $read = $chan->read($buf, $buflen );
warn 'More than ', $buflen, ' characters in listing' if $read >= $buflen;
open OUTPUT, ">", "$result";
print OUTPUT "HOST: $_\n\n";
print OUTPUT "$buf\n";
print OUTPUT "\n\n\n";
print OUTPUT
close (List);
$chan->close();
}
You shouldn't be closing your List filehandle inside of your while loop. Move the close (List); line to after the close brace:
open(List, '<', "$host") or die "$!";
while(<List>) {
# ⋮
}
close (List);
close(List);
should be after the closing bracket.
You're closing your filehandle inside the while() loop. Move close(List) so it's outside of the while():
while(<List>) {
...
}
close(List);
Edit: I just noticed that you're also doing this within your while() loop:
open OUTPUT, ">", "$result";
This will cause your output file to be overwritten each time through the loop, so it will only have the last command's results in it. You could either move the open() / close() outside of the loop, or open the file in append mode:
open OUTPUT, '>>', $result;
You're also not checking to see if the open() succeeds; you should put or die $! at the end of your open() statement.