Quoting in bash and perl in recursive ssh command - perl

I am writing a perl script to login in to a server with ssh and do some shell commands on the server. The problem is that the server is only accessible by first logging into another server.
(I am using password-less login with ssh keys).
The following bash script is working correctly, and illustrates the problem:
#! /bin/bash
server1="login.uib.no"
server2="cipr-cluster01"
ssh "$server1" "ssh $server2 \"echo \\\"\\\$HOSTNAME\\\"\""
It prints the correct host name to my screen: cipr-cluster01. However, when trying to do same thing in Perl:
my $server1="login.uib.no";
my $server2="cipr-cluster01";
print qx/ssh "$server1" "ssh $server2 \"echo \\\"\\\$HOSTNAME\\\"\""/;
I get the following output: login.uib.no. So I guess, there is some problems with the quoting for the perl script..

qx works like double quotes. You have to backslash some more:
print qx/ssh "$server1" "ssh $server2 \"echo \\\\"\\\$HOSTNAME\\\\"\""/;
Using single quotes might simplify the command a lot:
print qx/ssh "$server1" 'ssh $server2 "echo \\\$HOSTNAME"'/;

You can simplify the quoting a bit by using the ProxyCommand option that tells ssh to connect to $server2 via $server1, rather than explicitly running ssh on $server1.
print qx/ssh -o ProxyCommand="ssh -W %h:%p $server1" "$server2" 'echo \$HOSTNAME'/;
(There is some residual output from the proxy command (Killed by signal 1) that I'm not sure how to get rid of.)

You can use Net::OpenSSH that is able to do the quoting automatically:
my $ssh_gw = Net::OpenSSH->new($gateway);
my $proxy_command = $ssh_gw->make_remote_command({tunnel => 1}, $host, 22);
my $ssh = Net::OpenSSH->new($host, proxy_command => $proxy_command);
$ssh->system('echo $HOSTNAME');

Related

setup new database in ubuntu using a script [duplicate]

I have a script where I need to start a command, then pass some additional commands as commands to that command. I tried
su
echo I should be root now:
who am I
exit
echo done.
... but it doesn't work: The su succeeds, but then the command prompt is just staring at me. If I type exit at the prompt, the echo and who am i etc start executing! And the echo done. doesn't get executed at all.
Similarly, I need for this to work over ssh:
ssh remotehost
# this should run under my account on remotehost
su
## this should run as root on remotehost
whoami
exit
## back
exit
# back
How do I solve this?
I am looking for answers which solve this in a general fashion, and which are not specific to su or ssh in particular. The intent is for this question to become a canonical for this particular pattern.
Adding to tripleee's answer:
It is important to remember that the section of the script formatted as a here-document for another shell is executed in a different shell with its own environment (and maybe even on a different machine).
If that block of your script contains parameter expansion, command substitution, and/or arithmetic expansion, then you must use the here-document facility of the shell slightly differently, depending on where you want those expansions to be performed.
1. All expansions must be performed within the scope of the parent shell.
Then the delimiter of the here document must be unquoted.
command <<DELIMITER
...
DELIMITER
Example:
#!/bin/bash
a=0
mylogin=$(whoami)
sudo sh <<END
a=1
mylogin=$(whoami)
echo a=$a
echo mylogin=$mylogin
END
echo a=$a
echo mylogin=$mylogin
Output:
a=0
mylogin=leon
a=0
mylogin=leon
2. All expansions must be performed within the scope of the child shell.
Then the delimiter of the here document must be quoted.
command <<'DELIMITER'
...
DELIMITER
Example:
#!/bin/bash
a=0
mylogin=$(whoami)
sudo sh <<'END'
a=1
mylogin=$(whoami)
echo a=$a
echo mylogin=$mylogin
END
echo a=$a
echo mylogin=$mylogin
Output:
a=1
mylogin=root
a=0
mylogin=leon
3. Some expansions must be performed in the child shell, some - in the parent.
Then the delimiter of the here document must be unquoted and you must escape those expansion expressions that must be performed in the child shell.
Example:
#!/bin/bash
a=0
mylogin=$(whoami)
sudo sh <<END
a=1
mylogin=\$(whoami)
echo a=$a
echo mylogin=\$mylogin
END
echo a=$a
echo mylogin=$mylogin
Output:
a=0
mylogin=root
a=0
mylogin=leon
A shell script is a sequence of commands. The shell will read the script file, and execute those commands one after the other.
In the usual case, there are no surprises here; but a frequent beginner error is assuming that some commands will take over from the shell, and start executing the following commands in the script file instead of the shell which is currently running this script. But that's not how it works.
Basically, scripts work exactly like interactive commands, but how exactly they work needs to be properly understood. Interactively, the shell reads a command (from standard input), runs that command (with input from standard input), and when it's done, it reads another command (from standard input).
Now, when executing a script, standard input is still the terminal (unless you used a redirection) but the commands are read from the script file, not from standard input. (The opposite would be very cumbersome indeed - any read would consume the next line of the script, cat would slurp all the rest of the script, and there would be no way to interact with it!) The script file only contains commands for the shell instance which executes it (though you can of course still use a here document etc to embed inputs as command arguments).
In other words, these "misunderstood" commands (su, ssh, sh, sudo, bash etc) when run alone (without arguments) will start an interactive shell, and in an interactive session, that's obviously fine; but when run from a script, that's very often not what you want.
All of these commands have ways to accept commands by ways other than in an interactive terminal session. Typically, each command supports a way to pass it commands as options or arguments:
su root -c 'who am i'
ssh user#remote uname -a
sh -c 'who am i; echo success'
Many of these commands will also accept commands on standard input:
printf 'uname -a; who am i; uptime' | su
printf 'uname -a; who am i; uptime' | ssh user#remote
printf 'uname -a; who am i; uptime' | sh
which also conveniently allows you to use here documents:
ssh user#remote <<'____HERE'
uname -a
who am i
uptime
____HERE
sh <<'____HERE'
uname -a
who am i
uptime
____HERE
For commands which accept a single command argument, that command can be sh or bash with multiple commands:
sudo sh -c 'uname -a; who am i; uptime'
As an aside, you generally don't need an explicit exit because the command will terminate anyway when it has executed the script (sequence of commands) you passed in for execution.
If you want a generic solution which will work for any kind of program, you can use the expect command.
Extract from the manual page:
Expect is a program that "talks" to other interactive programs according to a script. Following the script, Expect knows what can be expected from a program and what the correct response should be. An interpreted language provides branching and high-level control structures to direct the dialogue. In addition, the user can take control and interact directly when desired, afterward returning control to the script.
Here is a working example using expect:
set timeout 60
spawn sudo su -
expect "*?assword" { send "*secretpassword*\r" }
send_user "I should be root now:"
expect "#" { send "whoami\r" }
expect "#" { send "exit\r" }
send_user "Done.\n"
exit
The script can then be launched with a simple command:
$ expect -f custom.script
You can view a full example in the following page: http://www.journaldev.com/1405/expect-script-example-for-ssh-and-su-login-and-running-commands
Note: The answer proposed by #tripleee would only work if standard input could be read once at the start of the command, or if a tty had been allocated, and won't work for any interactive program.
Example of errors if you use a pipe
echo "su whoami" |ssh remotehost
--> su: must be run from a terminal
echo "sudo whoami" |ssh remotehost
--> sudo: no tty present and no askpass program specified
In SSH, you might force a TTY allocation with multiple -t parameters, but when sudo will ask for the password, it will fail.
Without the use of a program like expect any call to a function/program which might get information from stdin will make the next command fail:
ssh use#host <<'____HERE'
echo "Enter your name:"
read name
echo "ok."
____HERE
--> The `echo "ok."` string will be passed to the "read" command

awk command in Perl's system does not work

I am writing a small Perl script that executes an Awk command :
I try to swap two columns in a file, the file is like this :
domain1,ip1
domain2,ip2
domain3,ip3
the result should be
ip1,domain1
ip2,domain2
ip3,domain3
The Perl command invoking awk is like this:
system("ssh -p 22 root\#$mainip 'awk -F, '{print $2,$1}' OFS=, /root/archive/ipdomain.txt > /root/ipdom.txt'");
This is the error I get :
awk: cmd. line:1: {print
awk: cmd. line:1: ^ unexpected newline or end of string
any suggestions, please?
With the layered commands and all that multi-level quoting and escaping that need be done right,† no wonder it fails. A complex command like that will always be tricky, but libraries help a lot.
A properly quoted string to run through a shell can be formed with String::ShellQuote ‡
use warnings;
use strict;
use feature 'say';
use String::ShellQuote qw(shell_quote);
die "Usage: $0 file outfile\n" if #ARGV != 2;
my ($file, $out) = #ARGV;
my #cmd_words =
( 'ssh', 'hostname', 'awk', q('{print $2 $1}'), $file, '>', $out );
my $cmd = shell_quote #cmd_words;
system($cmd);
Note how the q() operator from of single quotes enables us to pass single quotes nicely.
This swaps the first two words on each line of a file and prints them, using awk, and redirects the output to a file, on a remote host. It works as expected in my tests (with a real hostname). Please adjust as needed.
Another possible improvement would be to use a library for ssh, like Net::OpenSSH.
A complete command, as the one in the question, to use in the above program
my #cmd_words = (
'ssh', '-p', '22', "root\#$mainip",
'awk', '-F,', q('{print $2,$1}'), 'OFS=,', $file, '>', $out );
Tested with a file from the question.
The makeVoiceBot answer is informative and it got half way there but I find the need for
system("ssh hostname \"awk '{print \\\$2 \\\$1}' $path\"");
This works in my tests (on systems I ssh to). I try to avoid needing to deal with such quoting and escaping.
† This is a shell command which runs ssh, and then executes a command on the remote system which runs a shell (there) as well, in order to run awk and redirect its output to a file.
A bit more than an "awk command" as the title says.
‡ The library can prepare a command for bash (as of this writing), but one can look at the source for it and adjust it for their own shell, at least. There is also Win32::ShellQuote
I am using a shortened example here
system("ssh localhost 'awk '{print $2,$1}' file.txt'")
system() sees:
ssh localhost 'awk '{print $2,$1}' file.txt'
local shell expands:
ssh
localhost
awk
{print
$2,$1}
file.txt
local shell replaces $1 and $2 (positional args) with empty strings:
ssh
localhost
awk
{print
,}
file.txt
ssh executes:
ssh localhost awk {print ,} file.txt
remote shell gets:
awk
{print
,}
file.txt
So the remote shell runs awk with {print as its program argument, resulting in the described error. To prevent this, the invocation of system() can be changed to;
system("ssh localhost \"awk '{print \$2,\$1}' file.txt\"")
system() sees:
ssh localhost "awk '{print \$2,\$1}' file.txt"
local shell expands:
ssh
localhost
awk '{print \$2,\$1}' file.txt
ssh executes
ssh localhost awk '{print \$2,\$1}' file.txt
remote shell gets
awk
{print \$2,\$1}
file.txt
remote shell expands \ escapes
awk
{print $2,$1}
file.txt
Remote awk now gets {print $2,$1} as its program argument, and executes successfully.

Run a local script remotely via ssh while passing arguements

I am using this command fine:
ssh user#ip 'bash -s' -- < /usr/local/nagios/libexec/check_ssh_mem.sh
I don't like the results of that script so want to use a perl script instead and use this:
ssh user#ip 'perl -s' -- < /usr/local/nagios/libexec/check_mem.pl
check_mem.pl v1.0 - Nagios Plugin
usage:
check_mem.pl -<f|u> -w <warnlevel> -c <critlevel>
options:
-f Check FREE memory
-u Check USED memory
-C Count OS caches as FREE memory
-w PERCENT Percent free/used when to warn
-c PERCENT Percent free/used when critical
As you can see, I get proper feedback from the script. I want to pass it the -f, -w and -c variables but get errors when trying to do that.
man perlrun says:
Upon startup, Perl looks for your program in one of the following places
...
3. Passed in implicitly via standard input. This works only if there are
no filename arguments--to pass arguments to a STDIN-read program you must
explicitly specify a "-" for the program name.
So, you can use this:
ssh user#ip 'perl - -f -u -C' -- < /usr/local/nagios/libexec/check_mem.pl
The arguments after the - are passed to the script you are running.
You don't need the -s argument - I assume you copied that from your original bash implementation, but -s has a different meaning for perl.

Perl Syntax Error: bareword found where operator expected

This is my perl script code. in this i'm getting error like "bareword found where operator expected at $cmd"
my $path = $folder."/".$host."_".$database_name.".bak";
$cmd .= "-U $user_name -P $password -S $host -d master -Q "BACKUP DATABASE [$database_name] TO DISK = N'$path'" ";
any one help me?
When a string has double quotes within it, you need to escape them with \.
$cmd .= "-U $user_name -P $password -S $host -d master -Q \"BACKUP DATABASE [$database_name] TO DISK = N'$path'\" ";
Also, Perl lets you use other characters for quote delimiters. qq followed by almost any character is the same as double quotes. So you could do things like this to avoid the need of backslashes:
$cmd .= qq(-U $user_name -P $password -S $host -d master -Q "BACKUP DATABASE [$database_name] TO DISK = N'$path'" );
$cmd .= qq|-U $user_name -P $password -S $host -d master -Q "BACKUP DATABASE [$database_name] TO DISK = N'$path'" |;
And so on...
Update: How to execute a system command in Perl. There are three basic ways:
system($cmd); #Goes through the shell if shell metacharacters are detected.
system(#command_and_args); #first element is the command, the rest are arguments
system executes a command and waits for it to return. The return value is the exit status of the program.
my #results = `$cmd`; #Always goes through shell.
Backticks execute a command and return its output. You should only use this if you actually need the output; otherwise, it is better to go with system.
exec $cmd;
exec #command_and_args;
exec is exactly like system, except that it never returns. It effectively ends your program by calling another program.
Use the one that is most appropriate to your situation. Or in this case, since you are executing SQL, consider using the DBI module. It's definitely a better approach for anything more than a couple of simple commands.
Looks like you have your " characters in the wrong place. I'm not sure where they should be.
In your second line, the string literal:
"-U $user_name -P $password -S $host -d master -Q "
is immediately followed by the bareword
BACKUP

How to get input from cat over ssh in Perl

I'm trying to cat a remote file over ssh and process it in local script line by line. So far I've tried this
open(INPUT,"| ssh user#host cat /dir1/dir2/file.dat")
but obviously it's only printing the file.dat to the STDOUT.
I know I can probably just scp the file and process it, but...
You're piping into ssh. I think you want to move the pipe to the other end so you can read the output from that cat command.
I would use
$file_contents = `ssh user#host cat /dir1/dir2/file.dat`;
#lines = split(/\n/, $file_contents);
.
.
. # process the file contents
That captures the output of the command (i.e. the contents of the file).