Unix : ssh is working through command line but not through script - sh

Below command is working through command line :
expect -c 'spawn ssh username#Host ; expect "assword:" ; send "<password>\r" ; interact;'
Below is not working if I include in a script
while read server_from_file
do
expect -c 'spawn ssh username#${server_from_file}; expect "assword:" ; send "<password>\r" ; interact;'
done < serverlist.conf
Please also let me know how could we run certain commands using the above script

expect inherits stdin from its environment. In the first case, expect uses the script's stdin as its stdin. In the second case, expect inherits stdin from the enclosing while loop, so it will be reading from the file. One typical solution would be to use a different fd for the loop. eg:
while read server_from_file <&3
do
expect -c 'spawn ssh username#${server_from_file}; expect "assword:" ; send "<password>\r" ; interact;'
done <3 serverlist.conf
(Note, some shells provide -u, and there are other ways to do this, but this should get you pointed in the right direction.)

Related

unable to take user input in perl

I am having a strange issue. I have written a script which is basically running a perl script in remote server using ssh.
This script is working fine but after completion of the above operation it will ask user to choose the next operation.
it is showing the options in the command prompt but while I am giving any input it is not showing in the screen even after hitting enter also it remain same.
I am not getting what is the exact issue, but it seems there is some issue with the ssh command because if I am commenting out the ssh command it is working fine.
OPERATION:
print "1: run the script in remote server \n2: Exit\n\nEnter your choice:";
my $input=<STDIN>;
chomp($input);
..........
sub run_script()
{
my $com="sshg3.exe server -q --user=user --password=pass -exec script >/dev/null";
system("$com");
goto OPERATION;
}
after completing this ssh script it is showing in screen:
1: run remote script
2: exit
Enter your choice:
but while I am giving any input it is not displaying in the screen until and unless I am exiting it using crtl C.
Please can anyone help what might be the issue here ?
One of the classic gotchas with ssh is this - that it normally runs interactively, and as such will attach STDIN by default.
This can result in STDIN being consumed by ssh rather than your script.
Try it with ssh -n instead.
You can redirect the output in command prompt if -n option is not available for you.
try this one it might work for you.
system("$com />null");
As per https://support.ssh.com/manuals/client-user/62/sshg3.html there is an option for redirecting input use --dev-null (*nix) or --null (Windows).
-n, --dev-null (Unix), -n, --null (Windows)
Redirects input from /dev/null (Unix) and from NUL (Windows).

How to ssh as different user, change group, and run a script within Perl

I need to be able to run a script from within a script but first I need to ssh as a different user and then change my group.
I am currently doing the following inside my perl script:
`ssh <user>#<host> ; newgrp <group> ; /script/to/run.pl`
When running this command form the command line it doesn't seam to switch groups. I assume this is because it's changing to a new shell.
How do I get around this and get it to work?
Also, please note, I do not have sudo/root privelages.
The first semicolon is interpreted by the local shell. So the three commands are run on the same host. I think you want this
ssh <user>\#<host> "newgrp <grp>; /bin/run.pl"
salva, in his reply, answered my question:
sg $group -c '$cmd'
The reason the following command:
newgrp <int>
doesn't work is because it creates a new shell. At least that is my best guess. the "sg" command gets around this.
I have found the following to work (with ksh on hpux) :
ssh user#host "echo 'date;pwd;echo bozo;id' | newgrp nerds;"
which basically executes the commands as user:nerds :
I think OP wants to construct a string to execute from Perl, notice the backticks. Not sure but OP might have to use:
$s='ssh <user>#<host> ; newgrp <group> ; /script/to/run.pl'; # Normal single quotes not backticks
exec($s);
OP, there are different ways to execute shell functions from a Perl script. You used backticks. There is also exec($s) and system($s).

How to send stderr in email shell script (ash)

I wrote a shell script that I use under ash, and I redirect stderr and stdout to a log file. I would like that log file to be emailed to me only if stderr is not empty.
I tried:
exec >mylog.log 2>&1
# Perform various find commands
if [TEST_IF_STDERR_NOT_EMPTY]; then
/usr/bin/mail -s "mylog" email#mydomain.com < mylog.log
fi
My question is twofold:
1- I get a -sh: /usr/bin/mail: not found error. It seems that the mail command doesn't exist under ash (or at least under my linux box, which is a Synology NAS), what would be the alternative? Worst case, perl is available, but I would prefer to use standard sh commands.
2- How to I test that stderr is not empty?
Thanks
How to check if file is empty in bash
As for the first question, in your code you are calling mail but lower in the post you are calling email. Check your code and make sure it is mail.
Use which mail to get the full path. Maybe it is not installed in /usr/bin/.
Use find to locate mail.
If you can go to another shell, run it and then execute which mail to get the full path of mail in case the path is set up in the alternative shells.

automating FTP session

I have the following excerpt from a perl script to automate an FTP session, I'm hoping someone can explain how it works.
system("rsh some_server ftp -in ftp.something.com << !
user anonymous someone\#somewhere.org
some ftp commands
bye");
The background. This perl script runs on a Linux machine, it remotes into a Solaris machine. The FTP session must be executed from the Solaris machine because the FTP site performs IP address checking.
Formerly this script ran on the Solaris machine directly (i.e. it didn't use rsh) I hacked it around and came up with this which seems to work. However I have little idea how, in particular I don't understand the << ! bit at the end of the first line. It looks a little like a here-document but I'm not really sure.
Any explanations welcome.
You are right, << is a heredoc, which is made clear by the following warning (which I get when I take out the rsh command):
sh: line 2: warning: here-document at line 0 delimited by end-of-file (wanted `!')
The construct
<< HEREDOC
reads as standard input everything from HEREDOC up to a line containing only HEREDOC or up to an end-of-file character. When you put this after a command, it is equivalent to
command < file
where file contains the text in the heredoc. In your case, instead of HEREDOC the delimiter is !, so the ! is not passed to ftp but everything after ! is. This is equivalent to
$ cat file
user anonymous someone\#somewhere.org
some ftp commands
bye
$ ftp -in ftp.something.com < file
rsh takes that entire command and runs it on your remote host.
As illustrated by user1146334's answer, this command does not act on the principal of least surprise. At the very least, make it less confusing by changing it to
system("rsh some_server ftp -in ftp.something.com << HEREDOC
user anonymous someone\#somewhere.org
some ftp commands
bye
HEREDOC");
Or even better, as mpapec mentioned in the comments, use Net::FTP and Net::SSH2.
Did you look at the man page?
-i Turns off interactive prompting during multiple file transfers.
-n Restrains ftp from attempting “auto-login” upon initial connection. If auto-login is enabled, ftp will check the .netrc (see netrc(5)) file in the user's
home directory for an entry describing an account on the remote machine. If no entry exists, ftp will prompt for the remote machine login name (default is
the user identity on the local machine), and, if necessary, prompt for a password and an account with which to login.
The client host and an optional port number with which ftp is to communicate may be specified on the command line. If this is done, ftp will immediately attempt
to establish a connection to an FTP server on that host; otherwise, ftp will enter its command interpreter and await instructions from the user. When ftp is
awaiting commands from the user the prompt ‘ftp>’ is provided to the user. The following commands are recognized by ftp:
! [command [args]]
Invoke an interactive shell on the local machine. If there are arguments, the first is taken to be a command to execute directly, with the rest of
the arguments as its arguments.
So essentially you're ftp'ing in and providing a new command per line in-line instead of from a file.

Can I execute a multiline command in Perl's backticks?

In Unix, I have a process that I want to run using nohup. However this process will at some point wait at a prompt where I have to enter yes or no for it to continue. So far, in Unix I have been doing the following
nohup myprocess <<EOF
y
EOF
So I start the process 'myprocess' using nohup and pipe in a file with 'y' then close the file. The lines above are effectively three seperate commands - i.e. I hit enter on the first line in UNIX, then I get a prompt where I enter 'y' and then press enter to then finally type 'EOF' and hit return again.
I want to know execute this in Perl but I am not sure how I can execute this command as it is over three lines. I don't know if the following will work....
my $startprocess = `nohup myprocess <<EOF &
y
EOF
`
Please help - thank you!
I think your proposal will work as is. If not, try replacing the redirect with a pipe:
my $startprocess = `(echo "y" | nohup myprocess) &`;
Also, depending on WHY you are doing a nohup, please look at the following pure Perl daemonizing approach using Proc::Daemon : How can I run a Perl script as a system daemon in linux?
Expect for interactive programs can be used as well.