Calling coreftp application with perl - perl

I am attempting to loop through a set of files and upload to an FTP site using coreftp. I need to use core ftp and cannot use Net::SFTP or any other module.
When I run this command from CMD prompt, it works perfectly, but as you can see it is only for one file:
'"c:\program files\coreftp\coreftp.exe" -s -O -site My_Upload_Site -u //someserver/atextfile.txt -p /directory/'
I am requesting assistance on combining the foreach loop variable with the command prompt. How can I call the CMD script and replace the "//someserver/atextfile.txt" with $TheInputDir/$FileToUse from the foreach loop?
foreach $FileToUse(#FilesToUse)
{
'"c:\program files\coreftp\coreftp.exe" -s -O -site My_Upload_Site -u //someserver/atextfile.txt -p /directory/'
#once uploaded move the file
move ("$TheInputDir/$FileToUse", "$TheMoveDir/$FileToUse") or $MailMsg = $MailMsg . "ERROR: Moving files Failed! \n";
}
if there is a better way, I am always open to suggestions.

I have had a great deal of success with PUTTY, specifically, PSCP. You can run PSCP on the command line, to test it and to make sure that you are calling it correctly. Then, in your script, call it with system(), or back ticks, or however you want to. Whether it's better to use SCP or SFTP is a matter of debate, and I don't take sides here. For my purposes, PSCP worked very well.
Unfortunately, I don't currently have access to a remote server, or I would write a script to illustrate it. Still, it's easy to use and reliable.
Download: http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html
Usage example: pscp fred#example.com:/etc/hosts c:\temp\example-hosts.txt

Related

How to send stderr in email shell script (ash)

I wrote a shell script that I use under ash, and I redirect stderr and stdout to a log file. I would like that log file to be emailed to me only if stderr is not empty.
I tried:
exec >mylog.log 2>&1
# Perform various find commands
if [TEST_IF_STDERR_NOT_EMPTY]; then
/usr/bin/mail -s "mylog" email#mydomain.com < mylog.log
fi
My question is twofold:
1- I get a -sh: /usr/bin/mail: not found error. It seems that the mail command doesn't exist under ash (or at least under my linux box, which is a Synology NAS), what would be the alternative? Worst case, perl is available, but I would prefer to use standard sh commands.
2- How to I test that stderr is not empty?
Thanks
How to check if file is empty in bash
As for the first question, in your code you are calling mail but lower in the post you are calling email. Check your code and make sure it is mail.
Use which mail to get the full path. Maybe it is not installed in /usr/bin/.
Use find to locate mail.
If you can go to another shell, run it and then execute which mail to get the full path of mail in case the path is set up in the alternative shells.

Using Netsh with PsExec

I'm trying to dump DHCP settings from an older server thats being decommissioned. I ran the command fine while on the actual server but when trying to run it using psexec remotely, it keeps failing. The command is: psexec \\$source netsh dhcp server \\$source dump>$dhcpSettings
$source = the server being decommissioned
$dhcpSettings = the path to save the dumped settings
I've tried all sorts of combinations of encapsulating quotation marks but still nothing. the errors I'm getting is, "The system cannot find the file specified" and "The system cannot find the path specified"
EDIT: So I got rid of the path to save the dumped settings and now it works. But how should I format the command so that it'll save the settings to the remote computer's C:\USER.SET\LOG directory?
One solution might be to bundle the command you want to run and the stdout redirection into a single line cmd file and use PsExec -c or -f to copy and execute that file on the remote system. As an example
Create a line cmd file named DHCPSettings.cmd with the following in it and save it to C:\temp\:
netsh dhcp server \\localhost dump >c:\user.set\log\dhcpsetting.log
Then use
psexec \\$source -c c:\temp\DHCPSettings.cmd
You did not really provide any code to go by and I am not sure I understand the all requirements and constraints you have, so consider this as an idea; not the exact commands you need to run. Hope it helps.

Export PATH using NET::SSH:PERL

I am writing a shell script to automate some of the tedious tasks that we perform. I need to ssh to a server and change the PATH variable remotely, have that variable persist for the next commands to be executed. Code below;
sub ab_tier{
my $ssh=Net::SSH::Perl->new($host);
$ssh->login($user2,$user2);
my $PATH;
my($stdout,$stderr,$exit)=$ssh->cmd("export
PATH=/usr/bin/sudo:/local/perl-5.6.1/bin:$PATH");
my($PATH, $stderr, $exit)=$ssh->cmd("echo $PATH");
print $PATH; # Check the path for correctness : does not change
}
However the PATH does not change. Is there another way to implement this or am I doing something wrong. I need to automate tasks so dont think $ssh->shell would help here. Please suggest.
I made changes as per suggestions and everything works fine. However I am noticing another issue, which is occurring when trying to display environment variables.
my $cmd_ref_pri={
cmd0=>"echo $ENV{'HOME'}",
cmd1=>"chmod 777 $ENV{'COMMON_TOP'}/temp"
};
Now I am connecting to a remote server using Net::SSH::Perl and the value returned by $ENV{"HOME"} is the value of the my home directory and not of the remote server. However if I add a command as in :
my $cmd_ref_pri={
cmd0=>"whoami ; echo $ENV{'HOME'}",
cmd1=>"chmod 777 $ENV{'COMMON_TOP'}/temp"
};
Then the user id displayed is of the user using which I ssh to the remote server. I do not have other modules installed and the only one available is Net:SSh:perl hence I am forced to use this.
routine for executing command
sub ssh_cmd{
#$cmd_sub - contains command, $ssh contains object ref
my ($cmd_sub,$ssh)=#_;
my($stdout, $stderr, $exit)=$ssh->cmd("bash",$cmd_sub);
if( $exit !=0){
print $stdout;
print "ERROR-> $stderr";
exit 1;
}
return 0;
}
Any suggestions as to why this could happen ?
cmd() is not passing your commands into one shell. It executes them in separate shells (or without any shell - manual is not clear about it). As soon as you finish your export PATH the shell exits and the new PATH is lost.
Looks like it is possible to pass all the relevant commands to a single shell process as separate lines of $stdin?
my $stdin='export A=B
echo $A
';
$ssh->cmd("bash",$stdin);
This would work just like on interactive login (but without terminal control, so commands that talk directly to terminal would likely fail).
Anyway Net::SSH::Perl does not look like the best tool for the job. I would rather use expect for automation.
Set PATH on every command call:
$ssh->cmd('PATH=/usr/bin/sudo:/local/perl-5.6.1/bin:$PATH echo $PATH');
And BTW, Net::SSH::Perl is not being maintained anymore, nowadays Net::SSH2 and Net::OpenSSH are better alternatives.
Write commands to a remote temp file, then execute that one. Or, skip the $PATH thing and use the full path for subsequent commands (assuming you know it).

Send data to putty in powershell

I have powershell script that open putty.exe in process and i want to send data to this process, how can i do that???
PLEASE HELP!
The process:
$solExe = [diagnostics.process]::start("putty.exe", "-raw -P 2000 127.0.0.1")
The command line interface for putty is plink.exe. You can use plink to send commands over ssh.
For example:
PS C:> c:\progra~2\putty\plink.exe -i C:\credentials\mykeyfile.ppk root#myserver.com "ls";
Things to remember:
The first time you connect to a server, you'll have to add it to your registry, so this won't work in a non-interactive mode for brand new servers. There isn't a way to disable this.
The key file has to be in ppk format for plink.exe to recognize it. If yours is in pem format, use puttygen.exe to create a ppk file.
The path to the key file cannot contain any spaces, or the command above won't work.
If you want to send multiple commands at once, write them to a file and use the -m switch with plink.exe.
If you need to transfer files, you can use pscp.exe in a similar fashion.

ssh problem - no such file or directory

I have a script in remote host which I run as ./test /a/b/c/f and it runs perfectly fine on the maching.
Now I am on host machine, I run the same script as ssh root#dst "./test /a/b/c/f" and this too runs fine.
But from my perl script I execute it using backticks as
$file = "/a/b/c/f";
`ssh root\#dst "./test $file"`;
or
system("ssh root\#dst \"./test $file\" ");
it says bash:./test no such file or directory.
I tried escaping $file with single \ and \. even that does not work. Any idea how to solve this,
Thanks.
Have you tried using an absolute path instead of one based on ./ ? It'll probably solve this problem, and it's safer in general (especially when connecting as root) than depending on whatever sets the cwd (probably bash based on history) to set it the same way every time.