Pausing a perl script while SFTP transfers files - perl

FYI, I'm a complete newbie with Perl, as in I can spell it and only a little more so I'm trying to learn. What I'm trying to accomplish is using SFTP to transfer files from a Windows machine to a Linux machine.
I've noticed that Perl issues the SFTP get command, but doesn't wait for the transfer to finish so when the Perl script tries to use a file it can't find it. I know there is the sleep command, but the number and size of files will vary on a weekly basis so using sleep(600) seems a little silly.
Is there a standard way to pause a Perl script until SFTP finishes transferring all necessary files?
TIA.

Using Net::SFTP might have solved this dilemma, but my workplace won't allow me to download and install stuff, especially in production. So rather than waiting on the typical bureaucracy I did some more digging around and discovered this:
By calling SFTP in batch mode using a separate file that contains the SFTP commands, the Perl script has to wait for SFTP to finish executing the commands in the separate "command" file. So by using the batch mode option, the Perl script is paused as long as it takes for SFTP to finish its work of file transfer.

Related

Run a script via FTP connection from PowerShell

I have made a script that does a really basic task, it connects to a remote FTP site, retrieves XML files and deletes them afterward.
The only problem is that in the past we lost files because they were added when the delete statement was run.
open ftp.site.com
username
password
cd Out
lcd "E:\FTP\Site"
mget *.XML
mdel *.XML
bye
To prevent this from happening, we want to put a script on the FTP server (rename-files.ps1). The script will rename the *.xml files to *.xml.copy.
The only thing is I have no clue how to run the script through my FTP connection.
Some, but very few, FTP servers support SITE EXEC command. In the very rare case that your FTP server does support it, you can use:
quote SITE EXEC powershell rename-files.ps1
Though, in most cases, you cannot execute anything on the FTP server, if FTP is the only way you can access the server. You would have to use another method to execute the script, like SSH, PowerShell Remoting, etc.
But your problem has other solutions:
rename files using FTP; or
delete only the files that were downloaded.
Both are doable, but you will need better FTP client than Windows ftp.exe.
See for example A WinSCP script to download, rename, and move files.
Or you can do it like:
Run ftp.exe once to retrieve list of files;
Process the list in PowerShell to generate an ftp script with get and del commands for specific files (without wildcard);
Run ftp.exe again with generated script.
Actually you do not need to use ftp.exe in PowerShell. .NET assembly has its own FTP implementation: FtpWebRequest.

Can I use AutoIT, running as a service on a server, using the send function

I'm trying to trigger Illustrator javascripts through the use of AutoIT and its send function. AutoIT is invoked via a Perl script and it works when I have Illustrator open and I run the Perl script from the command line. (The Perl script runs in a continuous loop, triggered by files arriving in a hot folder.) It runs on a server when I have an active connection. However, when I disconnect, keeping the session alive, the autoIT process does not work. (I'm guessing it is because I'm using the send function which requires an active window.)
This is running on a Windows 2003 server.
Is this possible to do or am I farting in the wind.
Thanks in advance.
CODE:
Run("C:\Program Files\Adobe\Adobe Illustrator CS6\Support Files\Contents\Windows\Illustrator.exe")
WinActivate("Adobe Illustrator CS6")
sleep (3000)
Send("!f")
Send("{DOWN 17}")
Send("{RIGHT 2}")
Send("{ENTER}")
Here is some documentation on the Send() function in AutoIT. Also if you look near the bottom of the page after the key examples you can see that it also recommends trying to use ControlSend: http://www.autoitscript.com/autoit3/docs/functions/Send.htm

Should I turn a perl script that parses a /var/log/.* file into a daemon?

I am writing a perl script to parse, for example, /var/log/syslog.
The perl script triggers further subsequent tasks when particular events in the log appear. The log is parsed following the advice of this post:
Command line: monitor log file and add data to database
Which what I believe is the use of a pipe.
Now I'd like this script to forever run in the background.
This sounds like a daemon to me, and the daemon program referenced in the following question seems ideal:
How can I run a Perl script as a system daemon in linux?
But from this post, it seems clear that daemon's have no open file handles. So how can I have a daemon, or a perl script that becomes a daemon, that monitors a logfile?
It sounds like what you want is a daemon. In that case the advise given in the second post you reference is the best practice. However, you do have other options like daemontools, which removes the fork complexity.
Daemons are allowed to have filehandles, but you should close STDIN, STDOUT, and STDERRR because you shouldn't really use them anymore. A lot of this has to do with the way fork works in *nix systems. Just open the pipe filehandle after your second fork, and you shouldn't have any issues.
this doesn't answer your question, but is another route to consider which may or may not be appropriate for you:
rsyslog can execute a program when a certain message is logged
see Filter Conditions for setting up the up the trigger, Templates for formatting the output that's passed to the script, and Actions > Shell Execute for specifying the executable.
Be sure to read the security implications, and that ryslog blocks while the external program runs. But if your script runs reliably quickly, it may be an option.

parallel SSH in perl

I am trying to create a script in perl which can ssh to multiple hosts (500+), execute a desired command, and show output on the screen. I have done this with the Net::OpenSSH module, as ssh-keys are not configured and I am not allowed to configure those. So, I have to use a thing which can supply the password while doing ssh.
Due to the many connections, it takes considerable time while doing the thing. I searched for "parallel ssh in perl" and discovered that there is a module for opening parallel ssh (Net::OpenSSH:Parallel), but I read somewhere on some forums that I cannot capture output with this module like I can capture using Net::OpenSSH ($ssh->caputre(ls)).
So, how can I accomplish parallel ssh in a more expedient manner? Also, I welcome any other suggestions I can use to save time. Would using Net:OpenSSH in threads save my time or will it work exactly like parallel?
You can fork your program and manage the forks with something like Parallel::ForkManager. Then do the SSH work + capture using Net::OpenSSH and display the results to the screen. You'll need to be careful with your IO though since all those processes trying to write to STDOUT/STDERR at the same time will get garbled results. You'll need to do something like the answer from this question (pipes between parent and child processes): fork() and STDOUT/STDERR to the console from child processes
Parallel programming is harder than serial, so be prepared for some fun :)
One way is to use a shell script to execute your perl script:
#!/bin/bash
for host in $(cat myhosts)
do
perl myperl.pl $host $1 $2 &
done
where myhosts is a file containing 500+ host names

How can I pause Perl processing without hard-coding the duration?

I have a Perl script that contains this code snippet, which calls the system shell to get some files by SFTP and unzip them with WinZip:
# Run script to get files from remote server
system "exec_SFTP.vbs";
# Unzip any files that were retrieved
foreach $zipFile (<*.zip>) {
system "wzunzip $zipFile";
}
Even if some files are retrieved, they are never unzipped, because by the time the files are retrieved and the SFTP connection is closed, the Perl script has already completed the unzip step, with the result that it doesn't find anything to unzip.
My short-term fix is to insert
sleep(60);
before the unzip step, but that assumes that the SFTP connection will finish within 60 seconds, which may sometimes be a gross over-estimate, and other times an under-estimate.
Is there a more sound way to cause Perl to pause until the SFTP connection is closed before proceeding with the unzip step?
Edit: Responders have questioned (and reasonably so) the use of a VB script rather than having Perl do the file transfer. It has to do with security -- the VB script is maintained by others and is authorized to do the SFTP.
Check the code in your *.vbs file. The system function waits for the child process to finish before execution continues. It appears that your *.vbs file is forking a background task to do the FTP and returning immediately.
In a perfect world your script would be rewritten to use Net::SFTP::Foreign and Archive::Extract..
An ugly quick-hackish kind of way might be to create a touch-file before your first system call, alter your sftp-fetching script to delete the file once it is done and have a while like so
while(-e 'touch.file') {
sleep 5;
}
# foreach [...]
Of course, you would need to take care if your .vbs fails and leaves the touchfile undeleted and many other bad side effects. This would be for a quick solution (if none of the other suggestions work) until you get the time to rewrite without system() calls.
You need a way for Perl to wait until the SFTP transfer is done, but as your script is currently written, Perl has no way of knowing this. (It looks like you're combining at least two scripting languages and a (GUI?) SFTP client; this can work, but it's not exactly reliable or robust. Why use VBscript to start the SFTP transfer?)
I can think of four options:
Your Perl script could do the SFTP transfer itself, using something like CPAN's Net::SFTP module, rather than spawning an external job whose status it cannot track.
Your Perl script could spawn a command-line SFTP utility (like PSFTP) that doesn't return until the transfer is done.
Or change exec_SFTP.vbs script to not return until the transfer is done.
If you're currently using a graphical SFTP client and can't switch for whatever reason, I'd recommend using a scripting language like AutoIt instead of Perl. AutoIt has features to wait for windows to change state and so on, so it could more easily monitor for an activity's completion.
Options 1 or 2 would be the most robust and reliable.
The best I can suggest is modifying exec_SFTP.vbs to exit only after the file transfer is complete. system waits for the program it called to complete, so that should solve your problem:
system LIST
system PROGRAM LIST
Does exactly the same thing as "exec LIST", except
that a fork is done first, and the parent process
waits for the child process to complete.
If you can't modify the vbs script to stay alive until it terminates, you may be able to track subprocess creation. If you get subprocess ids, you can monitor them thereby know when the vbs' various offspring terminate.
Win32::Process::Info lets you get a subprocess ids from a running process.
Maybe this is a dumb question, but why not just use the Net::SFTP and Archive::Extract Perl modules to download and unzip the files?
system will not return until the shell it's running the command in has returned; this may be wrong for launching graphical programs and file associations.
See if any of the following help?
system('cscript exec_SFTP.vbs');
use Win32::Process;
use Win32;
Win32::Process::Create(my $proc, 'wscript.exe',
'wscript exec_SFTP.vbs', 0, NORMAL_PRIORITY_CLASS, '.');
$proc->Wait(INFINITE);
Have a look at IPC::Open3
IPC::Open3 - open a process for reading, writing, and error handling using open3()