Disk usage in Perl Core - perl

I’m looking for a way to check the remaining free space on a disk within Perl. I can’t use CPAN since I have to deploy the script on many servers with different versions of Perl, and I can’t change it because my team leader ordered me that way.
Any idea? I tried File::stat but I can’t use it on D:\ (the script runs on Windows versions).
Thanks!

fsutil volume diskfree C:
For Windows servers you can run this cmd command from system() method.

Related

Perl - Directory Management on Different Operating Systems

I am new in Perl. I am using the following command to remove a folder in Perl, under Windows:
system "del trash_folder";
But I intend to run the same script under Unix as well. I could get the OS name in the code and run a different command based on the OS. But is there no better way in Perl? I am thinking of possibly an API or so that is OS-ignorant.
The del command is never going to create a new directory, is it? :-)
If you want to create directories, use the mkdir command.
If you want to remove directories, use the rmdir command.
Update: In general, if you have a choice between using a Perl built-in function or an external command, then the Perl function will be the better choice. Firstly, your code will be more portable and, secondly, opening a sub-shell to run the external command will slow your program down.

Using Perl modules vs. using system() calls

Quite recently, I wrote a few scripts in Perl for a cPanel plugin in which, though most of the code was in Perl, there was quite a lot of system() commands as well which I used to execute shell commands directly.
I am pretty sure that there are Perl modules that I could have used instead. Keeping in mind the time crunch, I thought using the system command was easier (to complete the project in time). In retrospective, I think that was a bad programming practice.
My question is, is there any tradeoff, memory-wise or otherwise when using Perl's modules and using system() commands. For example, what would be the difference in using:
my $directory = "temp";
mkdir $directory;
and
system ("mkdir temp");
Also, if I am to use Perl modules, wouldn't that involve installing a whole lot of modules in the beginning?
The most obvious economy is that, in the first case, your Perl process is creating the directory, while in the second, Perl is starting a new process that runs a command shell which parses the command line and runs the shell mkdir command to create the directory, and then the child process is deleted. You would be creating and deleting a process and running the shell for every call to system: there is no caching of processes or similar economy.
The second thing that comes to mind is that, if your original mkdir fails, it is simple to handle the error in Perl, whereas shelling out to run a mkdir command puts your program at a distance from the error, and it is far more awkward to handle the many different problems that may arise.
There is also the question of maintainability and portability, which will affect you even if you aren't expecting to run your program on more than one machine. Once you abandon control to a system command you have no control over what happens. I could have written a mkdir that will delete your home directory or, less disastrously, your program may find itself on a system where mkdir doesn't exist, or does something slightly different.
In the particular case of mkdir, this is a built-in Perl operator and is part of every Perl installation. There are also many core libraries that require you to put use Module in your program, but are already installed and need no further action.
I am sure others will come up with more reasons to prefer a Perl operator or module over a shell command. In general you should prefer to keep everything you can within the language. There are only a few cases where you have to run a third-party program, and they usually involve custom software that allows you act on proprietary data formats.

Powershell for scripting large analysis runs

I'm completely new to Powershell and I know that a number of people use it to automate tasks much in the way bash and c-shell programming is done in *NIX. I've successfully recompiled some ancient analysis software written in FORTRAN that takes individual input files. I now need to somehow run just under 1000 cases with only slightly varied input files. The analysis software writes intermediate files, so for concurrent runs, every run has to be within a different directory. Each case can take up to 40 minutes to solve, so individually running these will take a lot of time and be prone to error.
So now for the question, can Powershell automate this and is there some similar script out there that I can modify to do it?
The automation would need to do the following (as I see it):
Take in an input file with the various runs that have to be run
Create a subdirectory relative to the run name/number
Save a version of the input files with the variables switched in the subdirectory
Run the analysis software in the subdirectory
Look at standard/error output of analysis software to confirm it was successful
Append to a file success or failure of a run
Ideally would be able to run up to some number of analyses concurrently (4-6 for my machine)
If IT reboots the machine (as they do whenever they choose), I'd like to be able to restart where it left off, though I expect the loss of anything that the analysis software was running during the forced reboot.
I've tried recompiling the software with vectorization and automated parallelization and on the tested cases, the convergence time was only minimally reduced, so it is safe to assume that this is effectively single threaded.
Powershell has lots of familiar aliases for Unix users. ls, cat, cp etc are implemented as aliases to native Powershell commands. The commands are not case sensitive. What's more, you can search help even with alias' name. That is,
man ls <=> get-help get-childitem
apropos <=> get-help <keyword>
get-help loop
about_Break
about_Continue
about_do
about_For
about_Foreach
about_Language_Keywords
...
This should help converting an existing script. For the rest, I'll give some hints as the description is somewhat vague.
Get-Content is used to read file contents into a variable: $myVar = cat c:\some\file.txt.
Directory creation is just md.
Capturing exe output is done by assigning to a variable: $exeOutput = c:\myApp.exe
Adding stuff to a file is Add-Content.
Background jobs are started with Start-Job.

Pausing a perl script while SFTP transfers files

FYI, I'm a complete newbie with Perl, as in I can spell it and only a little more so I'm trying to learn. What I'm trying to accomplish is using SFTP to transfer files from a Windows machine to a Linux machine.
I've noticed that Perl issues the SFTP get command, but doesn't wait for the transfer to finish so when the Perl script tries to use a file it can't find it. I know there is the sleep command, but the number and size of files will vary on a weekly basis so using sleep(600) seems a little silly.
Is there a standard way to pause a Perl script until SFTP finishes transferring all necessary files?
TIA.
Using Net::SFTP might have solved this dilemma, but my workplace won't allow me to download and install stuff, especially in production. So rather than waiting on the typical bureaucracy I did some more digging around and discovered this:
By calling SFTP in batch mode using a separate file that contains the SFTP commands, the Perl script has to wait for SFTP to finish executing the commands in the separate "command" file. So by using the batch mode option, the Perl script is paused as long as it takes for SFTP to finish its work of file transfer.

Executing a commandline from JConsole

I've recently discovered the joy of going through JConsole.exe instead of J.exe to run various scripts. There's generally a noticeable performance gain.
However, sometimes I need to use wd winexec (calling ad-hoc programs for example) and in the console, 11!:0 (wd) support is not available.
Is there a way to send a command from JConsole.exe to the regular Windows command line interpreter? Or maybe a workaround?
You might try the task script. See the script itself for documentation.
J6: ~system/packages/misc/task.ijs',
J7: ~system/main/task.ijs
It contains utilities such as fork_jtask_, spawn_jtask_, shell_jtask_
You can load the script in both versions using: require 'task'