Non-blocking child process blocks file - perl

Consider this scenario:
We have three scripts:
script.pl
use strict;
use warnings;
print "\nStarting a blocking process";
print "\nRedirect the output of the blocking process to execution.log";
my $cmd = "perl d:\\blocking_script.pl >d:\\execution.log";
my $exitCode = system ($cmd);
print "\nAfter the execution of the blocking process";
print "\nNow I try to rename the log";
rename "d:\\execution.log", "d:\\execution.err" or print "\nCouldn't rename because : $!";
blocking_script.pl
use strict;
use warnings;
print "\nFrom the blocking_process I run a non-blocking process";
my $cmd = "start perl d:\\non_blocking_script.pl";
my $exitCode = system ($cmd);
print "\nAfter I started the non-blocking process";
non_blocking_script.pl
use strict;
use warnings;
print "\nI am an independent non-blocking process";
sleep 5;
print "\nStill here";
sleep 2;
print "\nYou can't rename the log because you didn't wait for me";
sleep 3;
print "\n.";
sleep 1;
What will result from this?
Couldn't rename because : Permission denied
While another command promopt will be hanging ironically :
I am an independent non-blocking process
Still here
You can't rename the log because you didn't wait for me
.
In my situation from perl I run an external application in a blocking way, but that application was starting some non-blocking process which were holding my log.
How can I overcome this situation?

Here is the documentation for start (which you should also be able to read by using start /? on the command line. I do not have access to a Windows system right now, so I can't verify.
/b
Starts an application without opening a new Command Prompt window. CTRL+C handling is ignored unless the application enables CTRL+C processing. Use CTRL+BREAK to interrupt the application.
blocking_script.pl is waiting for the cmd window which start opened to run non_blocking_script.pl.
In the short run, using start /b might help.
Or, you could try
my #cmd = start => qw(perl d:\\non_blocking_script.pl);
my $exitCode = system #cmd;
However, you should change your design.

Related

How to use Perl to check when a Unix command has finished processing

I am working on a capstone project and am hoping for some insight.
This is the first time I've worked with Perl and it's pretty much a basic Perl script to automate a few different Unix commands that need to be executed in a specific order. There are two lines throughout the script which executes a Unix command that needs to finish processing before it is acceptable for the rest of the script to run (data will be incorrect otherwise).
How am I able to use Perl (or maybe this is a Unix question?) to print a simple string once the Unix command has finished processing? I am looking into ways to read in the Unix command name but am not sure how to implement a way to check if the process is no longer running and to print a string such as "X command has finished processing" upon it's completion.
Example:
system("nohup scripts_pl/RunAll.pl &");
This runs a command in the background that takes time to process. I am asking how I can use Perl (or Unix?) to print a string once the process has finished.
I'm sorry if I didn't understand your asking context.
But couldn't you use perl process fork function instead of & if you would like to do parallel process?
# parent process
if (my $pid = fork) {
# this block behaves as a normal process
system("nohup scripts_pl/RunAll2.pl"); # you can call other system (like RunAll2.pl)
wait; # wait for the background processing
say 'finished both';
}
# child process
else {
# this block behaves as a background process
system("nohup scripts_pl/RunAll.pl"); # trim &
}
You could try to use IPC::Open3 instead of system:
use IPC::Open3;
my $pid = open3("<&STDIN", ">&STDOUT", ">&STDERR", 'nohup scripts_pl/RunAll.pl');
waitpid( $pid, 0 );
Or, if you need to run nohup through the shell:
my $pid = open3("<&STDIN", ">&STDOUT", ">&STDERR", 'bash','-c', 'nohup scripts_pl/RunAll.pl & wait');
Update: Thanks to #ikegami. A better approach if you would like STDIN to stay open after running the command:
open(local *CHILD_STDIN, "<&", '/dev/null') or die $!;
my $pid = open3("<&CHILD_STDIN", ">&STDOUT", ">&STDERR", 'nohup scripts_pl/RunAll.pl');

How to redirect output of Win32::Process command to text file?

I am running a command in a Perl script using Win32::Process, and I need to redirect the output of that command to a text file. After doing a bit of research, this is what I am trying:
use Win32::Process;
open (OLDOUT, ">&STDOUT");
open (OLDERR, ">&STDERR");
my $file = "output.txt";
open (STDOUT, ">$file");
open (STDERR, ">&STDOUT");
my $timeout = 1000 * 60; # 60 second timeout
my $proc;
my $exit;
my $exe = "C:/Windows/System32/cmd.exe";
Win32::Process::Create($proc, $exe, "echo hello from process", 1, DETACHED_PROCESS, ".");
$proc->Wait($timeout);
$proc->GetExitCode($exit);
system("echo hello from system"); # To verify that the redirect is working
close (STDOUT);
close (STDERR);
open (STDOUT, ">&OLDOUT");
open (STDERR, ">&OLDERR");
close (OLDOUT);
close (OLDERR);
Unfortunately this is not working. In the output.txt file, I only get "hello from system". Is there a way to accomplish what I want using Win32::Process?
The reason I am using Win32::Process rather than backticks is because my command sometimes crashes, and I need to provide a timeout in order to kill it if neccessary. The ->Wait() function of Win32::Process allows me to do this.
I would rather have a solution using Win32::Process, as I am limited to which modules I have access to. However, if this really cannot be done, I would welcome an example solution using some other module.
Thank you.
You are specifying DETACHED_PROCESS when you start the process. The effect of this is the following:
DETACHED_PROCESS 0x00000008
For console processes, the new process does not inherit its parent's console (the default).
See Process Creation Flags.
The reason passing "echo hello from process" as the command line to Win32::Process doesn't work is because echo is a cmd.exe builtin. You need to instead use the command line 'cmd /c "echo hello from process"' as shown below:
#!/usr/bin/env perl
use strict;
use warnings;
use File::Which qw(which);
use Win32;
use Win32::Process;
open OLDOUT, ">&STDOUT";
open OLDERR, ">&STDERR";
my $file = 'output.txt';
open STDOUT, ">$file";
open STDERR, ">&STDOUT";
my $timeout = 15 * 1_000;
my ($proc, $exit);
my $exe = which 'cmd.exe';
Win32::Process::Create($proc, $exe, 'cmd /c "echo hello from spawned process"', 1, 0, '.');
$proc->Wait($timeout);
$proc->GetExitCode($exit);
print "Doing work ...\n"; sleep 3;
print "Spawned process exited with $exit\n";
close STDERR;
close STDOUT;
open STDERR, ">&OLDERR";
open STDOUT, ">&OLDOUT";
close OLDERR;
close OLDOUT;
Contents of output.txt:
$ perl main.pl
$ type output.txt
hello from spawned process
Doing work ...
Spawned process exited with 0

Display output of the command over SSH on run time

I am running a perl script over ssh like the below:
[dipuh#local ~]$ ssh dipuh#myremote_001 'perl /home/dipuh/a.pl'
The content of a.pl is the below:
print "Sleeping \n";
sleep(60);
print "Waking Up";
Here my local terminal waits for the perl script to execute completely and once finished displays the complete output. The initial "Sleeping" text also will be printed only with the final output.
Is there any way, in my local terminal, I can display the output of each command in the perl script at the run time, instead of waiting for the whole perl script to finish.
You are suffering from buffering.
You may either set $| to 1 for the block.
{
local $| = 1;
print "Sleeping \n";
sleep(60);
print "Waking Up";
}
Or use IO::Handle
use IO::Handle;
STDOUT->autoflush(1);
You can try to turn the autoflush mode on. The old fashioned way to it is by adding the following at the top of your script:
$| = 1;
or you can do it with the more modern way:
use IO::Handle;
STDOUT->autoflush(1);
Alternatively, you can flush the STDOUT on demand with:
use IO::Handle;
print "Sleeping \n";
STDOUT->flush;
sleep(60);
print "Waking Up";
STDOUT->flush;

Run external command and process its log file in parallel

I need to run external tool from within my Perl code. This command works for a pretty long time, prints almost nothing to STDOUT but creates a log file.
I would like to run it and in parallel read and process its log file. How can I do it in Perl?
Thanks in advance.
If you use something like File::Tail to read the log file, then you can do a simple fork and exec to run the external command. Something like the following should work:
use strict;
use warnings;
use File::Tail;
my $pid = fork;
if ( $pid ) {
# in the parent process; open the log file and wait for input
my $tail = File::Tail->new( '/path/to/logfile.log' );
while( my $line = $tail->read ) {
# do stuff with $line here
last if $line eq 'done running'; # we need something to escape the loop
# or it will wait forever for input.
}
} else {
# in the child process, run the external command
exec 'some_command', 'arg1', 'arg2';
}
# wait for child process to exit and clean it up
my $exit_pid = wait;
If there are problems running the child process, the exit return code will be in the special variable $?; see the documentation for wait for more information.
Also, if the logging output does not provide a clue for when to stop tailing the file, you can install a handler in $SIG{CHLD} which will catch the child process's termination signal and allow you to break out of the loop.

How can I run a system command in Perl asynchronously?

I currently have a Perl script that runs an external command on the system, gathers the output, and performs some action based on what was returned. Right now, here is how I run this (where $cmd is a string with the command setup):
#output = `$cmd`;
I'd like to change this so if the command hangs and does not return a value after so much time then I kill the command. How would I go about running this asynchronously?
There's a LOT of ways to do this:
You can do this with a fork (perldoc -f fork)
or using threads (perldoc threads). Both of these make passing the returned information back to the main program difficult.
On systems that support it, you can set an alarm (perldoc -f alarm) and then clean up in the signal handler.
You can use an event loop like POE or Coro.
Instead of the backticks, you can use open() or respectively open2 or open3 (cf. IPC::Open2, IPC::Open3) to start a program while getting its STDOUT/STDERR via a file handle. Run non-blocking read operations on it. (perldoc -f select and probably google "perl nonblocking read")
As a more powerful variant of the openX()'s, check out IPC::Run/IPC::Cmd.
Probably tons I can't think of in the middle of the night.
If you really just need to put a timeout on a given system call that is a much simpler problem than asynchronous programming.
All you need is alarm() inside of an eval() block.
Here is a sample code block that puts these into a subroutine that you could drop into your code. The example calls sleep so isn't exciting for output, but does show you the timeout functionality you were interested in.
Output of running it is:
/bin/sleep 2 failure: timeout at
./time-out line 15.
$ cat time-out
#!/usr/bin/perl
use warnings;
use strict;
my $timeout = 1;
my #cmd = qw(/bin/sleep 2);
my $response = timeout_command($timeout, #cmd);
print "$response\n" if (defined $response);
sub timeout_command {
my $timeout = (shift);
my #command = #_;
undef $#;
my $return = eval {
local($SIG{ALRM}) = sub {die "timeout";};
alarm($timeout);
my $response;
open(CMD, '-|', #command) || die "couldn't run #command: $!\n";
while(<CMD>) {
$response .= $_;
}
close(CMD) || die "Couldn't close execution of #command: $!\n";
$response;
};
alarm(0);
if ($#) {
warn "#cmd failure: $#\n";
}
return $return;
}
If your external program doesn't take any input, look for the following words in the perlipc manpage:
Here's a safe backtick or pipe open for read:
Use the example code and guard it with an alarm (which is also explained in perlipc).
I coded below to run rsync on 20 directories simultaneously (in parallel instead of sequentially requiring me to wait hours for it to complete):
use threads;
for my $user ( keys %users ) {
my $host = $users{$user};
async {
system <<~ "SHELL";
ssh $host \\
rsync_user $user
SHELL
}
}
$ pgrep -lf rsync | wc -l
20
Not sure if it's best or even a good solution, but I was glad that it worked for my use case.
With this you get a mixed output on screen (what I ignored anyway), but it does its job successfully.
threads pragma exports the (very useful) async function by default.
rsync_user is my Perl script that wraps rsync command with options, and source and target directories set.
Ran on FreeBSD 13.1 with Perl 5.32.1