MultiProcessing via Perl on windows - perl

I wrote this code that should open several process , the problem is its work on linux well but when i execute it on windows its just create one process !!. is this possible to create multiprocess on windows with perl ?
$j = ARGV[0];
for($i=1; $i<=$j; $i++){
system("perl example.pl word.txt.$i &");
}

& is a *nix thing. An explicit fork in Windows will do it.
Bear in mind that Windows implementations of Perl emulate forking using threads, so that may be another option.
my #pids;
for my $i (1 .. $j) {
my $pid = fork;
unless ( $pid ) { # Child
system("perl example.pl word.txt.$i");
exit 0;
}
push #pids, $pid;
}
waitpid $_, 0 foreach #pids;

Better fork from the enclosing Perl script and then call system in the child process without the trailing &. wait will be needed in the parent as well.
Because the argument of system is parsed by the system shell, you will encounter different behaviour from the Windows shell than from Bash, for example.

It is a lot easier to use the START command (Windows Batch command) than to fork processes. The downside is that it will open multiple DOS windows.
system "start perl example.pl word.txt.$i";

Related

How to use Perl to check when a Unix command has finished processing

I am working on a capstone project and am hoping for some insight.
This is the first time I've worked with Perl and it's pretty much a basic Perl script to automate a few different Unix commands that need to be executed in a specific order. There are two lines throughout the script which executes a Unix command that needs to finish processing before it is acceptable for the rest of the script to run (data will be incorrect otherwise).
How am I able to use Perl (or maybe this is a Unix question?) to print a simple string once the Unix command has finished processing? I am looking into ways to read in the Unix command name but am not sure how to implement a way to check if the process is no longer running and to print a string such as "X command has finished processing" upon it's completion.
Example:
system("nohup scripts_pl/RunAll.pl &");
This runs a command in the background that takes time to process. I am asking how I can use Perl (or Unix?) to print a string once the process has finished.
I'm sorry if I didn't understand your asking context.
But couldn't you use perl process fork function instead of & if you would like to do parallel process?
# parent process
if (my $pid = fork) {
# this block behaves as a normal process
system("nohup scripts_pl/RunAll2.pl"); # you can call other system (like RunAll2.pl)
wait; # wait for the background processing
say 'finished both';
}
# child process
else {
# this block behaves as a background process
system("nohup scripts_pl/RunAll.pl"); # trim &
}
You could try to use IPC::Open3 instead of system:
use IPC::Open3;
my $pid = open3("<&STDIN", ">&STDOUT", ">&STDERR", 'nohup scripts_pl/RunAll.pl');
waitpid( $pid, 0 );
Or, if you need to run nohup through the shell:
my $pid = open3("<&STDIN", ">&STDOUT", ">&STDERR", 'bash','-c', 'nohup scripts_pl/RunAll.pl & wait');
Update: Thanks to #ikegami. A better approach if you would like STDIN to stay open after running the command:
open(local *CHILD_STDIN, "<&", '/dev/null') or die $!;
my $pid = open3("<&CHILD_STDIN", ">&STDOUT", ">&STDERR", 'nohup scripts_pl/RunAll.pl');

Perl How to exit from child script which does a telnet?

I have a script which executes few commands and then telnets to machine. Now I need to call this script from another perl script.
$result = `some_script.pl`;
The script some_script.pl executes successfully but I am not able to exit from the main script as the script waits at the telnet prompt.
I also need to capture the exit status of the script in order to make sure that some_script.pl executed successfully.
I cannot modify some_script.pl.
Is there some way by which I can issue quit after the some_script.pl is executed successfully?
Try this out, this 'magic' close the standard in/out/err and may let your program finish.
$result = `some_script.pl >&- 2>&- <&-';
Otherwise you could use open2 and expect to watch for a specific string (like Done!) in your program output and close it when done.
http://search.cpan.org/~rgiersig/Expect-1.15/Expect.pod
Regards
I don't like the way you are actually executing your perl script with a "backtick" call to the system.
I suggest you actually fork (or something equivalent) and run the program in a more controlled manner.
use POSIX ":sys_wait_h";
my $pid = fork();
if($pid) { # on the parent proc, $pid will point to the child
waitpid($pid); # wait for the child to finish
} else { # this is the child, where we want to run the telnet
exec 'some_script.pl'; # this child will now "become" some_script.pl
}
Since I don't know how some_script.pl actually works, I cannot really help you more here. But for example, if all you need to do is print "quit" on the command line of some_script.pl you could use IPC::Open2 like suggested in another question. Doing something like:
use IPC::Open2;
$pid = open2(\*CHLD_OUT, \*CHLD_IN, 'some_script.pl');
print CHLD_IN "quit\n";
waitpid( $pid, 0 );
my $child_exit_status = $? >> 8;
You do need to tweak this a little, but the idea should solve your problem.

perl fork() & exec()

I'm trying to grasp the concept of fork() & exec() for my own learning purposes. I'm trying to use perl fork as a second identical process, and then use that to exec a .sh script.
If I use fork() & exec() can I get the .sh script to run in parallel to my perl script? Perl script doesn't wait on the child process and continues on its execution. So my perl script doesn't care about the output of the child process, but only that the command is valid and running. Sort of like calling the script to run in the background.
Is there some sort of safety I can implement to know that the child process exited correctly as well?
If I use fork() & exec() can I get the .sh script to run in parallel to my perl script? [...] Sort of like calling the script to run in the background.
Yes. Fork & exec is actually the way shells run commands in the background.
Is there some sort of safety I can implement to know that the child process exited correctly as well?
Yes, using waitpid() and looking at the return value stored in $?
Like #rohanpm mentioned, the perlipc man page has a lot of useful examples showing how to do this. Here is one of the most relevant, where a signal handler is set up for SIGCHLD (which will be sent to the parent when the child terminates)
use POSIX ":sys_wait_h";
$SIG{CHLD} = sub {
while ((my $child = waitpid(-1, WNOHANG)) > 0) {
$Kid_Status{$child} = $?;
}
};
To get waitpid to not wait for the child:
use POSIX qw/ WNOHANG /;
my $st = waitpid $pid, WNOHANG;
$st is 0 if the process is still running and the pid if it's reaped.

simultanious perl system calls on windows

Trying to find a way to have a perl script run 4 other perl scripts on windows and then once all are done, kick off a 5th script. I have checked out a bunch of things but none seem straight forward. Suggestions welcome. The scripts are going to run on a windows box. scripts 1-4 need to finish first before starting script 5
Accept answers to your other questions
use threads
2.1. kick of the 4 scripts:
my #scripts = qw(... commands ...);
my #jobs = ();
foreach my $script (#scripts) {
my $job = threads->create( sub {
system($script);
});
push #jobs, $job;
}
2.2. Wait for completition
$_->join() foreach #jobs;
2.3. Kick off the last script
Edit
As you indicated that my solution didn't work for you, I fired up my Windoze box, taught me to use this horrible cmd.exe and wrote following test script. It is a bit simplified over over the above solution, but does meet your requirements about sequentiality etc.
#!/usr/bin/perl
use strict; use warnings; use threads;
my #scripts = (
q(echo "script 1 reporting"),
q(perl -e "sleep 2; print qq{hi there! This is script 2 reporting\n}"),
q(echo "script 3 reporting"),
);
my #jobs = map {
threads->create(sub{
system($_);
});
} #scripts;
$_->join foreach #jobs;
print "finished all my jobs\n";
system q(echo "This is the last job");
I used this command to execute the script (on Win7 with Strawberry Perl v5.12.2):
C:\...>perl stackoverflow.pl
And this is the output:
"script 1 reporting"
"script 3 reporting"
hi there! This is script 2 reporting
finished all my jobs
"This is the last job"
So how on earth does this not work? I would very much like to learn circumventing Perl's pitfalls the next time I write a script on a non-GNU system, so please enlighten me about what can go wrong.
From personal experience, using fork() in ActiveState Perl doesn't always run the process sequentially. The threaded simulation of fork() used there seems to start all the processes, get it to a certain point, then run them one at a time. This applies even on multicore CPUs. I think Strawberry Perl is compiled the same way. Also, keep in mind that fork() is still being used for backticks and system(), it's just abstracted away.
If you use Cygwin Perl on Windows, it will run through Cygwin's own fork() call and things will parallelize properly. However, Cygwin is slower in other ways.
use Proc::Background;
my #commands = (
['./Files1.exe ALL'],
['./Files2.exe ALL'],
['./Files3.exe ALL'],
['./Files4.exe ALL'],
);
my #procs = map { Proc::Background->new(#$_) } #commands;
$_->wait for #procs;
system 'echo', 'CSCProc', '--pidsAndExitStatus', map { $_->pid, $_->wait } #procs;
`mergefiles.exe`;

How to run in parallel two child command from a parent one?

I need to run two perl scripts from one in parallel. How can I accomplish this?
Currently, I have a file with
system("perl command1.pl command2.pl");
Commands are executed in sequence and until command1.pl is done command2.pl won't run.
I would like to run the two commands simultaneously.
PLEASE HELP!
`perl command1.pl &`;
`perl command2.pl &`;
..or use the perl fork() function
perldoc -f fork
..or use perl threading
perldoc threads
Or just use a shell script:
#!/bin/sh
./command1.pl &
./command2.pl &
Depends on the command interpreter. In Windows you use the start command to just launch a process without waiting. In most *nix command interpreters as I recall the relevant notation is to add an ampersand & at the end of the command.
You could use a piped open to the process, ala
use 5.013;
use warnings;
use autodie;
open my $cmd1_fh, '-|', 'dir';
open my $cmd2_fh, '-|', 'cls';
Or, if you don't care about the output, fork and then exec:
my #child_pids;
for my $cmd ('dir', 'cls') {
defined(my $child_pid = fork()) or die "Couldn't fork: $!";
if ($child_pid == 0) {
exec $cmd;
} else {
push #child_pids, $child_pid;
}
}
for my $pid (#child_pids) {
waitpid($pid, 0);
}
(If you do care about the output, fork and then backtick?)
Or use threads (I'm not proud of this example, and I haven't even written it yet. Look up an example using Thread::Queue for something much less awful)
use threads;
my #threads;
for my $cmd ('dir', 'cls') {
push #threads, threads->create(sub { system #_ }, $cmd);
}
$_->join for #threads;
There's also several modules that help you out with this one, such as Parallel::ForkManager and Win32::Job.
Depending on your skill level and what you want to do, you might be interested in POE::Wheel::Run.