Need to pass variable to a batch script from a perl script - perl

I'm running a perl script, which in turn calls a batch script. I need to pass 3 parameters to the batch script. I'm passing the parameters, since it it easier to read a file in perl script & capture the desired value. But, my script is erroring out with error - 'The system cannot find the path specified.'I'm using below code --
while (<FILE>)
{
($file, $rcc, $loc) = split(',');
my #lines = qx/"D:\\SiebelAdmin\\Commands\\WinFile_Move.bat $file $rcc $loc" /;
}

Remove the double quotes. With them, the system interprets the whole line as a command, not as a command with parameters.
my #lines = qx/D:\\SiebelAdmin\\Commands\\WinFile_Move.bat $file $rcc $loc/;

Please check if this works for you.
I have created sample batch script which takes two args and print it on prompt.
This script location is on desktop.
Code of batch script
#echo off
set arg1=%1
set arg2=%2
shift
shift
echo %arg1%
echo %arg2%
Output of batch script
C:\Users\Administrator\Desktop>a.bat perl5.8 perl5.18
perl5.8
perl5.18
C:\Users\Administrator\Desktop>
Now I have created the perl script which calls this batch script. This perl script is present in C drive.
Code for perl script
my $bat_file = 'C:\Users\Administrator\Desktop\a.bat';
my $arg1 = 'perl5.8';
my $arg2 = 'perl5.18';
my #lines = `$bat_file $arg1 $arg2`;
print #lines;
Output of perl script
C:\>perl tmp.pl
perl5.8
perl5.18

You can do like this:
Perl file:
my $arg = "hey";
my $bat_file_loc = "C:\\abc.bat";
system($bat_file_loc,$arg);
Batch file:
set arg1=%1

Related

Can I pass a string from perl back to the calling c-shell?

RHEL6
I have a c-shell script that runs a perl script. After dumping tons of stuff to stdout, it determines where (what dir) the parent shell should cd to when the perl script finishes. But that's a string, not an int which is all I can pass back with "exit()".
Storing the name of the dir in a file which the c-shell script can read is what I have now. It works, but is not elegant. Is there a better way to do this ? Maybe a little chunk of memory that I can share with the perl script ?
Short:
Redirect Perl's streams and restore in the end to print that info, taken by the shell script
Or, print that last and the shell script can pass output to the console and take the last line
Or, use a named pipe (either shell) or specific file descriptors (not csh) for that print
When the Perl script prints out that name you can assign it to a variable
in the shell script
#!/bin/csh
set DIR `perl -e'print "dir_name"'`
while in bash
#!/bin/bash
DIR="$(perl -e'print "dir_name"')"
where $(...) is preferred for the command substitution.
But those other prints to console from the Perl script then need be handled
One way is to redirect all output in Perl script other than that one print, what can be controlled by a command-line option (filename to which to redirect, which shell script can print out)
Or, take all Perl's output and pass it to console, the last line being the needed "return." This puts the burden on the Perl script to print that last (perhaps in an END block). The program's output can be printed from the shell script after it completes or line by line as it is emitted.
Or, use a named pipe (both shells) or a specific file descriptor (bash only) to which the Perl script can print that information. In this case its streams go straight to the console.
The question explicitly mentions csh so it is given below. But I must repeat the old and worn fact that shell scripting is far better done in bash than in csh. I strongly recommend to reconsider.
bash
If you need the program's output on the console as it goes, take and print it line by line
#!/bin/bash
while read line; do
echo "$line"
DIR=$line
done < <(perl script.pl)
echo "$DIR"
Or, if you don't need output on the console before the script is finished
#!/bin/bash
mapfile -t lines < <(perl script.pl)
DIR="${lines[-1]}"
printf '%s\n' "${lines[#]}" # print script.pl's output
Or, use file descriptors for that particular print
F=$(mktemp) # safe filename
exec 3> "$F" # open fd 3 to write to it
exec 4< "$F" # open fd 4 to read from it
rm -f "$F" # remove file(name) for safety; opened fd's can still access
perl -E'$fd=shift; say "...normal prints to STDOUT...";
open(FH, ">&=$fd") or die $!;
say FH "dirname";
close FH
' 3
read dir_name <&4
exec 3>&- # close them
exec 4<&-
echo "$dir_name"
I couldn't get it to work with a single file descriptor for both reading and writing (exec 3<> ...), I think because the read can't rewind after the write, thus separate descriptors are used.
With a Perl script (and not the demo one-liner above) pass the fd number as a command-line option. The script can then do this only if it's invoked with that option.
Or, use a named pipe very similarly to how it's done for csh below. This is probably best here, if the manipulation of the program's STDOUT isn't to your liking.
csh
Iterate over the program's (completed) output line by line
#!/bin/csh
foreach line ( "`perl script.pl`" )
echo "$line"
set dir_name = "$line"
end
echo "Directory name: $dir_name"
or extract the last line first and then print the whole output
#!/bin/csh
set lines = ( "`perl script.pl`" )
set dir_name = $lines[$#]
# Print program's output
while ( $#lines )
echo "$lines[1]"
shift lines
end
or use a named pipe
set fifo_name = "/tmp/fifo$$" # or use mktemp
mkfifo "$fifo_name"
( perl script.pl --fifo $fifo_name [other args] & )
set dir_name = `cat "$fifo_name"`
rm -f $fifo_name
echo "dir name from FIFO: $dir_name"
The Perl command is in the background since FIFO blocks until written and read. So if the shell script were to wait for perl ... to complete the Perl script would block as it's writing to FIFO (since that's not being read) so shell would never get to read it; we would deadlock. It is also in a subshell, with ( ), so to avoid the informational prints about the background job.
The --fifo NAME command-line option is needed so that Perl script knows what special file to use (and not to do this if the option is not there).
For an in-line example replace ( perl script ...) with this one-liner, used above as well
( perl -E'$ff = shift; say qq(\t...normal prints to STDOUT...);
open FF, ">$ff" or die $!;
say FF "dir_name_$$";
close FF
' $fifo_name
& )
(broken over lines for readability)

Trouble with line iteration using while or for

I am trying to process each line in a file through a perl script instead of sending the entire file to the perl script, sending so much data to memory at once.
In a shell script, I began what I thought to be line iteration as follows:
while read line
do
perl script.pl --script=options "$line"
done < input
When I do this, how do I save the data to an output file >> output?
while read line
do
perl script.pl --script=options "$line"
done < input
>> output
If it takes less memory to split the file, then I also had trouble with the for statement
for file in /dev/*
do
split -l 1000 $file prefix
done < input
## Where do I save the output?
for file in /dev/out/*
do
perl script.pl --script=options
etc...
Which is the most memory-efficient way to
also you can process your very big file line by line within perl script without loading the entire file in memory. for that you just need to enclose the text of your current perl script (that i hope doen't read the file in memory any more :) ) with while loop. for example:
my $line;
while ($line = <>) {
// your script text here, refering to $line variable instead of param variable
}
and in this perl script you can also write results to output file. say, if result is stored in variable $res, you can do it this way:
open (my $fh, ">>", "out") or die "ERROR: $!"; # opening a file descriptor
my $line;
while ($line = <>) {
// your script text here, refering to $line variable instead of param variable
print $fh $res, "\n"; # writing to file descriptor
}
close $fh; # closing file descriptor
try this:
while read line
do
perl script.pl --script=options "$line" >> "out"
done < input
"out" is a name of your output file.
I fixed my issue with:
split -l 100000 input /dev/shm/split/split.input.txt.
find /dev/shm/split/ -type f -name '*.txt.* -exec perl script.pl --script=options {} + > output
This made my script process the files faster.

Perl script that invokes shell command doesn't work

I am writing a simple Perl program to test a shell script for changing directory. But it doesn't work.
This is my code :
$result = `cd/`;
print $result;
It works fine when I use
$result =`dir`;
If you need to change the cwd directory in your script, then you should use Perl's built-in chdir function.
perldoc -f chdir
cd (by default) doesn't output anything, so you're assigning an empty string to your $result variable.
If you want to output the (full) path of the directory you changed to, simply append && pwd inside the backticks:
$result = `cd / && pwd`;
Note that `...` creates a child process for running the shell with the specified command, so whatever environment changes you perform there - including changing the directory - do NOT affect the Perl script itself.
In other words: you're NOT changing the Perl script's current directory with your shell command.
If your intent is:
to simply test whether the shell command you enclose in `...` succeeds or not, use, the system() function instead; e.g.:
system('cd /') == 0 || die "Command failed";
to capture the output from the shell command, presume it to be a directory path and change the Perl script's working directory to it:
$result = `cd / && pwd` || die "Command failed.";
chomp $result; # trim trailing newline
# Change Perl script's working dir.
chdir $result || die "Could not change to $result.";
To affect the current working directory of the perl process, use the chdir() function.

How to Call Perl script from tcl script

I have a file with 4 perl commands ,
I want to open the file from the tcl and execute each perl command.
TCL script
runcmds $file
proc runcmds {file} {
set fileid [open $file "r"]
set options [read $fileid]
close $fileid
set options [split $options "\n"] #saperating each commad with new line
foreach line $options {
exec perl $line
}
}
when executing the above script
I am getting the error as "can't open the perl script /../../ : No Such file or directory " Use -S to search $PATH for it.
tl;dr: You were missing -e, causing your script to be interpreted as a filename.
To run a perl command from inside Tcl:
proc perl {script args} {
exec perl -e $script {*}$args
# or in 8.4: eval [list perl -e $script] $args
}
Then you can do:
puts [perl {
print "Hello "
print "World\n"
}]
That's right, an arbitrary perl script inside Tcl. You can even pass in other arguments as necessary; access from perl via #ARGV. (You'll need to add other options like -p explicitly.)
Note that this can pass whole scripts; you don't need to split them up (and probably shouldn't; you can do lots with one-liners but they tend to be awful to maintain and there's no technical reason to require it).

How can my perl script invoke and exe and post process it

I need my script to do a simple operation:
Use unix script command to log the activities on the screen to a file
execute a shell script ( there are multiple lines output by this script to STDOUT)
Stop the script command
Analyse the output of the script command
I am planning to use system command to do this work, but I am not sure if I should fork the shell script and wait for its completion. Since the output of the shell script is multiple like not sure if capture will work. Let me know the best option
This is one of the most interesting questions I've come across in a while.
Let's say you have a shell script myscript.sh. To get script to run it and capture the output, at least on my SUSE linux, so check your script(1) man page, I'd write:
script -c /path/to/myscript.sh myscript.log
So the Perl would look vaguely like:
# first, a bunch of code to initialize the program
# then run the shell script.
my $rv = system("script -c /path/to/myscript.sh myscript.log");
# then a bunch of code to process myscript.log
But I'm wondering my you can't just:
system("/path/to/myscript.sh > myscript.log");
instead of involving script(1)?
Why do you need to use script at all? Is the shell script interactive? Does it need a valid TTY? If it is a non-interactive batch job that doesn't need a valid TTY, then you'd be best off opening it as a pipe and processing the output via a file handle.
For example:
open my $cmd_handle, "-|", $command, #args
or die "Could not run $command #args: $!";
foreach my $line ( <$cmd_handle> )
{
# ... process the command output here ...
}
close $cmd_handle;
This has the advantage that your Perl script will process the command's output as it happens. In case you really do need to defer processing until the end, you could slurp all the output into an array and then process it afterwards:
open my $cmd_handle, "-|", $command, #args
or die "Could not run $command #args: $!";
my #cmd_output = ( <$cmd_handle> );
close $cmd_handle;
foreach my $line ( #cmd_output )
{
# ... process the command output here ...
}
Either ought to be better than running the command via script if it meets those restrictions I gave above: non-interactive, and does not need a valid TTY. Most batch scripts meet those restrictions.