Using perl temporarily in a batch file - perl

I would like to use perl in a batch file then exit perl and continue with the batch code. A small example I would like to achieve :
perl do something
echo hello
pause

If Perl is installed, there should be no problem:
perl -e "print $_ for 1 .. 10"
perl script.pl
You might need to specify the full path to perl and the script.pl.

#echo off
perl script.pl
echo hello
pause
or
#echo off
perl -e" ...code..."
echo hello
pause

Maybe I'm just reading it wrong but I interpretted the question as asking how to embed perl scripts directly into batch files.
Activeperl adds a bunch of these as .cmd files and there is a tool on cpan for creating them from your perl script.
I also found two examples from a quick search that seem to extend the idea a little. They allow you to put the code into the same file and run dos commands before and after the perl call, which is what I thought you were asking.
Here's one of the examples:
#rem = 'Perl, ccperl will read this as an array of assignment & skip this block
#CD /d "%~dp0"
#perl -s "%~nx0" %*
#FOR /L %%c in (4,-1,1) do #(TITLE %~nx0 - %%cs to close & ping -n 2 -w 1000 127.0.0.1 NUL)
#TITLE Press any key to close the window&ECHO.&GOTO:EOF
#rem ';
#perl script starts below here
print 'Hi there! DOS rocks!\n'

You can also do vice-versa. i.e. Call Batch commands from perl
my $cmd = 'some command';
if (system $cmd) {
print "Error: $? for command $cmd"
}
Save this perl script. This should do your job as well.

Related

Can I pass a string from perl back to the calling c-shell?

RHEL6
I have a c-shell script that runs a perl script. After dumping tons of stuff to stdout, it determines where (what dir) the parent shell should cd to when the perl script finishes. But that's a string, not an int which is all I can pass back with "exit()".
Storing the name of the dir in a file which the c-shell script can read is what I have now. It works, but is not elegant. Is there a better way to do this ? Maybe a little chunk of memory that I can share with the perl script ?
Short:
Redirect Perl's streams and restore in the end to print that info, taken by the shell script
Or, print that last and the shell script can pass output to the console and take the last line
Or, use a named pipe (either shell) or specific file descriptors (not csh) for that print
When the Perl script prints out that name you can assign it to a variable
in the shell script
#!/bin/csh
set DIR `perl -e'print "dir_name"'`
while in bash
#!/bin/bash
DIR="$(perl -e'print "dir_name"')"
where $(...) is preferred for the command substitution.
But those other prints to console from the Perl script then need be handled
One way is to redirect all output in Perl script other than that one print, what can be controlled by a command-line option (filename to which to redirect, which shell script can print out)
Or, take all Perl's output and pass it to console, the last line being the needed "return." This puts the burden on the Perl script to print that last (perhaps in an END block). The program's output can be printed from the shell script after it completes or line by line as it is emitted.
Or, use a named pipe (both shells) or a specific file descriptor (bash only) to which the Perl script can print that information. In this case its streams go straight to the console.
The question explicitly mentions csh so it is given below. But I must repeat the old and worn fact that shell scripting is far better done in bash than in csh. I strongly recommend to reconsider.
bash
If you need the program's output on the console as it goes, take and print it line by line
#!/bin/bash
while read line; do
echo "$line"
DIR=$line
done < <(perl script.pl)
echo "$DIR"
Or, if you don't need output on the console before the script is finished
#!/bin/bash
mapfile -t lines < <(perl script.pl)
DIR="${lines[-1]}"
printf '%s\n' "${lines[#]}" # print script.pl's output
Or, use file descriptors for that particular print
F=$(mktemp) # safe filename
exec 3> "$F" # open fd 3 to write to it
exec 4< "$F" # open fd 4 to read from it
rm -f "$F" # remove file(name) for safety; opened fd's can still access
perl -E'$fd=shift; say "...normal prints to STDOUT...";
open(FH, ">&=$fd") or die $!;
say FH "dirname";
close FH
' 3
read dir_name <&4
exec 3>&- # close them
exec 4<&-
echo "$dir_name"
I couldn't get it to work with a single file descriptor for both reading and writing (exec 3<> ...), I think because the read can't rewind after the write, thus separate descriptors are used.
With a Perl script (and not the demo one-liner above) pass the fd number as a command-line option. The script can then do this only if it's invoked with that option.
Or, use a named pipe very similarly to how it's done for csh below. This is probably best here, if the manipulation of the program's STDOUT isn't to your liking.
csh
Iterate over the program's (completed) output line by line
#!/bin/csh
foreach line ( "`perl script.pl`" )
echo "$line"
set dir_name = "$line"
end
echo "Directory name: $dir_name"
or extract the last line first and then print the whole output
#!/bin/csh
set lines = ( "`perl script.pl`" )
set dir_name = $lines[$#]
# Print program's output
while ( $#lines )
echo "$lines[1]"
shift lines
end
or use a named pipe
set fifo_name = "/tmp/fifo$$" # or use mktemp
mkfifo "$fifo_name"
( perl script.pl --fifo $fifo_name [other args] & )
set dir_name = `cat "$fifo_name"`
rm -f $fifo_name
echo "dir name from FIFO: $dir_name"
The Perl command is in the background since FIFO blocks until written and read. So if the shell script were to wait for perl ... to complete the Perl script would block as it's writing to FIFO (since that's not being read) so shell would never get to read it; we would deadlock. It is also in a subshell, with ( ), so to avoid the informational prints about the background job.
The --fifo NAME command-line option is needed so that Perl script knows what special file to use (and not to do this if the option is not there).
For an in-line example replace ( perl script ...) with this one-liner, used above as well
( perl -E'$ff = shift; say qq(\t...normal prints to STDOUT...);
open FF, ">$ff" or die $!;
say FF "dir_name_$$";
close FF
' $fifo_name
& )
(broken over lines for readability)

How to check if perl is Windows command?

I would like to write a batch file to silently install Perl MSI. However, the server/PC may have Perl installed, the batch file's flow would be:
Check if Perl is installed.
If not installed, install it silently.
I know that the command perl-v reports Perl version if Perl is installed, but do not have idea how to check whether the perl command is executable on the server/PC by windows batch file.
Perhaps How do I get the application exit code from a Windows command line? and Redirect Windows cmd stdout and stderr to a single file might help you.
Run
perl -e1 2>NUL
if errorlevel 1 (
echo Perl is not installed
)
perl -e1 simply executes the Perl expression 1 as a one-liner which always is successful if Perl is installed. It produces no ouput at all, except it complains when Perl isn't found. That's why I redirected STDERR to NUL so you will not see any output, even not the error messages.
The if errorlevel 1 checks whether the returncode of the last command (perl -e1 in this case) was >=1. If Perl is installed and was executable then its returncode will be 0 (meaning success) and the if won't trigger.
You could also use perl -v but that produces output on STDOUT. In that case you would have to redirect both STDOUT and STDERR to NUL, like so: perl -v >NUL 2>&1.
>nul 2>nul where perl || echo not installed
this checks for perl without trying to run perl.
where prints to STDOUT if it finds the file/command in folders listed in %PATH% or prints to STDERR if it doesn't.

How to execute perl file from shell script

I have a question about how to execute the perl file inside of a shell script
I have 2 files now, "test.sh" and "test.pl", here are example of my scripts
SHELL script
#!/bin/bash
perl FILEPATH/test.pl
......
PERL script
#!/usr/bin/perl
my $a = "hello"
sub saysomething
{
print $a;
}
.....
The way I call the shell script is : under the path of shell scripts, execute "./test.sh"
All mentioned above are working under the environment
GUN bash, version 4.2.24(1)-release (i686-pc-linux-gnu) + perl (v5.14.2)
But if I put those scripts on server (which I couldn't change the bash / perl version)
GNU bash, version 4.2.10(1)-release (x86_64-pc-linux-gnu) + perl (v5.12.4), I got the followign message:
FILEPATH/test.pl: line 2: my: command not found
Does anybody know how can I solve this problem?
BTW, if I execute the perl script individually (perl FILEPATH/FILENAME.pl), it works perfectly.
In order to execute a perl script by .sh script you dont need to use perl prefix, but only:
#!/bin/sh
/somewhere/perlScript.pl
It will work without problem.
This problem is at least two-fold. One, you have to have the location of Perl in your environment PATH. Two, the location of Perl may be different on different machines. One solution to both problems that I, and others, have used for years is to make use of a "magic header" of some sort at the top of Perl programs. The header identifies itself as a sh shell script and leverages the fact that /bin/sh exists in every version/flavor of Linux/UNIX. The header's job is to fortify the PATH with various possible Perl locations and then run the Perl script in place of itself (via exec). Here is a "Hello World" example:
1 #! /bin/sh --
2 eval '(exit $?0)' && eval 'PERL_BADLANG=x;PATH="/usr/bin:/bin:/usr/local/bin:$PATH";export PERL_BADLANG;: \
3 ;exec perl -x -S -- "$0" ${1+"$#"};#'if 0;
4 exec 'setenv PERL_BADLANG x;exec perl -x -S -- "$0" $argv:q;#'.q
5 #!/bin/perl -w
6 +($0=~/(.*)/s);do(index($1,"/")<0?"./$1":$1);die$#if$#;__END__+if 0;
7 # Above is magic header ... real Perl code begins here
8 use strict;
9 use warnings;
10 print "hello world!\n";
Note: I added line numbers just to make it clear where lines start and end.
First check where perl is installed on your system, e.g. which perl and use that location in the shebang line instead of /usr/bin/perl, if it is different.
If all other recommendations fail, check the first line of the script on the machine where it is not running properly by doing this: head -1 test.pl | xxd. Does the output show the last two bytes as 0d 0a? If so, you probably copied over the file via Windows and didn't do a dos2unix conversion.
"command not found" is an error emitted by the shell. You are trying to run your Perl script by the shell, not by Perl.

Executing perl code inside shell script using eval

I came across the following example. I tried to google but could not find much so I'm posting this question here.
What is the benefit of executing the perl script like this?
How can we make the shell script work like a "normal" shell script once we are through executing the perl code?
Here's the code:
#!/bin/ksh
#! -*- perl -*-
eval 'exec $PERLLOCATION/bin/perl -x $0 ${1+"$#"} ;'
if 0;
print "hello world\n";
# how can I make it behave like a "normal" shell script from this point onwards? What needs to be done?
# echo "hello world" ### this results in error
This idiom is described in the perlrun documentation.
The -x switch scans the whole file and ignores anything that appears before the first line that begins with #! and also contains the word perl.
It means that your system will run the script with the Perl interpreter whether you invoke the script with perl or with a shell command (sh/bash/ksh/etc.)
That is,
$ perl this_script
and
$ sh this_script
will both run the script with perl.
To address your second question, this idiom has just about nothing to do with combining shell script and Perl script in the same file. There are a few different ways to approach that problem, but maybe the most readable way is to write in shell script, but use the shell's heredoc notation to invoke perl code.
#!/bin/bash
# this is a bash script, but there is some Perl in here too
echo this line is printed from the shell
echo now let\'s run some Perl
perl <<EOF
# this is now perl script until we get to the EOF
print "This line is printed from Perl\n";
EOF
echo now this is from the shell script again
1. If you start a Perl script in the usual way:
#!/usr/bin/perl
print "hello world\n";
the #! line will only work if the Perl interpreter is actually installed under /usr/bin. The perl/ksh bilingual script you show is a tricky kluge to make the script work even if perl is installed somewhere else. For more information, see e.g. this.
2. You can't. When the shell process encounters the exec command, it terminates and hands control over to perl. (Technically, it executes perl in place of the shell, without creating a new process.) The only way to run more shell commands after that would be to launch a new shell.
It's way simpler than what's already been posted.
#!$PERLLOCATION/bin/perl
doesn't work because the shebang (#!) line is interpreted by the kernel (not the shell), and the kernel doesn't do variable interpolation.
The code invokes ksh to expand the environment variable and to launch the specified installation of Perl.

Missing output when running system command in perl/cgi file

I need to write a CGI program and it will display the output of a system command:
script.sh
echo "++++++"
VAR=$(expect -c " spawn ssh -o StrictHostKeyChecking=no $USER#$HOST $CMD match_max
100000 expect \"*?assword:*\" send -- \"$PASS\r\" send -- \"\r\" expect eof ")
echo $VAR
echo "++++++"
In CGI file:
my $command= "ksh ../cgi-bin/script.sh";
my #output= `$command`;
print #output;
Finally, when I run the CGI file in unix, the $VAR is a very long string including \n and some delimiters. However, when I run on web server, the output is
++++++
++++++
So $VAR is missing when passing in the web interface/browser.
I know maybe the problem is $VAR is very long string.
But anyway, is there anyway to solve this problem except writing the output to a file then retrieve it from browser?
Thanks if you are interested in my question.
script.sh uses several environment variables: $USER, $HOST, $CMD and $PASS. The CGI environment will have different environment variables set than a login shell. You may need to set these variables from your CGI script before calling script.sh.
Try finding where commands like expect and ssh that you are calling are on your system and adding their directory paths to the PATH used by your script.
I.e.
which expect
returns /usr/bin/expect then add the line:
PATH=$PATH:/usr/bin && export PATH
near the beginning of the ksh script. During debug you may also want to redirect stderr to a file by appending 2>/tmp/errors.txt to the end of your command since stderr is not shown in the browser.
my $command= "ksh ../cgi-bin/script.sh 2>/tmp/errors.txt";