How can I compile a Perl script inside a running Perl session? - perl

I have a Perl script that takes user input and creates another script that will be run at a later date. I'm currently going through and writing tests for these scripts and one of the tests that I would like to perform is checking if the generated script compiles successfully (e.g. perl -c <script>.) Is there a way that I can have Perl perform a compile on the generated script without having to spawn another Perl process? I've tried searching for answers, but searches just turn up information about compiling Perl scripts into executable programs.

Compiling a script has a lot of side-effects. It results in subs being defined. It results in modules being executed. etc. If you simply want to test whether something compiles, you want a separate interpreter. It's the only way to be sure that one testing one script doesn't cause later tests to give false positives or false negatives.

To execute dynamically generated code, use eval function:
my $script = join /\n/, <main::DATA>;
eval($script); # 3
__DATA__
my $a = 1;
my $b = 2;
print $a+$b, "\n";
However if you want to just compile or check syntax, then you will not be able to do it within same Perl session.
Function syntax_ok from library Test::Strict run a syntax check by running perl -c with an external perl interpreter, so I assume there is no internal way.
Only work-around that may work for you would be:
my $script = join /\n/, <main::DATA>;
eval('return;' . $script);
warn $# if $#; # syntax error at (eval 1) line 3, near "1
# my "
__DATA__
my $a = 1
my $b = 2;
print $a+$b, "\n";
In this case, you will be able to check for compilation error(s) using $#, however because the first line of the code is return;, it will not execute.
Note: Thanks to user mob for helpfull chat and code correction.

Won't something like this work for you ?
open(FILE,"perl -c generated_script.pl 2>&1 |");
#output=<FILE>;
if(join('',#output)=~/syntax OK/)
{
printf("No Problem\n");
}
close(FILE);

See Test::Compile module, particularly pl_file_ok() function.

Related

Execute a perl script within a perl script with arguments

I have met a problem when I tried to execute a perl script within my perl script. This is a small part of a larger project that I'm working on.
Below is my perl script code:
use strict;
use warnings;
use FindBin qw($Bin);
#There are more options, but I just have one here for short example
print "Please enter template file name: "
my $template = <>;
chomp($template);
#Call another perl script which take in arguments
system($^X, "$Bin/GetResults.pl", "-templatefile $template");
the "GetResults.pl" takes in multiple arguments, I just provide one here for example. Basically, if I was to use the GetResults.pl script alone, in the command line I would type:
perl GetResults.pl -templatefile template.xml
I met two problems with the system function call above. First of all, it seems to remove the dash in front of my argument when I run my perl script resulting in invalid argument error in GetResults.pl.
Then I tried this
system($^X, "$Bin/GetResults.pl", "/\-/templatefile $template");
It seems OK since it does not complain about earlier problem, but now it says it could not find the template.xml although I have that file in the same location as my perl script as well as GetResults.pl script. If I just run GetResults.pl script alone, it works fine.
I'm wondering if there is some issue with the string comparison when I use variable $template and the real file name located on my PC (I'm using Window 7).
I'm new to Perl and hope that someone could help. Thank you in advance.
Pass the arguments as an array, just as you would with any other program (a Perl script is not special; that it is a Perl script is an implementation detail):
system($^X, "$Bin/GetResults.pl", "-templatefile", "$template");
You could line everything up in an array and use that, too:
my #args = ("$Bin/GetResults.pl", "-templatefile", "$template");
system($^X, #args);
Or even add $^X to #args. Etc.

Shell Programming inside Perl

I am writing a code in perl with embedded shell script in it:
#!/usr/bin/perl
use strict;
use warnings;
our sub main {
my $n;
my $n2=0;
$n=chdir("/home/directory/");
if($n){
print "change directory successful $n \n";
$n2 = system("cd", "section");
system("ls");
print "$n2 \n";
}
else {
print "no success $n \n";
}
print "$n\n";
}
main();
But it doesn't work. When I do the ls. The ls doesn't show new files. Anyone knows another way of doing it. I know I can use chdir(), but that is not the only problem, as I have other commands which I have created, which are simply shell commands put together. So does anyone know how to exactly use cli in perl, so that my compiler will keep the shell script attached to the same process rather than making a new process for each system ... I really don't know what to do.
The edits have been used to improve the question. Please don't mind the edits if the question is clear.
edits: good point made by mob that the system is a single process so it dies everytime. But, What I am trying to do is create a perl script which follows an algorithm which decides the flow of control of the shell script. So how do I make all these shell commands to follow the same process?
system spawns a new process, and any changes made to the environment of the new process are lost when the process exits. So calling system("cd foo") will change the directory to foo inside of a very short-lived process, but won't have any effect on the current process or any future subprocesses.
To do what you want to do (*), combine your commands into a single system call.
$n2 = system("cd section; ls");
You can use Perl's rich quoting features to pass longer sequences of commands, if necessary.
$n2 = system q{cd section
if ls foo ; then
echo we found foo in section
./process foo
else
echo we did not find foo\!
./create_foo > foo
fi};
$n2 = system << "EOSH";
cd section
./process bar > /tmp/log
cd ../sekshun
./process foo >> /tmp/log
if grep -q Error /tmp/log ; then
echo there were errors ...
fi
EOSH
(*) of course there are easy ways to do this entirely in Perl, but let's assume that the OP eventually will need some function only available in an external program
system("cd", "section"); attempts to execute the program cd, but there is no such program on your machine.
There is no such program because each process has its own current work directory, and one process cannot change another process's current work directory. (Programs would malfunction left and right if it was possible.)
It looks like you are attempting to have a Perl program execute a shell script. That requires recreating the shell in Perl. (More specific goals might have simpler solutions.)
What I am trying to do is create a perl script which follows an algorithm which decides the flow of control of the shell script.
Minimal change:
Create a shell script that prompts for instructions/commands. Have your Perl script launch the shell script using Expect and feed it answers/commands.

Perl Shell Execution

I'm sure you have all used Metasploit.
In Metasploit when the user presses the enter key, or types any command Metasploit executes it, and returns back with a msf:>.
I was wondering how I could do this in Perl (pretty much make a Perl shell, which executes commands and returns back with that little identifier).
while (1) {
if (<STDIN> eq defined) {
print ">>"
}
$command = <STDIN>;
if ($command =~ m/help/) {
print "Help is on its way";
} elsif ($command =~ m/exit/) {
exit (1);
}
}
Take a look at Term::* modules
Term::ReadLine
Term::Shell
Following David's answer, its time for me to promote Zoidberg. Zoidberg is another Perl shell (like PSh) but it is modular, embeddable, and extendable.
You can use Zoidberg::Shell to build a shell for your application, or
you can use the Zoidberg::Fish plugin system to build a plugin for your needs which would run inside Zoidberg itself. It would most likely define some commands, and possibly a syntax and operation mode. The cannonical example of this is a SQL plugin which allows Zoidberg to recognize SQL statements, and then pass them to a waiting db handle and return results, directly from inside the shell!
As it happens, I am the new maintainer. Zoidberg just had its first release in several years which corrected several bugs that had popped up over the years. So while I am not an expert in it yet, I am probably the closest to being one that exists.
Start your reading about Zoidberg at the zoiduser man page, then read more about plugins at zoiddevel.
There's really something called Perl Shell (psh) and its available from the CPAN archive.
I haven't tried it, but the documentation is all there:
$ cpan
cpan> install Psh
EDIT
I've played with it a bit. I had to change PS1 so it wouldn't interfere with Psh. Originally, my PS1 was set to:
PS1=$(print -n "`logname`#`hostname`:";if [[ "${PWD#$HOME}" != "$PWD" ]] then; print -n "~${PWD#$HOME}"; else; print -n "$PWD";fi;print "\n$ ")
But, Psh didn't like it. Instead, if I use the Bash settings, it works great:
PS1="\u#\h:\W: PSH> "
I also get the following warnings when starting:
Using an array as a reference is deprecated at /Library/Perl/5.12/Psh/StrategyBunch.pm line 260.
Using an array as a reference is deprecated at /Library/Perl/5.12/Psh/Strategy/Darwin_apps.pm line 47.
But it does start up. I haven't figured out shell history editing, but it does take Perl scripts:
david#DaveBook:david: PSH> foreach $foo (<*>) {
> print "$foo\n";
> }

Perl as a batch-script tool - fully piping child processes?

Apologies if some of the terminology may be slighlty off here. Feel free to correct me if I use a wrong term for something.
Is it possible to use Perl as an "advanced shell" for running "batch" scripts? (on Windows)
The problem I face when replacing a .bat/.cmd script that's getting too complicated with a perl script is that I can't easily run sub processes as a shell does.
That is, I would like to do the same thing from my perl script as a shell does when invoking a child process, that is, fully "connecting" STDIN, STDOUT and STDERR.
Example:
foo.bat -
#echo off
echo Hello, this is a simple script.
set PARAM_X=really-simple
:: The next line will allow me to simply "use" the tool on the shell I have open,
:: that is STDOUT + STDERR of the tool are displayed on my STDOUT + STDERR and if
:: I enter something on the keyboard it is sent to the tools STDIN
interactive_commandline_tool.exe %PARAM_X%
echo The tool returned %ERROLEVEL%
However, I have no clue how to fully implement this in perl (is it possible at all?):
foo.pl -
print "Hello, this is a not so simple script.\n";
my $param_x = get_more_complicated_parameter();
# Magic: This sub executes the process and hooks up my STDIN/OUT/ERR and
# returns the process error code when done
my $errlvl = run_executable("interactive_commandline_tool.exe", $param_x);
print "The tool returned $errlvl\n";
How can I achieve this in perl? I played around with IPC::Open3 but it seems this doesn't do the trick ...
Probably you'll find IPC::Run3 useful. It allow you to capture both STDOUT and STDERR (but not pipe them in real time). Command error level will be returned in $?.
Why not this way:
print "Hello, this is a not so simple script.\n";
my $param_x = get_more_complicated_parameter();
system('cmd.exe', $param_x);
my $errorlevel = $? >> 8;
print "The tool returned $errorlevel\n";
sub get_more_complicated_parameter { 42 }
I don't have your interactive program, but the shell executed allowed me to enter commands, it has inherited environment defined in perl, etc.
I am using perl as replacement for more complicated shell scripts on Windows for long time and so far everything I needed was possible.

How can I test that a Perl program compiles from my test suite?

I'm building a regression system (not unit testing) for some Perl scripts.
A core component of the system is
`perl script.pl #params 1>stdoutfile 2>stderrfile`;
However, in the course of actually working on the scripts, they sometimes don't compile(Shock!). But perl itself will execute correctly. However, I don't know how to detect on stderr whether Perl failed to compile (and therefore wrote to stderr), or my script barfed on input (and therefore wrote to stderr).
How do I detect whether a program executed or not, without exhaustively finding Perl error messages and grepping the stderr file?
It might be easiest to do this in two steps:
system('$^X -c script.pl');
if ($? == 0) {
# it compiled, now let's see if it runs
system('$^X script.pl', #params, '1>stdoutfile', '2>stderrfile');
# check $?
}
else {
warn "script.pl didn't compile";
}
Note the use of $^X instead of perl. This is more flexible and robust. It ensures that you're running from the same installation instead of whatever interpreter shows up first in your path. The system call will inherit your environment (including PERL5LIB), so spawning a different version of perl could result in hard-to-diagnose compatibility errors.
When I want to check that a program compiles, I check that it compiles :)
Here's what I put into t/compile.t to run with the rest of my test suite. It stops all testing with the "bail out" if the script does not compile:
use Test::More tests => 1;
my $file = '...';
print "bail out! Script file is missing!" unless -e $file;
my $output = `$^X -c $file 2>&1`;
print "bail out! Script file does not compile!"
unless like( $output, qr/syntax OK$/, 'script compiles' );
Scripts are notoriously hard to test. You have to run them and then scrape their output. You can't unit test their guts... or can you?
#!/usr/bin/perl -w
# Only run if we're the file being executed by Perl
main() if $0 eq __FILE__;
sub main {
...your code here...
}
1;
Now you can load the script like any other library.
#!/usr/bin/perl -w
use Test::More;
require_ok("./script.pl");
You can even run and test main(). Test::Output is handy for capturing the output. You can say local #ARGV to control arguments or you can change main() to take #ARGV as an argument (recommended).
Then you can start splitting main() up into smaller routines which you can easily unit test.
Take a look at the $? variable.
From perldoc perlvar:
The status returned by the last pipe
close, backtick ("``") command,
successful call to wait() or
waitpid(), or from the system()
operator. This is just the 16-bit
status word returned by the
traditional Unix wait() system call
(or else is made up to look like it).
Thus, the exit value of the subprocess
is really ("$? >> 8"), and "$? & 127"
gives which signal, if any, the
process died from, and "$? & 128"
reports whether there was a core dump.
It sounds like you need IPC::Open3.