Execute a perl script within a perl script with arguments - perl

I have met a problem when I tried to execute a perl script within my perl script. This is a small part of a larger project that I'm working on.
Below is my perl script code:
use strict;
use warnings;
use FindBin qw($Bin);
#There are more options, but I just have one here for short example
print "Please enter template file name: "
my $template = <>;
chomp($template);
#Call another perl script which take in arguments
system($^X, "$Bin/GetResults.pl", "-templatefile $template");
the "GetResults.pl" takes in multiple arguments, I just provide one here for example. Basically, if I was to use the GetResults.pl script alone, in the command line I would type:
perl GetResults.pl -templatefile template.xml
I met two problems with the system function call above. First of all, it seems to remove the dash in front of my argument when I run my perl script resulting in invalid argument error in GetResults.pl.
Then I tried this
system($^X, "$Bin/GetResults.pl", "/\-/templatefile $template");
It seems OK since it does not complain about earlier problem, but now it says it could not find the template.xml although I have that file in the same location as my perl script as well as GetResults.pl script. If I just run GetResults.pl script alone, it works fine.
I'm wondering if there is some issue with the string comparison when I use variable $template and the real file name located on my PC (I'm using Window 7).
I'm new to Perl and hope that someone could help. Thank you in advance.

Pass the arguments as an array, just as you would with any other program (a Perl script is not special; that it is a Perl script is an implementation detail):
system($^X, "$Bin/GetResults.pl", "-templatefile", "$template");
You could line everything up in an array and use that, too:
my #args = ("$Bin/GetResults.pl", "-templatefile", "$template");
system($^X, #args);
Or even add $^X to #args. Etc.

Related

From perl, why does a backticked call to a c-shell pass quoted string ok if called with tcsh but not source?

I need to write a perl script that calls a c-shell script that calls yet another perl script. I cannot change the c-shell script or the perl script it calls. One of the args that needs to be passed is a quotes string with spaces. If I use backticks to call the c-shell, and I run the c-shell with tcsh, the quoted string is respected as a single entity. However, if I run the c-shell with source, it is not.
I feel that I need to use 'source' because when the c-shell is called by users from the command line, it is called through an alias that sources the c-shell. E.g.
alias top "source top.csh"
Consider these...
topmost.pl
#!/usr/bin/env perl
use strict;
print "Try with tcsh...\n";
my $msg = `tcsh ./top.csh -arg1 "this line has spaces"`;
print "$msg\n";
print "Try with source...\n";
my $msg = `source ./top.csh -arg1 "this line has spaces"`;
print "$msg\n";
exit;
top.csh is simply....
perl ./subperl.pl $*:q
exit
And subperl.pl is...
#!/usr/bin/env perl
use strict;
print "In subperl.pl\n";
foreach $x (#ARGV) {
print "$x\n";
}
print "The End\n";
exit;
When I run the topmost.pl script, I get...
Try with tcsh...
In subperl.pl
-arg1
this line has spaces
The End
Try with source...
In subperl.pl
-arg1
this
line
has
spaces:q
The End
Why does the "sourced" call to the top.csh script fail to respect the quotes ?
#Kaz has the answer as to why your code isn't working. This answer is about how to avoid this class of problems entirely.
First, if you can, add a #!/bin/tcsh to top.csh and make it executable (ie. chmod +x). Now it can be executed as top.csh without needing to know what shell to use.
Then you'll want to avoid using `` for anything but very simple commands. This is because `` is interpreted by the shell and now you need to worry about shell special characters and escapes and spaces... it's a mess. What you need is a way to call external programs without invoking a shell.
You can do this by passing a list to system, but system cannot capture the output.
system "tcsh", "./top.csh", "-arg1", "this line has spaces";
While you can cobble something together with open and pipes, it's better to use a pre-existing library such as IPC::System::Simple.
use IPC::System::Simple qw(capturex);
# Or capturex("./top.csh", ...) if you add a #! to top.csh.
my $msg = capturex("tcsh", "./top.csh", "-arg1", "this line has spaces");
For more involved interactions with executables, look into System::Command or IPC::Run.
Needless to say, Perl scripts which call shell scripts which call Perl scripts is a bit of a nightmare to maintain. Rather than do that, it is better to scoop the guts of subperl.pl out into a Perl library and have both subperl.pl and your code use that library.
The command in backticks is being interpreted by your system interpreter (invoked via /bin/sh), which I'm guessing might be GNU Bash. Or, in any case, it seems to be some shell which understands the source command, and almost certainly a POSIX-like shell. That source command quite probably tells that shell to read a script written in that shell's own language. So, for instance, if that shell happens to be Bash, it will treat that as a Bash script1, not as a Tcsh script.
The only way both could work is if the script is a "polyglot": a program which can be interpreted by either tcsh or the system shell that is used by perl to implement backticks.
(An easy example of a C Shell + POSIX shell polyglot is a script that contains nothing but a sequence of trivial commands consisting of space separated words like cp from to.)
Your script isn't a polyglot; only Tcsh understands the :q syntax, not the other shell.
More precisely, if /bin/sh is Bash, the original source ... command as well as the contents of the sourced top.csh script will be treated as a POSIX-mode Bash script, since when Bash is invoked as /bin/sh, it turns off its POSIX-incompatible behaviors. So even if Bash's pathname expansion supported the Tcsh :q mechanism, it would almost certainly be turned off under POSIX mode because $*:q already has a firm meaning in POSIX.

Extract user name from perl script

I want to extract user name, executing a perl script, within a script itself.
I am executing whoami linux command from perl as follows and it works pretty well.
my $whoami = `whoami`;
chomp $whoami;
print $whoami;
My intention is to get away from calling system commands from perl script. Therefore I am looking for Perl only solution. I was wondering if there is any CPAN module available which can extract system information.
Your suggestions in this regards will be appreciated.
perl -le 'print scalar getpwuid $<'
Perl have direct mapping to system getpw* functions.
These routines are the same as their counterparts in the system C
library. In list context, the return values from the various get
routines are as follows:
($name,$passwd,$uid,$gid,
$quota,$comment,$gcos,$dir,$shell,$expire) = getpw*
-- from perldoc -f getpwuid.
Use getpwuid with $< as argument (which, according to perldoc perlvar is "The real uid of this process", and also available as $REAL_USER_ID and $UID) and get first returned value.
You should probably take at look at the hash %ENV. It contains useful information about the environment where your script is run.
One example (in windows) to get the username would be:
perl -E "say $ENV{'USERNAME'}"
For bash substitute USERNAME for either LOGNAME or USER.

How can I compile a Perl script inside a running Perl session?

I have a Perl script that takes user input and creates another script that will be run at a later date. I'm currently going through and writing tests for these scripts and one of the tests that I would like to perform is checking if the generated script compiles successfully (e.g. perl -c <script>.) Is there a way that I can have Perl perform a compile on the generated script without having to spawn another Perl process? I've tried searching for answers, but searches just turn up information about compiling Perl scripts into executable programs.
Compiling a script has a lot of side-effects. It results in subs being defined. It results in modules being executed. etc. If you simply want to test whether something compiles, you want a separate interpreter. It's the only way to be sure that one testing one script doesn't cause later tests to give false positives or false negatives.
To execute dynamically generated code, use eval function:
my $script = join /\n/, <main::DATA>;
eval($script); # 3
__DATA__
my $a = 1;
my $b = 2;
print $a+$b, "\n";
However if you want to just compile or check syntax, then you will not be able to do it within same Perl session.
Function syntax_ok from library Test::Strict run a syntax check by running perl -c with an external perl interpreter, so I assume there is no internal way.
Only work-around that may work for you would be:
my $script = join /\n/, <main::DATA>;
eval('return;' . $script);
warn $# if $#; # syntax error at (eval 1) line 3, near "1
# my "
__DATA__
my $a = 1
my $b = 2;
print $a+$b, "\n";
In this case, you will be able to check for compilation error(s) using $#, however because the first line of the code is return;, it will not execute.
Note: Thanks to user mob for helpfull chat and code correction.
Won't something like this work for you ?
open(FILE,"perl -c generated_script.pl 2>&1 |");
#output=<FILE>;
if(join('',#output)=~/syntax OK/)
{
printf("No Problem\n");
}
close(FILE);
See Test::Compile module, particularly pl_file_ok() function.

parsing first entry of a find call in perl?

I need to get an example file file from a find command in a Perl script to create another system call afterwards. For some reason, the find command gets stuck when I call it from the script. Here is what I need to do:
my $search_dir = "/something/like/this/??/??/??";
# the triple '??' are needed here
my $cmd = "find $search_dir -name \"\*.$var1.token1.$var2.ext\" | head -n 1";
my $first_example_file = `$cmd`; chomp $first_example_file;
This gets stuck when I run it through Perl, it never finishes executing the command, whereas the constructed $cmd runs in no time if I copy+paste it and run in in my bash terminal. Any ideas?
Try using the File::Find perl module for finding files. If you would like to use bash's find in your perl then you might have to use $(..) in your command.
I am not in to perl … just trying to help out.
Update:
As stated in the comments by Rohaq you can also use File::Find::Rule
I'd wager globbing (shell metacharacter expansion) is involved. But regardless, try and chop the command up. Does it work without the pipe? What about without the ?? in the pathname? What happens if you prepend 'echo' ("echo find ...")? Still hanging? Then you can try it under perl -d - the debugger; perldoc perldebug is your friend.

How can I call a Perl function from a shell script?

I have written a library in Perl that contains a certain function, that returns information about a server as a character string. Can I call this function from a shell directly?
My boss asks "Can you call it from a shell directly for the time being?" Because he said that, I think I should be able to do it, but how do I do it?
perl -MServerlib=server_information -e 'print server_information()'
Is another way to do this, but only if Serverlib exports server_information sub. If it doesn't, you would need to do the below instead:
perl -MServerlib -e 'print MServerlib::server_information()'
As perl's command line arguments are a bit inscrutable, I'd wrap it in a simpler perl script that calls the function. For example, create a script serverinfo which contains:
#!/usr/bin/perl
require 'library.pl';
say library::getServerInformation();
then run:
chmod u+x serverinfo
The advantage of doing it this way is the output and arguments of the script can be corrected if the function itself changes. A command line script like this can be thought of as an API, which shouldn't change when the implementation changes.