perl - using backticks instead of system() - perl

I have a perl script that calls an other perl script by using system()
it's like:
my $returnval= system("perl", $path,$val1, $val2,#myarray);
Because system() returns only the exit status but I want the script's output I want to use backticks.
I tried something like that:
my $returnval = `$path`;
how can I add the parameters the script should receive?
how should the other perl script's return code looks like? At the moment it's like
exit ($myreturnedvalue);
(how) Is it possible to return multiple values?

To go through the shell in order to move data from one perl script to another is not the best solution. You should know that backticks or qx() captures whatever is printed to STDOUT. Using exit ($var) from the other script is unlikely to be captured properly. You would need print $var. But don't do that.
Instead, make the other script into a module and use the subroutine directly. This is a simplistic example:
In bar.pm:
use strict;
use warnings;
package bar; # declare our package
sub fooz { # <--- Our target subroutine
my $in = shift; # passing the input
return $in * 5; # return value
}
1; # return value must be true
In main.pl:
use strict;
use warnings;
use bar; # bar.pm must be in one path in #INC
my $foo = bar::fooz(12); # calling fooz from our other perl script
print "Foo is $foo\n";
There is a lot more to learn, and I suggest you read up on the documentation.

You want IPC::System::Simple's capturex.
use IPC::System::Simple qw( capturex );
my $output = capturex("perl", $path, $val1, $val2, #myarray);
It even handles errors for you.

The backticks simply work like a direct invocation one would make in a shell:
you#somewhere:~$ ./script.pl --key=value
Is basically the same as
my $returnval = `./script.pl --key=value`;

For invoking other programs passing arguments and capturing output at the same time, I'm a big fan of IPC::Run:
use IPC::Run 'run';
my $exitcode = run [ $command, #args ], ">", \my $output;
# $exitcode contains the exit status and
# $output contains the command's STDOUT data

Does this not do what you want?
my $returnval = `$path $val1 $val2 #myarray`;
#Quentin however adds this useful advice: If the value you want to pass is foo "bar then in shell you would have to do something like "foo \"bar". Using extra arguments to system will take card of that for you. Using backticks won't; you need to construct the shell command you want manually.

Related

Issues with getopts in perl

I'm using Getopt::Std to process my command line args. My command line args are strings. I have issuewithgetopts()`, as it works only for single character based opts.
As seen below "srcdir" "targetdir" options are mandatory and script should error out if any one of them is missing. "block" is NOT a mandatory option.
I don't see %options has is being set with the code below, and all my options{key} are NULL. Had I replaced "srcdir=>s" and "targetdir=>t" then the below piece of code works. It doesn't work with "-srcdir" "-targetdir" options.
What's the best way to address the issue I have?
Use mode:
perl test.pl -srcdir foo1 -targetdir hello1
#!/usr/bin/perl -w
use strict;
use Getopt::Std;
# declare the perl command line flags/opt we want to allow
my %options=();
my $optstring = 'srcdir:targetdir:block';
getopts( "$optstring", %options);
# test for the existence of the opt on the command line.
print "-srcdir $options{srcdir}\n" if defined $options{srcdir};
print "-targetdir $options{targetdir}\n" if defined $options{targetdir};
print "-blocks $options{block}\n" if defined $options{block};
# other things found on the command line
print "loop:\n" if ($#ARGV > 0);
foreach (#ARGV)
{
print "$_\n";
}
You really want to use Getopt::Long to handle words like srcdir:
use warnings;
use strict;
use Data::Dumper;
use Getopt::Long;
$Data::Dumper::Sortkeys=1;
my %options;
GetOptions(\%options, qw(srcdir=s targetdir=s block));
print Dumper(\%options);
print Dumper(\#ARGV);
The reason your hash was empty was that you need to pass a reference to a hash, as shown in Getopt::Std:
getopts( "$optstring", \%options);
Also, since Std only handles single letters, it would interpret srcdir as 6 separate options: s, r, etc.

How to execute a script from another so that it also sets variables for the caller script

I reviewed many examples on-line about running another process (either PERL or shell command or a program), but do not find any useful for my needs way.
(As by already received answers I see that my 'request' is not understood, I will try to say it in short, leaving all earlier printed as an example of what I already tried...)
I need:
- In a caller script set parameters for the second script before call the second script (thus, I could not use the do script2.pl s it executed before startin to run the first script)
- In the second script I need to set some variables that will be used in the caller script (therefore it is not useful to process the second script by system() or by back ticks);
- and, as I need to use those variables in the first script, I need come back to the first script after completting the second one
(I hope now it is more clear what I need...)
(Reviewed and not useful the system(), 'back ticks', exec() and open())
I would like to run another PERL-script from a first one, not exiting (as by exec()), not catching the STDOUT of the called script (as in the back tick processing,) but having it printed out, as in initial script (as it is by system()) while I do not need the return status (as by system());
but, I would like to have the called script to set some variables, that will be accessible in the calling s cript (sure, set by the our #set_var;)
My attempt (that I am not able to make do what I need) is:
Script1 is something, like:
...
if($condition)
{ local $0 = 'script2.pl';
local #ARGV = ('first-arg', 'second_arg');
do script2.pl;
}
print "set array is: '#set_var'\n";
...
The 'script2' would have something like:
#!/usr/bin/perl
...
print "having input parameters: '#ARGV'\n";
... # all script activities
our #set_var = ($val1, $val2, $val3);
exit 0;
The problem in my code is that the do ... command is executed on beginning of the first script run and is not in the place, where it is prepared for it (by setting some local .. vars!)
I did try to use the eval "do script2.pl" :
- now it is executed in the proper place, but it is not setting the #set_var into the first script process!
Is there any idea to do it as I would like to have it?
(I understand, that I can rewrite the script2.pl, including whole processing in some function (say, main()) and load it by require() and execute the function main(): that will do everything as I prefer it; but I would like to leave the second script as-is to be executable from shell by itself, as it is now.
... and I do not like the way to pass values by a flat file...)
Does anybody have an idea how to do my whim?
This works just fine:
script2.pl
use strict;
our #set_var = ("foo","bar");
script1.pl
use strict;
our #set_var;
do './script2.pl';
print "#set_var\n";
$ perl script1.pl
foo bar
But it does not if you use:
script2.pl
use strict;
our #set_var = ("foo","bar");
exit 0;
There is only a single perl process in this example, so calling exit, even from the second script, exits your program.
If you don't want to remove the exit call in the second script, we can work around that with some CORE::GLOBAL namespace hacking. The gist is to redirect the exit function to your own custom function that you can manipulate when the second script runs.
script1.pl
BEGIN { *CORE::GLOBAL::exit = *my_exit };
use strict;
sub my_exit { goto &CORE::exit }
our #set_var;
{
local *my_exit = sub { warn "Not exiting" };
do './script2.pl';
}
print "#set_var\n";
script2.pl
use strict;
our #set_var = ("foo","bar");
exit 0;
$ perl script1.pl
Not exiting at script1.pl line 7.
foo bar
(Ok, finally, asked by myself and ansvering by myself, too!)
( After additional reviewing, I am realized, that 'mod' solution does use it, but I did not understand advice!
I am sorry: my false to step over the real solution!
)
Solution to my question is simple! It is the:
do EXPR;
That way
- the second script executed in place where it placed; so, anything defined and set in the first one usefull in the second one;
- It is printing to STDOUT everything what it should print (the second script;)
- any variables or objects that are defined in the second script process, are accessible in the first one after coming back; and
- control is returned to position immediately after the second-script execution with continuation to process the first script commands!
Simple! I am just amazed, why I forget about that 'do...' command. I have used it already not once!
And I am disappointed by that forum much!
While it is badly designed to display communication, participants, instead of perl-issue reviewing, much concerned on moderating others, teaching them how to leave in such nice forum!
I am not really sure what you are trying to do exactly, but along these lines it should be very close.
test.pl
#!/usr/bin/perl
use strict;
use warnings;
use feature 'say';
use IPC::System::Simple qw(system);
say $0;
system($^X, "sample.pl", #ARGV);
$ perl test.pl first-arg second-arg
test.pl
sample.pl
$VAR1 = [
'first-arg',
'second-arg'
];
sample.pl
#!/usr/bin/perl
use strict;
use warnings;
use Data::Dumper;
use feature 'say';
say $0;
print Dumper \#ARGV;
I used the module IPC::System::Simple. You can also capture the output of the script (sample.pl) through IPC::System::Simple::capture.
Update: Maybe you can use Storable. This way you can pass new parameters that you can use from script 2 (sample.pl) to script 1 (test.pl).
test.pl
#!/usr/bin/perl
use strict;
use warnings;
use Storable;
use Data::Dumper;
use feature 'say';
use IPC::System::Simple qw(system);
say $0;
system($^X, "sample.pl", #ARGV);
my $hashref = retrieve('sample');
print Dumper $hashref;
__END__
$ perl test.pl first-arg second-arg
test.pl
sample.pl
$VAR1 = [
'first-arg',
'second-arg'
];
$VAR1 = {
'arg1' => 'test1',
'arg2' => 'test2'
};
sample.pl
#!/usr/bin/perl
use strict;
use warnings;
use Storable;
use Data::Dumper;
use feature 'say';
say $0;
print Dumper \#ARGV;
my %hashArgs = ( arg1 => 'test1',
arg2 => 'test2', );
store \%hashArgs, 'sample';

calling one perl file from the second perl file while taking variables of second file

I want to know if this is possible. the file a.pm has few variables that will be required by b.pl. and b.pl is called by a.pm
a.pm
{
$var1,$var2;
system (perl b.pl);
}
b.pl
{
print "$var1";
}
How do I make use of $var1 and $var2 to b.pl. I Do not want to pass as Command line arguments to b.pl
You cannot share the same variables if you do a system call to a new Perl interpreter. To do that, you could pass them as command line arguments.
use strict;
use warnings;
my $var1 = 'foo';
my $var2 = 'bar';
system( 'perl b.pl', $var1, $var2);
And then in b.pl:
use strict;
use warnings;
print $ARGV[0];
The arguments end up in #ARGV, which can be used as a regular array. If you want to share data structures, you'd have to serialize them first. You could use Storable or JSON to do that. You then need to deserialize them in your second program.
There are more complicated alternatives, but I think going into detail for those (like saving to a file, opening a pipe and reading from STDOUT in the second script) are a bit out of scope.
You can pass $var1 and var2 from a.pm to b.pl along with the command execution using system. Please the check the following:
a.pm
my ($var1,$var2);
$var1="Open";
$var2="Close";
system ("perl b.pl $var1 $var2");
b.pl
my $var1=$ARGV[0];
my $var2=$ARGV[1];
print "$var1";
The output will be:
Open
If you don't need to send value by command line please try the following.
The way to do is by passing value by setting it in %ENV hash as given in the following program. Please be careful when you set the key for it so that it won't replace the existing %ENV keys.
a.pm
my ($var1,$var2);
$var1="Open";
$var2="Close";
$ENV{m_temp_x_var1}=$var1;
$ENV{m_temp_x_var2}=$var2;
system ("perl b.pl");
b.pl
my $var1=$ENV{m_temp_x_var1};
my $var2=$ENV{m_temp_x_var2};
print "$var1";
The output will be:
Open

In perl passing a parameter to a script using a required command

I have two file tmp.pl and tmp2.pl. I want to call tmp2.pl with a require command but also send a parameter. Is there a better way of doing this?
tmp.pl
require "tmp2.pl" "passed parameter";
tmp2.pl
print #_;
As far as I know, require cannot be used to send a parameter. But that's a good thing, I think, because I cannot think of a reason why you should want to. Looks to me that your design is wrong.
tmp2.pl should be either:
a independent perl program, which you should run with system or qx()
a module, with optional exported tags etc.
a package which defines a class
but that's just my idea....
There's probably a better way to accomplish whatever it is you're trying to do, but you could achieve your current sub goal with something like
{
local #_ = ("passed parameter");
require "tmp2.pl";
}
I might consider this idiom in a place where I wanted to run a perl script from within a perl script. That is, I could say
{
local #ARGV = ("foo","bar");
require "my_script.pl";
}
instead of
system("perl","my_script.pl","foo","bar");
(There are plenty of subtle and not-so-subtle differences between these two calls, so a lot depends on what "features" of these calls you need)
Yes, I think this should work. Use "require" to include the script. After that you can pass the parameter by calling the sub function. The modified script can be
require "tmp2.pl" ;
subfunc(parameter);
print #_;
Supposing that you want to execute other program and capture its output I'm partial to using the IPC::Run module.
#!/usr/bin/perl
use strict;
use warnings;
use IPC::Run qw( run );
run( [ './tmp2.pl', #ARGV ], \'', \my $out, \my $err );
print "out: $out\n";
print "err: $err\n";

Is there a way to encapsulate the common Perl functions into their own scripts?

I am maintaining several Perl scripts that all have similar code blocks for different functions. Each time a code block is updated, I have to go through each script and manually make the change.
Is there a way to encapsulate the common functions into their own scripts and call them?
Put the common functionality in a module. See perldoc perlmod for details.
There are other ways, but they all have severe issues. Modules are the way to go, and they don't have to be very complicated. Here is a basic template:
package Mod;
use strict;
use warnings;
use Exporter 'import';
#list of functions/package variables to automatically export
our #EXPORT = qw(
always_exported
);
#list of functions/package variables to export on request
our #EXPORT_OK = qw(
exported_on_request
also_exported_on_request
);
sub always_exported { print "Hi\n" }
sub exported_on_request { print "Hello\n" }
sub also_exported_on_request { print "hello world\n" }
1; #this 1; is required, see perldoc perlmod for details
Create a directory like /home/user/perllib. Put that code in a file named Mod.pm in that directory. You can use the module like this:
#!/usr/bin/perl
use strict;
use warnings;
#this line tells Perl where your custom modules are
use lib '/home/user/perllib';
use Mod qw/exported_on_request/;
always_exported();
exported_on_request();
Of course, you can name the file anything you want. It is good form to name the package the same as file. If you want to have :: in the name of the package (like File::Find) you will need to create subdirectories in /home/user/perllib. Each :: is equivalent to a /, so My::Neat::Module would go in the file /home/user/perllib/My/Neat/Module.pm. You can read more about modules in perldoc perlmod and more about Exporter in perldoc Exporter
About a third of Intermediate Perl is devoted to just this topic.
Using a module is the most robust way, and learning how to use modules would be helpful.
Less efficient is the do function. Extract your code to a separate file, say "mysub.pl", and
do 'mysub.pl';
This will read and then eval the contents of the file.
You can use the
require "some_lib_file.pl";
where you would put all your common functions and call them from other scripts which would contain the line above.
For example:
146$ cat tools.pl
# this is a common function we are going to call from other scripts
sub util()
{
my $v = shift;
return "$v\n";
}
1; # note this 1; the 'required' script needs to end with a true value
147$ cat test.pl
#!/bin/perl5.8 -w
require 'tools.pl';
print "starting $0\n";
print util("asdfasfdas");
exit(0);
148$ cat test2.pl
#!/bin/perl5.8 -w
require "tools.pl";
print "starting $0\n";
print util(1);
exit(0);
Then executing test.pl and test2.pl will yield the following results:
149$ test.pl
starting test.pl
asdfasfdas
150$ test2.pl
starting test2.pl
1