Interpolate perl variable in shell command - perl

I want to take date as input from User & pass that input to shell command below.
for e.g.
$date = ARGV[0];
`cd xyz $date`
will this variable be interpolated in perl?

You have a couple of problems; first of all, cd only takes one parameter. Perhaps you meant something like cd xyz$date? Second, backticks start a shell that executes the command you give, which will do the change directory command and then immediately exit, having no effect. (The parent perl process's current directory is left unchanged.) You might be looking for chdir.
But yes, the interpolation will work; to disable interpolation in backticks, you either escape special characters (echo xyz \$date) or use qx with single quote delimiters (qx'echo xyz $date').

This is trivial to test:
Input:
use v5.16;
use strict;
use warnings;
my $foo = 123;
say `echo $foo`;
Output:
123
So yes.
Beware of variables containing characters with special meaning in shell though.

Yes, it will interpolate the variable. However, I highly recommend two things:
Use qx/../ instead of back ticks.
Set the command you're running first in a Perl variable, and then interpolate that variable.
The qx/../ is nice because it makes it more obvious what you're doing:
my $variable = qx(ls $directory);
You could use a wide variety of characters for qx:
my $variable = qx(ls $directory);
my $variable = qx/ls $directory/;
my $variable = qx^ls $directory^;
my $variable = qx#ls $directory#;
The character after the qx will be used as quotes. If you use parentheses, square brackets, or curly braces, you use them in pairs with the opening one and closing one.
This means that you can avoid issues where a particular character in your command might confuse things.
The other thing is to build your command into a Perl variable and then execute it when you try to do interpolations. This gives you more control over what you're doing:
my $HOME;
my $command;
$HOME = "bin";
$command = "ls $HOME";
print qx($command); #Lists the directory bin
$command = 'ls $HOME';
print qx($command); #List user's home directory
In both of these examples, I'm doing qx($command). However, in the first example, I allow Perl to substitute the value of $HOME. In the second example, I use single quotes, so Perl doesn't substitute the value of$HOME. Instead, the string$HOME` is just part of my command I'm using. I'm letting the shell interpolates it.
I usualy am leery of any program that uses qx/.../. In most cases, it's used to run a command that could be done in Perl itself. For example, in early Perl programs, you'd see things like this:
$date = `date +%M/%Y/%D`
chop $date; #Yeah, I said EARLY Perl programs
Because it was simply a lot easier to run the Unix command rather than trying to do it in a pure Perl way. However, doing it the Perl (i.e. the correct) way means you're no longer dependent upon the OS's behavior which is not entirely under your control.
If you need the output of the command for use in your Perl script, you should use open to execute your command, and treat the output of the command as a file.
my $command = "ls $HOME";
open my command_fh, "|-", $command or die qq(Couldn't execute "$command");
If you simply need to execute the command, use the system command:
my $command = "xyz $date";
my $error = system $command;
if ( $error ) {
#Something terrible happened...
}
Note that if you send only a single scalar argument to the system command, and it contains possible shell meta characters, it will execute the command via the OS shell. If you send the system command a list to execute, or there are no shell metacharacters, Perl will call the command executor to execute the command directly without any shell interpolations:
my #command = qw(ls $HOME);
system #command; #Will print out "No such directory '$HOME'

Perldoc has the answers, as ever.
Specifically, http://perldoc.perl.org/perlop.html#Quote-and-Quote-like-Operators has "Yes" in the Interpolates? column (with a proviso you don't need to worry about here), so yep, it's safe to assume your variable will be interpolated.
You can get the same document by running 'perldoc perlop' at your local commandline, often.

Related

Can I obtain values from a perl script using a system call from the middle of another perl script?

I'm trying to modify a script that someone else has written and I wanted to keep my script separate from his.
The script I wrote ends with a print line that outputs all relevant data separated by spaces.
Ex: print "$sap $stuff $more_stuff";
I want to use this data in the middle of another perl script and I'm not sure if it's possible using a system call to the script I wrote.
Ex: system("./sap_calc.pl $id"); #obtain printed data from sap_calc.pl here
Can this be done? If not, how should I go about this?
Somewhat related, but not using system():
How can I get one Perl script to see variables in another Perl script?
How can I pass arguments from one Perl script to another?
You're looking for the "backtick operator."
Have a look at perlop, Section "Quote-like operators".
Generally, capturing a program's output goes like this:
my $output = `/bin/cmd ...`;
Mind that the backtick operator captures STDOUT only. So in order to capture everything (STDERR, too) the commands needs to be appended with the usual shell redirection "2>&1".
If you want to use the data printed to stdout from the other script, you'd need to use backticks or qx().
system will only return the return value of the shell command, not the actual output.
Although the proper way to do this would be to import the actual code into your other script, by building a module, or simply by using do.
As a general rule, it is better to use all perl solutions, than relying on system/shell as a way of "simplifying".
myfile.pl:
sub foo {
print "Foo";
}
1;
main.pl:
do 'myfile.pl';
foo();
perldoc perlipc
Backquotes, like in shell, will yield the standard output of the command as a string (or array, depending on context). They can more clearly be written as the quote-like qx operator.
#lines = `./sap_calc.pl $id`;
#lines = qx(./sap_calc.pl $id);
$all = `./sap_calc.pl $id`;
$all = qx(./sap_calc.pl $id);
open can also be used for streaming instead of reading into memory all at once (as qx does). This can also bypass the shell, which avoids all sorts of quoting issues.
open my $fh, '-|', './sap_calc.pl', $id;
while (readline $fh) {
print "read line: $_";
}

how to include single quotes from user input in perl

The user is going to enter input string such as Tom's Toy.
However the perl script complains saying "unmatched '."
This is my code.
my $commandline="";
while (#ARGV) {
$_ = shift #ARGV;
{$commandline .= $_ . ' ';}
}
print " Running $commandline\n";
system ($commandline);
Now if the user input is Tom's Toy. I just want to print back Tom's Toy.
However perl complains "unmatched '.".
IF I dont user quote it works fine. (for eg: Tom Toy is good)
How do I fix this issue.
Any help is greatly appreciated.
Thanks in advance
If you switch things around a little to use the system $cmd, #args version of the function, no shell will be invoked, so no escaping will be necessary.
my $cmd = shift #ARGV;
my #args = #ARGV;
print " Running $cmd\n";
system $cmd, #args;
I tested with ./test.pl echo Tom\'s Toy and it gives the expected output:
Running echo
Tom's Toy
system(#ARGV) is probably all you need.
If you give system() a single argument, and if that argument contains any shell metacharacters (including spaces, quotation marks, etc), then the argument will be passed to the shell. jwodder is quite correct: the error message is from the shell, not from Perl.
If you pass system() multiple arguments, it's done without invoking a shell -- which is usually better. The approach you're using takes your program's command-line arguments, joins them together into a single string, then passes that string to the shell, which splits it back into multiple arguments for execution.
On the other hand, sometimes you might want to invoke the shell, for example if you're building up a complex command using pipes, I/O redirection, and so forth, and you don't want to set it all up in Perl. But you have to be careful about metacharacters, as you've seen.
"perldoc -f system" explains this more fully.
If all you want to do is print back the user input, use print, not system. system will try to pass the supplied string to the shell for execution as a command, and it's the shell that's complaining about the unmatched quote.
(Also, instead of manually concatenating #ARGV, may I direct your attention to the join function?)

Executing bash script, read its output and create html with Perl

I have a bash script which produces different integer values. When I run that script, the output looks like this:
12
34
34
67
6
This script runs on a Solaris server. In order to provide other users in the network with these values, I decided to write a Perl script which can:
run the bash file
read its output
build a tiny html page with a table in which the bash values are stored
Thats a hard job for me because I have almost no experience with Perl. I know I can use system to execute unix commands (and bash files) but I cannot get the output. I also heared about qx which sounds very useful for my case.
But I must admit I have no clue how do start... Could you give me a few hints how to solve that?
With a question like this it's a little hard to know where to begin.
The qx to which you are referring is a feature of Perl. The "q*" or "Quote and Quote-like Operators" are documented in the Perl "operators" man page (normally you'd use man perlop to read that on systems with a conventional installation of Perl).
Specifically qx is the "quoted-execution of a command" ... which is essentially an alternative form of the ` (back tick or "command substitution") operator in Perl.
In other words if you execute a command like:
perl -e '$foo = qx{ls}; print "\n###\n$foo\n###\n";'
... on a system with Perl installed then it should run Perl, which should evaluate (-e) the expression you've provided (quoted). In other words we're writing a small program right on the command line. This program starts by creating a variable whose contents will be a "scalar" (which is Perl terminology for a string or number). We're assigning (the =, or assignment, operator) the output which is captured by executing the ls command back to this variable ($foo). After that we're printing the contents of our variable (whatever the ls command would have printed) with ### lines preceding and following those contents..
A quirk of Perl's qx operator (and the various other q* operators) is that it allows you to delimit the command with just about any characters you like. For example perl -e '$bar = qx/pwd/;' would capture the output of the pwd command. When you use any of the characters that are normally used as delimiters around text parentheses, braces, brackets, etc) then the qx command will look for the appropriate matching delimiter. If you use any other punctuation (or non-alpha-numeric character?) then that same character will be the terminating delimiter as well. This later behavior is similar to, and was inspired by, a feature in "substitution" command from the old sed utility and ed line editors; while the matching of parentheses, braces, etc. are a Perl novelty.
So that's the basics of how to capture your shell script's output. To print the numbers in an HTML table you'd have to split the captured output into separate lines (saving them into a list or array) then print your HTML prologue (the <table> and <th> (header) tags, and so on) ... them loop over a series of <tr> rows, interpolating your numbers into <td>> (table data) containers) and then finally print your HTML epilogue (with the closing tags).
For that you'll want to read up on the Perl print function and about "interpolation" in Perl. That's a fairly complex topic.
This is all extremely crude and there are tools around which allow you to approach the generation of HTML at a much higher level. It's also rather dubious that you want to wrap the execution of your shell script in a Perl script since it seems likely that you could modify the shell script to directly output HTML (perhaps as an option controlled by a command line switch or environment variable) or that you could re-write the shell script in Perl. This could potentially eliminate the extra work of parsing the output (splitting it into lines and separating the values out of those lines into an array because you can capture the data directly into the array (or possibly print out your HTML rows) directly as you are generating them (however your existing shell script is doing that).
To capture the output of your bash file, you can use the backtick operator:
use strict;
my $e = `ls`;
print $e;
Many, many thanks to you! With your great help. I was able to build a perl script which does a big part of the job.
This is what I have created so far:
#!/usr/bin/perl -w
use strict;
use CGI qw(:standard);
#some variables
my $message = "please wait, loading data...\n";
#First build the web page
print header;
print start_html('Hello World');
print "<H1>we need love, peace and harmony</H1>\n";
print "<p>$message</p>\n";
#Establish a pipeline between the bash and my script.
my $bash_command = '/love/peace/harmony/./lovepeace.bash';
open(my $pipe, '-|', $bash_command) or die $!;
while (my $line = <$pipe>){
# Do something with each line.
print "<p>$line</p>\n";
}
#job done, now refresh page?
print end_html;
When I call that .pl script in my browser, everything works nice :-) But a few questions are still on my mind:
When I call this website, it is busy loading the values from the pipe. Since there are about 10 Values its rather
quick (2-4 seconds) But if I have 100+ Values the user has to wait a while. Since I cannot have a progress bar, I
should give an information to the user. Like:
"Loading data, please wait..."
And when the job is done, this message should say: "Job done" or something similar.
But how do I realize if the process is finnished?
can I reload the page if the job is done ?
Is there any chance of using my own stylesheet wihtin this perl-CGI
Regards,
JJ
Why only perl:
you can use awk for that in side your shell script itself.
I have done this earlier.
if you have the out put values in a variable then use the below method:
echo $SUBSCRIBERS|awk 'BEGIN {
print "<?xml version=\"1.0\" encoding=\"UTF-8\"?><GenTransactionHandler xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"><EntityToPublish>\n<Entitytype=\"C\" typeDesc=\"Subscriber level\"><TargetApplCode>UHUNLD</TargetApplCode><TrxName>GET_SUBSCR_DATA</TrxName>"
}
{for(i=1;i<NF+1;i++) printf("<value>%d</value>\n",$i)}
END{
print "</Entity>\n</EntityToPublish></GenTransactionHandler>"}' >XML_SUB_NUM`date +%Y%m%d%H%M%S`.xml
in $SUBSCRIBERS the values should eb tab separated.

What does the '`' character do in Perl?

I was using Perl to read through each line of a file. I used a command line tool to call a service, and I noticed some interesting functionality that I can't figure out how to search for. To the variable $cmd I assigned the command that invokes the service. If I refer to $cmd later in the code it prints out the command line argument, but if I refer to it as `$cmd`, however, it gives the output from running the service.
What is the explanation for this?
It works just like backquotes in the shell, which is why it is called that. See sh(1) for details. It captures the standard output alone, and nothing else. It sets the $? variable to the 16-bit wait status word.
This is all explained in the perlop(1) manpage:
qx/STRING/
`STRING`
A string which is (possibly) interpolated and then
executed as a system command with /bin/sh or its
equivalent. Shell wildcards, pipes, and redirections
will be honored. The collected standard output of the
command is returned; standard error is unaffected. In
scalar context, it comes back as a single (potentially
multi-line) string, or undef if the command failed.
In list context, returns a list of lines (however
you’ve defined lines with $/ or
$INPUT_RECORD_SEPARATOR), or the empty list if the
command failed.
Because backticks do not affect standard error: use
shell file descriptor syntax (assuming the shell
supports this) if you care to address this. To
capture a command’s STDERR and STDOUT merged together:
$output = `cmd 2>&1`;
To capture a command’s STDOUT but discard its STDERR:
$output = `cmd 2>/dev/null`;
To capture a command’s STDERR but discard its STDOUT
(ordering is important here):
$output = `cmd 2>&1 1>/dev/null`;
To exchange a command’s STDOUT and STDERR in order to
capture the STDERR but leave its STDOUT to come out
the old STDERR:
$output = `cmd 3>&1 1>&2 2>&3 3>&-`;
To read both a command’s STDOUT and its STDERR
separately, it’s easiest to redirect them separately
to files, and then read from those files when the
program is done:
system("program args 1>program.stdout 2>program.stderr");
The STDIN filehandle used by the command is inherited
from Perl’s STDIN. For example:
open(BLAM, "blam") || die "$0: can't open blam: $!";
open (STDIN, "<&BLAM") || die "$0: can't dup BLAM: $!";
print `sort`;
will print the sorted contents of the file blam.
Using single-quote as the delimiter protects the command
from Perl’s double-quote interpolation, passing the contents on
to the shell instead:
$perl_info = qx(ps $$); # that's Perl's $$
$shell_info = qx'ps $$'; # that's the new shell's $$
How that string gets evaluated is entirely subject to
the command interpreter on your system. On most
platforms, you will have to protect shell
metacharacters if you want them treated literally.
This is in practice difficult to do, as it’s unclear
which characters need escaping, or how. See perlsec for a
clean and safe example of a manual fork and exec
to emulate backticks safely.
On some platforms (notably DOS-like ones), the shell
may not be capable of dealing with multiline commands,
so putting newlines in the string may not get you what
you want. You may be able to evaluate multiple
commands in a single line by separating them with the
command separator character, if your shell supports
that (e.g. ; on many Unix shells; & on the Windows
NT CMD.COM shell).
Beginning with v5.6.0, Perl attempts to flush all
files opened for output before starting the child
process, but this may not be supported on some
platforms (see perlport(1)). To be safe, you may need to
set $| ($AUTOFLUSH in English) or call the
autoflush method of IO::Handle on any open
handles.
Beware that some command shells may place restrictions
on the length of the command line. You must ensure
your strings don’t exceed this limit after any
necessary interpolations. See the platform-specific
release notes for more details about your particular
environment.
Using this operator can lead to programs that are
difficult to port, because the shell commands called
vary between systems, and may in fact not be present
at all. As one example, the type command under the
POSIX shell is very different from the type command
under DOS. That doesn't mean you should go out of
your way to avoid backticks when they’re the right way
to get something done. Perl was made to be a glue
language, and one of the things it glues together is
commands. Just understand what you’re getting
yourself into.
See I/O Operators for more discussion.
Here’s a simple example of using backticks to get the exit status of the first element in a pipeline:
$device = q(/dev/rmt8);
$dd_noise = q(^[0-9]+\+[0-9]+ records (in|out)$);
$status = `exec 3>&1; ((dd if=$device ibs=64k 2>&1 1>&3 3>&- 4>&-; echo $? >&4) | egrep -v "$dd_noise" 1>&2 3>&- 4>&-) 4>&1`;
EDIT
Well ok then, so maybe that wasn’t that simple an example. :) But this one is.
I’d like to recommend the Capture::Tiny CPAN module as a simpler way to manage the output from external commands that you would normally run using backquotes. It has advantages and disadvantages, but I feel that for many people, the advantages outweigh any arguable disadvantageL
The advantage is that you get to do all this without requiring deep knowledge of arcane mysteries of file-descriptor redirection the way the previous example did.
The disadvantage is it’s yet another non-core dependency — something else you have to install from CPAN.
That’s really not bad for what you get.
Here’s an example of how easy it is:
NAME
Capture::Tiny - Capture STDOUT and STDERR from Perl, XS, or external programs
SYNOPSIS
use Capture::Tiny qw/capture tee capture_merged tee_merged/;
($stdout, $stderr) = capture {
# your code here
};
($stdout, $stderr) = tee {
# your code here
};
$merged = capture_merged {
# your code here
};
$merged = tee_merged {
# your code here
};
DESCRIPTION
Capture::Tiny provides a simple, portable way to capture anything sent to STDOUT or STDERR, regardless of whether it comes from Perl, from XS code
or from an external program. Optionally, output can be teed so that it is captured while being passed through to the original handles. Yes, it
even works on Windows. Stop guessing which of a dozen capturing modules to use in any particular situation and just use this one.
There, isn’t that a whole lot easier?
The back-quote in Perl does much the same as the back-quote in shell - it runs a command and captures the standard output.
See also qx//.
I think the backtick lets you run commands and store their output in a variable:
$listing=`ls -1 /tmp/`;

Perl command line problem

I'm writing a Perl program that will take a few command-line arguments (they'll actually be supplied by another program) and open a pdf to a specific page. I based it off of here (Look at page 5). I've already tested the command straight from the command line, and it does exactly what I want it to do. Now I'm trying to do it from Perl, and it doesn't appear to be working. The error I get is:
The process tried to write to a nonexistent pipe
Here's the code... can someone tell me what I'm doing wrong?
#!C:/perl/bin/perl
use strict;
use warnings;
use diagnostics;
my $c = `cmd \c "`.$ARGV[0].`" /A "page=`.$ARGV[1].`=OpenActions" "`.$ARGV[2].``;
print $c;
system "Pause";
All I get after this is a blank space in cmd. Once I hit Ctrl+C, it returns to a prompt, and if I hit enter there, it gives me the above error.
When Perl sees
my $c = `cmd \c "`.$ARGV[0].`" /A "page=`.$ARGV[1].`=OpenActions" "`.$ARGV[2].``;
It turns it into something like this:
my $c = qx{cmd \c "}.$ARGV[0].qx{" /A "page=}.
$ARGV[1].qx{=OpenActions" "}.$ARGV[2].qx{};
Each of those qx{...} portions, are executed by the command shell as they are encountered, most of them are probably syntax errors. Your full command is never run.
What you probably wanted was:
my $c = qx{cmd \\c "$ARGV[0]" /A "page=$ARGV[1]=OpenActions" "$ARGV[2]"};
Which constructs the string, and then passes it to the shell.
I think you're a little confused about how backticks work. Something like this:
my $c = `/where_is/pancake_house`
Will run the /where_is/pancake_house command and put whatever it prints on its standard output into $c. Backticks also interpolate like double quote strings. You'll also have to escape backslashes.
So, you don't want multiple sets of backticks in command and you don't need to paste things together like that. Something like this:
my $c = `cmd \\c "$ARGV[0]" /A "page=$ARGV[1]=OpenActions" "$ARGV[2]`;
should be okay. Of course you'll still have issues if the ARGV values have spaces or quotes or other funny things. You'd probably want to use IPC::Open2 or IPC::Open3 for more safety.
Backtick doesn't work like you seem to think.
$var = "foo";
`cmd \c `.$foo
This does not execute the command cmd \c foo. It executes the command cmd \c and takes the output, concatenates the value of $foo to that output. You need to construct the entire command, and only then feed it to the backtick operator.