How to use both pipes and prevent shell expansion in perl system function? - perl

If multiple arguments are passed to perl's system function then the shell expansion will not work:
# COMMAND
$ perl -e 'my $s="*"; system("echo", "$s" )'
# RESULT
*
If the command is passed as an one argument then the expansion will work:
# COMMAND
$ perl -e 'my $s="echo *"; system("$s")'
# RESULT
Desktop Documents Downloads
The system function also allows to using multiple commands and connect them using pipes. This only works when argument is passed as an one command:
# COMMAND
$ perl -e 'my $s="echo * | cat -n"; system("$s")'
# RESULT
1 Desktop Documents Downloads
How can I combine mentioned commands and use both pipes and prevent shell expansion?
I have tried:
# COMMAND
$ perl -e 'my $s="echo"; system("$s", "* | cat -n")'
# RESULT
* | cat -n
but this did not work because of reasons that I've described above (multiple arguments are not expanded). The result that I want is:
1 *
EDIT:
The problem that I'm actually facing is that when I use following command:
system("echo \"$email_message\" | mailx -s \"$email_subject\" $recipient");
Then the $email_message is expanded and it will break mailx if it contains some characters that are further expanded by shell.

system has three calling conventions:
system($SHELL_CMD)
system($PROG, #ARGS) # #ARGS>0
system( { $PROG } $NAME, #ARGS ) # #ARGS>=0
The first passes a command to the shell. It's equivalent to
system('/bin/sh', '-c', $SHELL_CMD)
The other two execute the program $PROG. system never prevents shell expansion or performs any escaping. There's simply no shell involved.
So your question is about building a shell command. If you were at the prompt, you might use
echo \* | cat -n
or
echo '*' | cat -n
to pass *. You need a function that performs the job of escaping * before interpolating it. Fortunately, one already exists: String::ShellQuote's shell_quote.
$ perl -e'
use String::ShellQuote qw( shell_quote );
my $s = "*";
my $cmd1 = shell_quote("printf", q{%s\n}, $s);
my $cmd2 = "cat -n";
my $cmd = "$cmd1 | $cmd2";
print("Executing <<$cmd>>\n");
system($cmd);
'
Executing <<printf '%s\n' '*' | cat -n>>
1 *
I used printf instead of echo since it's very hard to handle arguments starting with - in echo. Most programs accept -- to separate options from non-options, but not my echo.
All these complications beg the question: Why are you shelling out to send an email? It's usually much harder to handle errors from external programs than from libraries.

You can use open to pipe directly to mailx, without your content being interpreted by the shell:
open( my $mail, "|-", "mailx", "-s", $email_subject, $recipient );
say $mail $email_message;
close $mail;
More details can be found in open section of perlipc.

Related

awk command in Perl's system does not work

I am writing a small Perl script that executes an Awk command :
I try to swap two columns in a file, the file is like this :
domain1,ip1
domain2,ip2
domain3,ip3
the result should be
ip1,domain1
ip2,domain2
ip3,domain3
The Perl command invoking awk is like this:
system("ssh -p 22 root\#$mainip 'awk -F, '{print $2,$1}' OFS=, /root/archive/ipdomain.txt > /root/ipdom.txt'");
This is the error I get :
awk: cmd. line:1: {print
awk: cmd. line:1: ^ unexpected newline or end of string
any suggestions, please?
With the layered commands and all that multi-level quoting and escaping that need be done right,† no wonder it fails. A complex command like that will always be tricky, but libraries help a lot.
A properly quoted string to run through a shell can be formed with String::ShellQuote ‡
use warnings;
use strict;
use feature 'say';
use String::ShellQuote qw(shell_quote);
die "Usage: $0 file outfile\n" if #ARGV != 2;
my ($file, $out) = #ARGV;
my #cmd_words =
( 'ssh', 'hostname', 'awk', q('{print $2 $1}'), $file, '>', $out );
my $cmd = shell_quote #cmd_words;
system($cmd);
Note how the q() operator from of single quotes enables us to pass single quotes nicely.
This swaps the first two words on each line of a file and prints them, using awk, and redirects the output to a file, on a remote host. It works as expected in my tests (with a real hostname). Please adjust as needed.
Another possible improvement would be to use a library for ssh, like Net::OpenSSH.
A complete command, as the one in the question, to use in the above program
my #cmd_words = (
'ssh', '-p', '22', "root\#$mainip",
'awk', '-F,', q('{print $2,$1}'), 'OFS=,', $file, '>', $out );
Tested with a file from the question.
The makeVoiceBot answer is informative and it got half way there but I find the need for
system("ssh hostname \"awk '{print \\\$2 \\\$1}' $path\"");
This works in my tests (on systems I ssh to). I try to avoid needing to deal with such quoting and escaping.
† This is a shell command which runs ssh, and then executes a command on the remote system which runs a shell (there) as well, in order to run awk and redirect its output to a file.
A bit more than an "awk command" as the title says.
‡ The library can prepare a command for bash (as of this writing), but one can look at the source for it and adjust it for their own shell, at least. There is also Win32::ShellQuote
I am using a shortened example here
system("ssh localhost 'awk '{print $2,$1}' file.txt'")
system() sees:
ssh localhost 'awk '{print $2,$1}' file.txt'
local shell expands:
ssh
localhost
awk
{print
$2,$1}
file.txt
local shell replaces $1 and $2 (positional args) with empty strings:
ssh
localhost
awk
{print
,}
file.txt
ssh executes:
ssh localhost awk {print ,} file.txt
remote shell gets:
awk
{print
,}
file.txt
So the remote shell runs awk with {print as its program argument, resulting in the described error. To prevent this, the invocation of system() can be changed to;
system("ssh localhost \"awk '{print \$2,\$1}' file.txt\"")
system() sees:
ssh localhost "awk '{print \$2,\$1}' file.txt"
local shell expands:
ssh
localhost
awk '{print \$2,\$1}' file.txt
ssh executes
ssh localhost awk '{print \$2,\$1}' file.txt
remote shell gets
awk
{print \$2,\$1}
file.txt
remote shell expands \ escapes
awk
{print $2,$1}
file.txt
Remote awk now gets {print $2,$1} as its program argument, and executes successfully.

xargs pass multiple arguments to perl subroutine?

I know how to pipe multiple arguments with xargs:
echo a b | xargs -l bash -c '1:$0 2:$1'
and I know how to pass the array of arguments to my perl module's subroutine from xargs:
echo a b | xargs --replace={} perl -I/home/me/module.pm -Mme -e 'me::someSub("{}")'
But I can't seem to get multiple individual arguments passed to perl using those dollar references (to satisfy the me::someSub signature):
echo a b | xargs -l perl -e 'print("$0 $1")'
Just prints:
-e
So how do I get the shell arguments: $0, $1 passed to my perl module's subroutine?
I know I could just delimit a;b so that the xarg {} could be processed by perl splitting it to get individual arguments), but I could also just completely process all STDIN with perl. Instead, my objective is to use perl -e so that I can explicitly call the subroutine I want (rather than having some pre-process in the script that figures out what subroutine to call and what arguments to use based on STDIN, to avoid script maintenance costs).
While bash's argument are available as $# and $0, $1, $2, etc, Perl's arguments are available via #ARGV. This means that the Perl equivalent of
echo a b | xargs -l bash -c 'echo "1:$0 2:$1"'
is
echo a b | xargs -l perl -e'CORE::say "1:$ARGV[0] 2:$ARGV[1]"'
That said, it doesn't make sense to use xargs in this way because there's no way to predict how many times it will call perl, and there's no way to predict how many arguments it will pass to perl each time. You have an XY Problem, and you haven't provided any information to help us. Maybe you're looking for
perl -e'CORE::say "1:$ARGV[0] 2:$ARGV[1]"' $( echo a b )
I am not sure about the details of your design, so I take it that you need a Perl one-liner to use shell's variables that are seen in the scope in which it's called.
A perl -e'...' executes a Perl program given under ''. For any variables from the environment where this program runs -- a pipeline, or a shell script -- to be available to the program their values need be passed to it. Ways to do this with a one-liner are spelled out in this post, and here is a summary.
A Perl program receives arguments passed to it on the command-line in #ARGV array. So you can invoke it in a pipeline as
... | perl -e'($v1, $v2) = #ARGV; ...' "$0" "$1"
or as
... | xargs -l perl -e'($v1, $v2) = #ARGV; ...'
if xargs is indeed used to feed the Perl program its input. In the first example the variables are quoted to protect possible interesting characters in them (spaces, *, etc) from being interpreted by the shell that sets up and runs the perl program.
If input contains multiple lines to process and the one-liner uses -n or -p for it then unpack arguments in a BEGIN block
... | perl -ne'BEGIN { ($v1, $v2) = splice(#ARGV,0,2) }; ...' "$0" "$1" ...
which runs at compile time, so before the loop over input lines provided by -n/-p. The arguments other than filenames are now removed from #ARGV, so to leave only the filenames there for -n/-p, in case input comes from files.
There is also a rudimentary mechanism for command-line switches in a one-liner, via the -s switch. Please see the link above for details; I'd recommend #ARGV over this.
Finally, your calling code could set up environment variables which are then available to the Perl progam in %ENV. However, that doesn't seem to be suitable to what you seem to want.
Also see this post for another example.

running a shell command that is quoted

In my Perl program I get to a point where I have a variable that has the following:
echo -e \"use\nseveral\nlines\"
I would like to run this command through the shell (using exec) as
echo -e "use\nseveral\nlines"
I tried eval on the variable before I passed it to exec, but that interpreted the \n and changed them to newlines.
EDIT:
Note that I am given the variable and do not have control over how it is input. Thus, given that the variable was input as such, is there a way to "unquote" it?
In Perl, you should use q{} or qq{} (or qx{} for execution) to avoid complicated quote escaping.
This should work for you (using q{} to avoid interpolating \n):
my $str = q{echo -e "use\nseveral\nlines"};
Now, you can execute it using qx:
qx{$str}
When you pass
echo -e \"use\nseveral\nlines\"
to the shell, it passes the following three args to the exec systems call:
echo
-e
use\nseveral\nlines
How does one create that last string? Here are a few ways:
"use\\nseveral\\nlines" # Escape \W chars in double-quoted strings.
'use\\nseveral\\nlines' # Escape \ and delimiters in single-quoted strings
'use\nseveral\nlines' # ...although it's optional if unambiguous.
The corresponding Perl command would be therefore be
exec('echo', '-e', 'use\nseveral\nlines');
system('echo', '-e', 'use\nseveral\nlines');
open(my $fh, '-|', 'echo', '-e', 'use\nseveral\nlines');
my #output = <$fh>;

How can I store the result of a system command in a Perl variable?

$ cat test.pl
my $pid = 5892;
my $not = system("top -H -p $pid -n 1 | grep myprocess | wc -l");
print "not = $not\n";
$ perl test.pl
11
not = 0
$
I want to capture the result i.e. 11 into a variable. How can I do that?
From Perlfaq8:
You're confusing the purpose of system() and backticks (``). system() runs a command and returns exit status information (as a 16 bit value: the low 7 bits are the signal the process died from, if any, and the high 8 bits are the actual exit value). Backticks (``) run a command and return what it sent to STDOUT.
$exit_status = system("mail-users");
$output_string = `ls`;
There are many ways to execute external commands from Perl. The most commons with their meanings are:
system() : you want to execute a command and don't want to capture its output
exec: you don't want to return to the
calling perl script
backticks : you want to capture the
output of the command
open: you want to pipe the command (as
input or output) to your script
Also see How can I capture STDERR from an external command?
The easiest way is to use the `` feature in Perl. This will execute what is inside and return what was printed to stdout:
my $pid = 5892;
my $var = `top -H -p $pid -n 1 | grep myprocess | wc -l`;
print "not = $var\n";
This should do it.
Try using qx{command} rather than backticks. To me, it's a bit better because: you can do SQL with it and not worry about escaping quotes and such. Depending on the editor and screen, my old eyes tend to miss the tiny back ticks, and it shouldn't ever have an issue with being overloaded like using angle brackets versus glob.
Using backtick or qx helps, thanks everybody for the answers. However, I found that if you use backtick or qx, the output contains trailing newline and I need to remove that. So I used chomp.
chomp($host = `hostname`);
chomp($domain = `domainname`);
$fqdn = $host.".".$domain;
More information here:
http://irouble.blogspot.in/2011/04/perl-chomp-backticks.html
Use backticks for system commands, which helps to store their results into Perl variables.
my $pid = 5892;
my $not = ``top -H -p $pid -n 1 | grep myprocess | wc -l`;
print "not = $not\n";
Also for eg. you can use IPC::Run:
use IPC::Run qw(run);
my $pid = 5892;
run [qw(top -H -n 1 -p), $pid],
'|', sub { print grep { /myprocess/ } <STDIN> },
'|', [qw(wc -l)],
'>', \my $out;
print $out;
processes are running without bash subprocess
can be piped to perl subs
very similar to shell

How can I grep for a value from a shell variable?

I've been trying to grep an exact shell 'variable' using word boundaries,
grep "\<$variable\>" file.txt
but haven't managed to; I've tried everything else but haven't succeeded.
Actually I'm invoking grep from a Perl script:
$attrval=`/usr/bin/grep "\<$_[0]\>" $upgradetmpdir/fullConfiguration.txt`
$_[0] and $upgradetmpdir/fullConfiguration.txt contains some matching "text".
But $attrval is empty after the operation.
#OP, you should do that 'grepping' in Perl. don't call system commands unnecessarily unless there is no choice.
$mysearch="pattern";
while (<>){
chomp;
#s = split /\s+/;
foreach my $line (#s){
if ($line eq $mysearch){
print "found: $line\n";
}
}
}
I'm not seeing the problem here:
file.txt:
hello
hi
anotherline
Now,
mala#human ~ $ export GREPVAR="hi"
mala#human ~ $ echo $GREPVAR
hi
mala#human ~ $ grep "\<$GREPVAR\>" file.txt
hi
What exactly isn't working for you?
Not every grep supports the ex(1) / vi(1) word boundary syntax.
I think I would just do:
grep -w "$variable" ...
Using single quotes works for me in tcsh:
grep '<$variable>' file.txt
I am assuming your input file contains the literal string: <$variable>
If variable=foo are you trying to grep for "foo"? If so, it works for me. If you're trying to grep for the variable named "$variable", then change the quotes to single quotes.
On a recent linux it works as expected. Do could try egrep instead
Say you have
$ cat file.txt
This line has $variable
DO NOT PRINT ME! $variableNope
$variable also
Then with the following program
#! /usr/bin/perl -l
use warnings;
use strict;
system("grep", "-P", '\$variable\b', "file.txt") == 0
or warn "$0: grep exited " . ($? >> 8);
you'd get output of
This line has $variable
$variable also
It uses the -P switch to GNU grep that matches Perl regular expressions. The feature is still experimental, so proceed with care.
Also note the use of system LIST that bypasses shell quoting, allowing the program to specify arguments with Perl's quoting rules rather than the shell's.
You could use the -w (or --word-regexp) switch, as in
system("grep", "-w", '\$variable', "file.txt") == 0
or warn "$0: grep exited " . ($? >> 8);
to get the same result.
Using single quote it wont work. You should go for double quote
For example:
this wont work
--------------
for i in 1
do
grep '$i' file
done
this will work
--------------
for i in 1
do
grep "$i" file
done