I have a Perl script progA.pl which needs to run another Perl script progB.pl using the system command. However, progB.pl has been aliased in ~/.bashrc so I need to ensure that it is run after ~/.bashrc has been loaded. I can achieve this by using bash with the -lc option.
For this question, I simplify the problem as much as I think is needed, by considering the following version of progB.pl
use feature qw(say);
use strict;
use warnings;
use Data::Dump qw(dd dump);
say "Received \#ARGV: " . dump #ARGV;
and here is progA.pl:
use feature qw(say);
use strict;
use warnings;
use Data::Dump qw(dd dump);
my $cmd = qq(progB.pl --opt='This option contains '"'"'single'"'"' quotes');
say "cmd = " . dump($cmd);
system( "$cmd" );
say "-----";
system( 'bash -c ' . "$cmd" );
say "-----";
system( 'bash -c ' . "'$cmd'" );
say "-----";
system( "bash -c \"$cmd\"" );
Running
$ progA.pl
gives output:
cmd = "progB.pl --opt='This option contains '\"'\"'single'\"'\"' quotes'"
Received #ARGV: "--opt=This option contains 'single' quotes"
-----
Received #ARGV: ()
-----
Received #ARGV: "--opt=This"
-----
Received #ARGV: "--opt=This option contains single quotes"
We see that this works fine, when progB.pl is run directly without using bash -c. When I use bash -c to run the command, none of the three alternatives are working correctly.
How can I run progB.pl with an argument containing single quotes and at the same time using using bash -c ?
You should avoid this quoting madness at first place but if you insist, you should avoid at least one level of quoting by using system ARRAY version.
my $cmd = q{progB.pl --opt='This option contains '"'"'single'"'"' quotes'};
system( qw(bash -c), $cmd );
It makes it only one level of quoting madness.
my $option = q{This option contains 'single' quotes} =~ s/'/'"'"'/gr; # '
my $cmd = qq{progB.pl --opt='$option'};
system( qw(bash -c), $cmd );
There you can make some simple helper
sub sq ($) { "'" . $_[0] =~ s/'/'"'"'/gr . "'" } # "
my $option = q{This option contains 'single' quotes};
my $cmd = qq{progB.pl --opt=#{[sq $option]}};
system( qw(bash -c), $cmd );
After some trial and error, I arrived at:
use feature qw(say);
use strict;
use warnings;
my $cmd = qq(print_first_arg.pl --opt='This option contains '"'"'single'"'"' quotes');
$cmd =~ s/'/'"'"'/g;
system( 'bash -c ' . "'$cmd'" );
It seems to work, for this test case at least..
This also follows the approach suggested by #ysth in this answer:
https://stackoverflow.com/a/24869016/2173773
Related
I use IPC::Run and I want run command (for example):
my #cmd = ("C:/test.cmd", "key=value");
IPC::Run::run \#cmd, '>', "C:\\log" or die "Failed running\n";
But realy run next command: C:/test.cmd key value
Why IPC::Run split parameter with "=" (key=value) for two parameters key and value?
From help cmd:
The special characters that require quotes are:
<space>
&()[]{}^=;!'+,`~
Use quotes:
my #cmd = ("C:/test.cmd", "\"key=value\"");
I have this code:
$script = "console.log(\"It works!\");";
$output = qx/ssh user#123.123.123.123 $script | interpreter/;
It's supposed to run $script through interpreter and write it into $output. The problem is that it doesn't work. How do I escape the characters correctly?
Think about what you're trying to do just with ssh. Both of these produce the same output, but work differently:
ssh user#host 'echo "puts 2+2" | ruby'
echo "puts 2+2" | ssh user#host ruby
In the first, the remote shell is executing the pipeline. (If you don't have those single quotes, what happens?) In the second, it's piped through your local shell to the ssh command and the interpreter launched.
Rather than perform convoluted escapes of code to come out correctly when crammed through sh, I prefer to pipe the text in through stdin. It's just simpler.
Using IPC::Run to do all the heavy lifting:
#!/usr/bin/env perl
use strict;
use warnings;
use IPC::Run qw(run);
my $in = <<FFFF;
2 + 2
8 * (2 + 3)
FFFF
my $cmd = [qw(ssh user#host perl calc.pl)];
run $cmd, \$in, \my $out, \my $err or die "ssh: $?";
print "out: $out";
print "err: $err";
(calc.pl is a simple infix calculator program that I had lying around)
Here's the output I get, running that:
out: 4
40
err: (SSH banners and pointless error messages)
Seeing system or qx// calls in a perl script is a sign of trouble. In general, I don't like having to think about shell syntax or shell quoting when I'm not writing shell; it has nothing to do with the problem I'm trying to solve, nor the tool I'm solving it with. (And that's ignoring any security implications.)
Oh, and if you don't have to muck with standard input but still want to execute and capture output from other programs, there's always IPC::System::Simple.
Since you're using Perl, you should do it in Perl, and not a call out to an external command.
Have you tried the Perl module Net::SSH::Perl?
I would also use qq instead of quotation marks when setting the value of $script. Using qq removes the whole how do I quote quotes mess. What ever character comes after qq is your string delimiter. All of these are valid:
my $string = qq/This is a "string" with a quote/;
my $string = qq|This is a "string" with a quote|;
my $string = qq*This is a "string" with a quote*;
Special matching quote operators are ( and ), [ and ], and { and }:
my $string = qq(This (is (a "string" with) a) quote);
Note that I can use parentheses as my string delimiters even though my string has parentheses in it. This is okay as long as those parentheses are balanced. This one wouldn't work:
my $string qq(This is an "unbalanced" parentheses ) that breaks this statement);
But, then I can switch to square brackets or curly braces:
my $string qq[This is an "unbalanced" parentheses ) but still works.];
Here's a Perl version of your program:
use strict; #Always use!
use warnings; #Always use!
use Net::SSH::Perl;
#
# Use Constants to set things that are ...well... constant
#
use constant {
HOST => "123.123.123.123",
USER => "user",
};
my $script = qq[console.log("It works!");];
my $connection = Net::SSH::Perl->new(HOST);
$connection->login(USER);
my ($output, $stderr, $exit_code) = $connection->cmd($script);
use Net::OpenSSH:
my $ssh = Net::OpenSSH->new('user#123.123.123.123');
my $output = $ssh->capture({stdin_data => 'console.log("It works!");'},
'interpreter');
when I use the following command in command line it's giving list of non integrated change lists.
p4 interchanges -t $branch1 #$date1, #$date2 $branch2 > changes.txt
But when I use this command in a Perl script as below it's not giving output:
$cmd = system ("p4 interchanges -t $branch1 #$date1, #$date2 $branch2 > changes.txt");
The output message in commandline is some error as given below:
branch1, - all revision(s) already integrated.
The issue is because of the comma used between date1 and date2? How to use this command in a Perl script?
This is why it is so important to turn on the strict and warnings pragmas. The string "#$date1" does not mean what you think it does. It is trying to dereference $data1 as a an array. Because strict isn't on it is treating the contents of $date1 as a symbolic reference. If you had turned on strict you would have seen an error message like:
Can't use string ("2010-08-30") as an ARRAY ref while "strict refs" in use at script.pl line 10.
You should probably say this instead:
system "p4 interchanges -t $branch1\#$date1,\#$date2 $branch2 > changes.txt";
if ($?) {
die "saw exit code: ", $? >> 8;
}
You may also have a problem if you expect $branch1, $date1, etc. to be shell variables instead of Perl variables. In that case you should say:
system "p4 interchanges -t $ENV{branch1}\#$ENV{date1},\#$ENV{date2} $ENV{branch2} > changes.txt";
if ($?) {
die "saw exit code: ", $? >> 8;
}
If you're going to be doing a lot of Perforce with Perl, try the P4Perl, which wraps Perforce in a Perl-native API.
Cribbing from the documentation, your system() call could be implemented as:
use P4;
my $p4 = new P4;
$p4->SetClient( $clientname );
$p4->SetPort ( $p4port );
$p4->SetPassword( $p4password );
$p4->Connect() or die( "Failed to connect to Perforce Server" );
my $c = $p4->Run( "interchanges", "-t", $branch1, "#".$date1, "#".$date2, $branch2 );
$c will contain an array reference with each of the unintegrated changelists.
I noticed that when I use backticks in perl the commands are executed using sh, not bash, giving me some problems.
How can I change that behavior so perl will use bash?
PS. The command that I'm trying to run is:
paste filename <(cut -d \" \" -f 2 filename2 | grep -v mean) >> filename3
The "system shell" is not generally mutable. See perldoc -f exec:
If there is more than one argument in LIST, or if LIST is an array with more than one value, calls execvp(3) with the arguments in LIST. If
there is only one scalar argument or an array with one element in it, the argument is checked for shell metacharacters, and if there are any, the
entire argument is passed to the system's command shell for parsing (this is "/bin/sh -c" on Unix platforms, but varies on other platforms).
If you really need bash to perform a particular task, consider calling it explicitly:
my $result = `/usr/bin/bash command arguments`;
or even:
open my $bash_handle, '| /usr/bin/bash' or die "Cannot open bash: $!";
print $bash_handle 'command arguments';
You could also put your bash commands into a .sh file and invoke that directly:
my $result = `/usr/bin/bash script.pl`;
Try
`bash -c \"your command with args\"`
I am fairly sure the argument of -c is interpreted the way bash interprets its command line. The trick is to protect it from sh - that's what quotes are for.
This example works for me:
$ perl -e 'print `/bin/bash -c "echo <(pwd)"`'
/dev/fd/63
To deal with running bash and nested quotes, this article provides the best solution: How can I use bash syntax in Perl's system()?
my #args = ( "bash", "-c", "diff <(ls -l) <(ls -al)" );
system(#args);
I thought perl would honor the $SHELL variable, but then it occurred to me that its behavior might actually depend on your system's exec implementation. In mine, it seems that exec
will execute the shell
(/bin/sh) with the path of the
file as its first argument.
You can always do qw/bash your-command/, no?
Create a perl subroutine:
sub bash { return `cat << 'EOF' | /bin/bash\n$_[0]\nEOF\n`; }
And use it like below:
my $bash_cmd = 'paste filename <(cut -d " " -f 2 filename2 | grep -v mean) >> filename3';
print &bash($bash_cmd);
Or use perl here-doc for multi-line commands:
$bash_cmd = <<'EOF';
for (( i = 0; i < 10; i++ )); do
echo "${i}"
done
EOF
print &bash($bash_cmd);
I like to make some function btck (which integrates error checking) and bash_btck (which uses bash):
use Carp;
sub btck ($)
{
# Like backticks but the error check and chomp() are integrated
my $cmd = shift;
my $result = `$cmd`;
$? == 0 or confess "backtick command '$cmd' returned non-zero";
chomp($result);
return $result;
}
sub bash_btck ($)
{
# Like backticks but use bash and the error check and chomp() are
# integrated
my $cmd = shift;
my $sqpc = $cmd; # Single-Quote-Protected Command
$sqpc =~ s/'/'"'"'/g;
my $bc = "bash -c '$sqpc'";
return btck($bc);
}
One of the reasons I like to use bash is for safe pipe behavior:
sub safe_btck ($)
{
return bash_btck('set -o pipefail && '.shift);
}
How can I use bash syntax in Perl's system() command?
I have a command that is bash-specific, e.g. the following, which uses bash's process substitution:
diff <(ls -l) <(ls -al)
I would like to call it from Perl, using
system("diff <(ls -l) <(ls -al)")
but it gives me an error because it's using sh instead of bash to execute the command:
sh: -c: line 0: syntax error near unexpected token `('
sh: -c: line 0: `sort <(ls)'
Tell Perl to invoke bash directly. Use the list variant of system() to reduce the complexity of your quoting:
my #args = ( "bash", "-c", "diff <(ls -l) <(ls -al)" );
system(#args);
You may even define a subroutine if you plan on doing this often enough:
sub system_bash {
my #args = ( "bash", "-c", shift );
system(#args);
}
system_bash('echo $SHELL');
system_bash('diff <(ls -l) <(ls -al)');
system("bash -c 'diff <(ls -l) <(ls -al)'")
should do it, in theory. Bash's -c option allows you to pass a shell command to execute, according to the man page.
The problem with vladr's answers is that system won't capture the output to STDOUT from the command (which you would usually want), and it also doesn't allow executing more than one command (given the use of shift rather than accessing the full contents of #_).
Something like the following might be more suited to the problem:
my #cmd = ( 'diff <(ls -l) <(ls -al)', 'grep fu' );
my #stdout = exec_cmd( #cmd );
print join( "\n", #stdout );
sub exec_cmd
{
my $cmd_str = join( ' | ', #_ );
my #result = qx( bash -c '$cmd_str' );
die "Failed to exec $cmd_str: $!" unless( $? == 0 && #result );
return #result;
}
Unfortunately this won't prevent you from invoking /bin/sh just to run bash, however I don't see a workaround for this issue.
I prefer to execute bash commands in perl with backticks "`". This way I get a return value, e.g.:
my $value = \`ls`;
Also, I don't have to use "bash -c" just to run a commmand. Works for me.
Inspired by the answer from #errant.info, I created something simpler and worked for me:
my $line="blastn -subject <(echo -e \"$seq1\") -query <(echo -e \"$seq2\") -outfmt 6";
my $result=qx(bash -c '$line');
print "$result\n";
The introduced $line variable allows modifying inputs ($seq1 and $seq2) each time.
Hope it helps!