help porting unix commands to perl script - perl

I am getting some perl compile errors trying to convert these unix commands to perl.
The use of single quotes and double quotes is throwing me off (see below: my $curlcmd).
Here's the working unix commands executed in order:
export CERT=`/dev/bin/util --show dev.testaccnt | awk '{print $2}'`
/usr/bin/curl -c /home/foobar/cookee.txt --certify /dev/key.crt \
--header "FooBar-Util:'${CERT}'" \
https://devhost.foobar.com:4443/fs/workflow/data/source/productname?val=Summ
I want to do the same within Perl:
#Build cmd in perl
my $cookie='/home/foobar/cookee.txt';
my $certkey='/dev/key.crt';
my $fsProxyHostPort='devhost.foobar.com:4443';
my $fsPath='workflow/data/source/productname';
my $fsProxyOperation='Summ';
my $fsProxyURL="https://$fsProxyHostPort/fs/$fsPath?val=$fsProxyOperation";
#Get cert
my $cert=qx(/dev/bin/pass-util --show foobar.dev.testaccnt | awk '{print \$2}');
Here's where I am having trouble executing it:
my $curlcmd = qx(/usr/bin/curl -c $cookie --certify $certkey --header "FooBar-Util:'${" . $cert . "}'". $fsProxyURL);
Can someone show me how to setup these commands in Perl correctly?

In the shell script, you have (in part):
--header "FooBar-Util:'${CERT}'"
This generates something like:
--header FooBar-Util:'data-from-certificate'
where the curl command gets to see those single quotes. To get the same result in Perl, you will need:
my $header = "FooBar-Util:'$cert'";
my $out = qx(/usr/bin/curl -c $cookie --certify $certkey --header $header $fsProxyURL);
Changes:
Lost the ${ ... } notation.
Lost the concatenation operations.
In situations where you have problems seeing the argument list sent to a command, I recommend using a program analogous to the shell echo command, but which lists each argument on its own line, rather than as a space-separated set of arguments on a single line. I call my version of this al for 'argument list'. If you test your commands (for example, the shell version) by prefixing the whole command line with al, you get to see the arguments that curl would see. You can then do the same in Perl to compare the arguments curl sees at the shell with the ones given it by Perl. Then you can fix the problems, typically much more easily.
For debugging with al:
my #lines = qx(al /usr/bin/curl -c $cookie --certify $certkey --header $header $fsProxyURL);
foreach my $line (#lines) { print "$line"; }
If you want to write al in Perl:
#!/usr/bin/env perl
foreach my $arg (#ARGV) { print "$arg\n"; }
Adventures in Validating an Answer
Fortunately, I usually check what I write as answers - and what is written above is mostly accurate, except for one detail; Perl manages to invoke the shell on the command, and in doing so, the shell cleans out the single-quotes:
my $cert = 'certificate-info';
my $fsProxyURL = 'https://www.example.com/fsProxy';
my $cookie = 'cookie';
my $certkey = 'cert-key';
my $header = "FooBar-Util:'$cert'";
#my #out = qx(al /usr/bin/curl -c $cookie --certify $certkey --header $header $fsProxyURL);
my #cmdargs = ( 'al', '/usr/bin/curl', '-c', $cookie, '--certify', $certkey, '--header', $header, $fsProxyURL);
print "System:\n";
system #cmdargs;
print "\nQX command:\n";
my #lines = qx(#cmdargs);
foreach my $line (#lines) { print "$line"; }
This yields:
System:
/usr/bin/curl
-c
cookie
--certify
cert-key
--header
FooBar-Util:'certificate-info'
https://www.example.com/fsProxy
QX command:
/usr/bin/curl
-c
cookie
--certify
cert-key
--header
FooBar-Util:certificate-info
https://www.example.com/fsProxy
Note the difference in the `FooBar lines!
At this point, you start to wonder what's the least unclean way to work around this. If you want to use the qx// operator, then you probably do:
my $header = "FooBar-Util:\\'$cert\\'";
This yields the variant outputs (system then qx//):
FooBar-Util:\'certificate-info\'
FooBar-Util:'certificate-info'
So, the qx// notation now gives the single quotes to the executed command, just as in the shell script. This meets the goal, so I'm going to suggest that it is 'OK' for you; I'm just not sure that I'd actually adopt it in my own code, but I don't have a cleaner mechanism on hand. I'd like to be able to use the system plus 'array of arguments' mechanism, while still capturing the output, but I haven't checked whether there's a sensible (meaning relatively easy) way to do that.
One other passing comment; if any of your arguments contained spaces, you'd have to be excruciatingly careful with what gets passed through the shell. Having the al command available really pays off then. You can't identify which spaces in echo output are parts of one argument and which are separators provided by echo.

Is it on purpose that you have two different definitions of $cert?
Your translation of --header "FooBar-Util:'${CERT}'" is bad. The ${...} tells the shell to insert the CERT variable, but since you're already doing this insertation from Perl, it is not needed and will just confuse.
You're also missing a space before the $fsProxyURL.
As you're apparently not using the captured output from curl for anyting, I would suggest that you use the system function instead so you avoid the use of an intermediate shell command line parsing:
system "/usr/bin/curl","-c",$cookie,"--certify",$certTheFirst,
"--header","FooBar-Util:'$certTheSecond'", $fsProxyURL;
Finally, it's not very perlish to use a subsidiary awk to split the pass-util value into fields when Perl does that kind of things perfectly well. Once you solve the immediate error, I suggest
my #passwordline = split / /, qx(/dev/bin/util --show dev.testaccnt);
my $certTheSecond = $passwordline[1];

This " . $cert . " seems to be a leftover from some other code, where black-quotes were used and not qx. Removing the black-quotes and concatenations (.) works on my machine.
So, to execute the command, do:
my $curlcmd = qx(/usr/bin/curl -c $cookie --certify $certkey --header "FooBar-Util:'${ $cert }'". $fsProxyURL);
Following your comment, if you just want to print the command, you can do:
my $curlcmd = qq(/usr/bin/curl -c $cookie --certify $certkey --header "FooBar-Util:'${ $cert }'". $fsProxyURL);
print $curlcmd;

Related

how to execute multiline shell command in perl?

I want to run shell command as follow in perl:
tar --exclude="*/node_modules" \
--exclude="*/vendor" \
--exclude='.git' \
-zvcf /tmp/robot.tgz .
But it seems perl can not excute this:
`tar --exclude="cv/node_modules" \
--exclude="*/vendor" \
--exclude='.git' \
-zvcf /tmp/robot.tgz .`;
Here is the error:
tar: Must specify one of -c, -r, -t, -u, -x
sh: line 1: --exclude=*/vendor: No such file or directory
sh: line 2: --exclude=.git: command not found
sh: line 3: -zvcf: command not found
it seems perl treat each line as one command.
Update
I apologise. My original diagnosis was wrong
It is hard to clearly express in a post like this the contents of strings that contain Perl escape characters. If anything below is unclear to you then please write a comment to say so. I hope I haven't made things unnecessarily complicated
My original solution below is still valid, and will give you better control over the contents of the command, but my reasons for why the OP's code doesn't work for them were wrong, and the truth offers other resolutions
The problem is that the contents of backticks (or qx/.../) are evaluated as a double-quoted string, which means that Perl variables and escape sequences like \t and \x20 are expanded before the string is executed. One of the consequences of this is that a backslash is deleted if it is followed by a literal newline, leaving just the newline
That means that a statement like this
my $output = `ls \
-l`;
will be preprocessed to "ls \n-l" and will no longer contain the backslash that is needed to signal to the shell that the newline should be removed (or indeed to get the command passed to the shell in the first place)
Apart from manipulating the command string directly as I described in my original post below, there are two solutions to this. The first is to escape the backslash itself by doubling it up, like this
my $output = `ls \\
-l`;
which will prevent it from being removed by Perl. That will pass the backslash-newline sequence to the shell, which will remove it as normal
The other is to use qx'...' instead of backticks together with single-quote delimiters, which will prevent the contents from being processed as a double-quoted string
my $output = qx'ls \
-l';
This will work fine unless you have used Perl variables in the string that you want to be interpolated
Original post
The problem is that the shell removes newlines preceded by backslashes from the command string before executing it. Without that step the command is invalid
So you must do the same thing yourself in Perl, and to do that you must put the command in a temporary variable
use strict;
use warnings 'all';
my $cmd = <<'END';
tar --exclude="*/node_modules" \
--exclude="*/vendor" \
--exclude='.git' \
-zvcf /tmp/robot.tgz .
END
$cmd =~ s/\\\n//g;
my $output = `$cmd`;
There is no need for the backslashes of course; you can simply use newlines and remove those before executing the command
Or you may prefer to wrap the operations in a subroutine, like this
use strict;
use warnings 'all';
my $output = do_command(<<'END');
tar --exclude="*/node_modules" \
--exclude="*/vendor" \
--exclude='.git' \
-zvcf /tmp/robot.tgz .
END
sub do_command {
my ($cmd) = #_;
$cmd =~ s/\\\n//g;
`$cmd`;
}

How to use both pipes and prevent shell expansion in perl system function?

If multiple arguments are passed to perl's system function then the shell expansion will not work:
# COMMAND
$ perl -e 'my $s="*"; system("echo", "$s" )'
# RESULT
*
If the command is passed as an one argument then the expansion will work:
# COMMAND
$ perl -e 'my $s="echo *"; system("$s")'
# RESULT
Desktop Documents Downloads
The system function also allows to using multiple commands and connect them using pipes. This only works when argument is passed as an one command:
# COMMAND
$ perl -e 'my $s="echo * | cat -n"; system("$s")'
# RESULT
1 Desktop Documents Downloads
How can I combine mentioned commands and use both pipes and prevent shell expansion?
I have tried:
# COMMAND
$ perl -e 'my $s="echo"; system("$s", "* | cat -n")'
# RESULT
* | cat -n
but this did not work because of reasons that I've described above (multiple arguments are not expanded). The result that I want is:
1 *
EDIT:
The problem that I'm actually facing is that when I use following command:
system("echo \"$email_message\" | mailx -s \"$email_subject\" $recipient");
Then the $email_message is expanded and it will break mailx if it contains some characters that are further expanded by shell.
system has three calling conventions:
system($SHELL_CMD)
system($PROG, #ARGS) # #ARGS>0
system( { $PROG } $NAME, #ARGS ) # #ARGS>=0
The first passes a command to the shell. It's equivalent to
system('/bin/sh', '-c', $SHELL_CMD)
The other two execute the program $PROG. system never prevents shell expansion or performs any escaping. There's simply no shell involved.
So your question is about building a shell command. If you were at the prompt, you might use
echo \* | cat -n
or
echo '*' | cat -n
to pass *. You need a function that performs the job of escaping * before interpolating it. Fortunately, one already exists: String::ShellQuote's shell_quote.
$ perl -e'
use String::ShellQuote qw( shell_quote );
my $s = "*";
my $cmd1 = shell_quote("printf", q{%s\n}, $s);
my $cmd2 = "cat -n";
my $cmd = "$cmd1 | $cmd2";
print("Executing <<$cmd>>\n");
system($cmd);
'
Executing <<printf '%s\n' '*' | cat -n>>
1 *
I used printf instead of echo since it's very hard to handle arguments starting with - in echo. Most programs accept -- to separate options from non-options, but not my echo.
All these complications beg the question: Why are you shelling out to send an email? It's usually much harder to handle errors from external programs than from libraries.
You can use open to pipe directly to mailx, without your content being interpreted by the shell:
open( my $mail, "|-", "mailx", "-s", $email_subject, $recipient );
say $mail $email_message;
close $mail;
More details can be found in open section of perlipc.

What is a perl one liner to replace a substring in a string?

Suppose you've got this C shell script:
setenv MYVAR "-l os="\""redhat4.*"\"" -p -100"
setenv MYVAR `perl -pe "<perl>"`
Replace with code that will either replace "-p -100" with "-p -200" in MYVAR or add it if it doesn't exist, using a one liner if possible.
The topic does not correspond to content, but I think it may be usefull if someone posts an answer to topic-question. So here is the perl-oneliner:
echo "my_string" | perl -pe 's/my/your/g'
What you want will look something like
perl -e' \
use Getopt::Long qw( :config posix_default ); \
use String::ShellQuote; \
GetOptions(\my %opts, "l=s", "p=s") or die; \
my #opts; \
push #opts, "-l", $opts{l} if defined($opts{l}); \
push #opts, "-p", "-100"; \
print(shell_quote(#opts)); \
' -- $MYVAR
First, you need to parse the command line. That requires knowing the format of the arguments of the application for which they are destined.
For example, -n is an option in the following:
perl -n -e ...
Yet -n isn't an option in the following:
perl -e -n ...
Above, I used Getopt::Long in POSIX mode. You may need to adjust the settings or use an entirely different parser.
Second, you need to produce csh literals.
I've had bad experiences trying to work around csh's defective quoting, so I'll leave those details to you. Above, I used String::ShellQuote's shell_quote which produces sh (rather than csh) literals. You'll need to adjust.
Of course, once you got this far, you need to get the result back into the environment variable unmangled. I don't know if that's possible in csh. Again, I leave the csh quoting to you.

bash cgi won't return image

We have a monitoring system making RRD databases. I am looking for the most light way of creating graphs from this RRD files for our HTML pages. So I don't want to store them in files. I am trying to create simple BASH CGI script, that will output image data, so I can do something like this:
<img src="/cgi-bin/graph.cgi?param1=abc"></img>
First of all, I am trying to create simple CGI script, that will send me PNG image. This doesn't work:
#!/bin/bash
echo -e "Content-type: image/png\n\n"
cat image.png
But when I rewrite this to PERL, it does work:
#!/usr/bin/perl
print "Content-type: image/png\n\n";
open(IMG, "image.png");
print while <IMG>;
close(IMG);
exit 0;
What is the difference? I would really like to do this in BASH. Thank you.
Absence of -n switch outputs third newline, so it should be
echo -ne "Content-type: image/png\n\n"
or
echo -e "Content-type: image/png\n"
from man echo
-n do not output the trailing newline

Perl Syntax Error: bareword found where operator expected

This is my perl script code. in this i'm getting error like "bareword found where operator expected at $cmd"
my $path = $folder."/".$host."_".$database_name.".bak";
$cmd .= "-U $user_name -P $password -S $host -d master -Q "BACKUP DATABASE [$database_name] TO DISK = N'$path'" ";
any one help me?
When a string has double quotes within it, you need to escape them with \.
$cmd .= "-U $user_name -P $password -S $host -d master -Q \"BACKUP DATABASE [$database_name] TO DISK = N'$path'\" ";
Also, Perl lets you use other characters for quote delimiters. qq followed by almost any character is the same as double quotes. So you could do things like this to avoid the need of backslashes:
$cmd .= qq(-U $user_name -P $password -S $host -d master -Q "BACKUP DATABASE [$database_name] TO DISK = N'$path'" );
$cmd .= qq|-U $user_name -P $password -S $host -d master -Q "BACKUP DATABASE [$database_name] TO DISK = N'$path'" |;
And so on...
Update: How to execute a system command in Perl. There are three basic ways:
system($cmd); #Goes through the shell if shell metacharacters are detected.
system(#command_and_args); #first element is the command, the rest are arguments
system executes a command and waits for it to return. The return value is the exit status of the program.
my #results = `$cmd`; #Always goes through shell.
Backticks execute a command and return its output. You should only use this if you actually need the output; otherwise, it is better to go with system.
exec $cmd;
exec #command_and_args;
exec is exactly like system, except that it never returns. It effectively ends your program by calling another program.
Use the one that is most appropriate to your situation. Or in this case, since you are executing SQL, consider using the DBI module. It's definitely a better approach for anything more than a couple of simple commands.
Looks like you have your " characters in the wrong place. I'm not sure where they should be.
In your second line, the string literal:
"-U $user_name -P $password -S $host -d master -Q "
is immediately followed by the bareword
BACKUP