What is the Perl equivalent to the shell's "$#"? - perl

I'm working on a Perl script that I was hoping to capture a string entered on the command line without having to enter the quotes (similiar to bash's "$#" ability). I'll be using this command quite a bit so I was hoping that this is possible. If I have:
if ($ARGV) {
I have to put the command line string in quotes. I'd rather do the command something like this:
htmlencode some & HTML <> entities
Without the quotes. Is there a way to do this in Perl?

The #ARGV array contains the arguments to the Perl script - no quotes needed.
That said, the question asks about:
I have to put the command line string in quotes. I'd rather do the command something like this:
htmlencode some & HTML <> entities
Without the quotes. Is there a way to do this in perl?
Well, if the command shown is written at the shell command line, you have to obey shell conventions - which means escaping the '&' and '<>' to prevent the shell from interpreting them. Likewise, within a Perl script, that sequence would need to be protected from Perl. Maybe you'd write:
system "htmlencode", "some", "&", "HTML", "<>", "entities";
That is, everything would have to be in quotes - but that notation would avoid executing the shell and having the shell interpret the commands.
Alternatively again, if you put the arguments into an array (with quotes at the time the array is loaded), then you could pass that array to system and not use any quotes:
my #args = ( "htmlencode", "some", "&", "HTML", "<>", "entities" );
system #args;
But I think the question is confused.

You put quotes around $# in bash so that it expands to have each element in the array quoted. The reason to do that is so that each element of the array continues to be treated as a single argument when you pass them all to the next command.
The analogue to that in Perl is when you want to pass those parameters to another external command. If you're running the external program with the backtick operators, then you'd need to quote each parameter, but if you use system, then Perl will take care of keeping all the parameters separate for you.
In fact, separate parameters are the way programs are executed on Unix anyway. The single-string command-line format is there because we need to be able to type things at the command prompt. Like all shells, bash has special rules about how to split that single string into multiple arguments. But if you already have them separated in a Perl array, don't put them back into a single string. Keep them separate.

Related

Perl output shell-escaped string

I'm trying to use a perl one-liner to turn print0 output into quoted shell parameters, kind of like the trick that's something like .. | xargs -0 printf "%q" {} but I didn't want to require bash (whose printf implements %p). I was kind of amazed to, well, not find an easy way to do this in perl. For all of perl's quoting mechanisms, there's no way I saw for producing quoted strings. Surely I just haven't looked hard enough.
Hopefully the answer isn't a regular expression. Quoting an elaborate regular expression to put into a shell command-line is not my idea of fun (if only a simple perl program could quote it for me, oh back to the same problem).
You can roll your own quoting for POSIX-like shells fairly simply - no complicated regexes needed (just straightforward string substitution using literals):
$ echo "I'm \$HOME. 3\" of rain." | perl -lne "s/'/'\\\''/g; print q{'} . \$_ . q{'}"
'I'\''m $HOME. 3" of rain.'
The approach is modeled after AppleScript's quoted form of command:
The input string is broken into substrings by ', each substring is itself '-enclosed, with the original ' chars. spliced between the substrings as \' (an individually quoted ').
When passed to the shell, the shell rebuilds these parts into a single, literal string.
This multi-part string-concatenation approach is necessary, because POSIX-like shells categorically do not allow embedding ' itself inside single-quoted strings (there's not even an escape sequence).
Alternatively, you can install a CPAN module such as ShellQuote.
Optional background information
While it would be handy for Perl itself to support such a quoting mechanism for piecing together shell commands stored in a single string to pass to qx// (`...`), such a mechanism would have to operate platform-specifically.
Notably, quoting rules for Windows are very different from rules for Unix platforms, and except for simple cases shell commands as a whole will be incompatible too.
From inside Perl, you may be able to bypass the need for quoting altogether, by using the list forms of system() and open(), which allow you to pass the command arguments individually, as-is, but note that this is only an option if your command doesn't use any shell features; for a "shell-less" qx// (`...`) alternative, see this answer of mine, which also covers shell-quoting on Windows.

running bsub command from perl script

I am trying to run a bsub command from a perl script in the following way:
system ("bsub -select "testid::1" -q normal");
but I think perl is getting confused because of the double quotes in "testid::1". What is the proper way to implement this?
You can escape the inner quotes:
system ("bsub -select \"testid::1\" -q normal");
or replace the outer quotes with single quotes, or in fact any character at all, thanks to the qq generalized quote operator in Perl which exists precisely for this sort of scenario;
system (qq{bsub -select "testid::1" -q normal});
There is a companion generalized single quote operator q.
Rather than fitting the entire command in a single quoted string (although using the generalized quoting operators makes that fairly simply), you can use the multi-argument version of system to avoid needing to quote the entire command line.
system 'bsub', 'select[type==LINUX64&&clearcase]', '-select', 'testid::1', '-q' 'normal';

Is it possible to pipe input to another script with '<' using the system() in perl?

I've looked at several similar questions but none of them seem to address this issue, or they use a form of piping that I'm unfamiliar with, or I'm using "piping" in place of the correct word.
First, I'm on windows 7 and what I'm trying to do is get a Perl script to call and input to another Perl Script multiple times.
The way I'm going about doing this is with the System() function.
When put directly into the command line this works, although a little sloppy:
Functionalscript.pl < InputFile > OutputFile
That takes stuff from the input file performs the function and writes it to the output file flawlessly. However, when using the "system()" function in my calling script the input is not registered, but the output file is created (it's just blank).
The problem is with:
system("Functionalscript.pl < InputFile > OutputFile")
For some reason when that is used the functionalscript does not receive the input as stdin. Is there a way to make this work?
According to perldoc -f system (http://perldoc.perl.org/functions/system.html):
If there is only one scalar argument, the argument is checked for shell metacharacters, and if there are any, the entire argument is passed to the system's command shell for parsing (this is /bin/sh -c on Unix platforms, but varies on other platforms). If there are no shell metacharacters in the argument, it is split into words and passed directly to execvp , which is more efficient.
Which means if your command has > or < in it it should be passed to the shell, and the input and output redirection should work as expected.
system("x:/path/perl.exe Functionalscript.pl InputFile > OutputFile")
Supplied by mpapec, Works. The "x:/path/perl.exe" had to be included.

Calling perl script from a perl script

I want to call a perl script from a perl script with big argument list in a bash shell. The arguments contains special characters such as \, *, (, ) etc. Each of these special characters are guided by single escape character \.
But when I call 2nd perl script (which then calls to a shell script) from 1st perl script the escape character gets evaluated and the special characters are exposed in the shell and hence getting syntax error.
So basically i want to prevent escape character's evaluation when I call 2nd perl script from 1st perl script and it should be evaluated when I call shell script from my 2nd perl script.
Eg. Input to the first perl 'MonitorAdmin' script is :
MonitorAdmin -reversefilter -container="LogServerContainer" -filepath="/home/esg2/YogeshTemp/VSDEFAULT/logs" -filename="System.log" -pattern=".*\t.*\t(DEBUG)\t.*\t.*\t.*\t(SecurityService)\t.*\t.*\t.*\t.*\t.*" -linecount="5001" -targetfile="
Perl's exec and system commands won't invoke a shell if you pass them a list with more than one element, but each list element becomes a separate argument then, i.e. spaces don't separate arguments. I'd imagine this works well even when executing a shell script since you aren't invoking the shall with a -c option.
There are two forms of system, one that executes a shell command (system($shell_cmd)), and one that launches a program (system($program, #args)). As best as we can tell by your light post, you appear to be using the wrong one. All you need is
system('MonitorAdmin2', #ARGV)
There is no shell to "misinterpret" the characters.

How can I avoid escaping by accident in Perl using system()?

I want to run some commands using the system() command, I do this way:
execute_command_error("trash-put '/home/$filename'");
Where execute_command_error will report if there was an error with whatever system command it ran. I know I could just unlink the file using Perl commands, but I want to delete stuff using trash-put as it's a type of recycling program.
My problem is that $filename will sometimes have apostrophes, quotes, and other weird characters in it that mess up the system command or Perl itself.
Generate the command name and arguments as an array, and pass that to system:
my(#command) = ("trash-put", 'home/$filename');
system #command;
This means that Perl does not invoke the shell to do any metacharacter expansion (or I/O redirection, or command piping, or ...). It does mean it does exactly what you told it to do.
sub execute_command_error
{
system #_;
}
Borrowing information from the copious collection of comments:
Which is clearly documented in perldoc -f system or at perldoc.perl.org/functions/system.html (#Ether).
(See also the discussion of 'exec' below which is closely related.)
Did you mean to put $filename in single quotes? (#mobrule).
I did intend to use single quotes - I'm demonstrating that the $filename does not get expanded by Perl or Shell...In my test script, I used 'my.$file', and that gave me a file with a $ in the name - as I intended.
I think the desired quoting if you do want to invoke the shell (for example if you want some piping) is $command_line = "\"$command\" \"$arg1\" \"$arg2\"...". (#Jefromi).
Adding double quotes around the arguments won't help with embedded $, backtick1, '$(...)' and related notations. You more nearly need single quotes around things, but then you need to rewrite embedded single quotes as "'\''" which generates a single quote to terminate the current single-quoted argument, a backslash-quote combination to represent a single quote, and another single quote to resume the single-quoted argument.
This would be a good solution if I used system command directly; however I am using webmin's execute_command function, which is a bit over my head so I wouldn't know how to edit it to allow for arrays. Could you expand on the rewrite of embedded single quotes as "'\''"...This is what I will use, for now. (#Brian)
Roughly speaking, the way the (Unix) shells treat single quotes is "everything from the first single quote up to the next is literal text, no metacharacters". So, to get the shell to treat something as literal text, enclose it in single characters. That deals with everything except single quotes themselves. As my comment says, you have to use the 4-character replacement string to get a single quote embedded into the middle of a single quoted argument.
There is probably a neater way to do it than this (using one or two map operations, perhaps), but this should work:
for (my $i = 0; $i < scalar(#command); $i++)
{
$command[$i] =~ s/'/'\\''/g; # Replace single quotes by the magic sequence
$command[$i] = "'$command[$i]'"; # Wrap value in single quotes
}
You can then join the array to make a single string for transmission to execute_command.
It's better to write that as system { $command[0] } #command to handle the case where #command has one element. This is one of the things I talk about in the "Secure Programming Techniques" chapter of Mastering Perl. (#briandfoy).
As a general rule, I'll accept this correction. I'm not sure it is crucial in this instance, though, where the command name is provided by the program and it is only the arguments that are possibly provided the user. The command name 'trash-put' is safe from shell expansions (IFS is reset to default by the shell when it starts, so that avenue of attack is not available).
This issue is discussed in the 'perldoc -f exec' man page:
If you don't really want to execute the first argument, but want to lie to the program you are executing about its own name, you can specify the program you actually want to run as an "indirect object" (without a comma) in front of the LIST. (This always forces interpretation of the LIST as a multivalued list, even if there is only a single scalar in the list.)
Example:
$shell = '/bin/csh';
exec $shell '-sh'; # pretend it's a login shell
or, more directly,
exec {'/bin/csh'} '-sh'; # pretend it's a login shell
When the arguments get executed via the system shell, results are subject to its quirks and capabilities. See "STRING" in perlop for details.
Using an indirect object with exec or system is also more secure. This usage (which also works fine with system()) forces interpretation of the arguments as a multivalued list, even if the list had just one argument. That way you're safe from the shell expanding wildcards or splitting up words with whitespace in them.
#args = ( "echo surprise" );
exec #args; # subject to shell escapes
# if #args == 1
exec { $args[0] } #args; # safe even with one-arg list
The first version, the one without the indirect object, ran the echo program, passing it "surprise" an argument. The second version didn't; it tried to run a program named "echo surprise", didn't find it, and set $? to a non-zero value indicating failure.
1 How do you get a back-tick to display in Markdown?
As stated in perldoc -f system:
If there is more than one argument in LIST, or if LIST is an array
with more than one value, starts the program given by the first element of the list with arguments given by the rest of the list. If there is
only one scalar argument, the argument is checked for shell metacharacters, and if there are any, the entire argument is passed to the system's
command shell for parsing (this is "/bin/sh -c" on Unix platforms, but varies on other platforms). If there are no shell metacharacters in the
argument, it is split into words and passed directly to "execvp", which is more efficient.
I like to use IPC::System::Simple's version of system(), which gives more control, such as being able to capture various exceptions and handle certain error codes as "bad" and others as "good":
use IPC::System::Simple qw(system);
system("cat *.txt"); # will die on failure