Clearcase command line mkattr needs to wrap variable $bug_num between single quote + quote + $varible + quote + single quote, like this:
cleartool mkattr -replace BUGNUM '"$bug_num"' clearcase_file
How to make a call of the command cleartool mkattr in a Perl script in Unix env?
Env is Unix AIX and ksh
As mentioned in this recent answer:
If you want to execute a system command and don't have to use any shell syntax like redirects, it's usually better and safer to use the list form of system:
system(
'cleartool', 'mkattr', '-replace', 'BUGNUM ',
qq{"$bug_num"}, qq{clearcase_file}
);
# or, if you really want to pass both types of quotes:
system(
'cleartool', 'mkattr', '-replace', 'BUGNUM ',
qq{'"$bug_num"'}, qq{clearcase_file}
);
Or:
system(qq{cleartool mkattr -replace BUGNUM '"$bug_num"' clearcase_file});
Related
I am trying to put the system command like below to the perl script, but
sed expression contains both quotes and backticks and I am not sure how to escape all of them, so it will execute my system command exactly as I need.
Here is the example of the command:
mysql -u root -D porta-billing -e "..." | sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g"
The answer to the question you're asking is to use the qx(...) operator. qx(...) is the "choose your own delimiter" version of backticks.
my $output = qx[ ... ];
Or
my $output = qx( ... );
Or
my $output = qx! ... !;
It's easy to find a delimiter that won't clash with the characters in your command string.
But the answer to the question that you should be asking has two parts:
Don't call mysql from your Perl program - use DBI instead.
Don't call sed from your Perl program - use Perl code to manipulate your text.
I feel slightly nervous about the first part of my answer as I'm worried you will just take my hacky workaround and end up with an unmaintainable mess. Please take note of the advice in the second half - even if you ignore it in this case.
Hi I have a Unix Shell Script call Food.sh ; I have a Perl Script call Track.pl. Is there a way where I can put Track.pl's code in to Food.sh code and still have it work ? Track.pl requires one arugement to label a name of a folder.
Basically it will run like this.
Unix Shell Script codes RUN
step into
Perl Script codes RUN
Put in name of folder for Perl script
Rest of script runs
exit out.
You have a few options.
You can pass the code to Perl using -e/-E.
...
perl -e'
print "Hello, World!\n";
'
...
Con: Escaping can be a pain.[1]
You can pass the code via STDIN.
...
perl <<'END_OF_PERL'
print "Hello, World!\n";
END_OF_PERL
...
Con: The script can't use STDIN.
You can create a virtual file.
...
perl <(
cat <<'END_OF_PERL'
print "Hello, World!\n";
END_OF_PERL
)
...
Con: Wordy.
You can take advantage of perl's -x option.
...
perl -x -- "$0"
...
exit
#!perl
print "Hello, World!\n";
Con: Can only have one snippet.
$0 is the path to the shell script being executed. It's passed to perl as the program to run. The -x tells Perl to start executing at the #!perl line rather than the first line.
Ref: perlrun
Instances of ' in the program needs to escaped using '\''.
You could also rewrite the program to avoid using '. For example, you could use double-quoted string literals instead of single-quoted string literals. Or replace the delimiter of single-quoted string literals (e.g. q{...} instead of '...'). As for single-quoted inside of double-quoted and regex literals, these can be replaced with \x27, which you might find nicer than '\''.
(I'm assuming your goal is just to have all of the code in a single file so that you don't have multiple files to install)
Sure, there's a way to do this, but it's cumbersome. You might want to consider converting the shell script entirely to Perl (or the Perl script entirely to shell).
So ... A way to do this might be:
#!/bin/sh
echo "shell"
perl -E '
say "perl with arg=$ARGV[0]"
' fred
echo "shell again"
Of course, you'd have to be careful with your quotes within the Perl part of the program.
You might also be able to use a heredoc for the Perl part to avoid quoting issues, but I'm not sure about that.
I have been searching for this for a while, and can't find a satisfactory answer.
I have a perl script that needs to copy a file from one host to another, essentially
sub copy_file{
my($from_server, $from_path, $to_server, $to_path, $filename) = #_;
my $from_location = "$from_server:\"\\\"${from_path}${filename}\\\"\"";
my $to_location = $to_path . $filename;
$to_location =~ s/\s/\\\\ /g;
$to_location = "${to_server}:\"\\\"${to_location}\\\"\"";
return system("scp -p $from_location $to_location >/dev/null 2>&1"");
}
The problem is, some of my filenames look like this:
BLAH;BLAH;BLAH.TXT
Some really nicely named file( With spaces, prentices, &, etc...).xlx
I am already handling whitespaces, and the code for that is quite ugly since on each side, the files could be local or remote, and the escaping is different for the from and to part of the scp call.
what I am really looking for is either to somehow to escape all possible special characters or somehow bypass the shell expansion entirely by using POSIX system calls. I am ok with writing a XS Module if need be.
I have the correct keys set up in the .ssh directory
Also I am not honestly sure which special characters do and don't cause problems. I would like to support all legal filename characters.
Say you want to copy file foo(s) using scp.
As shown below, scp treats the source and target as shell literals, so you pass the following arguments to scp:
scp
-p
--
host1.com:foo\(s\) or host1.com:'foo(s)'
host2.com:foo\(s\) or host2.com:'foo(s)'
You can do that using the multi-argument syntax of system plus an escaping function.
use String::ShellQuote qw( shell_quote );
my $source = $from_server . ":" . shell_quote("$from_path/$filename");
my $target = $to_server . ":" . shell_quote("$to_path/$filename");
system('scp', '-p', '--', $source, $target);
If you really wanted to build a shell command, use shell_quote as usual.
my $cmd = shell_quote('scp', '-p', '--', $source, $target);
$ ssh ikegami#host.com 'mkdir foo ; touch foo/a foo/b foo/"*" ; ls -1 foo'
*
a
b
$ mkdir foo ; ls -1 foo
$ scp 'ikegami#host.com:foo/*' foo
* 100% 0 0.0KB/s 00:00
a 100% 0 0.0KB/s 00:00
b 100% 0 0.0KB/s 00:00
$ ls -1 foo
*
a
b
$ rm foo/* ; ls -1 foo
$ scp 'ikegami#host.com:foo/\*' foo
* 100% 0 0.0KB/s 00:00
$ ls -1 foo
*
There are three ways to handle this:
Use the multi-argument form of system which will completely avoid the shell:
system 'scp', '-p', $from_location, $to_location;
Disadvantage: you can't use shell features like redirection.
Use String::ShellQuote to escape the characters.
$from_location = shell_quote $from_location;
$to_location = shell_quote $to_location;
Disadvantage: certain strings can exist which can't be quoted safely. Furthermore, this solution is not portable as it assumes Bourne shell syntax.
Use IPC::Run which essentially is a supercharged system command that allows redirections.
run ['scp', '-p', $from_location, $to_location],
'>', '/dev/null', # yes, I know /dev/null isn't portable
'2>', '/dev/null'; # we could probably use IO::Null instead
Disadvantage: a complex module like this has certain limitations (e.g. Windows support is experimental), but I doubt you'll run into any issues here.
I strongly suggest you use IPC::Run.
A few suggestions:
It's not part of the standard Perl distribution, but Net::SSH2 and Net::SSH2::SFTP are two highly ranked CPAN modules that will do what you want. ssh, scp, and sftp all use the same protocol and the sshd daemon. If you can use scp, you can use sftp. This gives you a very nice Perlish way to copy files from one system to another.
The quotemeta command will take a string and quote all non-text ASSCII characters for you. It's better than attempting to do the situation yourself with the s/../../ substitution.
You can use qq(..) and q(...) instead of quotation marks. This will allow you to use quotation marks in your string without having to quote them over and over again. This is especially useful in the system command.
For example:
my $error = system qq(scp $user\#host:"$from_location" "$to_location");
One more little trick: If the system command is passed a single parameter, and that parameter has shell metacharacters in it, the system command will be passed to the default system shell. However, if you pass the system command a list of items, those items are passed to execvp directly without being passed to the shell.
Passing a_list_ of arguments to system via an array is a great way to avoid problems with file names. Spaces, shell metacharacters, and other shell issues are avoided.
my #command;
push #command, 'scp';
push #command, "$from_user\#$from_host:$from_location",
push #command, "$to_user\#$to_host:$to_location"
my $error = system #command;
use Net::OpenSSH:
my $ssh = Net::OpenSSH->new($host);
$ssh->scp_get($remote_path, $local_path);
The way arguments have to be quoted varies depending on the shell running on the remote side. The stable version of the module has support for Bourne compatible shells. The development version available from GitHub has also support for csh and several Windows flavors (Windows quoteing is, err, interesting).
For instance:
my $ssh = Net::OpenSSH->new($host, remote_shell => 'MSWin', ...);
Note that on Windows, there are string that just can not be properly quoted!
I'm having an issue with some code and I'm wondering if anyone can assist.
Basically I'm trying to execute an isql query against a database and assign it to a scalar variable. The isql command makes use of the column seperator which is defined as the pipe symbol.
So I have it set-up as such:
my $command = "isql -S -U -s| -i";
my $isql_output = `$command`;
The isql command works in isolation but when issued as a backtick it stops at the pipe. I've tried concatenating the $command string using sub-strings, using single quotes and backslash escaping items such as -s\"\|\" to no avail. I've also tried using qx instead of backticks.
Unfortunately I'm currently using an older version of perl (v5.6.1) with limited scope for upgrade so I'm not sure if I can resolve this.
You have to quote the | in a way that the shell does not recognize it as a special character. Two ways:
Put the -s| into single quotes: '-s|'. Perl will leave single quotes inside double quoted strings alone and pass them to the shell unmodified.
Escape the | with two backslashes: -s\\|. Why two? The first one is seen by Perl telling it to pass the next character through unmodified. Therefore the shell sees -s\| and does something very similar: it sees the single \ and knows not to treat the next char, |, special.
The problem is that the command is being executed through a shell.
You can avoid this by passing the command and arguments in a list rather than a single string.
The backtick construct does not support that, so you would need to use the open() function instead.
I haven't tested the following code but it gives the idea:
my #command = (qw(isql -Sserver -Uuser -Ppassword -s| -w4096), '–i' . $file);
print join(' ', #command), "\n";
open(my $fh, '-|', #command)
or die "failed to run isql command: $#\n";
my #isql_output = <$fh>;
close($fh);
my $isql_output = $isql_output[0]; chomp($isql_output);
If you're working with a 15 year old version of Perl (which Oracle users tend to do) I'm not sure this will all be supported. For instance, you may need to write chop instead of chomp.
UPDATE: the problem is not the Perl version, but this construct not being supported on Windows, according to the documentation. This must be qualified: I use Perl on Cygwin and it works fine there, but I don't know whether you can use Cygwin.
Single quotes should work. Try to run test perl script:
my $cmd = "./test.sh -S -U -s '|' -i";
print `$cmd`;
With test.sh:
#!/bin/sh
echo $#
Output should be -S -U -s | -i
When I run the ls command this runs fine. But echo $PATH does not give me any output from perl. When I run echo from the shell prompt it gives me output. Can you explain this behavior?
#!usr/bin/perl
$\="\n";
$out = `ls`;
print $out;
$out=`echo $PATH`;
print $out;
Please note that while the technically correct answer to your question is the $ interpolation, you should also note that you should not treat Perl like a shell script and call external commands via backticks instead of using Perl built-in or library functions designed for the purpose:
$out = join("\n", glob("*")); # same as `ls`
$out = $ENV{PATH}; # same as `echo $PATH`
This has several significant advantages:
speed (no call to system)
portability
More security (no shell attack vector)
Most built ins cover proper error handling for you better than your own system call implementation
Nearly always a better, cleaner, shorter and easier to read/maintain code
Backticks interpolate like double quotes, so you need to escape the $.
$out=`echo \$PATH`;
$PATH is shell variable, from perl you should use it as perl variable $ENV{PATH}
Still try to read some basic docs too, like perldoc perlintro. No need for executing echo at all.
Perl is interpolating $PATH in the backticks as a Perl variable, and you've not set a $PATH anywhere in your script, so the command is coming out as
$out = `echo `
which is basically a null-op. Try
$out = `echo \$PATH`
instead, which would force Perl to ignore the $ and pass it intact to the shell.
You need to escape $ in $PATH because the backticks operator interpolates any variables.
$out=`echo \$PATH`;
You could also use qx// with single quotes. From perldoc perlop:
Using single-quote as a delimiter
protects the command from Perl's
double-quote interpolation, passing it
on to the shell instead:
$perl_info = qx(ps $$); # that's Perl's $$
$shell_info = qx'ps $$'; # that's the new shell's $$
Others have already explained the reason - variables inside backticks are interpolated, so your echo $PATH is actually becoming echo since there's no $PATH variable declared.
However, always put use strict; at the top of every Perl script you write.
Had you done so, Perl would have told you what was happening, e.g.:
Global symbol "$PATH" requires explicit package name at myscript.pl line 9
To stop variables being interpolated, either escape them (e.g. \$PATH), or, more cleanly, use e.g. qx'echo $PATH'.
Also, as others have pointed out, calling echo $PATH makes no real-world sense; if you're trying to get the contents of the PATH environment variable, just use $ENV{PATH} - however, you may have just been using it as a simple reduced demonstration case.