How can I run this command in my Jenkins file?
sh "perl -p -e 's/\$\{([^}]+)\}/defined $ENV{$1} ? $ENV{$1} : $&/eg; s/\$\{([^}]+)\}//eg' .env"
I tried everything.
Like so:
sh """
perl -p -e 's/\$\{([^}]+)\}/defined $ENV{$1} ? $ENV{$1} : $&/eg; s/\$\{([^}]+)\}//eg' .env
"""
Or escaping the backslahes.
But I keep getting the error:
WorkflowScript: 13: unexpected char: '\' # line 13, column 23.
Depending on how this command is run, the string interpolation issues can be awful to predict. Is the double quoted string interpolated by sh? Does the backslash in front of $ mean that it is escaped from sh, but not from Perl interpolation? When I ran a test string in pastebin, it simply removed the $ENV{$1}.
I'm sure there's a way to do it the hard way (this way), but an easy way is to just write the Perl code in a file instead, and run the file.
I would write your regexes like this, in a separate file, say foo.pl:
s|\${([^}]+)}|$ENV{$1} // $&|eg;
s/\${([^}]+)}//g;
Using the logical defined-or operator // is slightly prettier than using the ternary operator. We change delimiter on the substitution operator to facilitate that.
I removed unused e modifier on second substitution.
You should note that all strings that match the regex ${....} will be removed from the input by the second substitution. So the fact that you attempt to put them back with the first substitution with $& is quite meaningless. Moreover using $& carries a notable performance reduction. Assuming that is a mistake from your side, the code can be shortened to:
s/\${([^}]+)}/$ENV{$1}/g;
Note that now you can also skip the dangerous eval modifier /e.
If you run it without warnings, which you do in your original code, you will not notice the undefined values in the %ENV hash, it will just return the empty string -- i.e. remove undefined values.
This code can now be run by your other script without interpolation issues:
sh "perl -p foo.pl .env"
Just remove the -e switch since you are no longer providing command line code.
Related
I am trying to test the code snippet below for a bigger script that I am writing. However, I can't get the search working with parentheses and variables.
Appreciate any help someone can give me.
Code snippet:
#!/usr/bin/perl
$file="test4.html";
$Search="Help (Test)";
$Replace="Testing";
print "/usr/bin/sed -i cb 's/$Search/$Replace/g' $file\n";
`/usr/bin/sed -i cb 's/$Search/$Replace/g' $file`;
Thanks,
Ash
The syntax to run a command in a child process and wait for its termination in perl is system "cmd", "arg1", "arg2",...:
#!/usr/bin/perl
$file="test4.html";
$Search="Help (Test)";
$Replace="Testing";
print "/usr/bin/sed -icb -e 's/$Search/$Replace/g' -- $file\n";
system "/usr/bin/sed", "-icb", "-e", "s/$Search/$Replace/g", "--", $file;
(error checking left as an exercise, see perldoc -f system for details)
Note that -i is not a standard sed option. The few implementations that support it (yours must be the FreeBSD one as you've separated the cb backup extension from -i) have actually copied it from perl! It does feel a bit silly to be calling sed from perl here.
Looking at your approach:
The `...` operator itself is reminiscent of the equivalent `...` shell operator. In perl, what's inside is evaluated as if inside double quoted, in that $var, #var... perl variables are expanded, and a shell is started with -c and the resulting string as arguments and with its stdout redirected to a pipe.
The shell interprets that argument as code in the shell syntax. Perl reads the output of that inline shell script from the other end of the pipe and that makes up the expansion of `...`. Same as in shell command substitution except that there's is no stripping of zero bytes or of trailing newlines.
sed -i produces no output, so it's pointless to try and capture its output with `...` here.
Now in your case, the code that sh is asked to interpret is:
/usr/bin/sed -i cb 's/Help (Test)/Testing/g' test4.html
That should work fine on FreeBSD or macOS at least. If $file had been test$(reboot).html, that would have been worse though.
Here, because you have the contents of variables that end up interpreted as code in an interpreter (here sh), you have a potential arbitrary command injection vulnerability.
In the system approach, we remove sh, so that particular vulnerability is removed. However sed is also an interpreter of some language. That language is not as omnipotent as that of sh, but for instance sed can write to arbitrary files with its w command. The GNU implementation (which you don't seem to be using) can run arbitrary commands as well.
So you still potentially have a code injection vulnerability in the case of $Search or $Replace coming from an external source.
If that's the case, you'd need to make sure your properly sanitise those values before running sed. See for instance: How to ensure that string interpolated into `sed` substitution escapes all metachars
I want to rename files with 'sr' in their names, replacing 'sr' with 'SR'. This one succeeded:
ls | perl -e 'while(<>){chomp;if(/(.*)sr(.*)/){rename $_,$1."SR".$2}}'
But this one failed:
ls | perl -e "while(<>){chomp;if(/sr/){rename $_,$\`.'SR'.($')}}"
with this error message:
Not enough arguments for rename at -e line 1, near "rename ,"`
Execution of -e aborted due to compilation errors.
It seems that $_ has become an empty string, but I don't quite understand why. Thanks for any explanations.
Now quotes have been an interesting problem and this is my test:
ls | perl -e "while(<>){chomp;if(/sr/){print $_;print\"\n\";print $\`,$&,($');print \"\n\";print $_,$\`,$&,($');print\"\n\";print $_;print\"\n\"}}"
outputs this:
3sr
3sr
3sr
3sr
sr1
sr1
sr1
sr1
sr2
sr2
sr2
sr2
it seems that when using alone, $_ is not empty; but it become empty when using along with $`,$& and $'. According to the last line of each file, I guess $_ has temporarily changed when not using alone?
Besides, according to a1111exe's answer, I test this:
ls | perl -e "while(<>){chomp;if(/sr/){print \$_,$\`,$&,($');print \"\n\"}}"
and got this:
3sr3sr
sr1sr1
sr2sr2
First in linux we should use single quote instead of double quote.
And instead of ls command you can use perl inbuilt function glob
And to capture the pre and post match you can use the $POSTMATCH and $PREMATCH from English module
so your one liner should be
perl -MEnglish -e 'while(<*>){chomp;if(/sr/){rename $_,$PREMATCH."SR".$POSTMATCH}}'
EDITED
Single quote and double quote is not about Perl this is about shell.
Single quote
Enclosing characters in single quotes (') preserves the literal value of each character within the quotes. A single quote may not occur between single quotes, even when preceded by a backslash.
Double quote
Enclosing characters in double quotes (‘"’) preserves the literal value of all characters within the quotes, with the exception of ‘$’, ‘`’, ‘\’, and, when history expansion is enabled, ‘!’.
In shell script we are accessing the shell variable prefix with $, so while using $ inside the double quote it is looking for the shell variable not a Perl variable. For example you can run the following line in your terminal,
m=4; perl -e "print $m;"
Here
m=4; perl -e "print $m;"
^ ^
| Accessing shell variable
Assigning shell variable
Output is 4. Because m is shell variable you are accessing the shell variable inside your Perl script.
And in windows, we need to use double-quote instead of single quote
It seems that double quotes mess between your shell environment and Perl. You can certainly do what #mkHun suggested. One other way:
ls | perl -e 'while(<>){chomp;($new=$_)=~s/sr/SR/g;rename $_,$new}'
Also, if you escape the '$' sigil in '$_', your oneliner will work too:
ls | perl -e "while(<>){chomp;if(/sr/){rename \$_,$\`.'SR'.$'}}"
I still don't get why though.. But it really seems like bash/perl interpolation issue.
When I tried hash in command line as in the below example, I am getting syntax error. I tried using fat comma as well but still the same result. Can someone help me?
perl -e "%hash_ex=(as,wdesadc,afcsdc,esvdfvzdfvfv,1,sd,34,34);print $hash_ex{'1'};"
syntax error at -e line 1, near "};"
Execution of -e aborted due to compilation errors.
perl -e "%hash_ex=('a' => 1 , 'b' => 2);print $hash_ex {a};"
syntax error at -e line 1, near "};"
Execution of -e aborted due to compilation errors.
the problem is that your Shell also substitutes variables beginning with $:
# (on zsh and bash)
echo "%hash_ex=(as,wdesadc,afcsdc,esvdfvzdfvfv,1,sd,34,34);print $hash_ex{'1'};"
%hash_ex=(as,wdesadc,afcsdc,esvdfvzdfvfv,1,sd,34,34);print {'1'};
Because of this, you'll better use single qotes for your -E argument:
perl -e'%hash_ex=(as,wdesadc,afcsdc,esvdfvzdfvfv,1,sd,34,34);print $hash_ex{1};'
sd
if you really need single quotes (in this case you don't), you can use the q operator:
perl -E'say q~some non-interpolating string\t\n$_~'
some non-interpolating string\t\n$_
Or you can try to avoid the interpolating of your shell:
perl -e "%hash_ex=(as,wdesadc,afcsdc,esvdfvzdfvfv,1,sd,34,34);print \$hash_ex{'1'};"
You are using double quotes to pass your command to Perl. This will mean that the shell will first interpolate any variables in your string before it then passes the command to Perl. you can see this if you just run echo on the string with double quotes then single quotes. The output from echo will show what the shell is then passing to Perl
When the shell processes the text in the double quotes it interpolates the $hash_ex. Since this is not set in the shell this gets interpolated as nothing which means your print statement instead of being
print $hash_ex{a}
becomes
print {a}
So you need to wrap all your perl in singleqotes so that the shell does not interpolate any vars and passes the full string to perl as literal string.
I am writing the following command to extract the text in makefile:-
#awk '/Exported Layer/,/Total Polygons/' out_compare.err | perl -lane '$el=$F[3] if(/Exported Layer/); print "$el: $f[3]" if (/Total Polygons/);' | cat
But it is giving the following error:-
Can't modify constant item in scalar assignment at -e line 1, near "] if"
Execution of -e aborted due to compilation errors.
Would you guys like to suggest something? :-)
Make is oblivious to shell quoting in commands, so the $ characters in your Perl snippet are being interpreted as make variables $e and $F. These variables don't exist in your makefile and are being expanded as empty, leading to the Perl syntax errors you're seeing.
You need to escape the $ characters from make like this:
... perl -lane '$$el=$$F[3] if(/Exported Layer/); ...
See also the GNU Make manual.
I have a variable in a shell script,
var=1234_number
I want to replace all other than integer of $var .. how can I do it using a perl onliner?
You might be looking for something to edit the shell script, in which case, this might be sufficient:
perl -i.bak -e 's/\b(var=\d+).*/$1/' shellscript.sh
The '-i' overwrites the original file, saving a copy in shellscript.sh.bak; the substitute command finds assignments to 'var' (and not any longer name ending 'var') followed by an equals sign, some digits, and any non-digits, and leaves behind just the assignment of digits.
In the example, it gives:
var=1234
Note that the Perl regex is not foolproof - it will mangle this (dropping the closing brace).
: ${var=1234_number}
Dealing with all such possible variants is extremely fairly tricky:
echo $var=$other
OTOH, you might be looking to eliminate digits from a variable within a shell script, in which case:
var=$(echo $var | perl -e 's/\D//g')
You could also use 'sed' for the job:
var=$(echo $var | sed 's/[^0-9]//g')
No need to use anything but the shell for this
var=1234_abcd
var=${var%_*}
echo $var # => 1234
See 'Parameter Expansion' in the bash manual.