Underscore not respected in on instance of terminal - sed

I have a strange problem when trying to define a variable or even echo it when its content has an underscore.
My original problem was to replace spaces and put the underscore in a string. So far so good. I used sed to make the replacement and assign it to a variable .. I'm stuck on that for hours.
It happens that when presenting the underscore the console is not effective, making me think the following: My variable is not with the underscore and the console is correct on the output; Or my variable has the underscore and the console is limited to not displaying it.
eg:
"the string" ... some process ... "the string"
when i was expecting
"the string" ... some process ... "the_string"
Follows the "best" failed attempts.
Using tr and put the stdout directly in to a var
FOO=$(tr -s ' ' '_' <<< "the string"); echo $FOO;
Creating a file and reading it to a var
(proof that the variable has the correct value, because the file has the underscore)
echo "the string" | sed "s/ /_/" 1>echo; FOO=$(cat echo); echo $FOO;
The "original" command stdout works well
tr -s ' ' '_' <<< "the string"
echo "the string" | sed "s/ /_/"
If I change the underscore for the dash, everything goes well. So, what's going on that?
EDIT:
The same command in a new terminal worked just enter image description herefine. The root cause is a mistery
Old console stdout
New console stdout

The terminal are trying to undestand de variable $FOO like a field of array type, so IFS is applied and the string is splited, generating the output without the underscore.
Quoting you variable will make then to be understand has a field of string type.
So you can restore your $IFS fiel to his default, or you can quoting you variable.
Reassigning $ IFS to its default value will resolve the result.
$ IFS=''
Quoting your variable in the script.
FOO=$(tr -s ' ' '_' <<< "the string"); echo "$FOO";

Related

How to remove trailing newline from command arguments in fish?

As far as I know, in fish as well as in many other command lines, the return statement is reserved for success/failure codes, and the standard way to 'return a string' is to echo the result like this:
function format-commit-message
echo e2e-new: $argv[1]
echo
echo "Jira: APP-1234"
end
However, this means that all such strings end with a newline character. I tried to remove it because that is the commit message policy:
git commit -m ( format-commit-message "new commit" | string split0 | strig sub -e -1 )
But the result is:
fatal: empty string is not a valid pathspec. please use . instead if you meant to match all paths
Could you tell me what I am doing wrong?
Use echo's -n parameter to omit the newline.

Jenkins run line with backslashes

How can I run this command in my Jenkins file?
sh "perl -p -e 's/\$\{([^}]+)\}/defined $ENV{$1} ? $ENV{$1} : $&/eg; s/\$\{([^}]+)\}//eg' .env"
I tried everything.
Like so:
sh """
perl -p -e 's/\$\{([^}]+)\}/defined $ENV{$1} ? $ENV{$1} : $&/eg; s/\$\{([^}]+)\}//eg' .env
"""
Or escaping the backslahes.
But I keep getting the error:
WorkflowScript: 13: unexpected char: '\' # line 13, column 23.
Depending on how this command is run, the string interpolation issues can be awful to predict. Is the double quoted string interpolated by sh? Does the backslash in front of $ mean that it is escaped from sh, but not from Perl interpolation? When I ran a test string in pastebin, it simply removed the $ENV{$1}.
I'm sure there's a way to do it the hard way (this way), but an easy way is to just write the Perl code in a file instead, and run the file.
I would write your regexes like this, in a separate file, say foo.pl:
s|\${([^}]+)}|$ENV{$1} // $&|eg;
s/\${([^}]+)}//g;
Using the logical defined-or operator // is slightly prettier than using the ternary operator. We change delimiter on the substitution operator to facilitate that.
I removed unused e modifier on second substitution.
You should note that all strings that match the regex ${....} will be removed from the input by the second substitution. So the fact that you attempt to put them back with the first substitution with $& is quite meaningless. Moreover using $& carries a notable performance reduction. Assuming that is a mistake from your side, the code can be shortened to:
s/\${([^}]+)}/$ENV{$1}/g;
Note that now you can also skip the dangerous eval modifier /e.
If you run it without warnings, which you do in your original code, you will not notice the undefined values in the %ENV hash, it will just return the empty string -- i.e. remove undefined values.
This code can now be run by your other script without interpolation issues:
sh "perl -p foo.pl .env"
Just remove the -e switch since you are no longer providing command line code.

Perl regex directly escaping special characters

A perl beginner here. I have been working on some simple one-liners to find and replace text in a file. I read about escaping all special characters with \Q\E or quotemeta() but found this only works when interpolating a variable. For example when I try to replace the part containing special characters directly, it fails. But when I store it in a scalar first it works. Of course, if I escape all the special character in backslashes it also works.
$ echo 'One$~^Three' | perl -pe 's/\Q$~^\E/Two/'
One$~^Three
$ echo 'One$~^Three' | perl -pe '$Sub=q($~^); s/\Q$Sub\E/Two/'
OneTwoThree
$ echo 'One$~^Three' | perl -pe 's/\$\~\^/Two/'
OneTwoThree
Can anyone explain this behavior and also show if any alternative exists that can directly quote special characters without using backslashes?
Interpolation happens first, then \Q, \U, \u, \L and \l.
That means
"abc\Qdef$ghi!jkl\Emno"
is equivalent to
"abc" . quotemeta("def" . $ghi . "!jkl") . "mno"
So,
s/\Q$~^/Two/ # not ok quotemeta($~ . "^")
s/\Q$Sub/Two/ # ok
s/\$\~\^/Two/ # ok
s/\$\Q~^/Two/ # ok

why does changing from ' to " affect the behavior of this one-liner?

Why is it that simply changing from enclosing my one-liner with ' instead of " affects the behavior of the code? The first line of code produces what is expected and the second line of code gives (to me!) an unexpected result, printing out an unexpected array reference.
$ echo "puke|1|2|3|puke2" | perl -lne 'chomp;#a=split(/\|/,$_);print $a[4];'
puke2
$ echo "puke|1|2|3|puke2" | perl -lne "chomp;#a=split(/\|/,$_);print $a[4];"
This is the Perl version:
$ perl -v
This is perl, v5.10.1 (*) built for x86_64-linux-thread-multi
ARRAY(0x1f79b98)
With double quotes you are letting the shell interpolate variables first.
As you can check, $_ and $a are unset in the subshell forked for pipe by the parent shell. See a comment on $_ below.
So the double-quoted version is effectively
echo "puke|1|2|3|puke2" | perl -lne 'chomp;#a=split(/\|/);print [4];'
what prints the arrayref [4].
A comment on the effects of having $_ exposed to Bash. Thanks to Borodin for bringing this up.
The $_ is one of a handful of special shell parameters in Bash. It contains the last argument of the previous command, or the pathname of what invoked the shell or commands (via _ environment variable). See the link for a full description.
However, here it is being interpreted in a subshell forked to run the perl command, its first. Apparently it is not even set, as seen with
echo hi; echo hi | echo $_
which prints an empty line (after first hi). The reason may be that the _ environment variable just isn't set for a subshell for a pipe, but I don't see why this would be the case. For example,
echo hi; (echo $_)
prints two lines with hi even though ( ) starts a subshell.
In any case, $_ in the given pipeline isn't set.
The split part is then split(/\|/), so via default split(/\|/, $_) -- with nothing to split. With -w added this indeed prints a warning for use of uninitialized $_.
Note that this behavior depends on the shell. The tcsh won't run this with double quotes at all. In ksh and zsh the last part of pipeline runs in the main shell, not a subshell, so $_ is there.
This is actual a shell topic, not a perl topic.
In shell:
Single quotes preserve the literal value of all of the characters they contain, including the $ and backslash. However, with double quotes, the $, backtick, and backslash characters have special meaning.
For example:
'\"' evaluates to \"
whereas
"\'" evaluates to just '
because with double quotes, the backslash gets a special meaning as the escape character.

KSH - echo Windows path and special characters

so I'm receiving a Windows path as argument but since in the path there are special chars I'm having some trouble.
For example if the argument path ($1) is \test\bla the outout of the script it's "\SLE esla" (because of the \t and \b)
How can I print the correct path??
Thanks.
p.s.
maybe it's a stupid question but I'm new to ksh
as I said I'm new to ksh but here is the solution:
print -R $arg
Using print instead of echo make it possible to avoid the expansion of "\" with the "-R" option