How to remove trailing newline from command arguments in fish? - fish

As far as I know, in fish as well as in many other command lines, the return statement is reserved for success/failure codes, and the standard way to 'return a string' is to echo the result like this:
function format-commit-message
echo e2e-new: $argv[1]
echo
echo "Jira: APP-1234"
end
However, this means that all such strings end with a newline character. I tried to remove it because that is the commit message policy:
git commit -m ( format-commit-message "new commit" | string split0 | strig sub -e -1 )
But the result is:
fatal: empty string is not a valid pathspec. please use . instead if you meant to match all paths
Could you tell me what I am doing wrong?

Use echo's -n parameter to omit the newline.

Related

Underscore not respected in on instance of terminal

I have a strange problem when trying to define a variable or even echo it when its content has an underscore.
My original problem was to replace spaces and put the underscore in a string. So far so good. I used sed to make the replacement and assign it to a variable .. I'm stuck on that for hours.
It happens that when presenting the underscore the console is not effective, making me think the following: My variable is not with the underscore and the console is correct on the output; Or my variable has the underscore and the console is limited to not displaying it.
eg:
"the string" ... some process ... "the string"
when i was expecting
"the string" ... some process ... "the_string"
Follows the "best" failed attempts.
Using tr and put the stdout directly in to a var
FOO=$(tr -s ' ' '_' <<< "the string"); echo $FOO;
Creating a file and reading it to a var
(proof that the variable has the correct value, because the file has the underscore)
echo "the string" | sed "s/ /_/" 1>echo; FOO=$(cat echo); echo $FOO;
The "original" command stdout works well
tr -s ' ' '_' <<< "the string"
echo "the string" | sed "s/ /_/"
If I change the underscore for the dash, everything goes well. So, what's going on that?
EDIT:
The same command in a new terminal worked just enter image description herefine. The root cause is a mistery
Old console stdout
New console stdout
The terminal are trying to undestand de variable $FOO like a field of array type, so IFS is applied and the string is splited, generating the output without the underscore.
Quoting you variable will make then to be understand has a field of string type.
So you can restore your $IFS fiel to his default, or you can quoting you variable.
Reassigning $ IFS to its default value will resolve the result.
$ IFS=''
Quoting your variable in the script.
FOO=$(tr -s ' ' '_' <<< "the string"); echo "$FOO";

why does changing from ' to " affect the behavior of this one-liner?

Why is it that simply changing from enclosing my one-liner with ' instead of " affects the behavior of the code? The first line of code produces what is expected and the second line of code gives (to me!) an unexpected result, printing out an unexpected array reference.
$ echo "puke|1|2|3|puke2" | perl -lne 'chomp;#a=split(/\|/,$_);print $a[4];'
puke2
$ echo "puke|1|2|3|puke2" | perl -lne "chomp;#a=split(/\|/,$_);print $a[4];"
This is the Perl version:
$ perl -v
This is perl, v5.10.1 (*) built for x86_64-linux-thread-multi
ARRAY(0x1f79b98)
With double quotes you are letting the shell interpolate variables first.
As you can check, $_ and $a are unset in the subshell forked for pipe by the parent shell. See a comment on $_ below.
So the double-quoted version is effectively
echo "puke|1|2|3|puke2" | perl -lne 'chomp;#a=split(/\|/);print [4];'
what prints the arrayref [4].
A comment on the effects of having $_ exposed to Bash. Thanks to Borodin for bringing this up.
The $_ is one of a handful of special shell parameters in Bash. It contains the last argument of the previous command, or the pathname of what invoked the shell or commands (via _ environment variable). See the link for a full description.
However, here it is being interpreted in a subshell forked to run the perl command, its first. Apparently it is not even set, as seen with
echo hi; echo hi | echo $_
which prints an empty line (after first hi). The reason may be that the _ environment variable just isn't set for a subshell for a pipe, but I don't see why this would be the case. For example,
echo hi; (echo $_)
prints two lines with hi even though ( ) starts a subshell.
In any case, $_ in the given pipeline isn't set.
The split part is then split(/\|/), so via default split(/\|/, $_) -- with nothing to split. With -w added this indeed prints a warning for use of uninitialized $_.
Note that this behavior depends on the shell. The tcsh won't run this with double quotes at all. In ksh and zsh the last part of pipeline runs in the main shell, not a subshell, so $_ is there.
This is actual a shell topic, not a perl topic.
In shell:
Single quotes preserve the literal value of all of the characters they contain, including the $ and backslash. However, with double quotes, the $, backtick, and backslash characters have special meaning.
For example:
'\"' evaluates to \"
whereas
"\'" evaluates to just '
because with double quotes, the backslash gets a special meaning as the escape character.

how to replace with sed when source contains $

I have a file that contains:
$conf['minified_version'] = 100;
I want to increment that 100 with sed, so I have this:
sed -r 's/(.*minified_version.*)([0-9]+)(.*)/echo "\1$((\2+1))\3"/ge'
The problem is that this strips the $conf from the original, along with any indentation spacing. What I have been able to figure out is that it's because it's trying to run:
echo " $conf['minified_version'] = $((100+1));"
so of course it's trying to replace the $conf with a variable which has no value.
Here is an awk version:
$ awk '/minified_version/{$3+=1} 1' file
$conf['minified_version'] = 101
This looks for lines that contain minified_version. Anytime such a line is found the third field, $3, is incremented by.
My suggested approach to this would be to have a file on-disk that contained nothing but the minified_version number. Then, incrementing that number would be as simple as:
minified_version=$(< minified_version)
printf '%s\n' "$(( minified_version + 1 ))" >minified_version
...and you could just put a sigil in your source file where that needs to be replaced. Let's say you have a file named foo.conf.in that contains:
$conf['minified_version'] = #MINIFIED_VERSION#
...then you could simply run, in your build process:
sed -e "s/#MINIFIED_VERSION#/$(<minified_version)/g" <foo.conf.in >foo.conf
This has the advantage that you never have code changing foo.conf.in, so you don't need to worry about bugs overwriting the file's contents. It also means that if you're checking your files into source control, so long as you only check in foo.conf.in and not foo.conf you avoid potential merge conflicts due to context near the version number changing.
Now, if you did want to do the native operation in-place, here's a somewhat overdesigned approach written in pure native bash (reading from infile and writing to outfile; just rename outfile back over infile when successful to make this an in-place replacement):
target='$conf['"'"'minified_version'"'"'] = '
suffix=';'
while IFS= read -r line; do
if [[ $line = "$target"* ]]; then
value=${line##*=}
value=${value%$suffix}
new_value=$(( value + 1 ))
printf '%s\n' "${target}${new_value}${suffix}"
else
printf '%s\n' "$line"
fi
done <infile >outfile

How to pipe Bash Shell command's output line by line to Perl for Regex processing?

I have some output data from some Bash Shell commands. The output is delimited line by line with "\n" or "\0". I would like to know that is there any way to pipe the output into Perl and process the data line by line within Perl (just like piping the output to awk, but in my case it is in the Perl context.). I suppose the command may be something like this :
Bash Shell command | perl -e 'some perl commands' | another Bash Shell command
Suppose I want to substitute all ":" character to "#" character in a "line by line" basis (not a global substitution, I may use a condition, e.g. odd or even line, to determine whether the current line should have the substitution or not.), then how could I achieve this.
See perlrun.
perl -lpe's/:/#/g' # assumes \n as input record separator
perl -0 -lpe's/:/#/g' # assumes \0 as input record separator
perl -lne'if (0 == $. % 2) { s/:/#/g; print; }' # modify and print even lines
Yes, Perl may appear at any place in a pipeline, just like awk.
The command line switch -p (if you want automatic printing) or -n (if you don't want it) will do what you want. The line contents are in $_ so:
perl -pe's/\./\#/g'
would be a solution. Generally, you want to read up on the '<>' (diamond) operator which is the way to go for non-oneliners.

tee to 2 blocks of code?

I am trying to use the tee command on Solaris to route output of 1 command to 2 different steams each of which comprises multiple statements. Here is the snippet of what I coded, but does not work. This iteration throws errors about unexpected end of files. If I change the > to | it throws an error Syntax Error near unexpected token do.
todaydir=/some/path
baselen=${#todaydir}
grep sometext $todaydir/somefiles*
while read iline
tee
>(
# this is the first block
do ojob=${iline:$baselen+1:8}
echo 'some text here' $ojob
done > firstoutfile
)
>(
# this is the 2nd block
do ojob=${iline:$baselen+1:8}
echo 'ls -l '$todaydir'/'$ojob'*'
done > secondoutfile
)
Suggestions?
The "while" should begin (and end) inside each >( ... ) substitution, not outside. Thus, I believe what you want is:
todaydir=/some/path
baselen=${#todaydir}
grep sometext $todaydir/somefiles* | tee >(
# this is the first block
while read iline
do ojob=${iline:$baselen+1:8}
echo 'some text here' $ojob
done > firstoutfile
) >(
# this is the 2nd block
while read iline
do ojob=${iline:$baselen+1:8}
echo 'ls -l '$todaydir'/'$ojob'*'
done > secondoutfile
)
I don't think the tee command will do that. The tee command will write stdin to one or more files as well as spit it back out to stdout. Plus I'm not sure the shell can fork off two sub-processes in the command pipeline like you are trying. You'd probably be better off to use something like Perl to fork off a couple of sub-process and write stdin to each.