Is it possible to have fishshell split variables that are in cmd line arguments?
Assume I have a variable $args set like so:
$ set args "-a args"
Now, given this python program (test.py):
import sys
print(sys.argv)
If I run the above in fishshell I get this output:
$ python test.py $args
['test.py', '-a args']
Notice that the arguments are passed as one argument. When I do the equivalent in bash I get this output:
$ python test.py $args
['test.py', '-a', 'params']
Is there someway to make fish behave like bash?
You do not want fish to behave like bash (technically any POSIX compatible shell) with respect to variable expansion. The POSIX behavior is the source of endless problems and is why you need to put double-quotes around almost everything. In fact, most experienced people will tell you to add IFS=$'\n' at the top of your scripts to stop that auto-splitting from happening.
One answer is to use fish's "every var is a list" feature: set args "-a" "args" (the quotes are just for clarity and aren't needed in this example). Each element of the list becomes a separate argument to the command. This will do the right thing even if the args value contains whitespace. The other answer is to explicitly split the string on whitespace using command substitution: a_cmd (string split ' ' $args). This will not do the right thing (in fish or bash) if the args value contains whitespace.
I found a little hack with fish commandline tokenization:
function posix_expand_str --description "Expand a string the POSIX way."
set __posix_expand_str__oldline (commandline)
commandline $argv
commandline -o
commandline $__posix_expand_str__oldline
set -e __posix_expand_str__oldline
end
All strings seem like they were concatenated during testing.
When you realize this answered your question, please accept. It only POSIXes when you ask it to, and does not break strings.
Test results:
> posix_expand_str "hello world"
hello
world
> posix_expand_str "hello 'posix haters' world"
hello
posix haters
world
> posix_expand_str "hello" 'high rep "stackoverflow staff"' "world"
hello
high
rep
stackoverflow staff
world
Related
I'm building a CLI tasks utility, (A cheap version of taskwarrior).
I want to add some optional flags, such as -n
else if [ $cmd = 'delete' ]
argparse 'n/index'=? -- $argv
sed -i "$_flag_index d" ~/.tasks/data/Tdo.csv
but this gives an error
~/.tasks/run.sh (line 14): No matches for wildcard “'n/index'=?”. See help expand.
argparse 'n/index'=? -- $argv
I'm unable to understand the correct usage of optional flags, and I've not been able to find enough resources, the fish shell documentation isn't sufficient for a novice in shell scripting given lack of examples.
How to accept an optional argument n/index, and further execute some code if the argument has been given, and something else otherwise, and is it possible to add integer constraints on optional arguments?
Did you help expand like fish told you?
The unquoted ? is being handled as a globbing character. Use 'n/index=?'
$ set argv --index=10
$ argparse 'n/index'=? -- $argv
fish: No matches for wildcard ''n/index'=?'. See `help expand`.
argparse --ignore-unknown 'n/index'=? -- $argv
^
# quote the whole thing
# v v
$ argparse 'n/index=?' -- $argv
$ set -S _flag_index
$_flag_index: set in local scope, unexported, with 1 elements
$_flag_index[1]: |10|
I'm trying to switch to fish shell but I've run into one sticking point. If I alias vi to vim and edit a file, I can't see which file I'm editing in job control.
polis#josh1 ~> function testvi
vim $argv
end
polis#josh1 ~> functions testvi
function testvi
vim $argv
end
polis#josh1 ~> testvi foobar
polis#josh1 ~> jobs
Job Group CPU State Command
2 26087 0% stopped vim $argv
How do I make it so the output is:
2 26087 0% stopped vim foobar
This is a situation where a command abbreviation is preferable to an alias (i.e., a function):
abbr -a vi vim
Now when you type "vi" and press space it will be magically replaced by "vim". You can do more complicated expansions. For example I use these abbreviations quite a bit:
abbr -a gca git commit --amend
abbr -a gcm git checkout master
The advantages of the abbreviation is they are much simpler than a function and the expansion shows up in your shell history which I find more useful than an aliased function name.
The idea is we want to expand $argv first into a new command, and then execute that. We can do that with eval.
However, this has the wrinkle that any special characters in the new command will be interpreted, so we need to escape them first.
The overall function is:
function testvi
eval "vim "(string escape $argv)
end
This uses the new string builtin, which is in the just-released fish 2.3.0.
Illustration of the result, here editing a file 'foobar $baz' to show that escaping works:
> testvi 'foobar $baz'
> jobs
Job Group State Command
5 60249 stopped vim 'foobar $baz'
I often see this command in node.js programs: NODE_ENV=test node app.js which sets the NODE_ENV variable to test and works. I also read here https://en.wikipedia.org/wiki/Environment_variable that this should work for any shell command, but running some tests on my own, here is what I see
$ HELLO="WORLD"
$ HELLO="MARS" echo "$HELLO"
WORLD
$
I would expect this to print MARS. Is there something I am missing here?
The syntax VAR=value command means that the command will be invoked with the environment variable VAR set to VALUE, and this will apply only for the scope of that command.
However, when you are using the command line:
HELLO="MARS" echo "$HELLO"
The shell first interprets the "$HELLO" parameter, determines that it is WORLD, and then what it actually does is run:
HELLO="MARS" echo "WORLD"
So the echo may have the HELLO variable set, but it doesn't affect what it prints - it has already been interpreted before.
Doing
HELLO="MARS"; echo "$HELLO"
does something else entirely. First it sets HELLO to MARS in the current shell, and then it goes on to interpret the echo command. By this time HELLO contains MARS, not WORLD. But this is an entirely different effect - the variable HELLO stays with the value MARS, which is not the case in the command without the ;.
Your problem is that echo is just a poor choice for a demonstartion of this. You can do other demonstrations to prove that HELLO is changed properly:
HELLO="MARS" eval 'echo $HELLO'
In this case, the shell will not interpret the $HELLO because it is within a string in single quotes. It will first put MARS in HELLO, and then call the eval 'echo $HELLO' with that variable set. The eval command with then run echo $HELLO, and you'll get the output you were expecting.
This syntax is best used for things that don't use the given variable as part of the command line, but rather use it internally.
Other answers are correct, but here a refinement :
There are 2 cases in fact when defining a list of variable separated by spaces in bash whether it ends or not with a command.
VAR1=value1 VAR2=value2 ... VARn=valuen command arg1 arg2 ... argn
and
VAR1=value1 VAR2=value2 ... VARn=valuen
don't export VAR1 ... VARn the same way.
In first case VAR1 ... VARn will be set only for command and will then not be exported to current shell.
In second case VAR1 ... VARn will alter current shell.
then ( remark that ';' is very same of using a new line )
HELLO=WORLD
HELLO=MARS echo "i don't export HELLO."
echo "HELLO=$HELLO"
will display
i don't export HELLO.
HELLO=WORLD
and
HELLO=WORLD
HELLO=MARS ; echo "i did export HELLO."
echo "HELLO=$HELLO"
will display
i did export HELLO.
HELLO=MARS
I want to create scipt to faciliate producing local text file extracts from Hive.
This is to basically execute commands like below:
hive -e "SET hive.cli.print.header=true;SELECT * FROM dropme"|perl -pe 's/(?:\t|^)\KNULL(?=\t|$)//g'>extract/outbound/dropme.txt
While the above works like a charm I find it quite problematic to implement through the parametrized following script (much simplified):
#!/bin/sh
TNAME=dropme
SQL="SELECT * FROM $TNAME"
echo $SQL
echo "SQL: $SQL"
EXTRACMD="hive -e \"SET hive.cli.print.header=true;$SQL\"|perl -pe 'BEGIN{if(defined(\$_=<ARGV>)){s/\b\w+\.//g;print}}s/(?:\t|^)\KNULL(?=\t|$)//g'>extract/outbound/$TNAME.txt"
echo "CMD: $EXTRACMD";
${EXTRACMD}
When run I get: Exception in thread "main" java.lang.NumberFormatException: For input string: "e"
I know there may be many flavours you can print the text or execute command. For instance the line echo $SQL prints me list of files in the directory instead:
SELECT file1.txt file2.txt file3.txt file4.txt FROM dropme
while the next one: echo "SQL: $SQL" gives just what I want: SQL: SELECT * FROM dropme
echo "CMD: $EXTRACMD" prints the (almost) the command to be executed. Almost, as I see \t in perl code being expanded:
CMD: hive -e "SET hive.cli.print.header=true;SELECT * FROM dropme"|perl -pe 'BEGIN{if(defined($_=<ARGV>)){s\w+\.//g;print}}s/(?: |^)\KNULL(?= |$)//g'>extract/outbound/dropme.txt
Maybe that's still ok, but what I want is to be able to copy&paste this command into (other) terminal and execute as the command I put at the top. Ideally I would like that command to be exactly the same (so with \t there)
Biggest problem I have comes when I try to execute it (${EXTRACMD} line). I'm getting the error:
Exception in thread "main" java.lang.NumberFormatException: For input string: "e" …and so on, irrelevant as bash treats every 'word' as single command here. I assume as I don't even know what is really tries to run (prior print attempt obviously doesn't help)
I'm aware that I have multiple options, like:
escaping special characters in the command definition string (like I did with doublequotes)
experimenting with echo and $VAR, '$VAR' or "$VAR"
experimenting with "${EXTRACMD}" or evaluating through eval "${EXTRACMD}"
experimenting with shopt -s extglob or set -f
but as number of combinations is quite large and with my little bash experience I feel it's better to ask for good practice here so my question is:
Is there a way to print a (complex/compound shell) command first and subsequently be able to execute it (exactly as per printed output)? In this case it would be printing the exact command from the top, then executing it the same way as by manually copying that output into terminal prompt and pressing Enter.
Do not construct commands as strings. See http://mywiki.wooledge.org/BashFAQ/050 for details.
That page also talks about a built-in way of getting the shell to tell you what it is running (section 6).
If that doesn't do what you want you can also, with bash, try using printf %q\\n "${arr[*]}".
I am trying to run a program like this:
$CMD $ARGS
where $ARGS is a set of arguments with spaces. However, zsh appears to be handing off the contents of $ARGS as a single argument to the executable. Here is a specific example:
$export ARGS="localhost -p22"
$ssh $ARGS
ssh: Could not resolve hostname localhost -p22: Name or service not known
Is there a bash or zsh flag that controls this behavior?
Note that when I put this type of command in a $!/bin/sh script, it performs as expected.
Thanks,
SetJmp
In zsh it's easy:
Use cmd ${=ARGS} to split $ARGS into separate arguments by word (split on whitespace).
Use cmd ${(f)ARGS} to split $ARGS into separate arguments by line (split on newlines but not spaces/tabs).
As others have mentioned, setting SH_WORD_SPLIT makes word splitting happen by default, but before you do that in zsh, cf. http://zsh.sourceforge.net/FAQ/zshfaq03.html for an explanation as to why whitespace splitting is not enabled by default.
If you want to have a string variable (there are also arrays) be split into words before passing to a command, use $=VAR. There is also an option (shwordsplit if I am not mistaking) that will make any command $VAR act like command $=VAR, but I suggest not to set it: I find it very inconvenient to type things like command "$VAR" (zsh: command $VAR) and command "${ARRAY[#]}" (zsh: command $ARRAY).
It will work if you use eval $CMD $ARGS.