Can fish shell handle if statements within if statements? - fish

I'm trying to get my config.fish to work, but it's not working as expected, and I really can't understand why. The best I can do is guess that maybe Fish can't handle if statements within if statements? Here's my code:
echo "so far so good"
if status --is-interactive
# Chips: fish plugin manager
if [ -e ~/.config/chips/build.fish ] ; . ~/.config/chips/build.fish ; end
echo "def interactive"
# Don't use vi keybindings in unknown terminals,
# since weird things can happen.
set acceptable_terms xterm-256color screen-256color xterm-termite
echo "acceptable terms: $acceptable_terms"
echo "term: $TERM"
if contains $TERM acceptable_terms
echo "good to go!"
fish_vi_key_bindings
# Load pywal colors
cat ~/.cache/wal/sequences
else
echo "why?!?!?"
end
end
And what I'm getting is this:
so far so good
def interactive
acceptable terms: xterm-256color screen-256color xterm-termite
term: xterm-termite
why?!?!?
But what I expect to see is this:
so far so good
def interactive
acceptable terms: xterm-256color screen-256color xterm-termite
term: xterm-termite
good to go!
But when I run this in a shell, if works fine:
❯ echo "term: $TERM / acceptable: $acceptable_terms"
term: xterm-termite / acceptable: xterm-256color screen-256color xterm-termite
❯ if contains $TERM $acceptable_terms
echo "yay!"
end
yay!
What could be going on here?

Of course fish can handle nested if statements!
if contains $TERM acceptable_terms
is missing the $ on the second variable!
if contains $TERM $acceptable_terms

Related

How to check substring in Bourne Shell?

I wanna test whether a string has "substring". Most answers online is based on Bash. I tried
if [ $string == "*substring*" ]
which was not working. Currently
if echo ${string} | grep -q "substring"
worked. Is there any other better way.
Using POSIX compliant parameter-expansion and with the classic test-command.
#!/bin/sh
substring=ab
string=abc
if [ "$string" != "${string%"$substring"*}" ]; then
echo "$substring present in $string"
fi
(or) explicitly using the test operator as
if test "$string" != "${string%$substring*}" ; then
In a POSIX-features only shell you won't be able to do general pattern or regex matching inside a conditional without the help of an external utility.
That said:
Kenster's helpful answer shows how to use the branches of a case ... esac statement for pattern matching.
Inian's helpful answer shows how to match indirectly inside a conditional, using patterns as part of parameter expansions.
Your own grep approach is certainly an option, though you should double-quote ${string}:
if echo "${string}" | grep -q "substring"; ...
A slightly more efficient way is to use the expr utility, but note that per POSIX it is limited to BREs (basic regular expressions), which are limited:
string='This text contains the word "substring".'
if expr "$string" : ".*substring" >/dev/null; then echo "matched"; fi
Note that the regex - the 3rd operand - is implicitly anchored at the start of the input, hence the need for .*.
>/dev/null suppresses expr's default output, which is the length of the matched string in this case. (If nothing matches, the output is 0, and the exit code is set to 1).
If you're just testing for substrings (or anything that can be matched using filename wildcards) you can use case:
#!/bin/sh
while read line; do
case "$line" in
*foo*) echo "$line" contains foo ;;
*bar*) echo "$line" contains bar ;;
*) echo "$line" isnt special ;;
esac
done
$ ./testit.sh
food
food contains foo
ironbar
ironbar contains bar
bazic
bazic isnt special
foobar
foobar contains foo
This is basic Bourne shell functionality. It doesn't require any external programs, it's not bash-specific, and it predates POSIX. So it should be pretty portable.
Short answer is no, not if you are trying to use vanilla sh, without Bash extensions. On many modern systems, /bin/sh is actually a link to /bin/bash, which provides a superset of sh's functionality (for the most part). Your original attempt would have worked with Bash's builtin [[ extended test command: http://mywiki.wooledge.org/BashFAQ/031

Eval for multiple command execution in ksh93, Solaris

I would like to execute two or more commands back to back . But these commands are stored in a variable in my script. For example,
var="/usr/bin/ls ; pwd ; pooladm -d; pooladm -e"
The problem arises when I execute this variable via my script.
Suppose I go:
#!/bin/ksh -p
..
..
var="/usr/bin/ls ; pwd;pooladm -d; pooladm -e"
..
..
$var # DOES NOT WORK ..BUT WORKS WITH EVAL
It doesn't work ..
But the moment I use eval :
eval $var
It works brilliantly.
I was just wondering if there is any other way to execute a bunch of commands stored in a variable without using eval.
Also , Is eval usage considered a bad programming practice because my coding standards appear to shun its usage than embrace it . Please do let me know.
Remember that the shell only parses the line once. So when you expand your $var, it becomes one string containing blanks. Since you have no executable named '/usr/bin/ls ; pwd;pooladm -d; pooladm -e', it can't run it.
On the other hand, eval takes its arguments are re-scans them, now you get '/usr/bin/ls', 'pwd', and so on. It works.
eval is a little chancy because it leaves a possible security hole -- consider if someone managed to get 'rm -rf /' into the string. But it's a useful tool.
Use backticks and echo. In your case
`echo $var`
You could invoke another copy of the shell to run the command:
sh -c "$var"
This isn't necessarily better than using eval. The main practical difference is that eval will run the commands in the context of the current shell, while "sh -c" runs the commands in a separate shell instance. If var contains commands to set environment variables or change the current directory, you or may not want those commands to affect the current shell.

Handling Perl command line arguments with spaces from a bash script?

This has been driving me nuts for hours now.
Consider the following test script in perl:
(hello.pl)
#!/usr/bin/perl
print "----------------------------------\n";
$numArgs = $#ARGV + 1;
print "thanks, you gave me $numArgs command-line arguments:\n";
foreach $argnum (0 .. $#ARGV) {
print "$ARGV[$argnum]\n";
}
Ok, it simply prints out the command line arguments given to the script.
For instance:
$ ./hello.pl apple pie
----------------------------------
thanks, you gave me 2 command-line arguments:
apple
pie
I can give the script a single argument with a space by surrounding the words with double quotes:
$ ./hello.pl "apple pie"
----------------------------------
thanks, you gave me 1 command-line arguments:
apple pie
Now I want to use this script in a shell script. I've set up the shell script like this:
#!/bin/bash
PARAM="apple pie"
COMMAND="./hello.pl \"$PARAM\""
echo "(command is $COMMAND)"
$COMMAND
I am calling the hello.pl with the same params and escaped quotes.
This script returns:
$ ./test.sh
(command is ./hello.pl "apple pie")
----------------------------------
thanks, you gave me 2 command-line arguments:
"apple
pie"
Even though the $COMMAND variable echoes the command exactly like the way I ran the perl script from the command line the second time, this time it does not want to see the apple pie as a single argument.
Why not?
This looks like the problem described in the Bash FAQ as: I'm trying to put a command in a variable, but the complex cases always fail!
The answer to that FAQ suggests a number of possible solutions - I hope that's of use.
The issue of the 2 command-line arguments
"apple
pie"
is due to shell expansion with the IFS shell variable being set to have a space as value.
printf '%q\n' "$IFS" # show value of IFS variable
You may use xargs & sh -c '...code...' to mimic / re-enable ordinary parameter parsing.
PARAM="'apple pie'"
printf '%s' "$PARAM" | xargs sh -c './hello.pl "$#"' argv0
Another option may be to write a few lines of C (like in shebang.c)!
http://www.semicomplete.com/blog/geekery/shebang-fix.html
You should try eval $COMMAND instead of simply $COMMAND.

How do I test if a perl command embedded in a bash script returns true?

So, I have a bash script inside of which I'd like to have a conditional which depends on what a perl script returns. The idea behind my code is as follows:
for i in $(ls); do
if $(perl -e "if (\$i =~ /^.*(bleh|blah|bluh)/) {print 'true';}"); then
echo $i;
fi;
done
Currently, this always returns true, and when I tried it with [[]] around the if statement, I got errors. Any ideas anyone?
P.s. I know I can do this with grep, but it's just an example. I'd like to know how to have Bash use Perl output in general
P.p.s I know I can do this in two lines, setting the perl output to a variable and then testing for that variables value, but I'd rather avoid using that extra variable if possible. Seems wasteful.
If you use exit, you can just use an if directly. E.g.
if perl -e "exit 0 if (successful); exit 1"; then
echo $i;
fi;
0 is success, non-zero is failure, and 0 is the default if you don't call exit.
To answer your question, you want perl to exit 1 for failure and exit 0 for success. That being said, you're doing this the wrong way. Really. Also, don't parse the output of ls. You'll cause yourself many headaches.
for file in *; do
if [[ $file = *bl[eau]h ]]; then
echo "$file matches"
fi
done
for file in * ; do
perl -e "shift =~ /^.*(bleh|blah|bluh)/ || exit 1" "$file" && echo $file: true
done
You should never parse the output of ls. You will have, at least, problems with file names containing spaces. Plus, why bother when your shell can glob on its own?
Quoting $file when passing to the perl script avoids problems with spaces in file names (and other special characters). Internally I avoided expanding the bash $file variable so as to not run afoul of quoting problems if the file name contained ", ' or \
Perl seems to (for some reason) always return 0 if you don't exit with an explicit value, which seems weird to me. Since this is the case I test for failure inside the script and return nonzero in that case.
The return value of the previous command is stored in the bash variable $?. You can do something like:
perl someargs script.pl more args
if [ $? == 0 ] ; then
echo true
else
echo false
fi
It's a good question, my advice is: keep it simple and go Posix (avoid Bashisms1) where possible..
so ross$ if perl -e 'exit 0'; then echo Good; else echo Bad; fi
Good
so ross$ if perl -e 'exit 1'; then echo Good; else echo Bad; fi
Bad
1. Sure, the OP was tagged bash, but others may want to know the generic-Posix form.

Is it possible to make 'exec' use '$SHELL -c' instead of '/bin/sh -c' in Perl?

In Perl, is it possible to make 'exec', 'system', and 'qx' use a shell other than /bin/sh (without using a construct like 'exec "$SHELL -c ..."', and without recompiling perl)?
EDIT: The motivation for this question is a bash script that does 'export -f foo' and then uses perl in a subshell to invoke the function directly via 'system "foo"'. I am not sure that this technique will work with all sh, and although 'system "/bin/bash -c foo"' may work in that scenario, I wouldn't expect the exported function to propagate through all variants of /bin/sh. But mostly I was just curious, and am now curious about how to extend the solution to qx. Also, since I know nothing about non-unix platforms, I'd like to avoid hard coding the path to an alternate shell in the solution.
You can override exec and system. See perldoc perlsub for the details, but here is roughly what you want (modulo some quoting bugs I don't feel like trying to fix):
#!/usr/bin/perl
use strict;
use warnings;
use subs qw/system/;
sub system {
#handle one arg version:
if (#_ == 1) {
return CORE::system "$ENV{SHELL} -c $_[0]";
}
#handle the multi argument version
return CORE::system #_;
}
print "normal system:\n";
system "perl", "-e", q{system q/ps -ef | grep $$/};
print "overloaded system:\n";
system 'ps -ef | grep $$';
exec and system will use the shell (which will likely not be /bin/sh on non-UNIX systems) if you only pass one argument to it. (Details are described in perlfunc)
You may want to have a look at IPC::Run3 as an alternative to system
Why don't you want to use 'exec "$SHELL -c ..."'? If you don't want see that code every time you call exec or system, just hide it in a subroutine. That's what they're there for. :)
sub my_exec {
exec $ENV{SHELL}, '-c', #_;
}
If you want to do that, however, I suggest somehow sanitizing $ENV{SHELL} so that people don't do odd things to your script by setting weird values. You might want to ensure that the shell is listed in /etc/shells or whatever way your system lists approved login shells. You also need to do a bit more work to make this taint-clean, which you should probably do if you are going to send data to another process.
exec doesn't use /bin/sh
It just execs the program you specify. No shells.
If you want it to go through a shell you have to do that yourself.