Splitting a command over multiple lines in fish-shell - fish

I'm trying to split my list of additional paths on to multiple lines in my fish config:
# Path additions
for i in \
~/Library/Haskell/ghc-7.0.2/lib/gtk2hs-buildtools-0.12.0/bin \
~/Library/Haskell/bin \
/Applications/MacVim.app/Contents/MacOS \
/opt/local/bin \
/usr/local/bin \
/usr/local/git/bin \
/Users/lyndon/.gem/ruby/1.8/bin
if not contains $i $PATH
set -x PATH $i $PATH
end
end
However, this doesn't seem to work unless all the items are on one line.
Is this possible? I can't seem to find any information on doing this.
Alternatively, is there a way to use a list/array literals to do this?

I'm using fish 2.0.0 on OSX 10.8.5 and your example works as I would expect (for paths that exist on my machine).
Running this code
# Path additions
for i in \
~/bin \
~/.config/fish/functions
if not contains $i $PATH
echo $i
end
end
where ~/bin is set on PATH and ~/.config/fish/functions is not, outputs:
~/bin
I appended the above to my fish config and then ran it like this:
. ~/.config/fish/config.fish
Here is a little more info on multiline editing: http://fishshell.com/docs/2.0/index.html#multiline
There are no array literals in fish. You can read more about fish arrays in the docs. http://fishshell.com/docs/2.0/index.html#variables-arrays

Related

How to list files and substitute pattern in makefile

I have scripts like this
#!/bin/sh
SHELL_CMD=busybox
and try to substitute some pattern to bash shell using makefile,
#!/bin/bash
#SHELL_CMD=busybox
Followings are the procedures in Makefile
release:
#rm -rf my_temp/
#mkdir my_temp/
#cp dir1/burn_*.sh dir1/dump_*.sh my_temp/
#cd my_temp/; \
for f in $(shell ls); do \
sed 's:#!/bin/sh\nSHELL_CMD=busybox:#!/bin/bash\n#SHELL_CMD=busybox:1' $${f} > temp; mv temp $${f}; \
done; \
cd ..;
#cd my_temp/; tar -jcv -f bash.tar.bz2 *.sh; cd ..;
My questions are:
1 in the for loop, it didn't get the correct script names in for loop.
How to patch it ?
Any better patterns in this sed substitution?
You are much better off doing the substitution without trying to match the newline in the source string. Unless you are ready to do some complex sed-fu (How can I replace a newline (\n) using sed?) you can just apply the substitution on each of the line with a
You can do this to apply the action on both 1st and 2nd lines. Also the $(shell ls) part is not needed. You can just run a shell glob expression to get you the files ending with .sh
#for f in *.sh; do \
sed -i -e '1 s:#!/bin/sh:#!/bin/bash:1' -e '2 s:^:#:1' $${f} ;\
done
If you don't want the -i in-place substitution, use the tmp file approach as you had originally shown.

Command Line Mass Rename Jpg Files

I have a folder full of jpg files which all end with "-x-large.jpg" I would like to rename them all using command line so that it gets rid of the -x-large and just becomes .jpg.
So for example 123-x-large.jpg will become 123.jpg
Can someone tell me how I can do this with the ren command?
Thanks.
for img in *-x-large.jpg; do mv -i -v "$img" "${img%-x-large.jpg}.jpg"; done
This loops on all matching images and moves them into a new file with a truncated name (removing -x-large.jpg from the end) with the .jpg added back to the end of the file name. I'm invoking this interactively with mv -i so you are prompted before overwriting each file. To force overwriting (always say "yes"), change that to mv (remove the -i). To prevent overwriting (always say "no"), change that to mv -n.
Remove the -v (verbose) if you don't want to see each rename happen.
If you have a very large number of these files, the command line will be too long for the above command (since *-x-large.jpg will be expanded onto a command line). You can work around that with find and xargs as follows:
sh <(find . -maxdepth 1 -name '*-x-large.jpg' \
|sed -r 's/(.*)(-x-large.jpg)$/mv -i "\1\2" "\1.jpg"/')
This creates a shell script using bash process substitution, using find to generate a list of all files we want to rename and then piping them through sed to create the mv commands.
(See above for the mv flags. I removed -v because presumably this will be a very long list.)
See the version below if you want to check the script before running it.
The above one-liner requires GNU bash or Korn shell (ksh) as well as GNU sed.
Here's how to do it with neither (in three commands):
find . -maxdepth 1 -name '*-x-large.jpg' \
|sed 's/.*/mv "&" "&/; s/-x-large.jpg$/.jpg"/' > temp.sh
sh temp.sh
rm temp.sh
Posix sed doesn't reliably support capture groups (\(…\) or sed -r to invoke ERE) and therefore we can't expect it to be able to match and recall text, so this version simply writes most of the command and then fixes the ending (the absence of a trailing double quotes in the first replacement is intentional; we add it in the second replacement). Posix shell (/bin/sh proper) doesn't support process substitution, so we dump to a temporary file, evaluate it, and then remove it.
If we're referring to Windows command-line, then SET /? is your friend. Loads of good info in there.
setlocal ENABLEDELAYEDEXPANSION
set SEARCH_SUFFIX=-x-large.jpg
set REPLACE_SUFFIX=.jpg
for %%A in ("*%SEARCH_SUFFIX%") do (
set OLD_NAME=%%~nxA
set NEW_NAME=!OLD_NAME:%SEARCH_SUFFIX%=%REPLACE_SUFFIX%!
ren "!OLD_NAME!" "!NEW_NAME!"
)
endlocal

fish shell: Is it possible to conveniently strip extensions?

Is there any convenient way to strip an arbitrary extension from a file name, something à la bash ${i%%.*}? Do I stick to my friend sed?
If you know the extension (eg _bak, a common usecase) this is possibly more convenient:
for f in (ls *_bak)
mv $f (basename $f _bak)
end
Nope. fish has a much smaller feature set than bash, relying on external commands:
$ set filename foo.bar.baz
$ set rootname (echo $filename | sed 's/\.[^.]*$//')
$ echo $rootname
foo.bar
You can strip off the extension from a filename using the string command:
echo (string split -r -m1 . $filename)[1]
This will split filename at the right-most dot and print the first element of the resulting list. If there is no dot, that list will contain a single element with filename.
If you also need to strip off leading directories, combine it with basename:
echo (basename $filename | string split -r -m1 .)[1]
In this example, string reads its input from stdin rather than being passed the filename as a command line argument.
--- Update 2022-08-02 ---
As of fish 3.5+, there is a path command (docs) which was designed to handle stripping extensions:
$ touch test.txt.bak
$ path change-extension '' ./test.txt.bak
test.txt
You can also strip a set number of extensions:
set --local file ./test.txt.1.2.3
for i in (seq 3)
set file (path change-extension '' $file)
end
echo $file
# ./test.txt
Or strip all extensions:
set --local file ./test.txt.1.2.3
while path extension $file
set file (path change-extension '' $file)
end
echo $file
# ./test
--- Original answer ---
The fish string command is still the canonical way to handle this. It has some really nice sub commands that haven't been shown in other answers yet.
split lets you split from the right with a max of 1, so that you just get the last extension.
for f in *
echo (string split -m1 -r '.' "$f")[1]
end
replace lets you use a regex to lop off the extension, defined as the final dot to the end of the string
for f in *
string replace -r '\.[^\.]*$' '' "$f"
end
man string for more info and some great examples.
Update:
If your system has proper basename and dirname utilities, you can use something like this:
function stripext \
--description "strip file extension"
for arg in $argv
echo (dirname $arg)/(string replace -r '\.[^\.]+$' '' (basename $arg))
end
end
With the string match function built into fish you can do
set rootname (string match -r "(.*)\.[^\.]*\$" $filename)[2]
The string match returns a list of 2 items. The first is the whole string, and the second one is the first regexp match (the stuff inside the parentheses in the regex). So, we grab the second one with the [2].
I too need a function to split random files root and extension. Rather than re-implementing naively the feature at the risk of meeting caveats (ex: dot before separator), I am forwarding the task to Python's built-in POSIX path libraries and inherit from their expertise.
Here is an humble example of what one may prefer:
function splitext --description "Print filepath(s) root, stem or extension"
argparse 'e/ext' 's/stem' -- $argv
for arg in $argv
if set -q _flag_ext
set cmd 'import os' \
"_, ext = os.path.splitext('$arg')" \
'print(ext)'
else if set -q _flag_stem
set cmd 'from pathlib import Path' \
"p = Path('$arg')" \
'print(p.stem)'
else
set cmd 'import os' \
"root, _ = os.path.splitext('$arg')" \
'print(root)'
end
python3 -c (string join ';' $cmd)
end
end
Examples:
$ splitext /this/is.a/test.path
/this/is.a/test
$ splitext --ext /this/is.a/test.path
.path
$ splitext --stem /this/is.a/test.path
test
$ splitext /this/is.another/test
/this/is.another/test

comparing two directories with separate diff output per file

I'd need to see what has been changed between two directories which contain different version of a software sourcecode. While I have found a way to get a unique .diff file, how can I obtain a different file for each changed file in the two directories? I'd need this, as the "main" is about 6 MB and wanted some more handy thing.
I came around this problem too, so I ended up with some lines of a shell script. It takes three arguments: Source and destination directory (as used for diff) and a target folder (should exist) for the output.
It's a bit hacky, but maybe it would be useful for someone. So use with care, especially if your paths have special characters.
#!/bin/sh
DIFFARGS="-wb"
LANG=C
TARGET=$3
SRC=`echo $1 | sed -e 's/\//\\\\\\//g'`
DST=`echo $2 | sed -e 's/\//\\\\\\//g'`
if [ ! -d "$TARGET" ]; then
echo "'$TARGET' is not a directory." >&2
exit 1
fi
diff -rqN $DIFFARGS "$1" "$2" | sed "s/Files $SRC\/\(.*\?\) and $DST\/\(.*\?\) differ/\1/" | \
while read file
do
if [ ! -d "$TARGET/`dirname \"$file\"`" ]; then
mkdir -p "$TARGET/`dirname \"$file\"`"
fi
diff $DIFFARGS -N "$1/$file" "$2/$file" > "$TARGET"/"$file.diff"
done
if you want to compare source code it is better to commit it to a source vesioning program as "svn".
after you have done so. do a diff of your uploaded code and pipe it to file.diff
svn diff --old svn:url1 --new svn:url2 > file.diff
A bash for loop will work for you. The following will diff two directories with C source code and produce a separate diff for each file.
for FILE in $(find <FIRST_DIR> -name '*.[ch]'); do DIFF=<DIFF_DIR>/$(echo $FILE | grep -o '[-_a-zA-Z0-9.]*$').diff; diff -u $FILE <SECOND_DIR>/$FILE > $DIFF; done
Use the correct patch level for the lines starting with +++

Dynamically building a exlude list for both rsync & egrep format

I wonder if anyone out there can assist me in trying to solve a issue with me.
I have written a set of shell scripts with the purpose of auditing remote file systems based on a GOLD build on a audit server.
As part of this, I do the following:
1) Use rsync to work out any new files or directories, any modified or removed files
2) Use find ${source_filesystem} -ls on both local & remote to work out permissions differences
Now as part of this there are certain files or directories that I am excluding, i.e. logs, trace files etc.
So in order to achieve this I use 2 methods:
1) RSYNC - I have an exclude-list that is added using --exclude-from flag
2) find -ls - I use a egrep -v statement to exclude the same as the rsync exclude-list:
e.g. find -L ${source_filesystem} -ls | egrep -v "$SEXCLUDE_supt"
So my issue is that I have to maintain 2 separate lists and this is a bit of a admin nightmare.
I am looking for some assistance or some advice on if it is possible to dynamically build a list of exlusions that can be used for both the rsync or the find -ls?
Here is the format of what the exclude lists look like::
RSYNC:
*.log
*.out
*.csv
logs
shared
tracing
jdk*
8.6_Code
rpsupport
dbarchive
inarchive
comms
PR116PICL
**/lost+found*/
dlxwhsr*
regression
tmp
working
investigation
Investigation
dcsserver_weblogic_*.ear
dcswebrdtEAR_weblogic_*.ear
FIND:
SEXCLUDE_supt="\.log|\.out|\.csv|logs|shared|PR116PICL|tracing|lost\+found|jdk|8\.6\_Code|rpsupport|dbarchive|inarchive|comms|dlxwhsr|regression|tmp|working|investigation|Investigation|dcsserver_weblogic_|dcswebrdtEAR_weblogic_"
You don't need to create a second list for your find command. grep can handle a list of patterns using the -f flag. From the manual:
-f FILE, --file=FILE
Obtain patterns from FILE, one per line. The empty file contains zero
patterns, and therefore matches nothing. (-f is specified by POSIX.)
Here's what I'd do:
find -L ${source_filesystem} -ls | grep -Evf your_rsync_exclude_file_here
This should also work for filenames containing newlines and spaces. Please let me know how it goes.
In the end the grep -Evf was a bit of a nightmare as rsync didnt support regex, it uses regex but not the same.
So I then pursued my other idea of dynamically building the exclude list for egrep by parsing the rsync exclude-list and building variable on the fly to pass into egrep.
This the method I used:
#!/bin/ksh
# Create Signature of current build
AFS=$1
#Create Signature File
crSig()
{
find -L ${SRC} -ls | egrep -v **"$SEXCLUDE"** | awk '{fws = ""; for (i = 11; i <= NF; i++) fws = fws $i " "; print $3, $6, fws}' | sort >${BASE}/${SIFI}.${AFS}
}
#Setup SRC, TRG & SCROOT
LoadAuditReqs()
{
export SRC=`grep ${AFS} ${CONF}/fileSystem.properties | awk {'print $2'}`
export TRG=`grep ${AFS} ${CONF}/fileSystem.properties | awk {'print $3'}`
export SCROOT=`grep ${AFS} ${CONF}/fileSystem.properties | awk {'print $4'}`
**export BEXCLUDE=$(sed -e 's/[*/]//g' -e 's/\([._+-]\)/\\\1/g' ${CONF}/exclude-list.${AFS} | tr "\n" "|")**
**export SEXCLUDE=$(echo ${BEXCLUDE} | sed 's/\(.*\)|/\1/')**
}
#Load Properties File
LoadProperties()
{
. /users/rpapp/rpmonit/audit_tool/conf/environment.properties
}
#Functions
LoadProperties
LoadAuditReqs
crSig
So with these new variables:
**export BEXCLUDE=$(sed -e 's/[*/]//g' -e 's/\([._+-]\)/\\\1/g' ${CONF}/exclude-list.${AFS} | tr "\n" "|")**
**export SEXCLUDE=$(echo ${BEXCLUDE} | sed 's/\(.*\)|/\1/')**
I use them to remove "*" and "/", then match my special characters and prepend with "\" to escape them.
Then it using "tr" replace a newline with "|" and then rerunning that output to remove the trailing "|" to make the variable $SEXCLUDE to use for egrep that is used in the crSig function.
What do you think?