bash script problem, find , mv tilde files created by gedit - find

im using linux with gedit which has the wonderful habit of creating a temp file with a tilde at the end for every file I edit.
im trying to move all of these files at once to a different folder using the following:
find . -iname “*.php~” -exec mv {} /mydir \;
However, its now giving me syntax errors, as if it were searching through each file and trying to move the piece of text. I just want to move all of the files ending in .php~ to another directory. Any idea how I do that?
Cheers Ke

Try this one-liner:
for D in `find . -iname "*.php~"`; do mv ${D} /mydir; done
For future reference, if you go into Edit > Preferences > Editor Tab, there is checkbox for "Create a backup copy of files before saving" That is the guy responsible for creating the tilde version.

GNU find
find . -iname "*.php~" -exec mv "{}" /mydir +;
or
for file in *.php~
do
echo mv "$file" /mydir
done

Related

Copy files based on regex to another folder but keep folder structure

I want to copy files matching a regex to another folder but while keeping part of the folder structure, All the filepaths will start with src/main/java/ buth the path before that is different for most files
I know that I can use
find . -iregex ".*HeadersConstants\.java" -exec cp {} ./destination/ \;
To copy a file but then I lose the file path in the destination dir
Are you on linux? Historically, cpio would have been an obvious choice but these days rsync is likely to be better:
find . -iregex ".*HeadersConstants\.java" |\
rsync -v --files-from=- ./ ${destination}/
It's probaby not a good idea for the destination to be inside . as your question code suggests but we can stop find looking there with:
find . -path ./destination -prune \
-o -iregex ".*HeadersConstants\.java" -print |\
rsync -v --file-from=- ./ ./destination/
(You may want to investigate why the -print is required.)
In the meantime I got it to work (probably not the cleanest way)
javaRe='(.*)\/src\/main\/java\/(.*)\/'
find . -name "HeadersConstants\.java" | while read f
do
if [[ ${f} =~ ${javaRe} ]]; then
path=${BASH_REMATCH[2]}
fullpath=${destination}${srcDir}${path}
mkdir -p "$fullpath"
cp "$f" "$fullpath"
fi
done

Insert the same text file inside several text files

I would like to insert the content of a text file, let's call it Menu.incl inside several different files. I know how to do that on a single file, by using the sed command and by introducing a mark in the target file:
sed -e '/the Menu is Here/r Menu.incl' F.html > F_Menu.html
I want to keep the original file for possible future evolution of the Menu.incl file.
How could I do that for each member of a collection of html files ? My guess : with the find command? I tried to obtain the expected result with the following command:
find . -name '*.html' -exec sed -e '/the Menu is Here/r Menu.incl' '{}' +
but I don't know how to save each modified .html content inside a copy, i.e. without spoiling the original html files
Than you for your help,
Romuald
Try with process substitution:
while read f; do
d=$(dirname "$f")
b=$(basename "$f")
b="new.$b"
sed -e '/the Menu is Here/ r Menu.incl' "$f" > "$d/$b";
done < <(find . -type f -iname "*.html" -print)

How to create links to all subfolders containing specified text in their names

As specified in title I am looking for a way how to create links to all subfolders containing specified text in their names, so for example for all subfolders of root directory containing ".app" in their names an link will be created to "/AppLinks" directory. I would like to use it in bash script (open source, free).
Does anyone know how to do that?
I searched it by google with no luck.
find yourdir -type d -name '*.app' -exec ln -s {} /AppLinks \;
Find all directories named something.app in yourdir, and create a symlink to them in /AppLinks.
single line bash-fu
function FUNCsymlink() { echo "$1"; fileName=`basename "$1"`; ln -s "$1" "/AppLinks/$fileName"; }; export -f FUNCsymlink; find `pwd`/ -maxdepth 1 -type d -iname "*.app" -exec bash -c "FUNCsymlink '{}'" \;
to easy reading:
function FUNCsymlink() {
echo "$1";
fileName=`basename "$1"`;
ln -s "$1" "/AppLinks/$fileName";
};
export -f FUNCsymlink;
find `pwd`/ -maxdepth 1 -type d -iname "*.app" -exec bash -c "FUNCsymlink '{}'" \;
you may have to adjust it a bit for your specific solution.
wherever you run it, it will create the symlinks to /AppLinks
it will only look for direct subfolders, not subfolders of subfolders, thats what I believe you need..

Create symbolic link from find

I'm trying to create a symbolic link (soft link) from the results of a find command. I'm using sed to remove the ./ that precedes the file name. I'm doing this so I can paste the file name to the end of the path where the link will be saved. I'm working on this with Ubuntu Server 8.04.
I learned from this post, which is kind of the solution to my problem but not quite-
How do I selectively create symbolic links to specific files in another directory in LINUX?
The resulting file name didn't work, though, so I started trying to learn awk and then decided on sed.
I'm using a one-line loop to accomplish this. The problem is that the structure of the loop is separating the filename, creating a link for each word in the filename. There are quite a few files and I would like to automate the process with each link taking the filename of the file it's linked to.
I'm comfortable with basic bash commands but I'm far from being a command line expert. I started this with ls and awk and moved to find and sed. My sed syntax could probably be better but I've learned this in two days and I'm kind of stuck now.
for t in find -type f -name "*txt*" | sed -e 's/.//' -e 's$/$$'; do echo ln -s $t ../folder2/$t; done
Any help or tips would be greatly appreciated. Thanks.
Easier:
Go to the folder where you want to have the files in and do:
find /path/with/files -type f -name "*txt*" -exec ln -s {} . ';'
Execute your for loop like this:
(IFS=$'\n'; for t in `find -type f -name "*txt*" | sed 's|.*/||'`; do ln -s $t ../folder2/$t; done)
By setting the IFS to only a newline, you should be able to read the entire filename without getting splitted at space.
The brackets are to make sure the loop is executed in a sub-shell and the IFS of the current shell does not get changed.

unix - delete files only from directory

Say with a directory structure such as:
toplev/
file2.txt
file5.txt
midlev/
test.txt
anotherdirec/
other.dat
myfile.txt
furtherdown/
morefiles.txt
otherdirec/
myfile4.txt
file7.txt
How would you delete all files (not directories and not recursively) from the 'anotherdirec'? In this example it would delete 2 files (other.dat, myfile.txt)
I have tried the below command from within the 'midlev' directory but it gives this error (find: bad option -maxdepth find: [-H | -L] path-list predicate-list):
find anotherdirec/ -type f -maxdepth 1
I'm running SunOS 5.10.
rm anotherdirec/*
should work for you.
Rob's answer (rm anotherdirec/*) will probably work, but it is a bit verbose and generates a bunch of error messages. The problem is that you are using a version of find that does not support the -maxdepth option. If you want to avoid the error messages that 'rm anotherdirec/*' gives, you can just do:
for i in anotherdirec/*; do test -f $i && rm $i; done
Note that neither of these solutions will work if any of the files contain spaces or other special characters. You can put double quotes around $i if that is an issue.
Find is sensitive to options order. Try this:
find anotherdirec/ -maxdepth 1 -type f -exec rm {} \;
rm toplev/midlev/anotherdirec/* if you want to delete only files.
rm -rf toplev/midlev/anotherdirec/* if you want to delete files and lower directories