How to use find to find a file in specified subdirectory - find

I have multiple sites that user my own template, now I have updated my template and I want to update the template using find.
if I do
find . -name template.css -print -exec cp NEW_FILE {} \;
Then all other template.css wil overwritten to.
I only want template.css to be overwritten if they are in the directory mytemplate/css.
Any one an idea?
Example:
/site1/templates/mytemplate/css/template.css*
/site1/templates/othertemplate/css/template.css
/site1/templates/other2template/css/template.css
/site2/templates/mytemplate/css/template.css*
/site2/templates/othertemplate/css/template.css
/site2/templates/other2template/css/template.css
/site3/templates/mytemplate/css/template.css*
/site3/templates/othertemplate/css/template.css
/site3/templates/other2template/css/template.css
The files with an * should only be overwritten

You can use -wholename flag and proper regular expression. This should work for your example:
find . -wholename "*/mytemplate/css/template.css" -print -exec cp NEW_FILE {} \;

Related

excluded directories in find command not properly piped to -exec cp

I am trying to copy a folder containing a subfolder structure, while excluding a specified subfolder by using the find -exec cp command. I have managed to use multiple working excluding options when I am using the find command alone, but once I add the '-exec cp' command, the excluding terms work no longer.
Imagine the directory of interest containing multiple files and subfolders, with one subfolder named "exclusion_string"
This find command works properly when used alone:
find ~/directory/of/interest/ -maxdepth 2 ! -name "*exclusion_string*"
... while this command negates the exclusion criterium:
find ~/directory/of/interest/ -maxdepth 2 ! -name "*exclusion_string*" -exec cp -r '{}' . \;
Likewise, when using other criteria or arguments, the exclusion of a subdirectory is lost, E.g:
find ~/directory/of/interest/ -maxdepth 2 -name "*" -size -100k -exec cp -r '{}' . \;
find ~/directory/of/interest/ -maxdepth 2 -name "*exclusion_string*" | xargs cp -rt .
What am I missing here?

delete directories with find and exclude other directories

I'm attempting to delete some directories and I want to be able to exclude a directory called 'logs' from being deleted.
This is my basic find operation (without the exclusion):
# find . -type d |tail -10
./d20160124-1120-df8mfb/deployments
./d20160124-1120-df8mfb/releases
./d20160131-16993-vazqg5
./d20160131-16993-vazqg5/metadata
./d20160131-16993-vazqg5/deployments
./d20160131-16993-vazqg5/releases
./logs
./d20160203-27735-1tqbjh6
./d20160125-1120-1yccr9p
./d20160131-16993-1yf9lnc
I'm just tailing the output so that you have an idea of what's going on without taking up the whole page. :)
If I try to exlclude the logs directory with the prune command I get back no results.
root#ops-manager:/tmp/tmp# find . -type d -prune -o -name 'logs' -print
root#ops-manager:/tmp#
What am I doing wrong?
Once I get this right, I'll tack on an -exec rm rf {} \; command so I can delete those directories.
Any help here would be appreciated!
-prune always evaluates to true, which means the expression on the other side of -o is never evaluated. You need to change the order:
find . -type d -name 'logs' -prune -o -print

How to copy a file on several directories with the name *Co* (where *=wildcard)

How to copy a file to several directories of the form *Co*? or *52?
Apparently, just typing
cp fileA *Co*
won't work.
My other concern is that if a directory already contains fileA, I don't want it to be overwritten. That is, if the directory *Co* contains fileA, do NOT copy. Is there a one line solution for this, since I think writing a script with if-else is an overkill.
Thanks!
If your version of cp supports -n, you can do:
find . -name '*Co*' -exec cp -n fileA {} \;
If not:
find . -name '*Co*' -exec sh -c 'test -f $0/fileA || cp fileA $0' {} \;
Note that these will each descend recursively: if you don't want that you can limit the scope of find. To find either Co or *52, you can do:
find . \( -name '*Co*' -o -name '*52' \) -exec ...

Solaris: Find files matching a pattern but only display the directory name

Like it says above. I'm trying to find a simple way to look for a pattern in a file name and display only the directory in which it is found.
For example, given a tree structure that looks like this:
./projecta
./projecta/src/code1.p
./projecta/src/code2.p
./projecta/util.p
./projectb
I would want the command "whatever *.p" to return:
./projecta/src
./projecta
Hope that makes sense. Any further info, please signify in the usual manner.
TIA
N/
To display the directories of .c files:
find . -name \*.c -exec dirname {} \; | uniq
To display the directories of .html files:
find . -name \*.html -exec dirname {} \; | uniq
If you want to use that in a script,
#/bin/bash
ext=$1
find . -name \*.${ext} -exec dirname {} \; | uniq

How to create links to all subfolders containing specified text in their names

As specified in title I am looking for a way how to create links to all subfolders containing specified text in their names, so for example for all subfolders of root directory containing ".app" in their names an link will be created to "/AppLinks" directory. I would like to use it in bash script (open source, free).
Does anyone know how to do that?
I searched it by google with no luck.
find yourdir -type d -name '*.app' -exec ln -s {} /AppLinks \;
Find all directories named something.app in yourdir, and create a symlink to them in /AppLinks.
single line bash-fu
function FUNCsymlink() { echo "$1"; fileName=`basename "$1"`; ln -s "$1" "/AppLinks/$fileName"; }; export -f FUNCsymlink; find `pwd`/ -maxdepth 1 -type d -iname "*.app" -exec bash -c "FUNCsymlink '{}'" \;
to easy reading:
function FUNCsymlink() {
echo "$1";
fileName=`basename "$1"`;
ln -s "$1" "/AppLinks/$fileName";
};
export -f FUNCsymlink;
find `pwd`/ -maxdepth 1 -type d -iname "*.app" -exec bash -c "FUNCsymlink '{}'" \;
you may have to adjust it a bit for your specific solution.
wherever you run it, it will create the symlinks to /AppLinks
it will only look for direct subfolders, not subfolders of subfolders, thats what I believe you need..