Copy directories and their sub directories created day ago - copy

I tried to copy Copy directories and their subdirectories created a day ago as follows:
find /application/work/ -type d -mtime -1 -exec cp -r {} /tmp/backup \;
But it is copying all directories (Not only the ones created a day ago).
Would you please advise?

find is also finding the working directory /application/work/ and is copying it, see How to exclude this / current / dot folder from find "type d". Since you're executing cp -r, it recursively copies everything in . before also copying the subset of directories you've found via -mtime. You need to set the -mindepth to exclude the working directory from the paths on which find will operate.
Modify your command to:
find /application/work -mindepth 1 -type d -mtime -1 -exec cp -r {} /tmp/backup \;

Related

excluded directories in find command not properly piped to -exec cp

I am trying to copy a folder containing a subfolder structure, while excluding a specified subfolder by using the find -exec cp command. I have managed to use multiple working excluding options when I am using the find command alone, but once I add the '-exec cp' command, the excluding terms work no longer.
Imagine the directory of interest containing multiple files and subfolders, with one subfolder named "exclusion_string"
This find command works properly when used alone:
find ~/directory/of/interest/ -maxdepth 2 ! -name "*exclusion_string*"
... while this command negates the exclusion criterium:
find ~/directory/of/interest/ -maxdepth 2 ! -name "*exclusion_string*" -exec cp -r '{}' . \;
Likewise, when using other criteria or arguments, the exclusion of a subdirectory is lost, E.g:
find ~/directory/of/interest/ -maxdepth 2 -name "*" -size -100k -exec cp -r '{}' . \;
find ~/directory/of/interest/ -maxdepth 2 -name "*exclusion_string*" | xargs cp -rt .
What am I missing here?

Copying the files and SUBDIRECTORIES based on modification date?

It may be a duplicate question but i could not find the solution for this i want to copy a last 3 months files AND subdirectories from one disk to andother but i could find only to listing the files by using the following command. I really don't know how to copy the files by using -mtime. I'm new to linux please help me.
find . -mtime -90 -exec cp {} targetdir \;
but how to copy directories with subdirectories and files too? (but do not use command rsync, i don;t have it with this instance) Regards S.
Copy needs a recursive option specified to handle the subdirectories
$ find testroot # shows some dirs and files
testroot
testroot/sub1
testroot/sub1/subtestfile
testroot/sub2
testroot/testf
$ find target # empty at this stage
target
$ find ./testroot/ -exec cp -R {} target/ \;
$ find target
target
target/sub1
target/sub1/subtestfile
target/sub2
target/subtestfile
target/testf

How to create links to all subfolders containing specified text in their names

As specified in title I am looking for a way how to create links to all subfolders containing specified text in their names, so for example for all subfolders of root directory containing ".app" in their names an link will be created to "/AppLinks" directory. I would like to use it in bash script (open source, free).
Does anyone know how to do that?
I searched it by google with no luck.
find yourdir -type d -name '*.app' -exec ln -s {} /AppLinks \;
Find all directories named something.app in yourdir, and create a symlink to them in /AppLinks.
single line bash-fu
function FUNCsymlink() { echo "$1"; fileName=`basename "$1"`; ln -s "$1" "/AppLinks/$fileName"; }; export -f FUNCsymlink; find `pwd`/ -maxdepth 1 -type d -iname "*.app" -exec bash -c "FUNCsymlink '{}'" \;
to easy reading:
function FUNCsymlink() {
echo "$1";
fileName=`basename "$1"`;
ln -s "$1" "/AppLinks/$fileName";
};
export -f FUNCsymlink;
find `pwd`/ -maxdepth 1 -type d -iname "*.app" -exec bash -c "FUNCsymlink '{}'" \;
you may have to adjust it a bit for your specific solution.
wherever you run it, it will create the symlinks to /AppLinks
it will only look for direct subfolders, not subfolders of subfolders, thats what I believe you need..

unix - delete files only from directory

Say with a directory structure such as:
toplev/
file2.txt
file5.txt
midlev/
test.txt
anotherdirec/
other.dat
myfile.txt
furtherdown/
morefiles.txt
otherdirec/
myfile4.txt
file7.txt
How would you delete all files (not directories and not recursively) from the 'anotherdirec'? In this example it would delete 2 files (other.dat, myfile.txt)
I have tried the below command from within the 'midlev' directory but it gives this error (find: bad option -maxdepth find: [-H | -L] path-list predicate-list):
find anotherdirec/ -type f -maxdepth 1
I'm running SunOS 5.10.
rm anotherdirec/*
should work for you.
Rob's answer (rm anotherdirec/*) will probably work, but it is a bit verbose and generates a bunch of error messages. The problem is that you are using a version of find that does not support the -maxdepth option. If you want to avoid the error messages that 'rm anotherdirec/*' gives, you can just do:
for i in anotherdirec/*; do test -f $i && rm $i; done
Note that neither of these solutions will work if any of the files contain spaces or other special characters. You can put double quotes around $i if that is an issue.
Find is sensitive to options order. Try this:
find anotherdirec/ -maxdepth 1 -type f -exec rm {} \;
rm toplev/midlev/anotherdirec/* if you want to delete only files.
rm -rf toplev/midlev/anotherdirec/* if you want to delete files and lower directories

how to prevent "find" from dive deeper than current directory

I have many directory with lots of files inside them.
I've just compressed that directory respectively become filename.tar.gz, someothername.tar.gz, etc.
After compressing, I use this bash to delete everything except file name contains .tar.gz:
find . ! -name '*.tar.gz*' | xargs rm -r
But the problem is find will dive too deep inside the directory. Because the directory has been deleted but find will dive deep in each directory, many messages displayed, such as:
rm: cannot remove `./dirname/index.html': No such file or directory
So how to prevent find from dive deeper than this level (current directory)?
You can use ls instead of find for your problem:
ls | grep -v .tar.gz | xargs rm -rf
You can tell find the max depth to recurse:
find -maxdepth 1 ....