how to prevent "find" from dive deeper than current directory - find

I have many directory with lots of files inside them.
I've just compressed that directory respectively become filename.tar.gz, someothername.tar.gz, etc.
After compressing, I use this bash to delete everything except file name contains .tar.gz:
find . ! -name '*.tar.gz*' | xargs rm -r
But the problem is find will dive too deep inside the directory. Because the directory has been deleted but find will dive deep in each directory, many messages displayed, such as:
rm: cannot remove `./dirname/index.html': No such file or directory
So how to prevent find from dive deeper than this level (current directory)?

You can use ls instead of find for your problem:
ls | grep -v .tar.gz | xargs rm -rf

You can tell find the max depth to recurse:
find -maxdepth 1 ....

Related

Unable to delete thousands of files within a folder in terminal

I'm trying to delete files inside a certain folder but it's throwing an error:
rm -rf /usr/html/sched/downloads/*
-bash: /bin/rm: Argument list too long
I searched online and found this solution but I'm afraid to try it being a production server and I don't know how to put the path correctly:
find . -name '*' | xargs rm -v
How can I delete thousands of files within the /downloads director? FYI, there's no sub-directory.
I think here you can check how you can handle it because for a large scale of files you will need to do it by a specific quantity by milliseconds.
find ./cache -mtime +0.5 -print0 | xargs -0 rm -f
Faster way to delete a large number of files [duplicate]

Copying the files and SUBDIRECTORIES based on modification date?

It may be a duplicate question but i could not find the solution for this i want to copy a last 3 months files AND subdirectories from one disk to andother but i could find only to listing the files by using the following command. I really don't know how to copy the files by using -mtime. I'm new to linux please help me.
find . -mtime -90 -exec cp {} targetdir \;
but how to copy directories with subdirectories and files too? (but do not use command rsync, i don;t have it with this instance) Regards S.
Copy needs a recursive option specified to handle the subdirectories
$ find testroot # shows some dirs and files
testroot
testroot/sub1
testroot/sub1/subtestfile
testroot/sub2
testroot/testf
$ find target # empty at this stage
target
$ find ./testroot/ -exec cp -R {} target/ \;
$ find target
target
target/sub1
target/sub1/subtestfile
target/sub2
target/subtestfile
target/testf

how to print the progress of the files being copied in bash [duplicate]

I suppose I could compare the number of files in the source directory to the number of files in the target directory as cp progresses, or perhaps do it with folder size instead? I tried to find examples, but all bash progress bars seem to be written for copying single files. I want to copy a bunch of files (or a directory, if the former is not possible).
You can also use rsync instead of cp like this:
rsync -Pa source destination
Which will give you a progress bar and estimated time of completion. Very handy.
To show a progress bar while doing a recursive copy of files & folders & subfolders (including links and file attributes), you can use gcp (easily installed in Ubuntu and Debian by running "sudo apt-get install gcp"):
gcp -rf SRC DEST
Here is the typical output while copying a large folder of files:
Copying 1.33 GiB 73% |##################### | 230.19 M/s ETA: 00:00:07
Notice that it shows just one progress bar for the whole operation, whereas if you want a single progress bar per file, you can use rsync:
rsync -ah --progress SRC DEST
You may have a look at the tool vcp. Thats a simple copy tool with two progress bars: One for the current file, and one for overall.
EDIT
Here is the link to the sources: http://members.iinet.net.au/~lynx/vcp/
Manpage can be found here: http://linux.die.net/man/1/vcp
Most distributions have a package for it.
Here another solution: Use the tool bar
You could invoke it like this:
#!/bin/bash
filesize=$(du -sb ${1} | awk '{ print $1 }')
tar -cf - -C ${1} ./ | bar --size ${filesize} | tar -xf - -C ${2}
You have to go the way over tar, and it will be inaccurate on small files. Also you must take care that the target directory exists. But it is a way.
My preferred option is Advanced Copy, as it uses the original cp source files.
$ wget http://ftp.gnu.org/gnu/coreutils/coreutils-8.21.tar.xz
$ tar xvJf coreutils-8.21.tar.xz
$ cd coreutils-8.21/
$ wget --no-check-certificate wget https://raw.githubusercontent.com/jarun/advcpmv/master/advcpmv-0.8-8.32.patch
$ patch -p1 -i advcpmv-0.8-8.32.patch
$ ./configure
$ make
The new programs are now located in src/cp and src/mv. You may choose to replace your existing commands:
$ sudo cp src/cp /usr/local/bin/cp
$ sudo cp src/mv /usr/local/bin/mv
Then you can use cp as usual, or specify -g to show the progress bar:
$ cp -g src dest
A simple unix way is to go to the destination directory and do watch -n 5 du -s . Perhaps make it more pretty by showing as a bar . This can help in environments where you have just the standard unix utils and no scope of installing additional files . du-sh is the key , watch is to just do every 5 seconds.
Pros : Works on any unix system Cons : No Progress Bar
To add another option, you can use cpv. It uses pv to imitate the usage of cp.
It works like pv but you can use it to recursively copy directories
You can get it here
There's a tool pv to do this exact thing: http://www.ivarch.com/programs/pv.shtml
There's a ubuntu version in apt
How about something like
find . -type f | pv -s $(find . -type f | wc -c) | xargs -i cp {} --parents /DEST/$(dirname {})
It finds all the files in the current directory, pipes that through PV while giving PV an estimated size so the progress meter works and then piping that to a CP command with the --parents flag so the DEST path matches the SRC path.
One problem I have yet to overcome is that if you issue this command
find /home/user/test -type f | pv -s $(find . -type f | wc -c) | xargs -i cp {} --parents /www/test/$(dirname {})
the destination path becomes /www/test/home/user/test/....FILES... and I am unsure how to tell the command to get rid of the '/home/user/test' part. That why I have to run it from inside the SRC directory.
Check the source code for progress_bar in the below git repository of mine
https://github.com/Kiran-Bose/supreme
Also try custom bash script package supreme to verify how progress bar work with cp and mv comands
Functionality overview
(1)Open Apps
----Firefox
----Calculator
----Settings
(2)Manage Files
----Search
----Navigate
----Quick access
|----Select File(s)
|----Inverse Selection
|----Make directory
|----Make file
|----Open
|----Copy
|----Move
|----Delete
|----Rename
|----Send to Device
|----Properties
(3)Manage Phone
----Move/Copy from phone
----Move/Copy to phone
----Sync folders
(4)Manage USB
----Move/Copy from USB
----Move/Copy to USB
There is command progress, https://github.com/Xfennec/progress, coreutils progress viewer.
Just run progress in another terminal to see the copy/move progress. For continuous monitoring use -M flag.

how to set folder path for gtags

I am new to gtags, and have a question. I have a big project, such as android AOSP, I want gtags to parse some folders, how can I achieve it with gtags? I searched and got solution:
use -f option with gtags, it seems doesn't support folders
Is there any good idea that I can set the folders path and gtags only process those folders?
UPDATE: author of the question came up with a better solution in the comments. I'm adding it here so it's easier to find:
.. create tag file in the sub-directories I need, and add the directories
to GTAGSLIBPATH when loading the project,
My answer:
You can limit what gtags indexes by adding list of files/directories to skip keyword in ~/.globalrc or /etc/gtags.conf. Here's a sample gtags.conf file.
The problem is that often global/gtags packages don't install gtags.conf (at least it's not there in global-5.7.1-2 on ubuntu 12.04), so you'll need to either get it from global source distribution, or use someone else's gtags.conf as a reference. For instance here.
Something like this should work. Note that leading / means from the top of the tree. Without it gtags will skip matching entries anywhere in the tree.:
common:\
:skip=/skip-this-dir/,/lib/and-this/,/include/and-this-one-too/:
The -f option is premised on find(1). Please try the followings.
$ find folder1 folder2 folder3 -type f -print | gtags -f -
or
$ find folder1 folder2 folder3 -type f -print >gtags.files
$ gtags
This is my bash function to get rid of files and paths including 'dummy' and 'win':
function gtagsupdate {
find . -name "*.c" -o -name "*.cpp" -o -name "*.h" -o -name "*.hpp" | grep -v dummy | grep -v win | gtags -f -
}

unix - delete files only from directory

Say with a directory structure such as:
toplev/
file2.txt
file5.txt
midlev/
test.txt
anotherdirec/
other.dat
myfile.txt
furtherdown/
morefiles.txt
otherdirec/
myfile4.txt
file7.txt
How would you delete all files (not directories and not recursively) from the 'anotherdirec'? In this example it would delete 2 files (other.dat, myfile.txt)
I have tried the below command from within the 'midlev' directory but it gives this error (find: bad option -maxdepth find: [-H | -L] path-list predicate-list):
find anotherdirec/ -type f -maxdepth 1
I'm running SunOS 5.10.
rm anotherdirec/*
should work for you.
Rob's answer (rm anotherdirec/*) will probably work, but it is a bit verbose and generates a bunch of error messages. The problem is that you are using a version of find that does not support the -maxdepth option. If you want to avoid the error messages that 'rm anotherdirec/*' gives, you can just do:
for i in anotherdirec/*; do test -f $i && rm $i; done
Note that neither of these solutions will work if any of the files contain spaces or other special characters. You can put double quotes around $i if that is an issue.
Find is sensitive to options order. Try this:
find anotherdirec/ -maxdepth 1 -type f -exec rm {} \;
rm toplev/midlev/anotherdirec/* if you want to delete only files.
rm -rf toplev/midlev/anotherdirec/* if you want to delete files and lower directories