Say with a directory structure such as:
toplev/
file2.txt
file5.txt
midlev/
test.txt
anotherdirec/
other.dat
myfile.txt
furtherdown/
morefiles.txt
otherdirec/
myfile4.txt
file7.txt
How would you delete all files (not directories and not recursively) from the 'anotherdirec'? In this example it would delete 2 files (other.dat, myfile.txt)
I have tried the below command from within the 'midlev' directory but it gives this error (find: bad option -maxdepth find: [-H | -L] path-list predicate-list):
find anotherdirec/ -type f -maxdepth 1
I'm running SunOS 5.10.
rm anotherdirec/*
should work for you.
Rob's answer (rm anotherdirec/*) will probably work, but it is a bit verbose and generates a bunch of error messages. The problem is that you are using a version of find that does not support the -maxdepth option. If you want to avoid the error messages that 'rm anotherdirec/*' gives, you can just do:
for i in anotherdirec/*; do test -f $i && rm $i; done
Note that neither of these solutions will work if any of the files contain spaces or other special characters. You can put double quotes around $i if that is an issue.
Find is sensitive to options order. Try this:
find anotherdirec/ -maxdepth 1 -type f -exec rm {} \;
rm toplev/midlev/anotherdirec/* if you want to delete only files.
rm -rf toplev/midlev/anotherdirec/* if you want to delete files and lower directories
Related
I tried to copy Copy directories and their subdirectories created a day ago as follows:
find /application/work/ -type d -mtime -1 -exec cp -r {} /tmp/backup \;
But it is copying all directories (Not only the ones created a day ago).
Would you please advise?
find is also finding the working directory /application/work/ and is copying it, see How to exclude this / current / dot folder from find "type d". Since you're executing cp -r, it recursively copies everything in . before also copying the subset of directories you've found via -mtime. You need to set the -mindepth to exclude the working directory from the paths on which find will operate.
Modify your command to:
find /application/work -mindepth 1 -type d -mtime -1 -exec cp -r {} /tmp/backup \;
I'm trying to delete files inside a certain folder but it's throwing an error:
rm -rf /usr/html/sched/downloads/*
-bash: /bin/rm: Argument list too long
I searched online and found this solution but I'm afraid to try it being a production server and I don't know how to put the path correctly:
find . -name '*' | xargs rm -v
How can I delete thousands of files within the /downloads director? FYI, there's no sub-directory.
I think here you can check how you can handle it because for a large scale of files you will need to do it by a specific quantity by milliseconds.
find ./cache -mtime +0.5 -print0 | xargs -0 rm -f
Faster way to delete a large number of files [duplicate]
So I have a directory called testdir and I want to delete the underscores in the filenames from the files in that directory.
I tried to use this command
find testdir -type f -exec ls {} \; | sed 's/_*//'
It will output the filenames without the underscores but it won't delete the underscores permanently. Could anyone help me?
Thanks!
If you are just looping through the files in your dir, use a simple loop:
while IFS= read -r file
do
echo mv "$file" "${file//_/}" #once you are sure it works, remove the echo!
done < <(find -type f -name "*_*")
This will feed the while loop with the output of the find command. Then, uses ${var//_/} to remove all _ in the name.
Why wasn't your approach working?
Because you are saying
find ... -exec ls {} \; | sed '...'
That is, you are finding something and then changing the output with sed. That is, nothing is done to the file itself.
This may help you
find testdir -type f | rename 's/_//'
Regards
How can i delete all the file that are ending with *0x0.jpg in CENTOS ? I need to delete multiple files nested into folders and subfolders
I assume you have a shell - try
find /mydirectory -type f -print | grep '0x0.jpg$' | xargs -n1 rm -f
There is probable a more elegant solution but that should work
However I would put an echo in before rm on the first run to ensure that the right files are going to be removed.
Ed Heal's answer works just fine but neither the grep nor xargs calls are necessary. The following should work just as well and be a good bit more efficient for large amounts of files.
find /mydirectory -name '*0x0.jpg' -type f -exec rm -rf () \+
I am new to gtags, and have a question. I have a big project, such as android AOSP, I want gtags to parse some folders, how can I achieve it with gtags? I searched and got solution:
use -f option with gtags, it seems doesn't support folders
Is there any good idea that I can set the folders path and gtags only process those folders?
UPDATE: author of the question came up with a better solution in the comments. I'm adding it here so it's easier to find:
.. create tag file in the sub-directories I need, and add the directories
to GTAGSLIBPATH when loading the project,
My answer:
You can limit what gtags indexes by adding list of files/directories to skip keyword in ~/.globalrc or /etc/gtags.conf. Here's a sample gtags.conf file.
The problem is that often global/gtags packages don't install gtags.conf (at least it's not there in global-5.7.1-2 on ubuntu 12.04), so you'll need to either get it from global source distribution, or use someone else's gtags.conf as a reference. For instance here.
Something like this should work. Note that leading / means from the top of the tree. Without it gtags will skip matching entries anywhere in the tree.:
common:\
:skip=/skip-this-dir/,/lib/and-this/,/include/and-this-one-too/:
The -f option is premised on find(1). Please try the followings.
$ find folder1 folder2 folder3 -type f -print | gtags -f -
or
$ find folder1 folder2 folder3 -type f -print >gtags.files
$ gtags
This is my bash function to get rid of files and paths including 'dummy' and 'win':
function gtagsupdate {
find . -name "*.c" -o -name "*.cpp" -o -name "*.h" -o -name "*.hpp" | grep -v dummy | grep -v win | gtags -f -
}