Unable to delete thousands of files within a folder in terminal - command-line

I'm trying to delete files inside a certain folder but it's throwing an error:
rm -rf /usr/html/sched/downloads/*
-bash: /bin/rm: Argument list too long
I searched online and found this solution but I'm afraid to try it being a production server and I don't know how to put the path correctly:
find . -name '*' | xargs rm -v
How can I delete thousands of files within the /downloads director? FYI, there's no sub-directory.

I think here you can check how you can handle it because for a large scale of files you will need to do it by a specific quantity by milliseconds.
find ./cache -mtime +0.5 -print0 | xargs -0 rm -f
Faster way to delete a large number of files [duplicate]

Related

Copying the files and SUBDIRECTORIES based on modification date?

It may be a duplicate question but i could not find the solution for this i want to copy a last 3 months files AND subdirectories from one disk to andother but i could find only to listing the files by using the following command. I really don't know how to copy the files by using -mtime. I'm new to linux please help me.
find . -mtime -90 -exec cp {} targetdir \;
but how to copy directories with subdirectories and files too? (but do not use command rsync, i don;t have it with this instance) Regards S.
Copy needs a recursive option specified to handle the subdirectories
$ find testroot # shows some dirs and files
testroot
testroot/sub1
testroot/sub1/subtestfile
testroot/sub2
testroot/testf
$ find target # empty at this stage
target
$ find ./testroot/ -exec cp -R {} target/ \;
$ find target
target
target/sub1
target/sub1/subtestfile
target/sub2
target/subtestfile
target/testf

Delete multiple files in CENTOS command

How can i delete all the file that are ending with *0x0.jpg in CENTOS ? I need to delete multiple files nested into folders and subfolders
I assume you have a shell - try
find /mydirectory -type f -print | grep '0x0.jpg$' | xargs -n1 rm -f
There is probable a more elegant solution but that should work
However I would put an echo in before rm on the first run to ensure that the right files are going to be removed.
Ed Heal's answer works just fine but neither the grep nor xargs calls are necessary. The following should work just as well and be a good bit more efficient for large amounts of files.
find /mydirectory -name '*0x0.jpg' -type f -exec rm -rf () \+

delete files in a folder older than x minutes via terminal

I want to repeatedly delete files from a folder that are older than some minutes. the reason behind it is a webcam, that constantly delivers jpegs to a folder. that folder is "watched" by an ftp program which mirrors all changes made - also the deletion.
so I tried
find Documents/GoProUpload/* -iname "*.JPG" -ctime +120s -print0 | xargs -0 -n1
as well as -atime and -mtime, but nothing is printed. i also checked the same command without the -ctime parameter and then i get all the files.
I also tried it with -mmin:
find Documents/GoProUpload/* -iname "*.JPG" -maxdepth 1 -mmin +2
but also nothing. why could that be?

unix - delete files only from directory

Say with a directory structure such as:
toplev/
file2.txt
file5.txt
midlev/
test.txt
anotherdirec/
other.dat
myfile.txt
furtherdown/
morefiles.txt
otherdirec/
myfile4.txt
file7.txt
How would you delete all files (not directories and not recursively) from the 'anotherdirec'? In this example it would delete 2 files (other.dat, myfile.txt)
I have tried the below command from within the 'midlev' directory but it gives this error (find: bad option -maxdepth find: [-H | -L] path-list predicate-list):
find anotherdirec/ -type f -maxdepth 1
I'm running SunOS 5.10.
rm anotherdirec/*
should work for you.
Rob's answer (rm anotherdirec/*) will probably work, but it is a bit verbose and generates a bunch of error messages. The problem is that you are using a version of find that does not support the -maxdepth option. If you want to avoid the error messages that 'rm anotherdirec/*' gives, you can just do:
for i in anotherdirec/*; do test -f $i && rm $i; done
Note that neither of these solutions will work if any of the files contain spaces or other special characters. You can put double quotes around $i if that is an issue.
Find is sensitive to options order. Try this:
find anotherdirec/ -maxdepth 1 -type f -exec rm {} \;
rm toplev/midlev/anotherdirec/* if you want to delete only files.
rm -rf toplev/midlev/anotherdirec/* if you want to delete files and lower directories

how to prevent "find" from dive deeper than current directory

I have many directory with lots of files inside them.
I've just compressed that directory respectively become filename.tar.gz, someothername.tar.gz, etc.
After compressing, I use this bash to delete everything except file name contains .tar.gz:
find . ! -name '*.tar.gz*' | xargs rm -r
But the problem is find will dive too deep inside the directory. Because the directory has been deleted but find will dive deep in each directory, many messages displayed, such as:
rm: cannot remove `./dirname/index.html': No such file or directory
So how to prevent find from dive deeper than this level (current directory)?
You can use ls instead of find for your problem:
ls | grep -v .tar.gz | xargs rm -rf
You can tell find the max depth to recurse:
find -maxdepth 1 ....