I created a symlink, and I'm trying to remove it using rm /WebRoot it but I get the error Is a directory, but if I try rmdir /WebRoot I get an error Not a directory.
Does anyone know what's going on here?
You need to use rm foo with no trailing slash.
Related
I got
$ gsutil ls gs://ml_models_c/ref7/test/model/2/
gs://ml_models_c/ref7/test/model/2/ <= why this?
gs://ml_models_c/ref7/test/model/2/saved_model.pb
gs://ml_models_c/ref7/test/model/2/variables/
$ gsutil ls gs://seldon-models/tfserving/mnist-model/1/
gs://seldon-models/tfserving/mnist-model/1/saved_model.pb
gs://seldon-models/tfserving/mnist-model/1/variables/
Why there is gs://ml_models_c/ref7/test/model/2/ in the first command output?
Why the second command does not return itself?
It seems that I can rm it.
Thanks
At the API level, Cloud Storage doesn't have the concept of folders, everything is stored as long file names that might have slashes in them.
In this case, you likely have an object named gs://ml_models_c/ref7/test/model/2/, but no object named gs://seldon-models/tfserving/mnist-model/1/
If you don't need the gs://ml_models_c/ref7/test/model/2/ object, you can delete it and it will no longer show in the results for gsutil ls
I'm unable to use the rm command to remove files remotely from another directory. I'm a beginner so I apologise for my inability to elaborate properly.
Here's what I'm trying to do:
I'm trying to delete all .srt files from a sub directory. It works when I cd into the specific directory like so:
Command 1:
cd /users/jakubdonovan/library/cloudstorage/iCloud\ drive/the-modern-python3-bootcamp/target_folder
Command 2:
rm *.srt
However, let's say I want to quickly delete a specific file type from a folder without first using the "cd" command, like so:
rm *.srt /users/jakubdonovan/library/cloudstorage/iCloud\ drive/the-modern-python3-bootcamp/target_folder
It returns with "No matches for wildcard '*.srt'. See help expand."
Which is strange because I can use the touch, cp and and all the other commands remotely without a problem.
Is there a way to make the command "rm *.filetype" remove all the files with that specific filetype from a folder and all its subfolders in one swoop?
If you would like to rm in a sub-directory you just have to specify that sub-directory in the command.
rm /path/to/folder/*.filetype
or if you know that the folder is inside your current directory you can try...
rm ./folder/*.filetype
I am trying to move a file from Downloads to a folder on my desktop.
I keep getting this:
Why does the usage message appear afterwards?
It looks like your file has a space in it, so it needs to be escaped. Otherwise, mv tries to find a file named "Tres", one named "Beijos_C.pdf" and move them into that directory.
# Either...
mv Tres\ Beijos_C.pdf ~/Desktop/choro/
# ...or...
mv "Tres Beijos_C.pdf" ~/Desktop/choro/
If your file name contains spaces, you should surround it with " like this:
mv "my file name.txt" /home/user/Desktop
I want to delete all files in a folder, which contain he word TRAR in their filename.. I hav etried the following :
CONFIG_DIR=`pwd`
VENDOR=ericsson-msc
RELEASE=v1
BASE_DIR=/appl/virtuo/gways
system ("cd /appl/virtuo/gways/config/ericsson-msc/v1/spool/input_d; rm-rf *TRAR");
remove all your config lines ( are they even perl? )
CONFIG_DIR=`pwd`
VENDOR=ericsson-msc
RELEASE=v1
BASE_DIR=/appl/virtuo/gways
and
system ("cd /appl/virtuo/gways/config/ericsson-msc/v1/spool/input_d; rm -rf *TRAR")
should work but you should really be using perl code (unlink, etc)
I suspect you are confusing the usage of perl with how you will use awk in bash scripts.
As #Steffen Ullrich said, that isn't Perl or Shell. But I'll try to make it a little more Perlish for you:
First, note that
variables in Perl start with a $
strings need "quotes around them"
statements end with a ;
spaces around = are ok and make it all easier to read
so
$CONFIG_DIR = `pwd`;
$VENDOR = "ericsson-msc";
$RELEASE = "v1";
$BASE_DIR = "/appl/virtuo/gways";
Next, see how you can combine these into a single string like this (I'm guessing that's what you want to do)
$DIR_FOR_CLEANING = "$BASE_DIR/config/$VENDOR/$RELEASE/spool/input_d";
Lastly, you should be really careful whenever using the -r command to rm along with a wildcard like *. Look up the man page for rm and see if -r is something you want to do. I don't think you need it here, unless you have directories named *TRAR that you want to recurse into to remove. I'll bet you only have files named *TRAR in that input_d directory.
Also, the command the way you wrote it could fail the cd if that directory doesn't exist, and would then proceed to recursively remove *TRAR from whatever directory you're running the script from. But you don't need to change directories at all. Try something like this
system ("echo rm -f $DIR_FOR_CLEANING/*TRAR");
If the echo command lists the files you do in fact want it to remove, then remove the "echo" and the rm will start deleting stuff.
I'm using find-name-dired to find a bunch of files (all with .orig file ending)
I would then like to mark all the files in the resulting *Find* buffer for deletion then delete them
Unfortunately they are root owned, so the delete fails due to lack of permissions
Is there some workaround here, tramp or something like that?
You can presumably mark the files, then use ! sudo rm
You can do this using sudo through tramp. When find-name-dired prompts for the directory name, modify it and put /sudo:: at the start. E.g. change /foo/bar into /sudo::/foo/bar. (Take care of relative paths and ~ paths.) It will prompt for your sudo password, and then you should be able to delete files as usual.