So I've aliased the commands rm/cp/mv to use interactive (-i) mode by default to avoid accidentally deleting things, but sometimes this is pretty inconvenient.
I would like to be able to say 'y' to all the prompts of the form:
mv: overwrite `file_1'? y
mv: overwrite `file_2'? y
without typing 'y' many many times. Is there a way to do this?
Use command mv to circumvent the alias and use --force if that doesn't stop mv from bothering you with questions (e.g., because of permissions).
yes | aCommand is the standard way to supply lots of y's to aCommand, but in this case that seems unnecessary.
you can use back slash before cp
\cp -rf /from/directory/* to/directory
Related
I am basically looking for a way to do this
list=$(command)
while read -r arg
do
...
done <<< "$list"
Using sh intead of bash. The code as it is doesn't run because of the last line:
syntax error: unexpected redirection
Any fixes?
Edit: I need to edit variables and access them outside the loop, so using | is not acceptable (as it creates a sub-shell with independent scope)
Edit 2: This question is NOT similar to Why does my Bash counter reset after while loop as I am not using | (as I just noticed in the last edit). I am asking for another way of achiving it. (The answers to the linked question only explain why the problem happens but do not provide any solutions that work with sh (no bash).
There's no purely syntactic way to do this in POSIX sh. You'll need to use either a temporary file for the output of the command, or a named pipe.
mkfifo output
command > output &
while read -r arg; do
...
done < output
rm output
Any reason you can't do this? Should work .. unless you are assigning any variables inside the loop that you want visible when it's done.
command |
while read -r arg
do
...
done
I want to delete all files in a folder, which contain he word TRAR in their filename.. I hav etried the following :
CONFIG_DIR=`pwd`
VENDOR=ericsson-msc
RELEASE=v1
BASE_DIR=/appl/virtuo/gways
system ("cd /appl/virtuo/gways/config/ericsson-msc/v1/spool/input_d; rm-rf *TRAR");
remove all your config lines ( are they even perl? )
CONFIG_DIR=`pwd`
VENDOR=ericsson-msc
RELEASE=v1
BASE_DIR=/appl/virtuo/gways
and
system ("cd /appl/virtuo/gways/config/ericsson-msc/v1/spool/input_d; rm -rf *TRAR")
should work but you should really be using perl code (unlink, etc)
I suspect you are confusing the usage of perl with how you will use awk in bash scripts.
As #Steffen Ullrich said, that isn't Perl or Shell. But I'll try to make it a little more Perlish for you:
First, note that
variables in Perl start with a $
strings need "quotes around them"
statements end with a ;
spaces around = are ok and make it all easier to read
so
$CONFIG_DIR = `pwd`;
$VENDOR = "ericsson-msc";
$RELEASE = "v1";
$BASE_DIR = "/appl/virtuo/gways";
Next, see how you can combine these into a single string like this (I'm guessing that's what you want to do)
$DIR_FOR_CLEANING = "$BASE_DIR/config/$VENDOR/$RELEASE/spool/input_d";
Lastly, you should be really careful whenever using the -r command to rm along with a wildcard like *. Look up the man page for rm and see if -r is something you want to do. I don't think you need it here, unless you have directories named *TRAR that you want to recurse into to remove. I'll bet you only have files named *TRAR in that input_d directory.
Also, the command the way you wrote it could fail the cd if that directory doesn't exist, and would then proceed to recursively remove *TRAR from whatever directory you're running the script from. But you don't need to change directories at all. Try something like this
system ("echo rm -f $DIR_FOR_CLEANING/*TRAR");
If the echo command lists the files you do in fact want it to remove, then remove the "echo" and the rm will start deleting stuff.
I'm using find-name-dired to find a bunch of files (all with .orig file ending)
I would then like to mark all the files in the resulting *Find* buffer for deletion then delete them
Unfortunately they are root owned, so the delete fails due to lack of permissions
Is there some workaround here, tramp or something like that?
You can presumably mark the files, then use ! sudo rm
You can do this using sudo through tramp. When find-name-dired prompts for the directory name, modify it and put /sudo:: at the start. E.g. change /foo/bar into /sudo::/foo/bar. (Take care of relative paths and ~ paths.) It will prompt for your sudo password, and then you should be able to delete files as usual.
I am looking for a safe and reliable way to overwrite ONE line in a text file. I don't care if it's using sed, grep, perl whatever. It just needs to be portable and reliable. Specifically what I am trying to do is replace the value of a variable I have saved in a text file at runtime. Let's say I have a file named variables.txt which contains a line that reads userName=stephen. Let's say my program wants to change the userName to frank. Here's what I've come up with using sed:
sed -i '' 's/userName.*/userName=frank/' variables.txt
The concern I have with this is that I've read that on some versions of sed using the '-i' switch without specifying a backup file could cause the command to fail or risk possible file corruption. Not an option. What do you guys think?
Edit
For those asking where I read about command failure and file corruption. The manpage for my version of sed recommends against providing an empty value for the -i switch as well as take a look at the comments on this page here:
It seems that some versions of sed require the argument after -i and others do not. With GNU sed version 4.1.x, it seems that the -i does not require an argument and specifying an empty argument after it actually fails.
It sounds like the unanimous recommendation is to provide a backup file and then delete it after the command completes. However I'm still concerned about this solution since my version of sed doesn't even support the --version switch. My primary concern here is that the solution is both reliable and portable.
t=`tempfile`
sed -e 's/userName.*/userName=frank/' variables.txt >$tempfile
cp $tempfile >variables.txt
rm $tempfile
you can also use mv but that won't preserve file rights
if tempfile is not available then use some other method ($$.bak) to create the filename.
As long as you don't run on windows, sed -i is as safe as anything. Even if the machine crashes mid-process, variables.txt will either have the old content or the new content -- it should never be missing or corrupt.
The concern I have with this is that I've read that on some versions of sed using the '-i' switch without specifying a backup file could cause the command to fail or risk possible file corruption.
Where have you read it?
It is not strictly true: this is not an usual problem in sed but all programs (including sed) can fail and in very unlikely situations corrupt data. So you should not be too afraid of data corruption with sed.
Anyway, why do you not use -i with an extension (such as -i.bak) for granting more safety? In any case you can erase the backup file with rm...
You could copy variables.txt to variables.txt.bak before running the command if you wanted to keep a backup. Or go the other way around:
sed 's/userName.*/userName=frank/' variables.txt > variables.txt.fixed
if [ $? -eq 0 ]
then
cp variables.txt.fixed variables.txt
fi
I am asked to diff two directories using Perl but I think something is wrong with my command,
$diff = system("sudo diff -r '/Volumes/$vol1' '/Volumes/$vol2\\ 1/' >> $diff.txt");
It doesn't display and output. Can someone help me with this? Thanks!
It seems that you want to store all differences in a string.
If this is the case, the command in the question is not going to work for a few reasons:
It's hard to tell whether it's intended or not, but the $diff variable is being used to set the filename storing the differences. Perhaps this should be diff.txt, not $diff.txt
The result of the diff command is saved in $diff.txt. It doesn't display anything in STDOUT. This can be remedied by omitting the >> $diff.txt part. If it also needs to be stored in file, consider the tee command:
sudo diff -r dir1/ dir2/ | tee diff.txt
When a system call is assigned to a variable, it will return 0 upon success. To quote the documentation:
The return value is the exit status of the program as returned by the wait call.
This means that $diff won't store the differences, but the command exit status. A more sensible approach would be to use backticks. Doing this will allow $diff to store whatever is output to STDOUT by the command:
my $diff = `sudo diff -r dir1/ dir2/ | tee diff.txt`; # Not $diff.txt
Is it a must to use the sudo command? Avoid using it if even remotely possible:
my $diff = `diff -r dir1/ dir2/ | tee diff.txt`; # Not $diff.txt
A final recommendation
Let a good CPAN module take care of this task, as backtick calls can only go so far. Some have already been suggested here; it may be well worth a look.
Is sudo diff being prompted for a password?
If possible, take out the sudo from the invocation of diff, and run your script with sudo.
"It doesn't display and output." -- this is becuase you are saving the differences to a file, and then (presumably) not doing anything with that resulting file.
However, I expect "diff two directories using Perl" does not mean "use system() to do it in the shell and then capture the results". Have you considered doing this in the language itself? For example, see Text::Diff. For more nuanced control over what constitutes a "difference", you can simply read in each file and craft your own algorithm to perform the comparisons and compile the similarities and differences.
You might want to check out Test::Differences for a more flexible diff implementation.