How to exclude symlink files from find command? - sed

I use the following command to do some processing on text files,however it changes the symlink files aswell,is there a way to exclude symlink files from the find command and include only the actual files?
find . \( -name "*.txt" ! -name "release.txt" \) | xargs -t -n1 sed -i '' -e '/^#/d;/^nocrc=/d;/acaddr=/d;/^$/d;s/[ \t]*$//'

Related

How to build cscope database for AOSP source?

I am using cscope -b -R command from AOSP root directory to build its database. I kept cscope running for more than 9 hours but its database is not created. There is no cscope.out file there. Is it stuck somewhere ?
Check this blog post : https://nativeguru.wordpress.com/2015/02/10/aosp-code-navigation-with-cscope/
You can first create the cscope.files file that contains all the file paths that contain the code you want to navigate, then use cscope command as below.
$ cd <aosp_root_dir>
$ find . -type f \( -name "*.java" -o -name "*.c" -o -name "*.cpp" -o -name "*.h" \) -and -not \( -path "./out/*" -o -path "./prebuilts/*" -o -path "./external/*" -o -path "./dalvik/*" -o -path "./ndk/*" \) > cscope.files
$ cscope -b -q -k

How to find and replace a string in multiple files in unix?

I can do the find and replace a string in multiple files with the below command.
find . -name '*.py' | xargs sed -i 's/foo/faa/g'
Can we do the same by using Perl?
Try this command...:
find . -name '*.py' | xargs perl -p -i -e 's/foo/faa/g'
N.B.: If you want to make a backup copy of your files before changing them, provide -i flag with an extension... I.E.: -i.bak...

Use grep / sed for filename search & replace?

I have a bunch of image files that were incorrectly named 'something#x2.png' and they need to be 'something#2x.png'. They're spread across multiple directories like so:
/images
something#x2.png
/icons
icon#x2.png
/backgrounds
background#x2.png
How can I use grep + sed to find/replace as needed?
Ruby(1.9+)
$ ruby -e 'Dir["**/*#x2.png"].each{|x| File.rename( x, x.sub(/#x2/,"#2x") ) }'
Look at qmv and rename
find -iname '*.png' -print0 | xargs -0 qmv -d
will launch your default editor and allow you to interactively edit the names
rename s/#x2/#2x/ *.png
Slashes look linuxy/unixoid to me. Do you have find and rename?
find -name "*#x2*" -execdir rename 's/#x2/#2x/' {} +
rename is worth installing, comes in some perl-package.
With bash 2.x/3.x
#!/bin/bash
while IFS= read -r -d $'\0' file; do
echo mv "$file" "${file/#x2/#2x}"
done < <(find images/ -type f -name "something*#x2*.png" -print0)
With bash 4.x
#!/bin/bash
shopt -s globstar
for file in images/**; do
[[ "$file" == something*#x2*.png ]] && echo mv "$file" "${file/#x2/#2x}"
done
Note:
In each case I left in an echo so you can do a dry-run, remove the echo if the output is sufficient

Query ragarding Solaris find command with -exec option

I want to create tar file with all the output files resulting from executing find command.
I tried the following command:
find . \(-name "*.log" -o -name "*.log.*" \) -mtime +7 -exec tar cvf test.tar.gz {} \;
But it is including only the last found file in the test.tar file. How to include all files in test.tar file?
Regards
Chaitanya
Use command line substitution:
tar cf test.tar $(find . \(-name "*.log" -o -name "*.log.*" \) -mtime +7)
What this does is run the command in $() and makes the output the command line arguments of the outer command.
This uses the more modern bash notation. If you are not using bash, you can also use backticks which should work with most shells:
tar cf test.tar `find . \(-name "*.log" -o -name "*.log.*" \) -mtime +7`
While backticks are more portable, the $() notation is easier if you need to nest command line substitution.
You want to pipe the file names found by find into tar.
find . \(-name "*.log" -o -name "*.log.*" \) -mtime +7 -exec tar cvf test.tar.gz {} \;
But it is including only the last found file in the test.tar file.
That's because for every file it finds it is running a new tar command that overwrites the tar file from the previous command.
You can make find batch the files together by changing the \; to a + but if there's more
files than can be listed at once, find will still run multiple commands, each overwriting the tar file from the previous one. You could pipe the output through xargs but it has the same issue of possibly running the command multiple times. The command line substitution recommended above is the safest way I know of to do it, ensuring that tar can only be called once -- but if too many files are found, it may give an error about the command line being too long.
This one should equally work:
find . -name "*.log" -o -name "*.log.*" -mtime +7 -exec tar cvf test.tar {} +
Note the "+" at the end vs "\;".
For a reliable way when a very large number of files will match the search:
find . -name "*.log" -o -name "*.log.*" -mtime +7 > /tmp/find.out
tar cvf test.tar -I /tmp/find.out

using find command to search for all files having some text pattern

I use following find command to find and show all files having the input text pattern.
find . -type f -print|xargs grep -n "pattern"
I have many project folders each of which has its own makefile named as 'Makefile'.(no file extension, just 'Makefile')
How do i use above command to search for a certain pattern only in the files named Makefile which are present in all my project folders?
-AD.
-print is not required (at least by GNU find implementation). -name argument allows to specify filename pattern. Hence the command would be:
find . -name Makefile | xargs grep pattern
If you have spaces or odd characters in your directory paths youll need to use the null-terminated method:
find . -name Makefile -print0 | xargs -0 grep pattern
find . -type f -name 'Makefile' | xargs egrep -n "pattern"
use egrep if you have very long paths
Duplicate of : this
You can avoid the use of xargs by using -exec:
find . -type f -name 'Makefile' -exec egrep -Hn "pattern" {} \;
-H on egrep to output the full path to the matching files.
grep -R "string" /path
Please find this link
http://rulariteducation.blogspot.in/2016/03/how-to-check-particluar-string-in-linux.html
you can use ff command i.e ff -p .format. For eg ff -p *.txt
Find big files occupying large disk space
we need to combine multiple command .
find . -type f | xargs du -sk | sort -n | tail;