I am using cscope -b -R command from AOSP root directory to build its database. I kept cscope running for more than 9 hours but its database is not created. There is no cscope.out file there. Is it stuck somewhere ?
Check this blog post : https://nativeguru.wordpress.com/2015/02/10/aosp-code-navigation-with-cscope/
You can first create the cscope.files file that contains all the file paths that contain the code you want to navigate, then use cscope command as below.
$ cd <aosp_root_dir>
$ find . -type f \( -name "*.java" -o -name "*.c" -o -name "*.cpp" -o -name "*.h" \) -and -not \( -path "./out/*" -o -path "./prebuilts/*" -o -path "./external/*" -o -path "./dalvik/*" -o -path "./ndk/*" \) > cscope.files
$ cscope -b -q -k
Related
I use the following command to do some processing on text files,however it changes the symlink files aswell,is there a way to exclude symlink files from the find command and include only the actual files?
find . \( -name "*.txt" ! -name "release.txt" \) | xargs -t -n1 sed -i '' -e '/^#/d;/^nocrc=/d;/acaddr=/d;/^$/d;s/[ \t]*$//'
I try to delete all .swp file which vim created with the following command:
find . -delete -name "*.swp" .
Then my whole project is deleted...
Can anyone tell me why? and how to recover the project?
if you change the command to this:
find . -name "*.swp" -delete
it'll only delete the file it match
find . -delete -name "*.swp"equals to rm -rf *
This issue is a warning message for programmer.
find predicates and actions form a boolean expression that is evaluated left to right with short-circuiting.
Your expression:
find . -delete -name "*.swp"
is equivalent to the bash expression:
rm "$file" && [[ $file == *.swp ]]
This deletes the file, and if the deletion is successful it checks the name.
Compare this to:
# Like: find . -name "*.swp" -delete
[[ $file == *.swp ]] && rm "$file"
In this case, it checks the name. If the check passes, it deletes the file. This is what you intended.
This behavior is super useful, because it allows you to write more advanced control flow and branching:
find . -name '.git' -prune \
-o \( -name '*.xz' -exec xz -d {} \; \
-o -name '*.gz' -exec gzip -d {} \; \) \
-printf 'Successfully extracted %f\n'
This expression will skip any .git directory, run xz -d on .xz files only, gzip -d on .gz files only, and finally print a message for the files that were extracted.
As for your files, they're deleted. It's often not possible to get them back. You'll have to restore them from your backup or, if you're desperate, try to follow a file un-deletion guide for your OS and filesystem (but again, it's often not possible).
this is my script where it is searching all the log files and zipping it alongwith deleting the older archive.
However, when i run this script i get the following error :
./file.sh: test: unknown operator .
Code:
#Directory of archives
archive_dr="/u01/apps/weblogic/weblogic10/user_projects/archive"
#Directory of log files
logdir="/u01/apps/weblogic/weblogic10/user_projects/domains/BPM/servers/BPM_MS1/logs"
cd $archive_dr
#Removing older archived files
if [ find . \( -name '*.log0*.gz' -o \
-name '*.out0*.gz' \) ]
then
rm *.out00*.gz *.log00*.gz
fi
cd $logdir
#Search,zip and move the new archive files
if [ find . \( -name '*.log0*' -o -name '*.out0*' \) \
-atime +30 ]
then
for log_files in `find . \( \
-name '*.log0*' -o -name '*.out0*' \
\) -atime +30`
do
gzip $log_files
mv $log_files.gz /u01/a*/w*/w*/us*/archive
done
if [$? = 0]; then
echo "Logs Archieved Successfully"|
mailx -s " Logs Archieved Successfully" \
-c 'x#abc.com' y#abc.com'
fi
Please suggest where i am going wrong ?
Change:
if [ find . \( -name '*.log0*.gz' -o -name '*.out0*.gz' \) ]; then
to:
if [ "$(find . \( -name '*.log0*.gz' -o -name '*.out0*.gz' \))" ]; then
You want to run the find command and test whether it returns any output. The test command (which is what [ is an abbreviation for) doesn't execute its contents, it expects it to be an expression to test, as in if [ "$foo" = 3 ].
Note also that find recurses into subdirectories, but you rm only in the current directory. If you don't want to recurse, add the -maxdepth 1 option.
There's no need for the second if. If that find doesn't find any files, the for loop will have nothing to operate on and will just terminate immediately.
Sorry, not able to edit the post properly. But, got it running , :)
CODE
#Directory of archives
archive_dr="/u01/apps/weblogic/weblogic10/user_projects/archive"
#Directory of log files
logdir="/u01/apps/weblogic/weblogic10/user_projects/domains"
cd $archive_dr
#Removing older archived files
find . \( -name '*.log00*.gz' -o -name '*.out00*.gz' \) -exec rm {} \;
cd $logdir
#Search,zip and move the new archive files
for log_files in `find . \( -name '*.log0*' -o -name '*.out0*' \) -ctime +5`
do
gzip $log_files
mv $log_files.gz /u01/a*/w*/w*/us*/archive
done
if [ $? = 0 ]; then
echo "text"|mailx -s "test" -c abc#def.com' mno#pqr.com'
fi
I got a list of files after find command.Now,it is required to remove the existing archive file from and append the files after command.
find /u01/apps/ ( -name '.log0' -o -name '.out0' ) -atime +30
returns a list of .out and .log files. Now, it is required to delete existing files in a tarball(manually created) and append the new files.
How can it be done?
I googled but couldnt find the appropriate result according to this requirement.
Do you mind using find command twice? The first one to remove files and then another one to add them?
This could help (maybe you should make some adjustments):
find /u01/apps/ ( -name '.log0' -o -name '.out0' ) -atime +30 -printf %P\\n | xargs tar -f foobar.tar --delete
And then:
find /u01/apps/ ( -name '.log0' -o -name '.out0' ) -atime +30 -printf %P\\n | xargs tar -f foobar.tar -r
I want to create tar file with all the output files resulting from executing find command.
I tried the following command:
find . \(-name "*.log" -o -name "*.log.*" \) -mtime +7 -exec tar cvf test.tar.gz {} \;
But it is including only the last found file in the test.tar file. How to include all files in test.tar file?
Regards
Chaitanya
Use command line substitution:
tar cf test.tar $(find . \(-name "*.log" -o -name "*.log.*" \) -mtime +7)
What this does is run the command in $() and makes the output the command line arguments of the outer command.
This uses the more modern bash notation. If you are not using bash, you can also use backticks which should work with most shells:
tar cf test.tar `find . \(-name "*.log" -o -name "*.log.*" \) -mtime +7`
While backticks are more portable, the $() notation is easier if you need to nest command line substitution.
You want to pipe the file names found by find into tar.
find . \(-name "*.log" -o -name "*.log.*" \) -mtime +7 -exec tar cvf test.tar.gz {} \;
But it is including only the last found file in the test.tar file.
That's because for every file it finds it is running a new tar command that overwrites the tar file from the previous command.
You can make find batch the files together by changing the \; to a + but if there's more
files than can be listed at once, find will still run multiple commands, each overwriting the tar file from the previous one. You could pipe the output through xargs but it has the same issue of possibly running the command multiple times. The command line substitution recommended above is the safest way I know of to do it, ensuring that tar can only be called once -- but if too many files are found, it may give an error about the command line being too long.
This one should equally work:
find . -name "*.log" -o -name "*.log.*" -mtime +7 -exec tar cvf test.tar {} +
Note the "+" at the end vs "\;".
For a reliable way when a very large number of files will match the search:
find . -name "*.log" -o -name "*.log.*" -mtime +7 > /tmp/find.out
tar cvf test.tar -I /tmp/find.out