Find and soft link without the parent path - find

So I have a find command as below which finds the libclntsh.so.* files in a directory instantclient.
find instantclient -type f -name "*libclntsh\.so\.[0-9]*\.[0-9]*"
This results in for e.g.,
instantclient/libclntsh.so.11.1
How do I now ln within instantclient directory, ln -s libclntsh.so.11.1 libclntsh.so all with a find command in combination with exec
I should mention here that I DO NOT want to cd into instantclient.
And this is for Alpine Linux.

Use the -execdir option. As per manual:
-execdir command {} ;
Like -exec, but the specified command is run from the subdirectory containing the matched file, which is not normally the directory in which you started find. This a much more secure method for invoking commands, as it avoids race conditions during resolution of the paths to the matched files.
So your command will be:
find instantclient -type f -name "*libclntsh\.so\.[0-9]*\.[0-9]*" -execdir ln -s {} libclntsh.so \;
EDIT:
Another solution
find instantclient -type f -name "*libclntsh\.so\.[0-9]*\.[0-9]*" | xargs -I {} sh -c 'ln -s $(basename {}) instantclient/libclntsh.so'

Related

excluded directories in find command not properly piped to -exec cp

I am trying to copy a folder containing a subfolder structure, while excluding a specified subfolder by using the find -exec cp command. I have managed to use multiple working excluding options when I am using the find command alone, but once I add the '-exec cp' command, the excluding terms work no longer.
Imagine the directory of interest containing multiple files and subfolders, with one subfolder named "exclusion_string"
This find command works properly when used alone:
find ~/directory/of/interest/ -maxdepth 2 ! -name "*exclusion_string*"
... while this command negates the exclusion criterium:
find ~/directory/of/interest/ -maxdepth 2 ! -name "*exclusion_string*" -exec cp -r '{}' . \;
Likewise, when using other criteria or arguments, the exclusion of a subdirectory is lost, E.g:
find ~/directory/of/interest/ -maxdepth 2 -name "*" -size -100k -exec cp -r '{}' . \;
find ~/directory/of/interest/ -maxdepth 2 -name "*exclusion_string*" | xargs cp -rt .
What am I missing here?

find command behaves weird on AIX

Found something weird with the find command on AIX 7.1. I executed find command to list files of specific permissions. It works when I execute as below
find . -type f -perm 600 -exec ls -la {} \;
But fails to pull the files when I execute as below
find /user -type f -perm 600 -exec ls -la {} \;
Is anyone familiar with this kind of error?
Thanks,

How to copy a file on several directories with the name *Co* (where *=wildcard)

How to copy a file to several directories of the form *Co*? or *52?
Apparently, just typing
cp fileA *Co*
won't work.
My other concern is that if a directory already contains fileA, I don't want it to be overwritten. That is, if the directory *Co* contains fileA, do NOT copy. Is there a one line solution for this, since I think writing a script with if-else is an overkill.
Thanks!
If your version of cp supports -n, you can do:
find . -name '*Co*' -exec cp -n fileA {} \;
If not:
find . -name '*Co*' -exec sh -c 'test -f $0/fileA || cp fileA $0' {} \;
Note that these will each descend recursively: if you don't want that you can limit the scope of find. To find either Co or *52, you can do:
find . \( -name '*Co*' -o -name '*52' \) -exec ...

How to create links to all subfolders containing specified text in their names

As specified in title I am looking for a way how to create links to all subfolders containing specified text in their names, so for example for all subfolders of root directory containing ".app" in their names an link will be created to "/AppLinks" directory. I would like to use it in bash script (open source, free).
Does anyone know how to do that?
I searched it by google with no luck.
find yourdir -type d -name '*.app' -exec ln -s {} /AppLinks \;
Find all directories named something.app in yourdir, and create a symlink to them in /AppLinks.
single line bash-fu
function FUNCsymlink() { echo "$1"; fileName=`basename "$1"`; ln -s "$1" "/AppLinks/$fileName"; }; export -f FUNCsymlink; find `pwd`/ -maxdepth 1 -type d -iname "*.app" -exec bash -c "FUNCsymlink '{}'" \;
to easy reading:
function FUNCsymlink() {
echo "$1";
fileName=`basename "$1"`;
ln -s "$1" "/AppLinks/$fileName";
};
export -f FUNCsymlink;
find `pwd`/ -maxdepth 1 -type d -iname "*.app" -exec bash -c "FUNCsymlink '{}'" \;
you may have to adjust it a bit for your specific solution.
wherever you run it, it will create the symlinks to /AppLinks
it will only look for direct subfolders, not subfolders of subfolders, thats what I believe you need..

Query ragarding Solaris find command with -exec option

I want to create tar file with all the output files resulting from executing find command.
I tried the following command:
find . \(-name "*.log" -o -name "*.log.*" \) -mtime +7 -exec tar cvf test.tar.gz {} \;
But it is including only the last found file in the test.tar file. How to include all files in test.tar file?
Regards
Chaitanya
Use command line substitution:
tar cf test.tar $(find . \(-name "*.log" -o -name "*.log.*" \) -mtime +7)
What this does is run the command in $() and makes the output the command line arguments of the outer command.
This uses the more modern bash notation. If you are not using bash, you can also use backticks which should work with most shells:
tar cf test.tar `find . \(-name "*.log" -o -name "*.log.*" \) -mtime +7`
While backticks are more portable, the $() notation is easier if you need to nest command line substitution.
You want to pipe the file names found by find into tar.
find . \(-name "*.log" -o -name "*.log.*" \) -mtime +7 -exec tar cvf test.tar.gz {} \;
But it is including only the last found file in the test.tar file.
That's because for every file it finds it is running a new tar command that overwrites the tar file from the previous command.
You can make find batch the files together by changing the \; to a + but if there's more
files than can be listed at once, find will still run multiple commands, each overwriting the tar file from the previous one. You could pipe the output through xargs but it has the same issue of possibly running the command multiple times. The command line substitution recommended above is the safest way I know of to do it, ensuring that tar can only be called once -- but if too many files are found, it may give an error about the command line being too long.
This one should equally work:
find . -name "*.log" -o -name "*.log.*" -mtime +7 -exec tar cvf test.tar {} +
Note the "+" at the end vs "\;".
For a reliable way when a very large number of files will match the search:
find . -name "*.log" -o -name "*.log.*" -mtime +7 > /tmp/find.out
tar cvf test.tar -I /tmp/find.out