Standalone Shell script working fine but when used is srcs of sh_binary its not working - sh

I have project structure as follows-
PROJECT_STRUCTURE
Now my_shbin.sh is as below -
#!/bin/bash
find ../../ \( -name "*.java" -o -name "*.xml" -o -name "*.html" -o -name "*.js" -o -name "*.css" \) | grep -vE "/node_modules/|/target/|/dist/" >> temp-scan-files.txt
# scan project files for offensive terms
IFS=$'\n'
for file in $(cat temp-scan-files.txt); do
grep -iF -f temp-scan-regex.txt $file >> its-scan-report.txt
done
This script works completely fine when invoked individually and gives required results.But when I add the below sh_binary in my BUILD file I do not see anything in temp-scan-files.txt file and thus nothing in its-scan-report.txt file
sh_binary(
name = "findFiles",
srcs = ["src/test/resources/my_shbin.sh"],
data = glob(["temp-scan-files.txt", "temp-scan-regex.txt", "its-scan-report.txt"]),
)
I ran sh_binary from intellij using the play icon and also tried running it from terminal using bazel run :findFiles. No error is shown but I cannot see data in temp-scan-files.txt.
Any help on this issue.The documentation of bazel is very confined with approx no information whatsoever except the use case.

When a binary is run using bazel run, it's run from the "runfiles tree" for that binary. The runfiles tree is a directory tree that bazel creates that contains symlinks to the binary's inputs. Try putting pwd and tree at the beginning of the shell script to see what this looks like. The reason that the runfiles tree doesn't contain any of the files in src/main is that they're not declared as inputs to the sh_binary (e.g. using the data attribute). See https://docs.bazel.build/versions/master/user-manual.html#run
Another thing to note is that the glob in data = glob(["temp-scan-files.txt", "temp-scan-regex.txt", "its-scan-report.txt"]), won't match anything, because those files are in src/test/resources relative to the BUILD file. However, the script tries to modify these files, and it's not typically possible to modify input files (if this sh_binary were being run as a build action, the inputs would be effectively read-only. This would work only because bazel run is similar to running the final binary by itself outside bazel, e.g. like bazel build //target && bazel-bin/target)
The most straight-forward way to do this might be something like this:
genrule(
name = "gen_report",
srcs = [
# This must be the first element of srcs so that
# the regex file gets passed to the "-f" of grep in cmd below.
"src/test/resources/temp-scan-regex.txt",
] + glob([
"src/main/**/*.java",
"src/main/**/*.xml",
"src/main/**/*.html",
"src/main/**/*.js",
"src/main/**/*.css",
],
exclude = [
"**/node_modules/**",
"**/target/**",
"**/dist/**",
]),
outs = ["its-scan-report.txt"],
# The first element of $(SRCS) will be the regex file, passed to -f.
cmd = "grep -iF -f $(SRCS) > $#",
)
$(SRCS) are the files in srcs delimited by a space, and $# means "the output file, if there's only one". $(SRCS) will contain the temp-scan-regex.txt file, which you probably don't want to include as part of the scan, but if it's the first element, then it will be the parameter to -f. This is maybe a bit hacky and a little fragile, but it's also kind of annoying to try to separate the file out (e.g. using grep or sed or array slicing).
Then bazel build //project/root/myPackage:its-scan-report.txt

Related

Autocomplete directories in a subfolder with the Fish shell

I'm having trouble getting the 'complete' function in the fish shell to behave as I would like and I've been searching for an answer for days now.
Summary
Essentially I need to provide tab directory auto-completion as if I was in a different directory to the one I am currently in. It should behave exactly as 'cd' and 'ls' do, but with the starting point in another directory. It seems like such a trivial thing to be able to do but I can't find a way to make it work.
Explanation
Example folder structure below
- root
- foo
- a
- dir1
- subdir1
- dir2
- subdir2
- b
- dir3
- subdir3
- dir4
- subdir4
I am running these scripts whilst in the 'root' directory, but I need tab auto-complete to behave as if I was in the 'foo' directory.
testfunc -d a/dir2/subdir2
Instead of
testfunc -d foo/a/dir2/subdir2
There are a lot of directories inside 'foo' and a lot of sub-directories within them, and this auto-complete behaviour is necessary to speed our process (this script is used extensively throughout the day).
Attempted Solution
I've tried using the 'complete' builtin to get this working by specifying the directory to use, but all this managed to do was auto-complete the first level of directories with a space after the argument instead of continuing to auto-complete like 'cd' would.
complete -x -c testfunc -a "(__fish_complete_directories ./foo/)"
Working bash version
I have already got this working in Bash and I am trying to port it over to fish. See below for the Bash version.
_testfunc()
{
local cur prev words cword
_init_completion || return
compopt +o default
case $prev in
testfunc)
COMPREPLY=( $( compgen -W '-d' -- "$cur" ) )
compopt +o nospace
return
;;
-d)
curdir=$(pwd)
cd foo/ 2>/dev/null && _filedir -d
COMPREPLY=( $( compgen -d -S / -- "$cur" ) )
cd $curdir
return
;;
esac
} &&
complete -o nospace -F _testfunc testfunc
This is essentially stepping into the folder that I want, doing the autocompletion, then stepping back into the original folder that the script was run in. I was hoping this would be easier in Fish after getting it working in Bash (I need to support these two shells), but I'm just pulling my hair out.
Any help would be really appreciated!
I am not a bash completions expert, but it looks like the bash completions are implemented by changing directories, running completions, and then changing back. You can do the same in fish:
function complete_testfunc
set prevdir $PWD
cd foo
__fish_complete_directories
cd $prevdir
end
complete -x -c testfunc -a "(complete_testfunc)"
does that work for you?

Git Bash find exec recursively on folders and files containing spaces

Question: In Git Bash on windows, how would you run the following in a way that it will also search folders with spaces in the name, and execute on files with spaces in the name?
$ find ./ -type f -name '*.png' -exec sh -c 'cwebp -q 75 $1 -o "${1%.png}.webp"' _ {} \;
Context I'm running Git Bash on windows, trying to execute a command on all found .png files to convert them to .webp format. It works for all files without spaces in the path, but it's failing to find files with spaces in the filename or files within folders that have spaces in the folder name.A few considerations:
I have many, many levels of folders to iterate through, and I can't run this command separately for each. I really need the recursion to work.I cannot change the folder names; it will break other dependencies (nor did I create the folder or filenames originally, so cut me some slack!)I arrived here by following the suggestions from this article: https://www.smashingmagazine.com/2018/07/converting-images-to-webp/the program, to my knowledge, doesn't ship with any built-in recursive command... golly that'd be handy
Any help you can provide will be appreciated. Thanks!

Eclipse, Add all source file paths to an external tool as an argument

I would like to add an External Tool to my Eclipse CDT Project.
This external tool, which is a program that I have written myself, requires different arguments (the map file and a list of all *.c *.cpp and *.h files). I already managed to hand over the map file but is there any way of getting a list of all *.c and *.h files (maybe with an Eclipse Variable) so that I can directly add this to the argument field?
I found one solution which can be used on a linux system. Just use a a pipe with the following command and put it in a shell script.
First of all, how to find all source code files:
find <rootfolder> -name '*.c' -o -name '*.cpp' -o -name '*.h'
Complete command:
find <rootfolder> -name '*.c' -o -name '*.cpp' -o -name '*.h' | xargs <myTool>
The first command will find out all absolute paths to the all .c .cpp and .h files listed in the rootfolder and the second one will convert its input into a set on arguments. The result will be the same as if every found file path would have been handed over as a single argument to mytool.

Delete files in a folder using Perl

I want to delete all files in a folder, which contain he word TRAR in their filename.. I hav etried the following :
CONFIG_DIR=`pwd`
VENDOR=ericsson-msc
RELEASE=v1
BASE_DIR=/appl/virtuo/gways
system ("cd /appl/virtuo/gways/config/ericsson-msc/v1/spool/input_d; rm-rf *TRAR");
remove all your config lines ( are they even perl? )
CONFIG_DIR=`pwd`
VENDOR=ericsson-msc
RELEASE=v1
BASE_DIR=/appl/virtuo/gways
and
system ("cd /appl/virtuo/gways/config/ericsson-msc/v1/spool/input_d; rm -rf *TRAR")
should work but you should really be using perl code (unlink, etc)
I suspect you are confusing the usage of perl with how you will use awk in bash scripts.
As #Steffen Ullrich said, that isn't Perl or Shell. But I'll try to make it a little more Perlish for you:
First, note that
variables in Perl start with a $
strings need "quotes around them"
statements end with a ;
spaces around = are ok and make it all easier to read
so
$CONFIG_DIR = `pwd`;
$VENDOR = "ericsson-msc";
$RELEASE = "v1";
$BASE_DIR = "/appl/virtuo/gways";
Next, see how you can combine these into a single string like this (I'm guessing that's what you want to do)
$DIR_FOR_CLEANING = "$BASE_DIR/config/$VENDOR/$RELEASE/spool/input_d";
Lastly, you should be really careful whenever using the -r command to rm along with a wildcard like *. Look up the man page for rm and see if -r is something you want to do. I don't think you need it here, unless you have directories named *TRAR that you want to recurse into to remove. I'll bet you only have files named *TRAR in that input_d directory.
Also, the command the way you wrote it could fail the cd if that directory doesn't exist, and would then proceed to recursively remove *TRAR from whatever directory you're running the script from. But you don't need to change directories at all. Try something like this
system ("echo rm -f $DIR_FOR_CLEANING/*TRAR");
If the echo command lists the files you do in fact want it to remove, then remove the "echo" and the rm will start deleting stuff.

Find unused resource files (.jsp, .xhtml, images) in Eclipse

I'm developing a large web application in Eclipse and some of the resources (I'm talking about files, NOT code) are getting deprecated, however, I don't know which are and I'm including them in my ending war file.
I know Eclipse recognizes file paths into its directory because I can access the link to an image or other page while I'm editing one of my xhtml pages (using Control). But is there a way to localize the unused resources in order to remove them?
Following these 3 steps would work for sites with a relatively finite number of dynamic pages:
Install your site on a filesystem mount'ed with atime (access time).
Try harvesting the whole site with wget.
Use find to see which files were not accessed recently.
Done.
As I know Eclipse doesn't have this (need this too).
I'm using grep in conjuction with bash scripting - shell script takes files in my resource folder, put filenames in list, greping throught source code for every record in the list and if grep find it it is removed.
At the end list is printed on console - just unused resources retain in the list.
UCDetector might be your best bet, specifically, the custom marker aspects of this tool.
In Eclipse I have not found a way. I have used the following shell command script.
Find .ftl template files which are NOT referenced in .java files
cd myfolder
find . -name "*.ftl" -printf "%f\n" |while read fname; do grep --include \*.java -rl "$fname" . > /dev/null || echo "${fname} not referenced" ; done;
or
Find all .ftl template files which are NOT referenced in .java, .ftl, .inc files
cd myfolder
find . -name "*.ftl" -printf "%f\n" |while read fname; do grep --include \*.java --include \*.ftl --include \*.inc -rl "$fname" . > /dev/null || echo "${fname} not referenced" ; done;
Note: on MacOSX you can use gfind instead of find in case -printf is not working.
Example output
productIndex2.ftl not referenced
showTestpage.ftl not referenced