How to do Bash process substitution in Scala? - scala

How would something like diff <(echo aoeu) <(echo snth) be done in Scala?
I've tried using the sys.process interface as follows:
"diff <(echo aoeu) <(echo snth)".!
...however, this doesn't interpret the <() as subprocess substitution.

import scala.sys.process._
def diff(one: String, two: String):
String = Seq(
"bash", "-c", """
diff <(printf '%s\n' "$1") \
<(printf '%s\n' "$2"); retval=$?
(( retval == 1 )) || exit "$retval"
""", "_", one, two).!!
This can be tested in practice:
scala> diff("hello", "world")
res1: String =
"1c1
< hello
---
> world
"
To break down the reasoning:
Invoking a sequence, rather than a string, allows data (in my examples hello and world; in yours, aoeu and snth) to be passed out-of-band from code. This is critical to avoiding injection attacks when such content is parameterized.
Invoking bash as your executable ensures that process substitution syntax is available.
Checking for the exit status of 1 (and coercing it to 0) avoids scala treating a case where diff returns an exit status indicating that the two inputs are not identical as an error, while ensuring that other errors still become exceptions in scala.
Using printf '%s\n' "$1" instead of echo "$1" avoids ambiguities in the POSIX definition of echo (see in particular the APPLICATION USAGE section).
Passing an explicit argument of _ fills in the argv[0] slot, (aka $0).
Note that invoking a sequence rather than a string also prevents you from needing a shell at all in many cases: Seq("hello", "world").! doesn't need to invoke any shell, but can be implemented so as to directly starts an executable named hello, whereas "hello world".! is equivalent to Seq("sh", "-c", "hello world").!, with an extra executable invocation with both performance cost and potential security vulnerabilities required for implementation. See Shellshock for an example of a (now-near-universally-patched) case where a shell invocation with no explicit user-controlled parameters could still be vulnerable in practice (when invoked from a web server following CGI conventions for exporting request parameters as environment variables); avoiding unnecessary shells is thus preferable behavior where feasible.

Related

how to use optional flag in fish

I'm building a CLI tasks utility, (A cheap version of taskwarrior).
I want to add some optional flags, such as -n
else if [ $cmd = 'delete' ]
argparse 'n/index'=? -- $argv
sed -i "$_flag_index d" ~/.tasks/data/Tdo.csv
but this gives an error
~/.tasks/run.sh (line 14): No matches for wildcard “'n/index'=?”. See help expand.
argparse 'n/index'=? -- $argv
I'm unable to understand the correct usage of optional flags, and I've not been able to find enough resources, the fish shell documentation isn't sufficient for a novice in shell scripting given lack of examples.
How to accept an optional argument n/index, and further execute some code if the argument has been given, and something else otherwise, and is it possible to add integer constraints on optional arguments?
Did you help expand like fish told you?
The unquoted ? is being handled as a globbing character. Use 'n/index=?'
$ set argv --index=10
$ argparse 'n/index'=? -- $argv
fish: No matches for wildcard ''n/index'=?'. See `help expand`.
argparse --ignore-unknown 'n/index'=? -- $argv
^
# quote the whole thing
# v v
$ argparse 'n/index=?' -- $argv
$ set -S _flag_index
$_flag_index: set in local scope, unexported, with 1 elements
$_flag_index[1]: |10|

Can you `split` fish shell variables as cmd line args

Is it possible to have fishshell split variables that are in cmd line arguments?
Assume I have a variable $args set like so:
$ set args "-a args"
Now, given this python program (test.py):
import sys
print(sys.argv)
If I run the above in fishshell I get this output:
$ python test.py $args
['test.py', '-a args']
Notice that the arguments are passed as one argument. When I do the equivalent in bash I get this output:
$ python test.py $args
['test.py', '-a', 'params']
Is there someway to make fish behave like bash?
You do not want fish to behave like bash (technically any POSIX compatible shell) with respect to variable expansion. The POSIX behavior is the source of endless problems and is why you need to put double-quotes around almost everything. In fact, most experienced people will tell you to add IFS=$'\n' at the top of your scripts to stop that auto-splitting from happening.
One answer is to use fish's "every var is a list" feature: set args "-a" "args" (the quotes are just for clarity and aren't needed in this example). Each element of the list becomes a separate argument to the command. This will do the right thing even if the args value contains whitespace. The other answer is to explicitly split the string on whitespace using command substitution: a_cmd (string split ' ' $args). This will not do the right thing (in fish or bash) if the args value contains whitespace.
I found a little hack with fish commandline tokenization:
function posix_expand_str --description "Expand a string the POSIX way."
set __posix_expand_str__oldline (commandline)
commandline $argv
commandline -o
commandline $__posix_expand_str__oldline
set -e __posix_expand_str__oldline
end
All strings seem like they were concatenated during testing.
When you realize this answered your question, please accept. It only POSIXes when you ask it to, and does not break strings.
Test results:
> posix_expand_str "hello world"
hello
world
> posix_expand_str "hello 'posix haters' world"
hello
posix haters
world
> posix_expand_str "hello" 'high rep "stackoverflow staff"' "world"
hello
high
rep
stackoverflow staff
world

How to pass arguments/parameters to a SCALA script

How can we pass arguments to scala script like how we pass arguments to a shell script.
Much like you have to set an environment variable to pass JVM arguments, you can set an environment variable for your arguments.
set MYVARS=arg1 arg2 arg3
Then in your scala script:
val args = sys.env("MYVARS").split(" ").map(_.trim).toList
args.foreach { println }
Declare your script with bash command on top like this
Test.scala
#!/bin/sh
exec scala "$0" "$#"
!#
object Test {
def main(args: Array[String]): Unit = {
println(s"args: ${args.mkString("[", ", ", "]")}")
}
}
It works
[Desktop] ./Test.scala "scala is awesome" "java8 has lambdas"
args: [scala is awesome, java8 has lambdas]
More info regarding $0 and $#
0 Expands to the name of the shell or shell script.
This is set at shell initialization. If bash is
invoked with a file of commands, $0 is set to
the name of that file. If bash is started with
the -c option, then $0 is set to the first argument
after the string to be executed, if one is present.
Otherwise, it is set to the file name used to invoke
bash, as given by argument zero.
# Expands to the positional parameters, starting from
one. When the expansion occurs within double quotes, each
parameter expands to a separate word. That is, "$#" is
equivalent to "$1", "$2" ... If the double-quoted
expansion occurs within a word, the expansion of the first
parameter is joined with the beginning part of the original
word, and the expansion of the last parameter is joined
with the last part of the original word. When there are no
positional parameters, "$#" and $# expand to nothing
(i.e., they are removed).
for more info visit: Command line args for Scala scripts

How can I store output string to a variable AND display in console

I have a perl script that prints a message. This script is being called by GNU make. In my GNU make, I want to display the message printed out by the script AND store it in a variable also.
I'm doing it this way.
result=`$(PERL) parse.pl report.log` #parse the report
echo $(result) #echo the message here
ifneq ($(strip $$(result)),) #check if message is empty
#if not empty, search for filepath string pattern and exit
echo filepath
exit 1
endif
But it is not displaying the string message from parse.pl.
You are capturing into a shell variable, but then trying to echo a makefile variable (and even if you tried to echo the shell variable, that wouldn't work because make runs each line in a separate shell process).
Changing it to echo the shell varible and all to run in one shell should work:
foo:
result=`$(PERL) parse.pl report.log`; \
echo $$result
but whatever you later need to do to use the captured result would also need to be in the same shell execution.
Apparently you can capture into a makefile variable too, which may be more convenient:
foo:
$(eval result := $(shell $(PERL) parse.pl report.log))
echo $(result)
The critical thing to keep in mind with make is first, that the entire makefile is parsed before any rules are run, and second a makefile has two completely distinct syntaxes in it: makefile syntax for most of it, and shell syntax for the recipes. The shell syntax is run by the shell, not by make: make just starts a shell, hands over the recipe, and waits for the shell to exit to see if it worked or not.
As a result of this you CANNOT combine make constructs like ifeq with shell commands and their results: it cannot work because all the make constructs are parsed first, while the makefile is being read in, and the shell commands are not run until much later, when the target is to be built.
In your case you need to write the entire thing in shell syntax, because you want things to depend on the shell invocation.
So, like this:
foo:
result=`$(PERL) parse.pl report.log`; \
echo $$result; \
if [ "$$result" = "" ]; then \
echo filepath; \
exit 1; \
fi
Note how each line ends with a backslash, so it's appended to the previous line instead of being a separate line: make runs each separate line in a different shell.
Alternatively if you have a new-enough GNU make you can use the one shell feature:
.ONESHELL:
foo:
result=`$(PERL) parse.pl report.log`
echo $$result
if [ "$$result" = "" ]; then
echo filepath
exit 1
fi

Checking if a file is a text file without using -T?

Title is pretty self explanatory, are there file testing functions in perl or is there a built in module that allows file testing operations?
This is a non-issue as -T like all of the file test operators are perl builtins.
They are documented here: perldoc -X
-X FILEHANDLE
-X EXPR
-X DIRHANDLE
-X
A file test, where X is one of the letters listed below. This unary operator takes one argument, either a filename, a filehandle, or a dirhandle, and tests the associated file to see if something is true about it. If the argument is omitted, tests $_ , except for -t , which tests STDIN. Unless otherwise documented, it returns 1 for true and '' for false, or the undefined value if the file doesn't exist. Despite the funny names, precedence is the same as any other named unary operator.
...
-T File is an ASCII text file (heuristic guess).
-B File is a "binary" file (opposite of -T).
The "file test" functions available in Perl are part of the programming language itself. Based on what you're saying and from the comments on this page, it may be that you have been "asked not to use external commands" because someone thinks that the -T flag is relying on something that belongs to the underlying environment and not the Perl language.
-T is part of the -X file test unary operators which are inherent to Perl:
http://perldoc.perl.org/functions/-X.html
Underlying the -T operator (specifically) is the function pp_fttext, which lives in pp_sys.c. These are part of the underlying code that comprises Perl, and you can verify this by looking in the root directory of the Perl source distribution:
http://www.perl.org/get.html
It may be the only way to do what you were originally asking (how to do this without -T) might be to do what you were asked not to do (use something external to Perl to perform the test).