How to execute grep inside sbt - scala

I'm using sbt for building my Scala project and I was looking for a way to filter output of any command (like compile) by sub-string. In particular, I want to use grep in combination with sbt commands. For example > compile | grep MyFile.scala, should print only lines where MyFile.scala was mentioned.
Is there any way to do that?
$ sbt --version
sbt launcher version 0.13.5

The best way to grep the output of SBT in interactive mode (tested with sbt 0.13.11) is to use the last-grep.
This will re-print the output of the last command and it will take different grep-like arguments.

I'm not sure about doing it from within the sbt console, but how about from your shell?
$ sbt "tasks" | grep 'clean'
clean Deletes files produced by the build, such as generated sources, compiled classes, and task caches.
Or for substrings as you mention:
$ sbt "tasks" | grep 'class'
clean Deletes files produced by the build, such as generated sources, compiled classes, and task caches.
console Starts the Scala interpreter with the project classes on the classpath.
consoleProject Starts the Scala interpreter with the sbt and the build definition on the classpath and useful imports.
consoleQuick Starts the Scala interpreter with the project dependencies on the classpath.
run Runs a main class, passing along arguments provided on the command line.
runMain Runs the main class selected by the first argument, passing the remaining arguments to the main method.

Related

Compiling with scalac does not find sbt dependencies

I tried running my Scala code in VSCode editor. I am able to run my script via spark-submit command. But when I am trying with scalac to compile, I am getting:
.\src\main\scala\sample.scala:1: error: object apache is not a member of package org
import org.apache.spark.sql.{SQLContext,SparkSession}
I have already added respective library dependencies to build.sbt.
Have you tried running sbt compile?
Running scalac directly means you're compiling only one file, without the benefits of sbt and especially the dependencies that you have added in your build.sbt file.
In a sbt project, there's no reason to use scalac directly. This defeats the purpose of sbt.

compile/package multiple configurations from command line sbt scala

is there a way to build/compile all configurations at once? I have a project that has a Dev configuration in addition to the default Compile and Test configuration, and i am looking for a command or a setting in my build.sbt that would allow me to compile/package all 3 configurations at once.
Basically looking for a way to avoid having to do these 3 commands to build the entire source tree:
sbt compile
sbt dev:compile
sbt test:compile
When I use sbt from IntelliJ it is able to do this on building the project, but I am looking to do this from the command line.
First, you can run multiple tasks with a single sbt invocation:
sbt compile dev:compile test:compile
Second, you could define an alias in your build which does what you want:
addCommandAlias("compileAll", "; compile; dev:compile; test:compile")
Then, just run sbt compileAll.

Build subproject in Spark with sbt

I want to build subproject in Spark with sbt. I found this example and it works
$ ./build/sbt -Phive -Phive-thriftserver (build)
sbt (spark)> project hive (switch to subproject)
sbt (hive)> testOnly *.HiveQuerySuite -- -t foo ( run test case)
However, I tried the following but it does not build but quit
./build/sbt -mllib
I do not know how does the author figure out -Phive -Phive-thriftserver. I cannot find this in Spark source code.
I just want to do the exact same thing as the example but with a different subproject.
This is not asking how to use projects to print out all available projects.
Specify the project scope:
./build/sbt mllib/compile
refer to: http://www.scala-sbt.org/0.13/docs/Scopes.html

Compile single scala file with TypeSafe Activator

I have Activator installed. Which means I have a full SBT on my system. I don't want to create a brand new activator project. All I want to do is compile a single scala file as we used to do with the scalac command. How can I do this please? Thanks.
You go into the directory containing your scala file and type "sbt compile" on the command line.
To run the program, you type "sbt run"
see also
http://www.scala-sbt.org/0.13/tutorial/Hello.html

How do I run an sbt main class from the shell as normal command-line program?

How can I run an sbt app from the shell, so that I can run my app as a normal command-line program (as if run directly via scala but without having to set up an enormous classpath)?
I know I can do:
echo hello | sbt 'run-main com.foo.MyMain3 arg1 arg2' > out.txt
But this (1) takes forever to start because it starts sbt, (2) causes all stdout and stderr to go to stdout, and (3) causes all output to be decorated with a logger [info] or [error].
I looked at https://github.com/harrah/xsbt/wiki/Launcher but it seems too heavyweight, since it downloads dependencies and sets up a new environment and whatnot. I just want to run this app within my existing development environment.
Thus far I've cobbled together my own script to build up a classpath, and you can also do some other things like modify your project file to get sbt to print the raw classpath, but I feel like there must be a better way.
Here's what I have in my SBT (version 0.10) project definition,
val Mklauncher = config("mklauncher") extend(Compile)
val mklauncher = TaskKey[Unit]("mklauncher")
val mklauncherTask = mklauncher <<= (target, fullClasspath in Runtime) map { (target, cp) =>
def writeFile(file: File, str: String) {
val writer = new PrintWriter(file)
writer.println(str)
writer.close()
}
val cpString = cp.map(_.data).mkString(":")
val launchString = """
CLASSPATH="%s"
scala -usejavacp -Djava.class.path="${CLASSPATH}" "$#"
""".format(cpString)
val targetFile = (target / "scala-sbt").asFile
writeFile(targetFile, launchString)
targetFile.setExecutable(true)
}
... // remember to add mklauncherTask to Project Settings
The mklauncher task creates a script target/scala-sbt that executes scala with the project classpath already set. It would be nice to have mklauncher executed automatically whenever the classpath changes, but I haven't looked into doing this yet.
(I use the Java classpath, rather than Scala's, for ease of creating embedded interpreters.)
The start-script SBT plugin is now at:
https://github.com/sbt/sbt-start-script
It requires a few steps to set up and generates scripts that do not work on OS X, but that can be easily fixed if you're on that platform (see below).
Setup
Install greadlink (OS X only):
a) brew install coreutils
b) map readlink to the new function (greadlink) by adding these lines to ~/.bashrc:
function readlink() { greadlink "$#"; }
export -f readlink`
Add start-script plugin to ~/.sbt/plugins/build.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-start-script" % "0.8.0")
Add start-script task to current project:
$ sbt add-start-script-tasks # execute from directory where build.sbt resides
Add start-script support to current build.sbt:
import com.typesafe.sbt.SbtStartScript
seq(SbtStartScript.startScriptForClassesSettings: _*)
Note the blank line in between statements (de rigueur for SBT build files).
Generate Start Script
Then, whenever you want to create a script to start your app like sbt run-main, but without sbt, execute:
$ sbt start-script
Run
target/start mypackage.MyMainClass
Time flies by and a lot have changed since the other answers. It's currently SBT 0.13.6 time.
I think what you may need is the sbt-onejar plugin or the SBT Native Packager plugin.
sbt-onejar "is a simple-build-tool plugin for building a single executable JAR containing all your code and dependencies as nested JARs."
SBT Native Packager's "goal is to be able to bundle up Scala software built with SBT for native packaging systems, like deb, rpm, homebrew, msi."
Just discovered the sbt start script plugin: https://github.com/typesafehub/xsbt-start-script-plugin:
This plugin allows you to generate a script target/start for a
project. The script will run the project "in-place" (without having to
build a package first).
The target/start script is similar to sbt run but it doesn't rely on
SBT. sbt run is not recommended for production use because it keeps
SBT itself in-memory. target/start is intended to run an app in
production.
The plugin adds a task start-script which generates target/start. It
also adds a stage task, aliased to the start-script task.