I have a CLI tool written in Java which can modify some source with the added params. For example, it can rename an enum value across a whole project.
I want to write an sbt task that can run this tool from my project dir with the given params, like sbt 'enums -rename A B'. My tool can be injected to the project through the sbt dependencies.
I skimmed through the book sbt in Action looking for an answer, but those examples are not this specific.
My build.sbt (far from working):
name := """toolTestWithActivator"""
version := "1.0-SNAPSHOT"
resolvers += "Local Repository" at "file://C:/Users/torcsi/.ivy2/local"
lazy val root = (project in file(".")).enablePlugins(PlayJava)
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
"tool" % "tool_2.11" % "1.0",
javaJdbc,
javaEbean,
cache,
javaWs
)
val mytool = taskKey[String]("mytool")
mytool := {
com.my.tool.Main
}
Can sbt handle this type of task/dependency structure, or do I need to do this another way?
SBT is recursive: it compiles .sbt files and .scala files under the project folder and use those to execute your build (in fact you can see sbt as a library that helps you producing builds).
So, as you need your library to define a task, that one is a dependency of your build.sbt file (and not a dependency of your project).
To declare that the build.sbt file depends on your library, just create a ".sbt" file in the project folder; example:
project/dependencies.sbt
libraryDependencies += "tool" %% "tool" % "1.0"
and in build.sbt add:
val mytool = taskKey[Unit]("mytool")
mytool := {
com.my.tool.main(Array())
}
Some comments:
be careful with the scala version used: as sbt 0.13 is compiled with scala 2.10; your library should also be compiled for scala 2.10 (the package should be tools_2.10 ). And the new sbt 1.0 is compiled with scala 2.12.
I used the %% notation, so that sbt adds by itself the expected scala version.
I supposed your cli tool defines a classic java main method (or the scala equivalent). So, the argument should be an Array of String (here an empty one) and it returns Unit (void in java).
Some reference to understand the solution:
http://www.scala-sbt.org/0.13/docs/Organizing-Build.html
Related
Is there a standard in place for setting up a Scala project where the build.sbt is contained in a subdirectory?
I've cloned https://github.com/lightbend/cloudflow and opened it in IntelliJ, here is the structure:
Can see core contains build.sbt.
If I open the project core in a new project window then IntelliJ will recognise the Scala project.
How to compile the Scala project core while keeping the other folders available within the IntelliJ window?
EDIT:
If you do want to play around with the project, it should suffice to either import an SBT project and select core as the root. Intellij should also detect the build.sbt if you open core as the root.
Here is the SBT Reference Manual
Traditionally, build.sbt will be at the root of the project.
If you are looking to use their libraries, you should import them in your sbt file, you shouldn't clone the repo unless you intend to modify or fork their repo.
For importing libraries into your project take a look at the Maven Repository for Cloudflow, select the project(s), click on the version you want, and select the SBT tab. Just copy and paste those dependencies into your build.sbt. Once you build the project with SBT, you should have all those packages available to you.
So in [ProjectRoot]/build.sbt something along the lines of
val scalaVersion = "2.13.4"
lazy val root = (project in file("."))
.settings(
name := "Your_Project_Name",
scalaVersion := scalaVersion,
libraryDependencies += "com.lightbend.cloudflow" %% "cloudflow-streamlets" % "2.0.26-RC12",
// Either sequentially
libraryDependencies += "com.lightbend.cloudflow" %% "cloudflow-akka" % "2.0.26-RC12",
// Or as a sequence (note that you can have a trailing comma for these)
libraryDependencies ++= Seq(
"com.lightbend.cloudflow" %% "cloudflow-blueprint" % "2.0.26-RC12",
"com.lightbend.cloudflow" %% "cloudflow-flink" % "2.0.26-RC12", // No next elemenet
)
// Or combo using both like above, just don't forget the commas in between
)
I've been tasked with rewriting an old ant build script to SBT. As it happens, our suite is built up of 3 modules:
A Play 2.3 front-end webserver;
A back-end for retrieving data from various other systems;
A middle module containing some shared classes for database access and business logic.
Below an excerpt of my Build.scala file can be found:
val sharedSettings = Seq(
organization := <organization here>,
version := "1.2.5",
scalaVersion := "2.11.1",
libraryDependencies ++= libraries,
unmanagedJars in Compile ++= baseDirectory.value / "lib",
unmanagedJars in Compile ++= baseDirectory.value / "src",
unmanagedJars in Compile ++= baseDirectory.value / "test"
)
lazy val middle = project.settings(sharedSettings: _*)
lazy val back = project.settings(sharedSettings: _*).dependsOn(middle)
However, when I try to compile the source, I get the following error:
bad symbolic reference to scala.reflect.runtime encountered in class file 'ValueConverter.class'. Cannot access term runtime in package scala.reflect. The current classpath may be missing a definition for scala.reflect.runtime, or ValueConverter.class may have been compiled against a version that's incompatible with the one found on the current classpath.
The source code is organized in the following structure:
back
src
test
lib
middle
src
test
lib
front
src
test
lib
Here each lib folder contains some manually maintained libraries (which is why we want to move to sbt).
Any ideas on how to solve this?
In the end, I gave up on trying to get the compiler to understand the additional libraries. Eventually, I added those dependencies that were available using sbt, to the sbt managed libraries. This apparently works well.
I created a new project using Play Scala and Eclipse. Added Squeryl dependency and see that it's been pulled during compile time. Confirmed it's present in .ivy2/cache/org.squeryl directory but eclipse project is not able to pull it up and causing compilation for import.
build.sbt
name := """registration"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.1"
libraryDependencies ++= Seq(
jdbc,
anorm,
cache,
ws,
"org.squeryl" % "squeryl_2.10" % "0.9.6-RC2"
)
It looks like squeryl doesn't have a binary readily available for Scala 2.11 yet according to http://www.squeryl.org/getting-started.html
So if you want to use a pre-compiled version of this library you must change your scala version to 2.10.4.
All versions of squeryl available can be found at: http://mvnrepository.com/artifact/org.squeryl
I had a similar case using eclipse.
Select Project --> Clean to clean your workspace and build it again if you have not checked "Build Automatically".
If its still not visible please refresh the package explorer (or just the 'Referenced Library') with F5.
I need to build a single jar, including dependencies, for one of my sub-projects so that it can be used as a javaagent.
I have a multi-module sbt project and this particular module is the lowest level one (it's also pure Java).
Can I (e.g. with sbt-onejar, sbt-proguard or sbt assembly) override how the lowest level module is packaged?
It looks like these tools are really designed to be a post-publish step, but I really need a (replacement or additional) published artefact to include the dependencies (but only for this one module).
UPDATE: Publishing for sbt-assembly are instructions for a single project, and doesn't easily translate into multi-project.
Publishing for sbt-assembly are instructions for a single project, and doesn't easily translate into multi-project.
People have been publishing fat JAR using sbt-assembly & sbt-release without issues. Here's a blog article from 2011: Publishing fat jar created by sbt-assembly. It boils down to adding addArtifact(Artifact(projectName, "assembly"), sbtassembly.AssemblyKeys.assembly) to your build.sbt (note that the blog is a little out of date AssemblyKeys is now a member of sbtassembly directly).
For sbt 0.13 and above, I prefer to use build.sbt for multi-projects too, so I'd write it like:
import AssemblyKeys._
lazy val commonSettings = Seq(
version := "0.1-SNAPSHOT",
organization := "com.example",
scalaVersion := "2.10.1"
)
val app = (project in file("app")).
settings(commonSettings: _*).
settings(assemblySettings: _*).
settings(
artifact in (Compile, assembly) ~= { art =>
art.copy(`classifier` = Some("assembly"))
}
).
settings(addArtifact(artifact in (Compile, assembly), assembly).settings: _*)
See Defining custom artifacts:
addArtifact returns a sequence of settings (wrapped in a SettingsDefinition). In a full build configuration, usage looks like:
...
lazy val proj = Project(...)
.settings( addArtifact(...).settings : _* )
...
SBT 0.12.2 always attempts to resolve plugins using Scala 2.9.2 when using the %% syntax on plugin imports.
I have tried setting older versions of Scala in build.sbt, newer versions, etc. Even deleting target folder each time... nothing seems to make a difference.
name := "Game"
version := "1.0"
scalaVersion := "2.9.1" // SBT is ignoring the scala version
SBT is recursive, so you need to specify scala version for project, that build your project. Another words, you need to add appropriate scalaVersion to the plugins.sbt file.
For all plugins in your project, you set scalaVersion in project/plugins.sbt file that configures the build project definition for your project and where you define plugins.
$ cat project/plugins.sbt
scalaVersion := "2.9.3"
There's however a way to set up a more specific version of sbt and Scala for a plugin.
Instead of using addSbtPlugin that accepts a single ModuleID (constructed with % and %%), use addSbtPlugin(dependency: ModuleID, sbtVersion: String) or even addSbtPlugin(dependency: ModuleID, sbtVersion: String, scalaVersion: String), e.g.
$ cat project/plugins.sbt
// It doesn't exist and it's only for demo purposes
addSbtPlugin("com.timushev.sbt" % "sbt-updates" % "0.1.0", "0.12.2", "2.5")