How to enable SbtWeb in not-play project? - scala

I have a single-project build, implemented in Build.scala file with the following settings:
scala
lazy val root = Project(
id = ProjectInfo.name,
base = file("."),
settings = Project.defaultSettings
++ Revolver.settings
++ Revolver.enableDebugging(port = 5050)
++ Twirl.settings
++ // more tasks omitted
++ Seq(
mainClass in Compile := Some(launcherClassName),
mainClass in Revolver.reStart := Some(launcherClassName),
javaOptions in Revolver.reStart ++= List(
"-XX:PermSize=256M",
"-XX:MaxPermSize=512M",
"-Dlogback.debug=false",
"-Dlogback.configurationFile=src/main/resources/logback.xml"
),
resolvers ++= projectResolvers,
libraryDependencies ++= Dependencies.all,
parallelExecution in Test := false,
)
)
I would like to add sbt-web managed assets processing for the project, as I want to handle coffeescript, less and so on.
I added sbt-coffeescript plugin straight to plugins.sbt file in project folder and actually got it working. So now when I run web-assets:assets I have a coffeescript sample file in /src/main/coffeescript/foo.coffee and it gets compiled to target/web/coffeescript/main/coffeescript/foo.js.
Unfortunately, nothing gets processed when I simply run compile or run task. How do I enable processing of assets during compile in development workflow?

The issue you're having is that the old-style of specifying dependencies in projects does not work with AutoPlugins (which is what the WebPlugin is).
Specifically:
val foo = Project(
id = "ok"
base = file("ok")
settings = defaultSettings // BAD!
)
i.e. if you manually place settings on the Project, you're telling sbt "I Know EVERY setting I want on this project, and I want to completely override the defaults."
The load order of sbt settings is:
AutoPlugins (Core settings now come from AutoPlugins)
Settings defined in Project instances
Settings defined in build.sbt files in the base directory of a project.
The above code is re-applying ALL of the sbt default settings from 0.13.x series, which will overwrite anything that the AutoPlugins previously enabled. This is by design, as any other mechanism wouldn't be "correct".
If you're migrating to using AutoPlugins, simply modify your build to be:
lazy val root = Project(
id = ProjectInfo.name,
base = file("."))
settings =
// NOTICE we dropped the defaultSettings
Revolver.settings
++ Revolver.enableDebugging(port = 5050)
++ Twirl.settings
++ // more tasks omitted
++ Seq(
mainClass in Compile := Some(launcherClassName),
mainClass in Revolver.reStart := Some(launcherClassName),
javaOptions in Revolver.reStart ++= List(
"-XX:PermSize=256M",
"-XX:MaxPermSize=512M",
"-Dlogback.debug=false",
"-Dlogback.configurationFile=src/main/resources/logback.xml"
),
resolvers ++= projectResolvers,
libraryDependencies ++= Dependencies.all,
parallelExecution in Test := false,
)
)

To run assets generation on compilation I did this:
settings = ... ++ Seq(
pipelineStages := Seq(rjs),
(compile in Compile) <<= compile in Compile dependsOn (stage in Assets),
// ...
)
Than when I run compile, stage command is also executed, thus running sbt-web's pipeline.
The question for me is how to make generated assets to become available as part of managed resources (I'm trying to get sbt-web working with xsbt-web-plugin and liftweb)

Related

Can I create a proto jar for scalaVersion 2.11/2.12 and use it within the same sbt under different sub-project?

I have a set of .proto files (protobuf) which I generate java from using scalapb. I also have in the same sbt 2 sub-projects, one is scalaVersion 2.11 compatible (can't upgrade it to 2.12 due to missing packages) and the other one is scala 2.12.
I created a sub-project to hold my proto, and by default 2.12 is used and my 2.12 sub-project can use it, but my 2.11 can't.
I set the crossScalaVersions to 2.11/2.12, I compiled my project with both, which passed, but then even then I was unable to get the 2.11 sub-project to find that code.
I am "wondering" if that is something supported, or if there is a track I could use a single location to hold my .proto yet have my 2 sub-projects using the same sbt file use those.
lazy val scala212 = "2.12.13"
lazy val scala211 = "2.11.12"
lazy val supportedScalaVersion = List(scala212, scala211)
ThisBuild / scalaVersion := scala212
lazy val root = (project in file("."))
.aggregate(proto, subproject1, subproject2)
.settigns(
crossScalaVersions := Nil,
publish / skip := true
)
lazy val proto = project
.settings(
crossScalaVersions := supportedScalaVersions,
name := "proto",
libraryDependencies += "com.trueaccord.scalapb" %% "scalapb-runtime" % com.trueaccord.scalapb.compiler.Version.scalapbVersion % "protobuf",
PB.targets in Compile := Seq(
scalapb.gen(grpc = false) -> (sourceManaged in Compile).value / "protobuf"
)
)
lazy val subproject1 = project
.dependsOn(proto)
lazy val subproject2 = project
.settings(
scalaVersion := scala211
)
.dependsOn(proto)
So, from the above, if I do sbt "+ proto" I can compile both versions. If I do sbt subproject1/compile it works too. Using sbt subproject2/compile fails indicating that it cannot find the 2.11:proto jar file.
Either, I would like the above somehow to work nicely, or any other trick that I could generate the code from the same proto location but within subproject1/subproject2 would be appreciated.
You could try the sbt-projectmatrix plugin:
https://github.com/sbt/sbt-projectmatrix
The idea is to have separate sbt subprojects for the different Scala versions, so you can simply reference the relevant subproject when calling dependsOn.
I think this plugin is going to end up in sbt some day as it's a much better solution in general than the current built-in stateful cross compilation support, and it's developed by Eugene Yokota, who is also an sbt developer.

Sbt generated docker container fails to package subproject

I have a multi-project build.sbt file, with projects like so:
lazy val utils = (project in file("utils"))
.settings(
Seq(
publishArtifact := false
)).[...]
lazy val api = (project in file("api"))
.dependsOn(utils)
.settings(commonSettings: _*)
.enablePlugins(JavaAppPackaging, DockerPlugin)
.settings(publish := {})
.settings(
Seq(
packageName in Docker := "my-api",
dockerBaseImage := "java:8",
mainClass in Compile := Some("com.path.to.Main"),
publishArtifact := false,
unmanagedJars in Compile += file("jars/somejars.jar")
))
API is built on top of Finch framework. I create a docker image for the API using sbt api/docker:publishLocal and then run it locally. However, it seems like the utils subproject classes are not packaged with the final container, and as a result I am getting multiple
java.lang.ClassNotFoundException:
types of exceptions. For a similar project that doesn't have a subproject dependency, everything runs smoothly and I have no problems.
Am I missing something in the plugin configuration? I thought .dependsOn() should be taking care of providing dependent classes in the project docker image.
Answering my own question, but turns out this is a default behaviour of sbt-native-packager, or rather sbt, when a dependent project has publishArtifact := false setting.
A workaround that worked for me was changing the above to publish/skip := true.
More on this issue can be found here: https://github.com/sbt/sbt-native-packager/issues/1221

Intertwined dependencies between sbt plugin and projects within multi-project build that uses the plugin itself

I'm developing a library that includes an sbt plugin. Naturally, I'm using sbt to build this (multi-project) library. My (simplified) project looks as follows:
myProject/ # Top level of library
-> models # One project in the multi-project sbt build.
-> src/main/scala/... # Defines common models for both sbt-plugin and framework
-> sbt-plugin # The sbt plugin build
-> src/main/scala/...
-> framework # The framework. Ideally, the sbt plugin is run as part of
-> src/main/scala/... # compiling this directory.
-> project/ # Multi-project build configuration
Is there a way to have the sbt-plugin defined in myProject/sbt-plugin be hooked into the build for myProject/framework all in a unified build?
Note: similar (but simpler) question: How to develop sbt plugin in multi-project build with projects that use it?
Is there a way to have the sbt-plugin defined in myProject/sbt-plugin be hooked into the build for myProject/framework all in a unified build?
I have a working example on Github eed3si9n/plugin-bootstrap. It's not super pretty, but it kind of works. We can take advantage of the fact that sbt is recursive.
The project directory is another build inside your build, which knows how to build your build. To distinguish the builds, we sometimes use the term proper build to refer to your build, and meta-build to refer to the build in project. The projects inside the metabuild can do anything any other project can do. Your build definition is an sbt project.
By extension, we can think of the sbt plugins to be library- or inter-project dependencies to the root project of your metabuild.
meta build definition (project/plugins.sbt)
In this example, think of the metabuild as a parallel universe or shadow world that has parallel multi-build structure as the proper build (root, model, sbt-plugin).
To reuse the source code from model and sbt-plugin subprojects in the proper build, we can re-create the multi-project build in the metabuild. This way we don't need to get into the circular dependency.
addSbtPlugin("com.eed3si9n" % "sbt-doge" % "0.1.5")
lazy val metaroot = (project in file(".")).
dependsOn(metaSbtSomething)
lazy val metaModel = (project in file("model")).
settings(
sbtPlugin := true,
scalaVersion := "2.10.6",
unmanagedSourceDirectories in Compile :=
mirrorScalaSource((baseDirectory in ThisBuild).value.getParentFile / "model")
)
lazy val metaSbtSomething = (project in file("sbt-plugin")).
dependsOn(metaModel).
settings(
sbtPlugin := true,
scalaVersion := "2.10.6",
unmanagedSourceDirectories in Compile :=
mirrorScalaSource((baseDirectory in ThisBuild).value.getParentFile / "sbt-plugin")
)
def mirrorScalaSource(baseDirectory: File): Seq[File] = {
val scalaSourceDir = baseDirectory / "src" / "main" / "scala"
if (scalaSourceDir.exists) scalaSourceDir :: Nil
else sys.error(s"Missing source directory: $scalaSourceDir")
}
When sbt loads up, it will build metaModel and metaSbtSomething first, and use metaSbtSomething as a plugin to your proper build.
If you have any other plugins you need you can just add it to project/plugins.sbt normally as I've added sbt-doge.
proper build (build.sbt)
The proper build looks like a normal multi-project build.
As you can see framework subproject uses SomethingPlugin. Important thing is that they share the source code, but the target directory is completely separated, so there are no interference once the proper build is loaded, and you are changing code around.
import Dependencies._
lazy val root = (project in file(".")).
aggregate(model, framework, sbtSomething).
settings(inThisBuild(List(
scalaVersion := scala210,
organization := "com.example"
)),
name := "Something Root"
)
// Defines common models for both sbt-plugin and framework
lazy val model = (project in file("model")).
settings(
name := "Something Model",
crossScalaVersions := Seq(scala211, scala210)
)
// The framework. Ideally, the sbt plugin is run as part of building this.
lazy val framework = (project in file("framework")).
enablePlugins(SomethingPlugin).
dependsOn(model).
settings(
name := "Something Framework",
crossScalaVersions := Seq(scala211, scala210),
// using sbt-something
somethingX := "a"
)
lazy val sbtSomething = (project in file("sbt-plugin")).
dependsOn(model).
settings(
sbtPlugin := true,
name := "sbt-something",
crossScalaVersions := Seq(scala210)
)
demo
In the SomethingPlugin example, I'm defining something task that uses foo.Model.x.
package foo
import sbt._
object SomethingPlugin extends AutoPlugin {
def requries = sbt.plugins.JvmPlugin
object autoImport {
lazy val something = taskKey[Unit]("")
lazy val somethingX = settingKey[String]("")
}
import autoImport._
override def projectSettings = Seq(
something := { println(s"something! ${Model.x}") }
)
}
Here's how we can invoke something task from the build:
Something Root> framework/something
something! 1
[success] Total time: 0 s, completed May 29, 2016 3:01:07 PM
1 comes from foo.Model.x, so this demonstrates that we are using the sbt-something plugin in framework subproject, and that the plugin is using metaModel.

Multi module Scala Play 2.3 conf location

I've been tasked with rewriting an old ant build script to SBT. As it happens, our suite is built up of 3 modules:
A Play 2.3 front-end webserver;
A back-end for retrieving data from various other systems;
A middle module containing some shared classes for database access and business logic.
Below an excerpt of my Build.scala file can be found:
val sharedSettings = Seq(
organization := <organization here>,
version := "1.2.5",
scalaVersion := "2.11.1",
libraryDependencies ++= libraries,
unmanagedJars in Compile ++= baseDirectory.value / "lib",
unmanagedJars in Compile ++= baseDirectory.value / "src",
unmanagedJars in Compile ++= baseDirectory.value / "test"
)
lazy val middle = project.settings(sharedSettings: _*)
lazy val back = project.settings(sharedSettings: _*).dependsOn(middle)
lazy val front =
project
.enablePlugins(play.PlayScala)
.settings(sharedSettings: _*)
.settings(scalaSource in Compile := baseDirectory.value / "app")
.settings(
routesImport ++= Seq(
"scala.language.reflectiveCalls", // Removes warnings when using multiple routes files
"com.asml.cerberus.front.toolbox.Binders._")
)
.dependsOn(middle % "compile->compile;test->test")
I've got my application.conf in the ./front/conf/ directory. Unfortunately, if I now run sbt, it looks for a ./conf/application.conf file. (I've tested this by moving the conf directory.)
Is there any way how I can tell SBT/Play to use the front module's conf directory in stead?
In case that helps, we have a wrapper script around activator (sbt) activatorWrapper:
#!/bin/bash
activator -Dconfig.file=front/conf/application.conf
Then you can start your application with :
$ ./activatorWrapper
You can use system properties to specify an alternative config file. See here for the details.

Run sbt project in debug mode with a custom configuration

I want to introduce a debug mode in my sbt 0.11 project using a special configuration.
I've tried to implement this using the following code but unfortunately, it doesn't seems to work as expected. I'm launching debug:run but the run doesn't suspends as expected.
object Test extends Build {
lazy val root = Project("test", file("."))
.configs( RunDebug )
.settings( inConfig(RunDebug)(Defaults.configTasks):_*)
.settings(
name := "test debug",
scalaVersion := "2.9.1",
javaOptions in RunDebug += "-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=5005",
fork in RunDebug := true
)
lazy val RunDebug = config("debug").extend( Runtime )
}
Ok that works with the following :
object Test extends Build {
lazy val root = Project("test", file("."))
.configs( RunDebug )
.settings( inConfig(RunDebug)(Defaults.configTasks):_*)
.settings(
name := "test debug",
scalaVersion := "2.9.1",
javaOptions in RunDebug ++= Seq("-Xdebug", "-Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=5005"),
fork in RunDebug := true
)
lazy val RunDebug = config("debug").extend( Runtime )
}
now I can run my code in debug mode using debug:mode.
for simply running sbt project in debug mode , just do
JAVA_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=5005
and then
sbt run would run SBT in debug mode, you can create a remote debug configuration in eclipse and connect to it.
this is a rather lame, but useful when you have a multi module play project and want to run one of the modules in the debug mode
In Intellij IDEA, I just boot the program in Dedug mode and it seems to work properly without further configuration.