Scala.js crossProject vs manual setup - scala

To compile the same source code with Scala.js and Scala JVM the documentation says to use crossProject, for instance
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := "0.1-SNAPSHOT"
)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
However it looks that the same goal can be achieved setting up the modules manually
lazy val fooSettings = Seq(
name := "foo",
version := "0.1-SNAPSHOT",
scalaSource in Compile := baseDirectory.value / ".." / "shared" / "src" / "main" / "scala"
)
lazy val fooJVM = project.in(file("jvm"))
.settings(fooSettings: _*)
lazy val fooJS = project.in(file("js"))
.settings(fooSettings: _*)
.enablePlugins(ScalaJSPlugin)
Does crossProject do something important or it's just more convenient way to setup stuff?

It looks like, in your manual setup, you have separate jvm and js subdirectories under your "shared" area -- it doesn't look like they're the same code at all.
CrossProject is about letting you have a single directory structure, with the same code, that is compiled for both sides. Possibly with some small subdirectories that are specific to one side or another, but in general the focus is that most of the code is shared -- the jvm and js subdirectories, if present (it depends on the mode), are usually just shims to adapt the common code to those sides.
All that said, it is really just to make it more convenient -- I believe you could achieve the same results without CrossProject. But in the common case where you're sharing a lot of code, it's significantly more convenient...

Related

sbt: Dependent project does not compile first

In my build.sbt (nothing fancy) ....
val common: Project =
project
.in(file("common"))
.enablePlugins(...)
.settings(libraryDependencies ++= ...)
val root =
project
.in(file("."))
.enablePlugins(...)
.dependsOn(common)
.aggregate(
common,
....
)
.settings(...)
Problem
common does not compile before root, and so the compilation fails complaining that it cannot find those classes (in common)
FYI
I have tried a multitude of things that I came across when searching some information on this problem (google, documentation, github issues etc.) No luck.
sbt v1.4.9 (Play project sbt-play v2.7.9)
The build.sbt is much bigger than what you see above (dependencies, tasks etc.). Otherwise, I don't think if there's anything special or tricky about it.
Help much appreciated!
To avoid initialisation problems try declaring projects as lazy vals
lazy val common = ...
lazy val root = ...
instead of strict vals
val common = ...
val root = ...
As a side note use dependsOn to establish ordering between subprojects, and not aggregate, because aggregate will not modify classpaths.
I'd agree with Mario Galic on using lazy val. In fact, I'd recommend using lazy val at all times in build.sbt.
If there is a cycle, like common referring back to root, one technique you can use is to use LocalProject("project name"), like LocalProject("root"):
lazy val common = (project in file("common"))
.settings(
Test / test := (LocalProject("root") / Test / test).value
)

What is the SBT := operator in build.sbt?

I'm new to Scala and SBT. I noticed an unfamiliar operator in the build.sbt of an open source project:
:=
Here are a couple examples of how it's used:
lazy val akkaApp = Project(id = "akka-app", base = file("akka-app"))
.settings(description := "Common Akka application stack: metrics, tracing, logging, and more.")
and it's used a few times in this larger code snippet:
lazy val jobServer = Project(id = "job-server", base = file("job-server"))
.settings(commonSettings)
.settings(revolverSettings)
.settings(assembly := null.asInstanceOf[File])
.settings(
description := "Spark as a Service: a RESTful job server for Apache Spark",
libraryDependencies ++= sparkDeps ++ slickDeps ++ cassandraDeps ++ securityDeps ++ coreTestDeps,
test in Test <<= (test in Test).dependsOn(packageBin in Compile in jobServerTestJar)
.dependsOn(clean in Compile in jobServerTestJar)
.dependsOn(buildPython in jobServerPython)
.dependsOn(clean in Compile in jobServerPython),
testOnly in Test <<= (testOnly in Test).dependsOn(packageBin in Compile in jobServerTestJar)
.dependsOn(clean in Compile in jobServerTestJar)
.dependsOn(buildPython in jobServerPython)
.dependsOn(clean in Compile in jobServerPython),
console in Compile <<= Defaults.consoleTask(fullClasspath in Compile, console in Compile),
fullClasspath in Compile <<= (fullClasspath in Compile).map { classpath =>
extraJarPaths ++ classpath
},
fork in Test := true
)
.settings(publishSettings)
.dependsOn(akkaApp, jobServerApi)
.disablePlugins(SbtScalariform)
My best guess is that it means "declare if not already declared".
The := has essentially nothing to do with the ordinary assignment operator =. It's not a built-in scala operator, but rather a family of methods/macros called :=. These methods (or macros) are members of classes such as SettingKey[T] (similarly for TaskKey[T] and InputKey[T]). They consume the right hand side of the key := value expression, and return instances of type Def.Setting[T] (or similarly, Tasks), where T is the type of the value represented by the key. They are usually written in infix notation. Without syntactic sugar, the invocations of these methods/macros would look as follows:
key.:=(value)
The constructed Settings and Tasks are in turn the basic building blocks of the build definition.
The important thing to understand here is that the keys on the left hand side are not some variables in a code block. Instead of merely representing a memory position in an active stack frame of a function call (as a simple variable would do), the keys on the left hand side are rather complex objects which can be inspected and passed around during the build process.

SBT multi-project build without using lazy vals

I'm working with a huge project with lots of subprojects, some of them with subprojects of their own. On top of that, I'd like some of them to be dynamic - given a List somewhere in the project build, I'd like to create one project for each of the elements.
For those reasons, having to define a lazy val for each project in build.sbt is very cumbersome. Is there other way to declare projects, like a addProject-like method we could call anywhere? Is there some SBT plugin that helps with that?
Sbt uses macros to turns top level vals into projects, so I don't think you will be able to escape that part. However, you can define all you build in Project => Project functions: (note that you also composability "for free" with function composition)
def myConf: Project => Project =
_.enablePlugins(ScalaJSPlugin)
.settings(scalaVersion := "2.12.0")
Then simply use project.configure(myConf) for single line project definitions:
lazy val subProject1 = project.configure(myConf)
lazy val subProject2 = project.configure(myConf)
lazy val subProject3 = project.configure(myConf)
lazy val subProject4 = project.configure(myConf)
...

sbt-unidoc -- how to exclude a sub-module from a RootProject

I am building compound API docs using the sbt-unidoc plugin. I am linking to various projects on GitHub using sbt's RootProject constructor with URIs.
One of these projects has mutually exclusive sub-projects:
val foo = RootProject(uri("git://github.com/Foo/Bar.git#v1.0.0"))
That is, in this project foo, there are two sub-modules foo-db1 and foo-db2, which contain the same types but are built against different dependency library versions.
Now when I try to run unidoc, it fails because a type is defined twice from unidoc's perspective. From the documentation, I can see there is a filter function:
unidocProjectFilter in (ScalaUnidoc, unidoc) := inAnyProject -- inProjects(app)
But how do I create an identifier for a sub-module from my RootProject?
In other words, how would I do this:
unidocProjectFilter in (ScalaUnidoc, unidoc) := inAnyProject --
inProjects(foo).SELECT_SUB_PROJECT(foo-v1)
?
The answer is found in this entry: Instead of using RootProject, one creates a number of ProjectRef instances.
import UnidocKeys._
val fooURI = uri("git://github.com/Foo/Bar.git#v1.0.0")
val fooDb1 = ProjectRef(fooURI, "foo-db1") // sub-module I don't want
val fooDb2 = ProjectRef(fooURI, "foo-db2") // sub-module I want
val root = (project in file("."))
.settings( ... )
.settings(unidocSettings: _*)
.settings(site.settings ++ ghpages.settings: _*)
.settings(
unidocProjectFilter in (ScalaUnidoc, unidoc) :=
inAnyProject -- inProjects(fooDb1)
)
.aggregate(fooDb2)
Note that putting only fooDb2 in the aggregate and leaving fooDb1 out, does not have any effect. It seems that unidoc always includes all sub-modules, no matter, so one has to use the unidocProjectFilter to explicitly remove projects, even if they don't appear in the aggregate.

How to define project-specific settings/keys in sbt 0.13

I have a simple multi project, whereby the root aggregates projects a and b. The root project loads this plugin I'm writing that is supposed to allow easy integration with the build system in our company.
lazy val a = project in file("a")
lazy val b = project in file("b")
Now, I'd like to define some Settings in the plugin that don't make sense in the root project, and can have different values for each sub-project. However, if I just define them like
object P extends Plugin {
val kind = settingKey[Kind]("App or Lib?")
val ourResolver = settingKey[Resolver]("...")
override def projectSettings = Seq(
// I want this to only be defined in a and b, where `kind` is defined
ourResolver <<= kind { k => new OurInternalResolver(k) }
)
}
then sbt will complain that ourResolver cannot be defined because kind is not defined in the root project.
Is there a way to specify a scope for this setting (ourResolver), such that the setting would be defined in every aggregated project except the root project?
Or do I have to make it into an SettingKey[Option[_]] and set it to None by default?
Edit: I have quite a large set of settings which progressively depend on kind and then ourResolver and so on, and these settings should only be defined where (read: "in the project scopes where") kind is defined. Added ourResolver to example code to reflect this.
However, if I just define them like....
There's nothing really magical about setting keys. Keys are just String entry tied with the type. So You can define your key the way you're doing now just fine.
Is there a way to specify a scope for this setting, such that the setting would be defined in every aggregated project except the root project?
A setting consists of the following four things:
scope (project, config, task)
a key
dependencies
an ability to evaluate itself (def evaluate(Settings[Scope]): T)
If you did not specify otherwise, your settings are already scoped to a particular project in the build definition.
lazy val a = (project in file("a")).
settings(commonSettings: _*).
settings(
name := "a",
kind := Kind.App,
libraryDependencies ++= aDeps(scalaVersion.value)
)
lazy val b = (project in file("b")).
settings(commonSettings: _*).
settings(
name := "b",
kind := Kind.Lib,
libraryDependencies ++= bDeps(scalaVersion.value)
)
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "foo",
publishArtifact := false
).
aggregate(a, b)
In the above, kind := Kind.App setting is scoped to project a. So that would answer your question literally.
sbt will complain that kind is not defined in the root project.
This is the part I'm not clear what's going on. Does this happen when you load the build? Or does it happen when you type kind into the sbt shell? If you're seeing it at the startup that likely means you have a task/setting that's trying to depend on kind key. Don't load the setting, or reimplement it so something that doesn't use kind.
Another way to avoid this issue may be to give up on using root aggregation. Switch into subproject and run tasks or construct explicit aggregation using ScopeFilter.
I've managed to ultimately do this by leveraging derive, a method that creates DerivedSettings. These settings will then be automatically expanded by sbt in every scope where their dependencies are defined.
Therefore I can now write my plugin as:
object P extends Plugin {
val kind = settingKey[Kind]("App or Lib?")
val ourResolver = settingKey[Resolver]("...")
def derivedSettings = Seq(
// I want this to only be defined in a and b, where `kind` is defined
ourResolver <<= kind { k => new OurInternalResolver(k) }
) map (s => Def.derive(s, trigger = _ != streams.key))
override def projectSettings = derivedSettings ++ Seq( /* other stuff */ )
}
And if I define kind in a and b's build.sbt, then ourResolver will also end up being defined in those scopes.