How to "seq" plugin settings in a multi-project sbt build - scala

I am converting a single-project build.sbt to a multi-project build.sbt, which is always a PITA. There is this obscure syntax to make plugin settings available. E.g. before
seq(appbundle.settings: _*)
How do I do this with sub-projects. E.g.
lazy val views = Project(
id = "views",
base = file("views"),
dependencies = Seq(core),
settings = commonSettings ++ Seq(
seq(appbundle.settings: _*), // ???
name := "views",
description := ...
)
)
This just gives me an error
found : Seq[sbt.Def.SettingsDefinition]
required: Seq[sbt.Def.Setting[_]]
settings = commonSettings ++ Seq(
^

Add them using ++ to the overall settings
lazy val views = Project(
id = "views",
base = file("views"),
dependencies = Seq(core),
settings = commonSettings ++ appbundle.settings ++ Seq(
name := "views",
description := ...
)
)

Related

Why does sbt try to pull my interproject dependency?

I have a multi-project build with a build.sbt that looks as follows:
import lmcoursier.CoursierConfiguration
import lmcoursier.definitions.Authentication
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.12.12"
val adoMavenUsername = "."
val adoMavenPassword = "ADO_PAT"
val adoRepoIdWithView = "ADO-id"
val adoMavenRepos = Vector(
MavenRepository(adoRepoIdWithView, s"https://adoMavenHost/adoOrganization/adoProject/_packaging/${adoRepoIdWithView.replace("#", "%40")}/maven/v1")
)
val adoAuthentication =
Authentication(user = adoMavenUsername, password = adoMavenPassword)
.withOptional(false)
.withHttpsOnly(true)
.withPassOnRedirect(false)
val coursierConfiguration = {
val initial =
CoursierConfiguration()
.withResolvers(adoMavenRepos)
.withClassifiers(Vector("", "sources"))
.withHasClassifiers(true)
adoMavenRepos.foldLeft(initial) {
case (conf, repo) ⇒
conf.addRepositoryAuthentication(repo.name, adoAuthentication)
}
}
lazy val mainSettings = Seq(
organization := "org.some",
csrConfiguration := coursierConfiguration,
updateClassifiers / csrConfiguration := coursierConfiguration
)
lazy val root = (project in file("."))
.settings(mainSettings: _*)
.settings(
name := "sbt-test",
).aggregate(core, util)
lazy val core = (project in file("core"))
.settings(mainSettings: _*)
.settings(
name := "core",
).dependsOn(util)
lazy val util = (project in file("util"))
.settings(mainSettings: _*)
.settings(
name := "util"
)
For some reason, coursier attempts to download the util package externally during the core/update task. This is not what I want, as it should resolve it internally as part of the project. The package is not added to libraryDependencies, so I'm baffled why it would attempt the download.
The above example will fail because the Azure DevOps credentials are and Maven repository are incorrect, but it shows the attempt to download util.
It seems somehow related to this Github issue.
The default CoursierConfiguration constructor sets the interProjectDependencies property to an empty Vector. To fix this, manually add resolvers on top of sbt's csrConfiguration taskKey using .withResolvers.
This is what the solution looks like applied to my question, largely based on this Github comment:
val adoMavenUsername = "."
val adoMavenPassword = "ADO_PAT"
val adoRepoIdWithView = "ADO-id"
val adoMavenHost = "pkgs.dev.azure.com"
val adoMavenRepos = Vector(
MavenRepository(adoRepoIdWithView, s"https://$adoMavenHost/adoOrganization/adoProject/_packaging/$adoRepoIdWithView/maven/v1")
)
lazy val mainSettings = Seq(
organization := "org.some",
csrConfiguration := {
val resolvers = csrResolvers.value ++ adoMavenRepos
val conf = csrConfiguration.value.withResolvers(resolvers.toVector)
val adoCredentialsOpt = credentials.value.collectFirst { case creds: DirectCredentials if creds.host == adoMavenHost => creds }
val newConfOpt = adoCredentialsOpt.map { adoCredentials =>
val auths =
resolvers
.collect {
case repo: MavenRepository if repo.root.startsWith(s"https://$adoMavenHost/") => {
repo.name ->
Authentication(adoCredentials.userName, adoCredentials.passwd)
}
}
auths.foldLeft(conf) { case (conf, (repoId, auth)) => conf.addRepositoryAuthentication(repoId, auth) }
}
newConfOpt.getOrElse(conf)
},
updateClassifiers / csrConfiguration := coursierConfiguration
)

Cross compiling Scala versions with differing project modules

I'm trying to use sbt-projectmatrix to cross compile scala versions. However, I want one of the final modules to use different internal dependencies, based on the target Scala version.
This is a toy example of where I am right now:
lazy val root: Project = (projectMatrix in file("."))
.settings(commonSettings: _*)
.settings(publish := {})
.aggregate(
a_spark21.projectRefs ++
a_spark22.projectRefs ++
a_spark23.projectRefs ++
a_spark24.projectRefs ++
top.projectRefs ++: _*
)
lazy val top: Project = (projectMatrix in file("vulcan-hive"))
.disablePlugins(SitePreviewPlugin, SitePlugin, ParadoxPlugin, ParadoxSitePlugin)
.settings(name := "vulcan-hive")
.settings(commonSettings: _*)
.jvmPlatform(scalaVersions = Seq(versions.scala211, versions.scala212))
.customRow(
scalaVersions = Seq(versions.scala211),
axisValues = Seq(VirtualAxis.jvm),
_.settings(
SubModuleDependencies.hiveDependencies24,
libraryDependencies ++= Dependencies.Spark24.Compile.all.map(_.force()),
)
)
.customRow(
scalaVersions = Seq(versions.scala212),
axisValues = Seq(VirtualAxis.jvm),
_.settings(
SubModuleDependencies.hiveDependencies24,
libraryDependencies ++= Dependencies.Spark24.Compile.all.map(_.force()),
)
)
.dependsOn(a_spark24 % "compile->compile;test->test", a_spark23 % "compile->compile;test->test", a_spark22 % "compile->compile;test->test", a_spark21 % "compile->compile;test->test")
lazy val a_spark21: Project = (projectMatrix in file(a_spark21))
.jvmPlatform(scalaVersions = Seq(versions.scala211))
.customRow(
scalaVersions = Seq(versions.scala211),
axisValues = Seq(VirtualAxis.jvm),
_.settings(
libraryDependencies ++= Dependencies.Spark21.Compile.all.map(_.force())
)
)
lazy val a_spark22: Project = ...
lazy val a_spark23: Project = ...
lazy val a_spark24: Project = (projectMatrix in file(a_spark24))
.jvmPlatform(scalaVersions = Seq(versions.scala211, versions.scala212))
.customRow(
scalaVersions = Seq(versions.scala211),
axisValues = Seq(VirtualAxis.jvm),
_.settings(
libraryDependencies ++= Dependencies.Spark24.Compile.all.map(_.force())
)
)
.customRow(
scalaVersions = Seq(versions.scala212),
axisValues = Seq(VirtualAxis.jvm),
_.settings(
libraryDependencies ++= Dependencies.Spark24.Compile.all.map(_.force())
)
)
This fails with this error:
no rows were found in a_spark23 matching ProjectRow(true, List(PlatformAxis(jvm,JVM,jvm), ScalaVersionAxis(2.12.12,2.12))): List(ProjectRow(true, List(PlatformAxis(jvm,JVM,jvm), ScalaVersionAxis(2.11.12,2.11))), ProjectRow(true, List(PlatformAxis(jvm,JVM,jvm), ScalaVersionAxis(2.11.12,2.11))))
This error makes sense since top is trying is trying to find 2.12 dependencies for a_spark23 even though a_spark23 doesn’t have a 2.12 row defined. Is there a way to move dependsOn definitions to the customRow arguments? Or access the scalaBinaryVersion in outside the settings closure so that I can pass different lists to dependsOn?
The final product I want is top_2.12 that's depends on a_spark24_2.12 and top_2.11 that depends on [a_spark21_2.11, a_spark22_2.11, a_spark23_2.11, a_spark24_2.11]
Any help or guidance would be greatly appreciated!

how to run scala sbt-native-packager for a appJS/appJVM cross-build project

The sbt-native-packager can make a zip file with all dependencies and a script to run_
$ sbt universal:packageBin
I have a scala web application, using cross-build (appJS for front-end and appJVM for back-end).
How do I run this packager for the appJVM?
I've tried as follows, but it does not accept the command:
$ sbt appJVM/universal:packageBin
Here it is the build.sbt project, from https://www.scala-js.org/doc/project/cross-build.html
...
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := "0.1-SNAPSHOT"
).
jvmSettings(
// Add JVM-specific settings here
).
jsSettings(
// Add JS-specific settings here
)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
How do I run this packager for the appJVM?
And how I include the file generated by sbt appJS/fullOptJS?
And some other static files?
Update with Ivan response
build.sbt:
import sbtcrossproject.CrossPlugin.autoImport.{crossProject, CrossType}
val sharedSettings = Seq(
scalaVersion := "2.12.8",
)
lazy val app =
crossProject(JSPlatform, JVMPlatform)
.in(file("."))
.settings(sharedSettings)
.jsSettings(
)
.jvmSettings(
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-http" % "10.1.9"
),
)
lazy val backend = project
.enablePlugins(UniversalPlugin)
.enablePlugins(JavaAppPackaging)
.dependsOn(app.jvm)
.settings(
mainClass in Compile := Some("com.example.EchoServer")
)
lazy val frontend = project
.enablePlugins(ScalaJSPlugin)
.dependsOn(app.js)
backend
.settings(
Seq(
resourceGenerators in Compile += Def.task {
Seq(
(fullOptJS in Compile in frontend).value,
(fastOptJS in Compile in frontend).value
).map { js =>
val resource = (resourceManaged in Compile).value / "public" / "assets" / js.data.name
IO.write(resource, IO.read(js.data))
resource
}
}.taskValue
)
)
and run:
$ sbt backend/universal:packageBin
34: error: type mismatch;
found : Seq[sbt.Def.Setting[Seq[sbt.Task[Seq[java.io.File]]]]]
required: Int
Seq(
^
[error] Type error in expression
I used the following structure.
Define a shared project that needs to be cross-compiled for JS and Scala.
lazy val shared = CrossPlugin.autoImport
.crossProject(JSPlatform, JVMPlatform)
.crossType(CrossType.Pure)
.jvmSettings(???)
.jsSettings(???)
lazy val sharedJvm = shared.jvm
lazy val sharedJs = shared.js
Add project that contains a Main class.
lazy val backend = project
.enablePlugins(UniversalPlugin)
.enablePlugins(JavaAppPackaging)
.dependsOn(sharedJvm)
Add web project containing web related code.
lazy val web = project
.enablePlugins(ScalaJSPlugin)
.dependsOn(sharedJs)
And finally, attach resources from web compiled into JS to backend.
backend
.settings(
Seq(
resourceGenerators in Compile += Def.task {
Seq(
(fullOptJS in Compile in web).value,
(fastOptJS in Compile in web).value
).map { js =>
val resource = (resourceManaged in Compile).value / "public" / "assets" / js.data.name
IO.write(resource, IO.read(js.data))
resource
}
}.taskValue
)
Main class needs to service compiled JS from public/assets, as configured in sbt, and any other web resources from its class path.

sbt multi project undefined settings

I have a multi project setup in SBT. In our build process there's a file in the project that is automatically updated by our CI. It contains the app version.
However, whenever I try to load the app settings, I get an error similar to the following:
[error] References to undefined settings:
[error]
[error] module1/*:appProperties from module1/*:version (/Users/jespeno/workspace/multi-module/build.sbt:10)
[error]
[error] module2/*:appProperties from module2/*:version (/Users/jespeno/workspace/multi-module/build.sbt:10)
This is what my sbt file looks like:
val appProperties = settingKey[Properties]("app version")
appProperties := {
val prop = new Properties()
IO.load(prop, new File("version.properties"))
prop
}
val commonSettings = Seq(
version := appProperties.value.getProperty("project.version"),
scalaVersion := "2.11.7"
)
lazy val root = (project in file(".")).settings(commonSettings: _*)
.aggregate(module1, module2)
.settings(
name := appProperties.value.getProperty("project.name")
)
lazy val module1 = (project in file("./modules/module1"))
.settings(commonSettings: _*)
.settings(
name := "module1"
)
lazy val module2 = (project in file("./modules/module2"))
.settings(commonSettings: _*)
.settings(
name := "module2"
)
Here's my version.properties:
project.name="multi-module"
project.version="0.0.1"
The interesting thing is, the root project is able to load the settings correctly: if I remove the sub-modules, the build starts correctly. I'm using SBT version 0.13.8.
This is caused by appProperties not being visible to submodules(module1, module2), you can change it to:
appProperties in Global := {
val prop = new Properties()
IO.load(prop, new File("version.properties"))
prop
}
sbt scopes

SBT/Scala: macro implementation not found

I tried my hand on macros, and I keep running into the error
macro implementation not found: W
[error] (the most common reason for that is that you cannot use macro implementations in the same compilation run that defines them)
I believe I've set up a two pass compilation with the macro implementation being compiled first, and the usage second.
Here is part of the /build.sbt:
lazy val root = (project in file(".")).
settings(rootSettings: _*).
settings(name := "Example").
aggregate(macros, core).
dependsOn(macros, core)
lazy val macros = (project in file("src/main/com/example/macros")).
settings(macrosSettings: _*).
settings(name := "Macros")
lazy val core = (project in file("src/main/com/example/core")).
settings(coreSettings: _*).
settings (name := "Core").
dependsOn(macros)
lazy val commonSettings = Seq(
organization := Organization,
version := Version,
scalaVersion := ScalaVersion
)
lazy val rootSettings = commonSettings ++ Seq(
libraryDependencies ++= commonDeps ++ rootDeps ++ macrosDeps ++ coreDeps
)
lazy val macrosSettings = commonSettings ++ Seq(
libraryDependencies ++= commonDeps ++ macrosDeps
)
lazy val coreSettings = commonSettings ++ Seq(
libraryDependencies ++= commonDeps ++ coreDeps
)
The macro implementation looks like this:
/src/main/com/example/macros/Macros.scala
object Macros {
object Color {
def ColorWhite(c: Context): c.Expr[ObjectColor] = c.Expr[ObjectColor](c.universe.reify(ObjectColor(White())).tree)
}
}
The usage looks like this:
/src/main/com/example/core/Main.scala
object Macros {
import com.example.macros.Macros._
def W: ObjectColor = macro Color.ColorWhite
}
object Main extends App {
import Macros._
println(W)
}
Scala 2.11.6. SBT 0.13.8.
What am I doing wrong?
Thanks for your advice!
Fawlty Project:
The Project on Github
Working Project:
Rearranged the projects to a more correct form:
The cleanedup working project
Your macros and core projects don't contain any files, so they don't cause the problem. The error happens when sbt compiles root, which contains both Main.scala and Macros.scala by the virtue of you saying project in file(".") in the sbt build.