Cross compiling Scala versions with differing project modules - scala

I'm trying to use sbt-projectmatrix to cross compile scala versions. However, I want one of the final modules to use different internal dependencies, based on the target Scala version.
This is a toy example of where I am right now:
lazy val root: Project = (projectMatrix in file("."))
.settings(commonSettings: _*)
.settings(publish := {})
.aggregate(
a_spark21.projectRefs ++
a_spark22.projectRefs ++
a_spark23.projectRefs ++
a_spark24.projectRefs ++
top.projectRefs ++: _*
)
lazy val top: Project = (projectMatrix in file("vulcan-hive"))
.disablePlugins(SitePreviewPlugin, SitePlugin, ParadoxPlugin, ParadoxSitePlugin)
.settings(name := "vulcan-hive")
.settings(commonSettings: _*)
.jvmPlatform(scalaVersions = Seq(versions.scala211, versions.scala212))
.customRow(
scalaVersions = Seq(versions.scala211),
axisValues = Seq(VirtualAxis.jvm),
_.settings(
SubModuleDependencies.hiveDependencies24,
libraryDependencies ++= Dependencies.Spark24.Compile.all.map(_.force()),
)
)
.customRow(
scalaVersions = Seq(versions.scala212),
axisValues = Seq(VirtualAxis.jvm),
_.settings(
SubModuleDependencies.hiveDependencies24,
libraryDependencies ++= Dependencies.Spark24.Compile.all.map(_.force()),
)
)
.dependsOn(a_spark24 % "compile->compile;test->test", a_spark23 % "compile->compile;test->test", a_spark22 % "compile->compile;test->test", a_spark21 % "compile->compile;test->test")
lazy val a_spark21: Project = (projectMatrix in file(a_spark21))
.jvmPlatform(scalaVersions = Seq(versions.scala211))
.customRow(
scalaVersions = Seq(versions.scala211),
axisValues = Seq(VirtualAxis.jvm),
_.settings(
libraryDependencies ++= Dependencies.Spark21.Compile.all.map(_.force())
)
)
lazy val a_spark22: Project = ...
lazy val a_spark23: Project = ...
lazy val a_spark24: Project = (projectMatrix in file(a_spark24))
.jvmPlatform(scalaVersions = Seq(versions.scala211, versions.scala212))
.customRow(
scalaVersions = Seq(versions.scala211),
axisValues = Seq(VirtualAxis.jvm),
_.settings(
libraryDependencies ++= Dependencies.Spark24.Compile.all.map(_.force())
)
)
.customRow(
scalaVersions = Seq(versions.scala212),
axisValues = Seq(VirtualAxis.jvm),
_.settings(
libraryDependencies ++= Dependencies.Spark24.Compile.all.map(_.force())
)
)
This fails with this error:
no rows were found in a_spark23 matching ProjectRow(true, List(PlatformAxis(jvm,JVM,jvm), ScalaVersionAxis(2.12.12,2.12))): List(ProjectRow(true, List(PlatformAxis(jvm,JVM,jvm), ScalaVersionAxis(2.11.12,2.11))), ProjectRow(true, List(PlatformAxis(jvm,JVM,jvm), ScalaVersionAxis(2.11.12,2.11))))
This error makes sense since top is trying is trying to find 2.12 dependencies for a_spark23 even though a_spark23 doesn’t have a 2.12 row defined. Is there a way to move dependsOn definitions to the customRow arguments? Or access the scalaBinaryVersion in outside the settings closure so that I can pass different lists to dependsOn?
The final product I want is top_2.12 that's depends on a_spark24_2.12 and top_2.11 that depends on [a_spark21_2.11, a_spark22_2.11, a_spark23_2.11, a_spark24_2.11]
Any help or guidance would be greatly appreciated!

Related

Use sbt settingKey in function call

We have a build.sbt file like this, which is working fine:
name := "Foo"
version := "0.1"
scalaVersion := "2.12.8"
def aws(module: String): ModuleID = "com.amazonaws" % module % "1.11.250"
lazy val Core = project
.settings(
libraryDependencies ++= Seq(
aws("aws-java-sdk-s3"),
aws("aws-java-sdk-dynamodb"),
)
)
Basically, the project has a few AWS SDK library dependencies and we want to avoid typing the groupID (e.g. "com.amazonaws") and the revision (e.g. "1.11.250") multiple times and that's why we have this line:
def aws(module: String): ModuleID = "com.amazonaws" % module % "1.11.250"
However, since we have many repos like this and we want to move this definition to a custom sbt-plugin. To begin with, we try this:
name := "Foo"
version := "0.1"
scalaVersion := "2.12.8"
val awsVersion = settingKey[String]("The version of aws SDK used for building.") // line 5
def aws(module: String): ModuleID = "com.amazonaws" % module % awsVersion.value // line 6
awsVersion := "1.11.250"
lazy val Core = project
.settings(
libraryDependencies ++= Seq(
aws("aws-java-sdk-s3"),
aws("aws-java-sdk-dynamodb"),
)
)
However, line 6 is producing an error:
error: value can only be used within a task or setting macro, such as :=, +=, ++=, Def.task, or Def.setting.
The idea is that we'll move line 5 and 6 above to our plugin eventually so that we can use it like this:
name := "Foo"
version := "0.1"
scalaVersion := "2.12.8"
awsVersion := "1.11.250"
lazy val Core = project
.settings(
libraryDependencies ++= Seq(
aws("aws-java-sdk-s3"),
aws("aws-java-sdk-dynamodb"),
)
)
Any solution or work around for the error above?
We've also tried this:
def aws(module: String, version: String): ModuleID = "com.amazonaws" % module % version
... which is then used like this:
awsVersion := "1.11.250"
lazy val Core = project
.settings(
libraryDependencies ++= Seq(
aws("aws-java-sdk-s3", awsVersion.value),
aws("aws-java-sdk-dynamodb", awsVersion.value),
)
)
That works fine though a bit annoying to use and it defeats the purpose of using a settingKey to begin with.
You cannot use setting or task values in locally defined methods like aws. The values can be used only within other setting or task definitions, ie the error message such as :=, +=, ++=, Def.task, or Def.setting.
This is what you could do.
Create AutoPlugin in project folder.
import sbt.{AutoPlugin, Def, ModuleID, settingKey}
import sbt.PluginTrigger.AllRequirements
import sbt._
object AwsPlugin extends AutoPlugin {
override def trigger = AllRequirements
type GetAWS = String => ModuleID
object autoImport {
val awsVersion =
settingKey[String]("The version of aws SDK used for building.")
val awsLibrary = settingKey[GetAWS]("Builds given AWS library")
}
import autoImport._
override def projectSettings: Seq[Def.Setting[_]] = Seq(
awsLibrary := { id =>
"com.amazonaws" % id % awsVersion.value
}
)
}
Use it in this way in build.sbt
awsVersion in ThisBuild := "1.11.250"
lazy val Core = project
.settings(
libraryDependencies ++= Seq(
awsLibrary.value("aws-java-sdk-s3"),
awsLibrary.value("aws-java-sdk-dynamodb"),
)
)

How to run main project in multi-project scala

I've following Build code to have multi-project setup:
import sbt._
import Keys._
import com.typesafe.sbt.SbtScalariform._
import scalariform.formatter.preferences._
import sbtunidoc.Plugin._
object DirectorynameBuild extends Build {
addCommandAlias("rebuild", ";clean; compile; package")
lazy val BoundedContext = Project(id = "BoundedContext",
base = file("."),
settings = commonSettings).aggregate(
persistence,
core,
biz,
protocol,
transport)
lazy val persistence = Project(id = "BoundedContext-persistence",
settings = commonSettings,
base = file("persistence")) dependsOn (protocol)
lazy val core = Project(id = "BoundedContext-core",
settings = commonSettings,
base = file("core")) dependsOn (
persistence,
biz,
protocol)
lazy val biz = Project(id = "BoundedContext-biz",
settings = commonSettings,
base = file("biz")) dependsOn (protocol)
lazy val protocol = Project(id = "BoundedContext-protocol",
settings = commonSettings,
base = file("protocol"))
lazy val transport = Project(id = "BoundedContext-transport",
settings = commonSettings,
base = file("transport")) dependsOn(protocol)
val ORGANIZATION = "my.first.ddd.app"
val PROJECT_NAME = "directoryname"
val PROJECT_VERSION = "0.1-SNAPSHOT"
val SCALA_VERSION = "2.11.4"
val TYPESAFE_CONFIG_VERSION = "1.2.1"
val SCALATEST_VERSION = "2.2.2"
val SLF4J_VERSION = "1.7.9"
val LOGBACK_VERSION = "1.1.2"
lazy val commonSettings = Project.defaultSettings ++
basicSettings ++
formatSettings ++
net.virtualvoid.sbt.graph.Plugin.graphSettings
lazy val basicSettings = Seq(
version := PROJECT_VERSION,
organization := ORGANIZATION,
scalaVersion := SCALA_VERSION,
libraryDependencies ++= Seq(
"com.typesafe" % "config" % TYPESAFE_CONFIG_VERSION,
"org.slf4j" % "slf4j-api" % SLF4J_VERSION,
"ch.qos.logback" % "logback-classic" % LOGBACK_VERSION % "runtime",
"org.scalatest" %% "scalatest" % SCALATEST_VERSION % "test"
),
scalacOptions in Compile ++= Seq(
"-unchecked",
"-deprecation",
"-feature"
),
/*javaOptions += "-Djava.library.path=%s:%s".format(
sys.props("java.library.path")
),*/
fork in run := true,
fork in Test := true,
parallelExecution in Test := false
)
lazy val formatSettings = scalariformSettings ++ Seq(
ScalariformKeys.preferences := FormattingPreferences()
.setPreference(IndentWithTabs, false)
.setPreference(IndentSpaces, 2)
.setPreference(AlignParameters, false)
.setPreference(DoubleIndentClassDeclaration, true)
.setPreference(MultilineScaladocCommentsStartOnFirstLine, false)
.setPreference(PlaceScaladocAsterisksBeneathSecondAsterisk, true)
.setPreference(PreserveDanglingCloseParenthesis, true)
.setPreference(CompactControlReadability, true)
.setPreference(AlignSingleLineCaseStatements, true)
.setPreference(PreserveSpaceBeforeArguments, true)
.setPreference(SpaceBeforeColon, false)
.setPreference(SpaceInsideBrackets, false)
.setPreference(SpaceInsideParentheses, false)
.setPreference(SpacesWithinPatternBinders, true)
.setPreference(FormatXml, true)
)
//credentials += Credentials(Path.userHome / ".ivy2" / ".credentials")
}
I've not defined src/main/scala in root folder.
I could run individual project using sbt "project ****" run
If I follow command: sbt run then I'm getting "No Main Class Detected"
I could see lagom framework is easily implementing the process I wanted. But I couldn't figure out the process.
What should I do to make the main project run without use of src/main/scala/***App.scala?

SBT/Scala: macro implementation not found

I tried my hand on macros, and I keep running into the error
macro implementation not found: W
[error] (the most common reason for that is that you cannot use macro implementations in the same compilation run that defines them)
I believe I've set up a two pass compilation with the macro implementation being compiled first, and the usage second.
Here is part of the /build.sbt:
lazy val root = (project in file(".")).
settings(rootSettings: _*).
settings(name := "Example").
aggregate(macros, core).
dependsOn(macros, core)
lazy val macros = (project in file("src/main/com/example/macros")).
settings(macrosSettings: _*).
settings(name := "Macros")
lazy val core = (project in file("src/main/com/example/core")).
settings(coreSettings: _*).
settings (name := "Core").
dependsOn(macros)
lazy val commonSettings = Seq(
organization := Organization,
version := Version,
scalaVersion := ScalaVersion
)
lazy val rootSettings = commonSettings ++ Seq(
libraryDependencies ++= commonDeps ++ rootDeps ++ macrosDeps ++ coreDeps
)
lazy val macrosSettings = commonSettings ++ Seq(
libraryDependencies ++= commonDeps ++ macrosDeps
)
lazy val coreSettings = commonSettings ++ Seq(
libraryDependencies ++= commonDeps ++ coreDeps
)
The macro implementation looks like this:
/src/main/com/example/macros/Macros.scala
object Macros {
object Color {
def ColorWhite(c: Context): c.Expr[ObjectColor] = c.Expr[ObjectColor](c.universe.reify(ObjectColor(White())).tree)
}
}
The usage looks like this:
/src/main/com/example/core/Main.scala
object Macros {
import com.example.macros.Macros._
def W: ObjectColor = macro Color.ColorWhite
}
object Main extends App {
import Macros._
println(W)
}
Scala 2.11.6. SBT 0.13.8.
What am I doing wrong?
Thanks for your advice!
Fawlty Project:
The Project on Github
Working Project:
Rearranged the projects to a more correct form:
The cleanedup working project
Your macros and core projects don't contain any files, so they don't cause the problem. The error happens when sbt compiles root, which contains both Main.scala and Macros.scala by the virtue of you saying project in file(".") in the sbt build.

How to "seq" plugin settings in a multi-project sbt build

I am converting a single-project build.sbt to a multi-project build.sbt, which is always a PITA. There is this obscure syntax to make plugin settings available. E.g. before
seq(appbundle.settings: _*)
How do I do this with sub-projects. E.g.
lazy val views = Project(
id = "views",
base = file("views"),
dependencies = Seq(core),
settings = commonSettings ++ Seq(
seq(appbundle.settings: _*), // ???
name := "views",
description := ...
)
)
This just gives me an error
found : Seq[sbt.Def.SettingsDefinition]
required: Seq[sbt.Def.Setting[_]]
settings = commonSettings ++ Seq(
^
Add them using ++ to the overall settings
lazy val views = Project(
id = "views",
base = file("views"),
dependencies = Seq(core),
settings = commonSettings ++ appbundle.settings ++ Seq(
name := "views",
description := ...
)
)

Compile with different settings in different commands

I have a project defined as follows:
lazy val tests = Project(
id = "tests",
base = file("tests")
) settings(
commands += testScalalib
) settings (
sharedSettings ++ useShowRawPluginSettings ++ usePluginSettings: _*
) settings (
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-reflect" % _),
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-compiler" % _),
libraryDependencies += "org.tukaani" % "xz" % "1.5",
scalacOptions ++= Seq()
)
I would like to have three different commands which will compile only some files inside this project. The testScalalib command added above for instance is supposed to compile only some specific files.
My best attempt so far is:
lazy val testScalalib: Command = Command.command("testScalalib") { state =>
val extracted = Project extract state
import extracted._
val newState = append(Seq(
(sources in Compile) <<= (sources in Compile).map(_ filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala"))),
state)
runTask(compile in Compile, newState)
state
}
Unfortunately when I use the command, it still compiles the whole project, not just the specified files...
Do you have any idea how I should do that?
I think your best bet would be to create different configurations like compile and test, and have the appropriate settings values that would suit your needs. Read Scopes in the official sbt documentation and/or How to define another compilation scope in SBT?
I would not create additional commands, I would create an extra configuration, as #JacekLaskowski suggested, and based on the answer he had cited.
This is how you can do it (using Sbt 0.13.2) and Build.scala (you could of course do the same in build.sbt, and older Sbt version with different syntax)
import sbt._
import Keys._
object MyBuild extends Build {
lazy val Api = config("api")
val root = Project(id="root", base = file(".")).configs(Api).settings(custom: _*)
lazy val custom: Seq[Setting[_]] = inConfig(Api)(Defaults.configSettings ++ Seq(
unmanagedSourceDirectories := (unmanagedSourceDirectories in Compile).value,
classDirectory := (classDirectory in Compile).value,
dependencyClasspath := (dependencyClasspath in Compile).value,
unmanagedSources := {
unmanagedSources.value.filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala")
}
))
}
now when you call compile everything will get compiled, but when you call api:compile only the classes matching the filter predicate.
Btw. You may want to also look into the possibility of defining different unmanagedSourceDirectories and/or defining includeFilter.