Why does sbt try to pull my interproject dependency? - scala

I have a multi-project build with a build.sbt that looks as follows:
import lmcoursier.CoursierConfiguration
import lmcoursier.definitions.Authentication
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.12.12"
val adoMavenUsername = "."
val adoMavenPassword = "ADO_PAT"
val adoRepoIdWithView = "ADO-id"
val adoMavenRepos = Vector(
MavenRepository(adoRepoIdWithView, s"https://adoMavenHost/adoOrganization/adoProject/_packaging/${adoRepoIdWithView.replace("#", "%40")}/maven/v1")
)
val adoAuthentication =
Authentication(user = adoMavenUsername, password = adoMavenPassword)
.withOptional(false)
.withHttpsOnly(true)
.withPassOnRedirect(false)
val coursierConfiguration = {
val initial =
CoursierConfiguration()
.withResolvers(adoMavenRepos)
.withClassifiers(Vector("", "sources"))
.withHasClassifiers(true)
adoMavenRepos.foldLeft(initial) {
case (conf, repo) ⇒
conf.addRepositoryAuthentication(repo.name, adoAuthentication)
}
}
lazy val mainSettings = Seq(
organization := "org.some",
csrConfiguration := coursierConfiguration,
updateClassifiers / csrConfiguration := coursierConfiguration
)
lazy val root = (project in file("."))
.settings(mainSettings: _*)
.settings(
name := "sbt-test",
).aggregate(core, util)
lazy val core = (project in file("core"))
.settings(mainSettings: _*)
.settings(
name := "core",
).dependsOn(util)
lazy val util = (project in file("util"))
.settings(mainSettings: _*)
.settings(
name := "util"
)
For some reason, coursier attempts to download the util package externally during the core/update task. This is not what I want, as it should resolve it internally as part of the project. The package is not added to libraryDependencies, so I'm baffled why it would attempt the download.
The above example will fail because the Azure DevOps credentials are and Maven repository are incorrect, but it shows the attempt to download util.
It seems somehow related to this Github issue.

The default CoursierConfiguration constructor sets the interProjectDependencies property to an empty Vector. To fix this, manually add resolvers on top of sbt's csrConfiguration taskKey using .withResolvers.
This is what the solution looks like applied to my question, largely based on this Github comment:
val adoMavenUsername = "."
val adoMavenPassword = "ADO_PAT"
val adoRepoIdWithView = "ADO-id"
val adoMavenHost = "pkgs.dev.azure.com"
val adoMavenRepos = Vector(
MavenRepository(adoRepoIdWithView, s"https://$adoMavenHost/adoOrganization/adoProject/_packaging/$adoRepoIdWithView/maven/v1")
)
lazy val mainSettings = Seq(
organization := "org.some",
csrConfiguration := {
val resolvers = csrResolvers.value ++ adoMavenRepos
val conf = csrConfiguration.value.withResolvers(resolvers.toVector)
val adoCredentialsOpt = credentials.value.collectFirst { case creds: DirectCredentials if creds.host == adoMavenHost => creds }
val newConfOpt = adoCredentialsOpt.map { adoCredentials =>
val auths =
resolvers
.collect {
case repo: MavenRepository if repo.root.startsWith(s"https://$adoMavenHost/") => {
repo.name ->
Authentication(adoCredentials.userName, adoCredentials.passwd)
}
}
auths.foldLeft(conf) { case (conf, (repoId, auth)) => conf.addRepositoryAuthentication(repoId, auth) }
}
newConfOpt.getOrElse(conf)
},
updateClassifiers / csrConfiguration := coursierConfiguration
)

Related

How can I define my own setting or variable in build.sbt?

I try to define my own setting that calculated by using value of name setting in build.sbt
// ...
val projectName_ = "project_name"
val projectName = projectName_.replace("_", "")
lazy val main_class = settingKey[String]("")
main_class := s"ru.company.${projectName}.${name.value}.Main"
lazy val commonSettings = Seq(
// ...
Compile / mainClass := Some(main_class.value),
assembly / mainClass := Some(main_class.value)
// ...
)
lazy val rollout = taskKey[File](s"rollout_${projectName_}") := {
// Other using of main_class.value
}
lazy val root = (project in file("."))
.aggregate(stg, dm)
.settings(
name := "root"
)
lazy val core = project
.settings(
name := "core",
//...
)
lazy val stg = project.dependsOn(core)
.settings(
name := "stg",
commonSettings,
rollout
)
lazy val dm = project.dependsOn(core)
.settings(
name := "dm",
commonSettings,
rollout
)
But i get error when i try to get value of my setting:
Some(main_class.value)
Reference to undefined settings
How can I define variable with name setting that I will be able to use in settings?
When you do
lazy val main_class = settingKey[String]("")
main_class := s"ru.company.${projectName}.${name.value}.Main"
you're defining settingKey for any place which can access it, but setting its value only for the current project (which is root). For subprojects the value is undefined. So you have to set it for all projects.
Do something like this:
Global / main_class := s"ru.company.${projectName}.${name.value}.Main"
or
ThisBuild / main_class := s"ru.company.${projectName}.${name.value}.Main"
and main_class.value should not longer complain.
See the differences between Global and ThisBuild here.

build.sbt - iteration over sub projects for common settings in monorepo

I'm implementing a monorepo using SBT. I would like to iterate over my subprojects in order to initialize them (as the have the same configuration) and prevent code duplication.
In my build.sbt:
lazy val root = (project in file("."))
.aggregate(projects: _*)
.settings(
crossScalaVersions := Nil,
publish / skip := true
)
lazy val projects = Seq("projectA", "projectB", "projectC")
.map((projectName: String) => (project in file(projectName))
.settings(
name := projectName,
commonSettings,
libraryDependencies ++= ModulesDependencies.get(projectName))
.project
)
I'm getting the error:
error: project must be directly assigned to a val, such as `val x = project`. Alternatively, you can use `sbt.Project.apply`
Based on the error message, I also tried to use sbt.Project.apply(projectName, file(projectName)).settings(...) instead, but I'm also facing some funny errors.
From what I understand, it seems that SBT expects me to declare as lazy val projectA = (project in file("projectA")).settings(...), which works fine but I would have to duplicate this code for all my sub projects.
Is this iteration that I try to implement even possible?
Utility method might help with some of the duplication, for example
def createProject(projectName: String) = {
Project(projectName, file(projectName))
.settings(
name := projectName,
commonSettings,
libraryDependencies ++= ModulesDependencies.get(projectName)
)
}
lazy val projectA = createProject("projectA")
lazy val projectB = createProject("projectB")
lazy val projectC = createProject("projectC")
lazy val root = (project in file("."))
.aggregate(projectA, projectB, projectB)
.settings(
crossScalaVersions := Nil,
publish / skip := true
)

sbt multi project undefined settings

I have a multi project setup in SBT. In our build process there's a file in the project that is automatically updated by our CI. It contains the app version.
However, whenever I try to load the app settings, I get an error similar to the following:
[error] References to undefined settings:
[error]
[error] module1/*:appProperties from module1/*:version (/Users/jespeno/workspace/multi-module/build.sbt:10)
[error]
[error] module2/*:appProperties from module2/*:version (/Users/jespeno/workspace/multi-module/build.sbt:10)
This is what my sbt file looks like:
val appProperties = settingKey[Properties]("app version")
appProperties := {
val prop = new Properties()
IO.load(prop, new File("version.properties"))
prop
}
val commonSettings = Seq(
version := appProperties.value.getProperty("project.version"),
scalaVersion := "2.11.7"
)
lazy val root = (project in file(".")).settings(commonSettings: _*)
.aggregate(module1, module2)
.settings(
name := appProperties.value.getProperty("project.name")
)
lazy val module1 = (project in file("./modules/module1"))
.settings(commonSettings: _*)
.settings(
name := "module1"
)
lazy val module2 = (project in file("./modules/module2"))
.settings(commonSettings: _*)
.settings(
name := "module2"
)
Here's my version.properties:
project.name="multi-module"
project.version="0.0.1"
The interesting thing is, the root project is able to load the settings correctly: if I remove the sub-modules, the build starts correctly. I'm using SBT version 0.13.8.
This is caused by appProperties not being visible to submodules(module1, module2), you can change it to:
appProperties in Global := {
val prop = new Properties()
IO.load(prop, new File("version.properties"))
prop
}
sbt scopes

SBT triggering or detecting in a task if any sources have been recompiled

This snippet is wrong:
def bundleTo(dir: String) = Seq(
mkBundles <<= (bundle, compile in Compile) map { (fl, anal) =>
val flTarget = baseDirectory / s"app/$dir/${fl.getName}"
if (!flTarget.exists()) {
println("target did not exist copying over")
IO.copyFile(fl, flTarget)
} else if (anal.compilations.allCompilations.nonEmpty) {
println("something was recompiled, copying over")
IO.copyFile(fl, flTarget)
}
},
mkBundles <<= mkBundles.triggeredBy(compile in Compile)
)
Specifically anal.compilations.allCompilations.nonEmpty. I'd like to move a plugin into a directory only if something has changed as it triggers a bundle reload.
This snippet for SBT 13.7 will trigger the inner closure upon source change. There is probably pre-rolled logic for this in the SBT code base. You will probably need invalidation logic for SBT setting key changes and dependency updates.
myTask := {
val us = (unmanagedSources in Compile).value
val cd = streams.value.cacheDirectory / "osgi-recompile-cache"
println("bam")
val func = FileFunction.cached(cd, FilesInfo.lastModified) { par: Set[File] =>
println("boom")
par
}
func(us.toSet)
}
myTask <<= myTask.triggeredBy(compile in Compile)
Fleshed out a script to do what I need. Here it is :
import sbt._
import sbt.Keys._
import com.typesafe.sbt.osgi.OsgiKeys._
object OsgiDistUtils {
lazy val rootDirectory = SettingKey[File]("the root of the entire build")
lazy val distDirectoryName = SettingKey[String]("name for the dist directory")
lazy val distdirectory = SettingKey[File]("derived location where the OSGI dist will be constructed")
lazy val bundleDirectory = SettingKey[File]("location for the bundles")
lazy val compileBundleAndMove = TaskKey[Unit]("make bundles if needed")
val osgiDistUtildefaults = Seq(
distDirectoryName := "app",
distdirectory := rootDirectory.value / distDirectoryName.value,
compileBundleAndMove := {
val targetDirectory = bundleDirectory.value
val moduleName = name.value
val bundleFile = bundle.value
val s = streams.value
val targetFile = targetDirectory / bundleFile.getName
if(!targetDirectory.exists()) {
IO.createDirectory(targetDirectory)
} else if(!targetFile.exists()) {
s.log.info(s"module $moduleName did not exist in dist, copying over.")
IO.copyFile(bundleFile, targetFile)
} else {
val sources = (unmanagedSources in Compile).value
val cp = (managedClasspath in Compile).value
val cd = s.cacheDirectory / "osgi-recompile-cache"
FileFunction.cached(cd, FilesInfo.lastModified) { sources: Set[File] =>
s.log.info(s"Recompiling $moduleName as sources or classpath have changed.")
IO.copyFile(bundleFile, targetFile)
sources
} (sources.toSet ++ cp.seq.map(_.data).toSet)
}
},
compileBundleAndMove <<= compileBundleAndMove.triggeredBy(compile in Compile)
)
def createModuleGroup(base: File, name: String, aggregatorSettings: Seq[Def.Setting[_]], moduleSettings: Seq[Def.Setting[_]], projectDeps: Array[Project] = Array()) = {
val moduleRoot = base / name
val modules = for (x <- moduleRoot.listFiles if x.isDirectory && x.getName != "target") yield {
Project(
id = name + "-%s".format(x.getName).replace(".", "-"),
base = x,
settings = moduleSettings ++ osgiDistUtildefaults ++ Seq(
bundleDirectory := (distdirectory / name).value
)
).dependsOn(projectDeps.map(x=> ClasspathDependency(x,Some("compile"))):_*)
}
val moduleRefs = modules.map { x =>
x:ProjectReference
}
val aggregationNode = Project(
id = name,
base = moduleRoot,
settings = aggregatorSettings
).aggregate(moduleRefs: _*)
(aggregationNode, modules)
}
}

How to "seq" plugin settings in a multi-project sbt build

I am converting a single-project build.sbt to a multi-project build.sbt, which is always a PITA. There is this obscure syntax to make plugin settings available. E.g. before
seq(appbundle.settings: _*)
How do I do this with sub-projects. E.g.
lazy val views = Project(
id = "views",
base = file("views"),
dependencies = Seq(core),
settings = commonSettings ++ Seq(
seq(appbundle.settings: _*), // ???
name := "views",
description := ...
)
)
This just gives me an error
found : Seq[sbt.Def.SettingsDefinition]
required: Seq[sbt.Def.Setting[_]]
settings = commonSettings ++ Seq(
^
Add them using ++ to the overall settings
lazy val views = Project(
id = "views",
base = file("views"),
dependencies = Seq(core),
settings = commonSettings ++ appbundle.settings ++ Seq(
name := "views",
description := ...
)
)