Scala build.sbt: Solving dependency recursions - scala

I just started with SBT's Multi-Project-Builds and I ran into an interesting problem that I have not seen a good example for in the scala-sbt docs.
In my build.sbt, Project B and C are dependent on A, but B is also dependent on C (at least dependent on B's classes in C's testing scope):
(Common is referenced to an Object in root/project/Common.scala)
root/build.sbt:
lazy val prjA: Project = project.in(file("Project-A")).
settings(
name := "Project-A",
version := Common.prjVersion,
scalaVersion := Common.scalaVersion,
libraryDependencies ++= Common.Imports.compileDependencies,
libraryDependencies ++= Common.Imports.testDependencies,
)
lazy val prjB: Project = project.in(file("Project-B")).
settings(
name := "Project-B",
version := Common.prjVersion,
scalaVersion := Common.scalaVersion,
libraryDependencies ++= Common.Imports.compileDependencies,
libraryDependencies ++= Common.Imports.testDependencies,
).dependsOn(prjA)//.dependsOn(prjC % "test->compile")
lazy val prjC: Project = project.in(file("Project-C")).
settings(
name := "Project-C",
version := Common.prjVersion,
scalaVersion := Common.scalaVersion,
libraryDependencies ++= Common.Imports.compileDependencies,
libraryDependencies ++= Common.Imports.testDependencies,
).dependsOn(prjA).dependsOn(prjB)
This build.sbt, as it is written here, runs successful (via sbt clean update compile) but for sure, I cannot start the test-cases in prjB. Once I establish the .dependsOn(prjC % "test->compile") on prjB in my build.sbt, the output is a StackOverflowError - this makes perfectly sense to me, as the cross-dependency between prjB and prjC can not be solved.
However, is there a practical way to solve this endless recursion? I am thinking about one more step in the building process (1 & 2 are done by the actual build.sbt, as you can see), but I don't know how to do that.
First compile prjB with prjA dependency,
Then compile prjC with prjA and prjB dependency
at last include the builded prjC's classes in prjB for testing purposes. <- is this a valid approach?
Best regards and thanks in advance!

This isn't really an SBT problem, but a module dependency problem.
Ignoring A for a moment, since B needs C to compile and C needs B to compile, this cycle cannot be resolved by any build system.
The only way to solve this is to change the structure of the modules themselves. For example, if possible, you could create a D project that contains the common classes and have them both rely on it. Or, use the Dependency Inversion Principle.

Related

Can I create a proto jar for scalaVersion 2.11/2.12 and use it within the same sbt under different sub-project?

I have a set of .proto files (protobuf) which I generate java from using scalapb. I also have in the same sbt 2 sub-projects, one is scalaVersion 2.11 compatible (can't upgrade it to 2.12 due to missing packages) and the other one is scala 2.12.
I created a sub-project to hold my proto, and by default 2.12 is used and my 2.12 sub-project can use it, but my 2.11 can't.
I set the crossScalaVersions to 2.11/2.12, I compiled my project with both, which passed, but then even then I was unable to get the 2.11 sub-project to find that code.
I am "wondering" if that is something supported, or if there is a track I could use a single location to hold my .proto yet have my 2 sub-projects using the same sbt file use those.
lazy val scala212 = "2.12.13"
lazy val scala211 = "2.11.12"
lazy val supportedScalaVersion = List(scala212, scala211)
ThisBuild / scalaVersion := scala212
lazy val root = (project in file("."))
.aggregate(proto, subproject1, subproject2)
.settigns(
crossScalaVersions := Nil,
publish / skip := true
)
lazy val proto = project
.settings(
crossScalaVersions := supportedScalaVersions,
name := "proto",
libraryDependencies += "com.trueaccord.scalapb" %% "scalapb-runtime" % com.trueaccord.scalapb.compiler.Version.scalapbVersion % "protobuf",
PB.targets in Compile := Seq(
scalapb.gen(grpc = false) -> (sourceManaged in Compile).value / "protobuf"
)
)
lazy val subproject1 = project
.dependsOn(proto)
lazy val subproject2 = project
.settings(
scalaVersion := scala211
)
.dependsOn(proto)
So, from the above, if I do sbt "+ proto" I can compile both versions. If I do sbt subproject1/compile it works too. Using sbt subproject2/compile fails indicating that it cannot find the 2.11:proto jar file.
Either, I would like the above somehow to work nicely, or any other trick that I could generate the code from the same proto location but within subproject1/subproject2 would be appreciated.
You could try the sbt-projectmatrix plugin:
https://github.com/sbt/sbt-projectmatrix
The idea is to have separate sbt subprojects for the different Scala versions, so you can simply reference the relevant subproject when calling dependsOn.
I think this plugin is going to end up in sbt some day as it's a much better solution in general than the current built-in stateful cross compilation support, and it's developed by Eugene Yokota, who is also an sbt developer.

Spark-Scala build.sbt libraryDependencies UnresolvedDependency

i'm trying to import a dependency in my build.sbt file from here
https://github.com/dmarcous/spark-betweenness.
When i hover on the error it says:
Expression type ModuleID must confirm to Def.SettingsDefinition in SBT file
Unresolved Dependency
I am new in scala so my question may be silly.Thanks in advance
It is still unclear how your build configuration looks like, but the following build.sbt works (in the sense that it compiles and does not show the error that you mentioned):
name := "test-sbt"
organization := "whatever"
version := "1.0.0"
scalaVersion := "2.10.7"
libraryDependencies += "com.centrality" %% "spark-betweenness" % "1.0.0"
Alternatively, if you have a multi-project build, it could look like this:
lazy val root = project
.settings(
name := "test-sbt",
organization := "whatever",
version := "1.0.0",
scalaVersion := "2.10.7",
libraryDependencies += "com.centrality" %% "spark-betweenness" % "1.0.0"
)
However, you're probably going to find that it still does not work because it cannot resolve this dependency. Indeed, this library does not seem to be available neither in Maven Central nor in jcenter. It is also very old - it appears to only be published for Scala 2.10 and a very old Spark version (1.5), so most likely you won't be able to use it with recent Spark environments (2.x and Scala 2.11).

Explanation of SBT build file

Question
Is .sbt file is a in scala or in sbt proprietary language? Please help to decipher the sbt build definition.
lazy val root = <--- Instance of the SBT Project object? Why "lazy"? Is "root" the reserved keyword for sbt to identify the project in the build.sbt?
(project in file(".")) <--- Is this defining a Project object regarding the current directory having the SBT expected project structure?
.settings( <--- Is this a scala function call of def settings in the Project object?
name := "NQueen",
version := "1.0",
scalaVersion := "2.11.8",
mainClass in Compile := Some("NQueen")
)
libraryDependencies ++= Seq( <--- libraryDependencies is a reserved keyword of type scala.collection.Seq using which sbt downloads and packages them as part of the output jar?
"org.apache.spark" %% "spark-core" % "2.3.0", <--- Concatenating and creating the full library name including version. I suppose I need to look into respective documentations to find out what to specify.
"org.apache.spark" %% "spark-mllib" % "2.3.0"
)
// <--- Please explain what this block does and where I can find the explanations.
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.firs
}
Resources
Please suggest good resources to understand the design, mechanism, how .sbt works. I looked into the SBT getting started and documents but as Scala definition itself, it is difficult to understand. If it is make, ant, or maven, how things get pieced together and the design/mechanism are so much clear, but need to find good documentations or tutorials for SBT.
References
I looked into the references below trying to understand.
SBT: How to get started using the Build.scala file (instead of build.sbt)
What is the difference between build.sbt and build.scala?
SBT - Build definition
SBT Project object
scala.collection.Seq
SBT Library dependencies
Spark 2.3 Quick Start
sbt can be really difficult for first time users, and it's ok not to fully understand all of the definitions. It will become clearer over time.
let me first simplify you build.sbt. it contains some unnecessary parts, and will be easier to explain without them
name := "NQueen"
version := "1.0"
scalaVersion := "2.11.8"
mainClass in Compile := Some("NQueen")
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.0",
"org.apache.spark" %% "spark-mllib" % "2.3.0"
)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.firs
}
and for your questions:
Is .sbt file is a in scala or in sbt proprietary language?
well, it's both. you can do most scala operations in an .sbt file. you can import and use external dependencies, write custom code, etc.. but some things you can't do (define classes for example).
It's also might look as a dedicated different language, but in reality, it's just a DSL written in scala (:=, in, %%, % are all function written in scala)
libraryDependencies is a reserved keyword of type scala.collection.Seq using which sbt downloads and packages them as part of the output jar?
libraryDependencies is not a reserved keyword. you can think of it as a way to configure you project.
writing libraryDependencies := Seq(..) you basically setting the value of libraryDependencies.
But you are right about the meaning. it is a list of dependencies that should be downloaded.
Concatenating and creating the full library name including version. I suppose I need to look into respective documentations to find out what to specify.
keep in mind that %% and % are functions. you use those functions to specify what modules should be downloaded and added to the classpath.
you can find many dependencies (and their versions) in mvnrepository.
for example, for spark: https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11/2.3.0
Please explain what this block does and where I can find the explanations.
assemblyMergeStrategy is a setting coming from the sbt-assembly plugin.
that plugin allows you to pack your application into a single jar with all the dependencies.
you can read about the merge strategy here: https://github.com/sbt/sbt-assembly#merge-strategy

SBT: How to set common scala version for multiproject

I have a multi-SBT-project in IntellJ Idea. My SBT file in the root dir looks like this:
name := "PlayRoot"
version := "1.0"
lazy val shapeless_learn = project.in(file("shapeless_learn")).dependsOn(common)
lazy val scalaz_learn = project.in(file("scalaz_learn")).dependsOn(common)
lazy val common = project.in(file("common"))
lazy val root = project.in(file(".")).aggregate(common, shapeless_learn, scalaz_learn)
scalaVersion := "2.11.7"
Then I have folders for each of the projects: ./common, ./shapeless_learn, ./scalaz_learn and each has its own build.sbt there. But for some reason I require to put in each of the subproject build.sbt files the line scalaVersion := "2.11.7".
If I forget to do that, the build fails with the message:
Error:Unresolved dependencies: common#common_2.10;0.1-SNAPSHOT: not found
See complete log in ...
For some reason if I do not specify that my scala version is 2.11.7, sbt falls back to 2.10 and tries to find common project that is built for 2.10 which I do not have.
I always keep forgetting adding scalaVersion := "2.11.7" to the newly created project and it keeps bugging me. I also would prefer sbt generating build.sbt with some default data, but instead it requires me not to forget to create it manually.
Is there any way I could set the single scala version for all projects and sub-projects in a single place? I figured that I could add a separate lazy val commonSettings = Seq { scalaVersion := "2.11.7" } in a root definition. And for each lazy val project definition I should add in the end .settings(commonSettings). This is nice, but still doesn't look beautiful enough - I should do this for every project definition. Is there a better way?
Is there any way I could create a template for a newly created project, so when I just put line lazy val newProject = ..., it will put an appropriate build.sbt file there with the contents I want?
Use
scalaVersion in ThisBuild := "2.11.7"
in the root build.sbt.

hot swap in sbt project without play-plugin

When I am using play framework, every time I've changed the code, it will take effect automatically by re-compile the code.
However, when I'm using sbt to run a project without play-plugin, it won't take effect.
I'm wondering if there were a way to make sbt project hot swap the changed code.
My build.sbt is as below:
version in ThisBuild := "1.0-SNAPSHOT"
scalaVersion in ThisBuild := "2.11.6"
lazy val `frontend` = (project in file("frontend")).
enablePlugins(PlayScala).
enablePlugins(DockerPlugin).
settings(
name := "frontend",
libraryDependencies ++= Dependencies.frontend
).dependsOn(`api`).aggregate(`api`)
lazy val `backend` = (project in file("backend")).
enablePlugins(JavaAppPackaging).
enablePlugins(DockerPlugin).
settings(
name := "backend",
libraryDependencies ++= Dependencies.backend ++ Seq(cache, ws)
).dependsOn(`api`).aggregate(`api`)
lazy val `api` = (project in file("api")).
settings(
name := "api",
libraryDependencies += ws
)
And what I have configured in intellij idea is like below as a sbt task(I can't post images by now):
"project backend" ~run
However, every time I've changed the code in backend, It won't take effect after I've call backend from the frontend.
I'm wondering how I can solve the problem. Thanks for your guys' help.
You can have sbt automatically recompiling any changes by invoking it like this:
sbt ~compile
If you use ~run, on every change the changeed classes will be compiled and project rerun again.
If it does not work, you might explain more about your project and structure.
Open two SBT window.
The one run ~compile, and another run ~run.
Hope it will be help.