I have a build.sbt for a couple of packages in the same project.
When I do sbt compile, my protobuf files are under [packagename]/src/main/protobuf/*,
so it won't scan these files since they have that additional package name in the path.
How to change the default protobuf file path?
It is unclear from your question whether your sbt build is a single or multi-project build. You can tell that it is a multi-project build if you see lines like val someSubProject = project.in(file(...).
If your build.sbt is a single project build, you can add a line to customize where protos are being scanned:
Compile / PB.protoSources += file("pkgname/src/main/protobuf")
If you have a multi-project build with a project proj1, then it should already work in the way you expect:
val proj1 = project.in(file("proj1"))
.settings(
PB.targets in Compile := Seq(
scalapb.gen(javaConversions=true) -> (sourceManaged in Compile).value / "scalapb"
),
libraryDependencies ++= Seq(
"com.thesamet.scalapb" %% "scalapb-runtime" % scalapb.compiler.Version.scalapbVersion % "protobuf",
)
)
then sbt-protoc will generate sources for the protos under proj1/src/main/protobuf. The sources will get compiled as poart of the proj1 project.
You can customize the path it looks for by setting Compile / PB.protoSources within that project's setting. For example, if you want it to generate source for protos that are in another top-level directory, you can do:
val proj1 = project.in(file("proj1"))
.settings(
PB.targets in Compile := Seq(
scalapb.gen(javaConversions=true) -> (sourceManaged in Compile).value / "scalapb"
),
Compile / PB.protoSources :=
Seq((ThisBuild / baseDirectory).value / "somewhere" / "protos"),
libraryDependencies ++= Seq(
"com.thesamet.scalapb" %% "scalapb-runtime" % scalapb.compiler.Version.scalapbVersion % "protobuf",
)
)
I recommend using AkkaGrpcPlugin and a multi-project SBT configuration.
Put the .proto files in a separate project and make the others depend on it. This project just has a single directory <root>/grpc/src/main/protobuf containing the .proto files. When this project is compiled it will create all the stub files which can be picked up by other projects that depend on it.
Here is an outline build.sbt file:
lazy val top_level =
(project in file("."))
.aggregate(grpc, main)
lazy val grpc =
project
.in(file("grpc"))
.settings(
???
)
.enablePlugins(AkkaGrpcPlugin)
lazy val main =
project
.in(file("main"))
.settings(
???
)
.dependsOn(grpc)
And add this to plugins.sbt:
addSbtPlugin("com.lightbend.akka.grpc" % "sbt-akka-grpc" % "1.1.1")
Related
I defined multi-project SBT project. I declared all dependencies in my root project. When I run sbt assembly from root directory everything is okay.
How can I run sbt assembly from subproject directory? When I try to do this, SBT can't find dependencies that are declared in root build.sbt file.
For example, i do something like this in root build.sbt:
ThisBuild / organization := "org.example"
ThisBuild / version := "1.0"
ThisBuild / scalaVersion := "2.11.12"
...
lazy val commonSettings = Seq(
libraryDependecies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion
, "org.apache.spark" %% "spark-hive" % sparkVersion
, "org.apache.spark" %% "spark-sql" % sparkVersion
// other dependencies
).map(_% Provided) ++ Seq(
"org.postgresl" % "postgresql" % "42.2.24"
// other dependencies
)
)
lazy val root = (project in file("."))
.aggregate(subproject)
.settings(
name := "root"
)
lazy val subproject = (project in file("subproject"))
.setting(
commonSettings,
name := "subproject"
//...Other settings
)
val allProjects = ScopeFilter(
inProject(
subproject
)
)
build.sbt from subproject directory:
assembly / mainClass := Some("org.example.Main")
//other settings
When I run sbt assembly from root directory everything okay.
When I run it from subproject directory i get errors like that:
object apache is not a member of package org
import org.apache.spark.sql.expressions.UserDefinedFunction
Is it possible to compile jar files from subprojects directories?
You have to build from the directory in which main project is defined.
However you don't have to always build everything. You can pretty much do:
# in project root directory
sbt "subproject / assembly"
so there isn't even an issue.
I have a set of .proto files (protobuf) which I generate java from using scalapb. I also have in the same sbt 2 sub-projects, one is scalaVersion 2.11 compatible (can't upgrade it to 2.12 due to missing packages) and the other one is scala 2.12.
I created a sub-project to hold my proto, and by default 2.12 is used and my 2.12 sub-project can use it, but my 2.11 can't.
I set the crossScalaVersions to 2.11/2.12, I compiled my project with both, which passed, but then even then I was unable to get the 2.11 sub-project to find that code.
I am "wondering" if that is something supported, or if there is a track I could use a single location to hold my .proto yet have my 2 sub-projects using the same sbt file use those.
lazy val scala212 = "2.12.13"
lazy val scala211 = "2.11.12"
lazy val supportedScalaVersion = List(scala212, scala211)
ThisBuild / scalaVersion := scala212
lazy val root = (project in file("."))
.aggregate(proto, subproject1, subproject2)
.settigns(
crossScalaVersions := Nil,
publish / skip := true
)
lazy val proto = project
.settings(
crossScalaVersions := supportedScalaVersions,
name := "proto",
libraryDependencies += "com.trueaccord.scalapb" %% "scalapb-runtime" % com.trueaccord.scalapb.compiler.Version.scalapbVersion % "protobuf",
PB.targets in Compile := Seq(
scalapb.gen(grpc = false) -> (sourceManaged in Compile).value / "protobuf"
)
)
lazy val subproject1 = project
.dependsOn(proto)
lazy val subproject2 = project
.settings(
scalaVersion := scala211
)
.dependsOn(proto)
So, from the above, if I do sbt "+ proto" I can compile both versions. If I do sbt subproject1/compile it works too. Using sbt subproject2/compile fails indicating that it cannot find the 2.11:proto jar file.
Either, I would like the above somehow to work nicely, or any other trick that I could generate the code from the same proto location but within subproject1/subproject2 would be appreciated.
You could try the sbt-projectmatrix plugin:
https://github.com/sbt/sbt-projectmatrix
The idea is to have separate sbt subprojects for the different Scala versions, so you can simply reference the relevant subproject when calling dependsOn.
I think this plugin is going to end up in sbt some day as it's a much better solution in general than the current built-in stateful cross compilation support, and it's developed by Eugene Yokota, who is also an sbt developer.
I have a many dependent sbt projects in one folder. They all have same values in Build.sbt, for example dependencies.
I want to move same values from all sbt files to separate file. But don't want to use multibuild. Just need to include some other sbt files from upper directory.
For example my directory structure can look like this:
MyRepository
|- Dependencies.sbt
|- MyProject1
|- src
|- Build.sbt
|- MyProject2
|- src
|- Build.sbt
In that example, how can I include Dependencies.sbt in Build.sbt?
Code is reused between .sbt files by creating a normal .scala file in project/. The code in project/ will be available for use in the .sbt files.
If I remember correctly the definitions in one .sbt are not visible to other .sbt files, at least on the older versions.
Basically, the solution is to use: Dependencies.scala and not Dependencies.sbt and define the common part's there.
Check the illustration, that can be found here,
You create project/Dependencies.scala to track dependencies in one place.
import sbt._
object Dependencies {
// Versions
lazy val akkaVersion = "2.3.8"
// Libraries
val akkaActor = "com.typesafe.akka" %% "akka-actor" % akkaVersion
val akkaCluster = "com.typesafe.akka" %% "akka-cluster" % akkaVersion
val specs2core = "org.specs2" %% "specs2-core" % "2.4.17"
// Projects
val backendDeps =
Seq(akkaActor, specs2core % Test)
}
The Dependencies object will be available in build.sbt.
You need to, import Dependencies._ in your build.sbt file.
import Dependencies._
ThisBuild / organization := "com.example"
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.12.8"
lazy val backend = (project in file("backend"))
.settings(
name := "backend",
libraryDependencies ++= backendDeps
)
I've been tasked with rewriting an old ant build script to SBT. As it happens, our suite is built up of 3 modules:
A Play 2.3 front-end webserver;
A back-end for retrieving data from various other systems;
A middle module containing some shared classes for database access and business logic.
Below an excerpt of my Build.scala file can be found:
val sharedSettings = Seq(
organization := <organization here>,
version := "1.2.5",
scalaVersion := "2.11.1",
libraryDependencies ++= libraries,
unmanagedJars in Compile ++= baseDirectory.value / "lib",
unmanagedJars in Compile ++= baseDirectory.value / "src",
unmanagedJars in Compile ++= baseDirectory.value / "test"
)
lazy val middle = project.settings(sharedSettings: _*)
lazy val back = project.settings(sharedSettings: _*).dependsOn(middle)
lazy val front =
project
.enablePlugins(play.PlayScala)
.settings(sharedSettings: _*)
.settings(scalaSource in Compile := baseDirectory.value / "app")
.settings(
routesImport ++= Seq(
"scala.language.reflectiveCalls", // Removes warnings when using multiple routes files
"com.asml.cerberus.front.toolbox.Binders._")
)
.dependsOn(middle % "compile->compile;test->test")
I've got my application.conf in the ./front/conf/ directory. Unfortunately, if I now run sbt, it looks for a ./conf/application.conf file. (I've tested this by moving the conf directory.)
Is there any way how I can tell SBT/Play to use the front module's conf directory in stead?
In case that helps, we have a wrapper script around activator (sbt) activatorWrapper:
#!/bin/bash
activator -Dconfig.file=front/conf/application.conf
Then you can start your application with :
$ ./activatorWrapper
You can use system properties to specify an alternative config file. See here for the details.
Here's an example build.sbt:
import AssemblyKeys._
assemblySettings
buildInfoSettings
net.virtualvoid.sbt.graph.Plugin.graphSettings
name := "scala-app-template"
version := "0.1"
scalaVersion := "2.9.3"
val FunnyRuntime = config("funnyruntime") extend(Compile)
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "provided"
sourceGenerators in Compile <+= buildInfo
buildInfoPackage := "com.psnively"
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, target)
assembleArtifact in packageScala := false
val root = project.in(file(".")).
configs(FunnyRuntime).
settings(inConfig(FunnyRuntime)(Classpaths.configSettings ++ baseAssemblySettings ++ Seq(
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "funnyruntime"
)): _*)
The goal is to have spark-core "provided" so it and its dependencies are not included in the assembly artifact, but to reinclude them on the runtime classpath for the run- and test-related tasks.
It seems that using a custom scope will ultimately be helpful, but I'm stymied on how to actually cause the default/global run/test tasks to use the custom libraryDependencies and hopefully override the default. I've tried things including:
(run in Global) := (run in FunnyRuntime)
and the like to no avail.
To summarize: this feels essentially a generalization of the web case, where the servlet-api is in "provided" scope, and run/test tasks generally fork a servlet container that really does provide the servlet-api to the running code. The only difference here is that I'm not forking off a separate JVM/environment; I just want to manually augment those tasks' classpaths, effectively "undoing" the "provided" scope, but in a way that continues to exclude the dependency from the assembly artifact.
For a similar case I used in assembly.sbt:
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
and now the 'run' task uses all the libraries, including the ones marked with "provided". No further change was necessary.
Update:
#rob solution seems to be the only one working on latest SBT version, just add to settings in build.sbt:
run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated,
runMain in Compile := Defaults.runMainTask(fullClasspath in Compile, runner in(Compile, run)).evaluated
Adding to #douglaz' answer,
runMain in Compile <<= Defaults.runMainTask(fullClasspath in Compile, runner in (Compile, run))
is the corresponding fix for the runMain task.
Another option is to create separate sbt projects for assembly vs run/test. This allows you to run sbt assemblyProj/assembly to build a fat jar for deploying with spark-submit, as well as sbt runTestProj/run for running directly via sbt with Spark embedded. As added benefits, runTestProj will work without modification in IntelliJ, and a separate main class can be defined for each project in order to e.g. specify the spark master in code when running with sbt.
val sparkDep = "org.apache.spark" %% "spark-core" % sparkVersion
val commonSettings = Seq(
name := "Project",
libraryDependencies ++= Seq(...) // Common deps
)
// Project for running via spark-submit
lazy val assemblyProj = (project in file("proj-dir"))
.settings(
commonSettings,
assembly / mainClass := Some("com.example.Main"),
libraryDependencies += sparkDep % "provided"
)
// Project for running via sbt with embedded spark
lazy val runTestProj = (project in file("proj-dir"))
.settings(
// Projects' target dirs can't overlap
target := target.value.toPath.resolveSibling("target-runtest").toFile,
commonSettings,
// If separate main file needed, e.g. for specifying spark master in code
Compile / run / mainClass := Some("com.example.RunMain"),
libraryDependencies += sparkDep
)
If you use sbt-revolver plugin, here is a solution for its "reStart" task:
fullClasspath in Revolver.reStart <<= fullClasspath in Compile
UPD: for sbt-1.0 you may use the new assignment form:
fullClasspath in Revolver.reStart := (fullClasspath in Compile).value