I have two projects that use scalajs where the second project needs access to the sources of the first. I define my first project.sbt something like the following:
val commonSettings = Seq(
name := "project1",
unmanagedSourceDirectories in Compile +=
baseDirectory.value / ".." / "shared" / "src" / "main" / "scala",
)
val project1JS = project.in(file("js"))
.settings(commonSettings: _*)
.enablePlugins(ScalaJSPlugin)
val project1JVM = project.in(file("jvm"))
.settings(commonSettings: _*)
In order for project 2 to have access to the sources of project 1 I define its sbt as follows:
val commonSettings = Seq(
name := "project2",
unmanagedSourceDirectories in Compile +=
baseDirectory.value / ".." / "shared" / "src" / "main" / "scala",
)
val project2JS = project.in(file("js"))
.settings(commonSettings: _*)
.configure(_.dependsOn(ProjectRef(uri("../project1"), "project1JS")))
.enablePlugins(ScalaJSPlugin)
val project2JVM = project.in(file("jvm"))
.settings(commonSettings: _*)
.configure(_.dependsOn(ProjectRef(uri("../project1"), "project1JVM")))
My second project compiles fine but this raises a problem in that I now have access to project1's classpath which can cause conflicts. For example I am getting warnings that more than one logback.xml file has been found and the incorrect application.conf file can also be picked up.
Is there a better way of depending on the sources for project1?
The best alternative solution I could come up with was to add unmanagedSourceDirectories settings to my second project. So for example by doing:
val project2JVM = project.in(file("jvm"))
.settings(commonSettings: _*)
.settings(
unmanagedSourceDirectories in Compile +=
baseDirectory.value / ".." / ".." / "project1" / "shared" / "src" / "main" / "scala",
)
Project2 now has access to the shared sources in Project1 without access to the resources for Project1 on its classpath
Related
I have a Scala / SBT multi module project where one module uses generated code from an OpenAPI specification. The module also has source code in src/main/scala.
What I need to do is this:
reference company templates as dependency
unpack company templates to a directory inside target
generate openapi code using said templates
compile generated and non-generated code together as a Play application
After long debugging, I did get SBT to do all of that in the right order on sbt compile. However, while the compiling succeeds, I do get an error afterwards:
[error] (rest / Compile / compile) sbt.internal.util.Init$RuntimeUndefined: References to undefined settings at runtime.
[error] ScopedKey(Scope(Select(ProjectRef(file:/path/to/my/project/,rest)), Select(ConfigKey(compile)), Zero, Zero),compile) referenced from setting(ScopedKey(Scope(Select(ProjectRef(file:/path/to/my/project/,rest)), Select(ConfigKey(compile)), Zero, Zero),compile)) at LinePosition(/path/to/my/project/build.sbt,33)
What can I do about that error?
Here's my build.sbt:
lazy val `rest` = Project(id = "rest", base = file("."))
.enablePlugins(OpenApiGeneratorPlugin, PlayScala, UnpackPlugin)
.settings(
libraryDependencies ++= Seq(
"com.mycompany" % "mycompany-templates" % "0.0.3"
// other dependencies omitted
),
dependencyFilter := { (file: File) => file.getName.startsWith("mycompany-templates") },
dependenciesJarDirectory := (ThisProject / target).value,
openApiInputSpec := ((ThisProject / baseDirectory).value / "../api/src/main/resources/openapi.yaml").absolutePath,
openApiConfigFile := (baseDirectory.value / "codegen-config.yaml").absolutePath,
openApiGeneratorName := "scala-play-server",
openApiOutputDir := ((ThisProject / baseDirectory).value / "target" / "gen-src" / "openapi").absolutePath,
openApiTemplateDir := ((ThisProject / baseDirectory).value / "target" / "templates" / "domino-play-server").absolutePath,
Compile / sourceGenerators += Def.task {
val generatedFiles = openApiGenerate.value
generatedFiles.filter(f => f.getName.endsWith(".scala"))
}.taskValue,
// this is the line in the error message:
Compile / compile := Def.sequential(unpackJars, openApiGenerate, Compile / compile).value,
Compile / managedSourceDirectories += (ThisProject / target).value / "gen-src" / "openapi" / "app",
Compile / managedResourceDirectories += (ThisProject / target).value / "gen-src" / "openapi" / "conf",
Compile / unmanagedSourceDirectories += (ThisProject / baseDirectory).value / "src" / "main" / "scala"
)
Solved. The Compile / compile definition needs to look like this:
Compile / compile :=
((Compile / compile) dependsOn Def.sequential(unpackJars, openApiGenerate)).value
The sbt-native-packager can make a zip file with all dependencies and a script to run_
$ sbt universal:packageBin
I have a scala web application, using cross-build (appJS for front-end and appJVM for back-end).
How do I run this packager for the appJVM?
I've tried as follows, but it does not accept the command:
$ sbt appJVM/universal:packageBin
Here it is the build.sbt project, from https://www.scala-js.org/doc/project/cross-build.html
...
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := "0.1-SNAPSHOT"
).
jvmSettings(
// Add JVM-specific settings here
).
jsSettings(
// Add JS-specific settings here
)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
How do I run this packager for the appJVM?
And how I include the file generated by sbt appJS/fullOptJS?
And some other static files?
Update with Ivan response
build.sbt:
import sbtcrossproject.CrossPlugin.autoImport.{crossProject, CrossType}
val sharedSettings = Seq(
scalaVersion := "2.12.8",
)
lazy val app =
crossProject(JSPlatform, JVMPlatform)
.in(file("."))
.settings(sharedSettings)
.jsSettings(
)
.jvmSettings(
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-http" % "10.1.9"
),
)
lazy val backend = project
.enablePlugins(UniversalPlugin)
.enablePlugins(JavaAppPackaging)
.dependsOn(app.jvm)
.settings(
mainClass in Compile := Some("com.example.EchoServer")
)
lazy val frontend = project
.enablePlugins(ScalaJSPlugin)
.dependsOn(app.js)
backend
.settings(
Seq(
resourceGenerators in Compile += Def.task {
Seq(
(fullOptJS in Compile in frontend).value,
(fastOptJS in Compile in frontend).value
).map { js =>
val resource = (resourceManaged in Compile).value / "public" / "assets" / js.data.name
IO.write(resource, IO.read(js.data))
resource
}
}.taskValue
)
)
and run:
$ sbt backend/universal:packageBin
34: error: type mismatch;
found : Seq[sbt.Def.Setting[Seq[sbt.Task[Seq[java.io.File]]]]]
required: Int
Seq(
^
[error] Type error in expression
I used the following structure.
Define a shared project that needs to be cross-compiled for JS and Scala.
lazy val shared = CrossPlugin.autoImport
.crossProject(JSPlatform, JVMPlatform)
.crossType(CrossType.Pure)
.jvmSettings(???)
.jsSettings(???)
lazy val sharedJvm = shared.jvm
lazy val sharedJs = shared.js
Add project that contains a Main class.
lazy val backend = project
.enablePlugins(UniversalPlugin)
.enablePlugins(JavaAppPackaging)
.dependsOn(sharedJvm)
Add web project containing web related code.
lazy val web = project
.enablePlugins(ScalaJSPlugin)
.dependsOn(sharedJs)
And finally, attach resources from web compiled into JS to backend.
backend
.settings(
Seq(
resourceGenerators in Compile += Def.task {
Seq(
(fullOptJS in Compile in web).value,
(fastOptJS in Compile in web).value
).map { js =>
val resource = (resourceManaged in Compile).value / "public" / "assets" / js.data.name
IO.write(resource, IO.read(js.data))
resource
}
}.taskValue
)
Main class needs to service compiled JS from public/assets, as configured in sbt, and any other web resources from its class path.
This is the current content root configuration in my project:
However, I want the "scala" directory to be the actual test content root, and not the directory named "test". If I modify it, I get the warning that "Module is imported from Sbt. Any changes in its configuration will may be lost after re-importing." (and, indeed, they are).
Unfortunately, I couldn't find where in my Build.scala file (or any other file) this configuration is declared. What I can do to once and for all convince IntelliJ that "scala" is the correct test content root?
This is my Build.scala file (this is a Play 2.5.4 project if it matters):
import play.routes.compiler.StaticRoutesGenerator
import play.sbt.PlayScala
import play.sbt.routes.RoutesKeys._
import sbt.Keys._
import sbt._
object Build extends Build {
val main = Project("Mp3Streamer", file(".")).enablePlugins(PlayScala).settings(
scalaVersion := "2.11.8",
version := "1.0-SNAPSHOT",
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full),
libraryDependencies ++= Seq(
// a bunch of dependencies
),
resolvers += Resolver.mavenLocal,
javaOptions ++= Seq("-Xmx4000M", "-Xms2024M", "-XX:MaxPermSize=2000M"),
routesGenerator := StaticRoutesGenerator
)
}
By adding scalaSource in Test := baseDirectory.value / "test" "/scala", to my Build.scala file, I've been able to make the "scala" folder a test source, but the parent "test" folder was still also a test source:
As far as I could tell, this is a setting inherited from Play, since if I removed the .enablePlugins(PlayScala) code, the "test" folder stops being a test source. Following the instructions in https://www.playframework.com/documentation/2.5.x/Anatomy#Default-SBT-layout, I disabled the play layout, and then manually added the source and resource directories, which I copied from https://github.com/playframework/playframework/blob/master/framework/src/sbt-plugin/src/main/scala/play/sbt/PlayLayoutPlugin.scala#L9, only modifying the test source, and adding my own resource folders. My modified Build.scala file is now:
val main = Project("Mp3Streamer", file("."))
.enablePlugins(PlayScala)
.disablePlugins(PlayLayoutPlugin)
.settings(
target := baseDirectory.value / "target",
sourceDirectory in Compile := baseDirectory.value / "app",
// My change
sourceDirectory in Test := baseDirectory.value / "test" / "scala",
resourceDirectory in Compile := baseDirectory.value / "conf",
scalaSource in Compile := baseDirectory.value / "app",
// My change
scalaSource in Test := baseDirectory.value / "test" / "scala",
// I've added this resource
resourceDirectory in Test := baseDirectory.value / "test" / "resources",
javaSource in Compile := baseDirectory.value / "app",
sourceDirectories in(Compile, TwirlKeys.compileTemplates) := Seq((sourceDirectory in Compile).value),
sourceDirectories in(Test, TwirlKeys.compileTemplates) := Seq((sourceDirectory in Test).value),
// sbt-web
sourceDirectory in Assets := (sourceDirectory in Compile).value / "assets",
sourceDirectory in TestAssets := (sourceDirectory in Test).value / "assets",
resourceDirectory in Assets := baseDirectory.value / "public",
// Native packager
sourceDirectory in Universal := baseDirectory.value / "dist",
// Everything else is the same as the original Build.scala file
Honestly, this feels so hacky that I'll probably end up modifying my directory structure to match Play's default... But it's the principle that counts!
I have a root project containing 3 subprojects plus sbt config files and nothing else. 2 main subprojects are called server and backend, the other is called common and is dependency of both main projects. server is PlayFramework project. backed project is configured to generate assembly jar into resources directory of server.
The jar is generated correctly and server is able to see it, but I don't know how to run assembly task from backend when server is compiled(i.e. I want the server to depend on assembly of backend.jar)
/* [...] */
lazy val commonSettings = Seq(
version := "0.1",
organization := "org.example",
scalaVersion := "2.11.7"
)
lazy val server = (project in file("server")).enablePlugins(PlayJava).settings(commonSettings: _*).settings(
name := """example""",
libraryDependencies ++= Seq(
/* [...] */
),
/* [...] */
unmanagedResourceDirectories in Compile += { baseDirectory.value / "resources" }
).dependsOn(common)
lazy val backend = (project in file("backend")).settings(commonSettings: _*).settings(
assemblyJarName in assembly := "backend.jar",
assemblyOutputPath in assembly := server.base / "resources/backend.jar",
libraryDependencies := Seq(
)
).dependsOn(common)
lazy val common = (project in file("common")).settings(commonSettings: _*)
onLoad in Global := (Command.process("project server", _: State)) compose (onLoad in Global).value
Thanks to comment by #pfn I got it working. One thing I needed to do was to insert this line in server subproject settings and change server to Compile, so it is now:
(compile in Compile) <<= (compile in Compile) dependsOn (assembly in backend)
I have an sbt (0.13.1) project with a bunch of subprojects. I am generating eclipse project configurations using sbteclipse. My projects only have scala source files, so I want to remove the generated src/java folders.
I can achieve that by (redundantly) adding the following to the build.sbt of each subproject:
unmanagedSourceDirectories in Compile := (scalaSource in Compile).value :: Nil
unmanagedSourceDirectories in Test := (scalaSource in Test).value :: Nil
I tried just adding the above configuration to the root build.sbt but the eclipse command still generated the java source folders.
Is there any way to specify a configuration like this once (in the root build.sbt) and have it flow down to each subproject?
You could define the settings unscoped and then reuse them
val onlyScalaSources = Seq(
unmanagedSourceDirectories in Compile := Seq((scalaSource in Compile).value),
unmanagedSourceDirectories in Test := Seq((scalaSource in Test).value)
)
val project1 = project.in( file( "project1" )
.settings(onlyScalaSources: _*)
val project2 = project.in( file( "project2" )
.settings(onlyScalaSources: _*)
You could also create a simple plugin (untested code)
object OnlyScalaSources extends AutoPlugin {
override def trigger = allRequirements
override lazy val projectSettings = Seq(
unmanagedSourceDirectories in Compile := Seq((scalaSource in Compile).value),
unmanagedSourceDirectories in Test := Seq((scalaSource in Test).value)
)
}
More details about creating plugins in the plugins documentation