Using DependsOn between two ScalaJS SBT projects - scala

(Long question ahead. Simplified tl;dr at the bottom).
I have two ScalaJS projects built with SBT - "myapp" and "mylib", in the following directory structure
root/build.sbt
root/myapp/build.sbt
root/myapp/jvm/
root/myapp/js/
root/myapp/shared/
root/mylib/build.sbt
root/mylib/jvm
root/mylib/js
root/mylib/shared
lib exports an artifact named "com.example:mylib:0.1", which as used as a libraryDependency for myapp.
myapp and mylib are in separate repositories, contain their own build files, and should be able to be build completely separately (i.e. they must contain their own individual build config).
In production, they will be built separately with mylib being first published as a maven artifact before building myapp separately.
In development however, I want to be able to merge these into a parent SBT project so that both can be developed in parallel without needing to use publishLocal after each change.
In a traditional (not scalajs) project this would be quite easy
$ROOT/build.sbt:
lazy val mylib = project
lazy val myapp = project.dependsOn(mylib)
However in ScalaJS, we actually have two projects inside each module - appJVM, appJS, libJVM and libJS. As such, the above configuration only finds the aggregate root project and does not correctly apply the dependsOn configuration to the actual JVM and JS projects.
(i.e. myapp and mylib build.sbt each contains two projects, and an aggregate root project)
Ideally I'd like to be able to do something like the following
lazy val mylibJVM = project
lazy val myappJVM = project.dependsOn(mylibJVM)
lazy val mylibJS = project
lazy val myappJS = project.dependsOn(myappJS)
Unfortunately this just creates new projects within the root instead of importing the subprojects themselves.
I've also tried various combinations of paths (such as)
lazy val mylibJVM = project.in(file("mylib/jvm"))
But this doesn't see configuration in build.sbt file in mylib
Ultimately I keep running up against the same problem - when importing an existing multi-project SBT project into a parent sbt file, it imports the root project, but does not seem to provide a way to import a subproject from an existing multimodule SBT file in a way that lets me add dependsOn configuration to it.
tl;dr
If I have
root/mylib/build.sbt with multiple projects defined and
root/myapp/build.sbt with multiple projects defined
Is it possible to import individual subprojects into root/build.sbt instead of the root project from the submodule?
i.e. Can I have two layers of multiproject builds.

After spending a lot of time digging through SBT source code, I managed to figure out a solution. This isn't clean, but it works. (For bonus points, it imports correctly into IntelliJ).
// Add this function to your root build.sbt file.
// It can be used to define a dependency between any
// `ProjectRef` without needing a full project definition.
def addDep(from:String, to:String) = {
buildDependencies in Global <<= (
buildDependencies in Global,
thisProjectRef in from,
thisProjectRef in to) {
(deps, fromref, toref) =>
deps.addClasspath(fromref, ResolvedClasspathDependency(toref, None))
}
}
// `project` will import the `build.sbt` file
// in the subdirectory of the same name as the `lazy val`
// (performed by an SBT macro). i.e. `./mylib/build.sbt`
//
// This won't reference the actual subprojects directly,
// will but import them into the namespace such that they
// can be referenced as "ProjectRefs", which are implicitly
// converted to from strings.
//
// We then aggregate the JVM and JS ScalaJS projects
// into the new root project we've defined. (Which unfortunately
// won't inherit anything from the child build.sbt)
lazy val mylib = project.aggregate("mylibJVM","mylibJS")
lazy val myapp = project.aggregate("myappJVM","myappJS")
// Define a root project to aggregate everything
lazy val root = project.in(file(".")).aggregate(mylib,myapp)
// We now call our custom function to define a ClassPath dependency
// between `myapp` -> `mylib` for both JVM and JS subprojects.
// In particular, this will correctly find exported artifacts
// so that `myapp` can refer to `mylib` in libraryDependencies
// without needing to use `publishLocal`.
addDep("myappJVM", "mylibJVM")
addDep("myappJS","mylibJS")

Related

Reuse Scalafmt Config File Across Projects

I have a set of Scala projects and for all of those projects, I would like to introduce some scala source code formatting for which purpose, I'm using the scamafmt sbt pliugin. I have compiled the config file and this config file is in a separate project repo. I would now like to reuse this in all of the other Scala projects. I see two possibilities:
Use the repo where the conf file is located as a git submodule in all the other 10 projects where I want to run the scala formatter
Do not do anything, just add a README documentation that every user who is working on the codebase should download the scalafmt conf file to the project (I will pre add a .gitignore to all projects to ignore the local conf file)
Is there any other approach? I definitely do not want the conf file to diverge if I leave it as is in all the projects.
As per the documentation, one option is to build (and publish in your org) a SBT plugin with your configuration:
https://scalameta.org/scalafmt/docs/installation.html#share-configuration-between-builds
To share configuration across different sbt builds, create a custom sbt plugin that generates .scalafmt-common.conf on build reload, then include the generated file from .scalafmt.conf
// project/MyScalafmtPlugin.scala
import sbt._
object MyScalafmtPlugin extends AutoPlugin {
override def trigger = allRequirements
override def requires = plugins.JvmPlugin
override def buildSettings: Seq[Def.Setting[_]] = {
SettingKey[Unit]("scalafmtGenerateConfig") :=
IO.write(
// writes to file once when build is loaded
file(".scalafmt-common.conf"),
"maxColumn = 100".stripMargin.getBytes("UTF-8")
)
}
}
// .scalafmt.conf
include ".scalafmt-common.conf"

how to get the sub project path in sbt multi project build

I am trying to get the location of sub project in multi-project build in sbt. But I am able to get only the root project directory.
lazy val copyToResources = taskKey[Unit]("copies the assembly jar.")
private val rootLocation: File = file(".").getAbsoluteFile
private val subProjectLocation: File = file("sub_project").getAbsoluteFile.getParentFile
lazy val settings = Seq(copyToResources := {
val absPath = subProjectLocation.getAbsolutePath
println(s"rootLocation:$subProjectLocation $absPath, sub-proj-location: ${rootLocation.getAbsolutePath}")
})
Output:
rootLocation:/home/user/projects/workarea/repo /home/vdinakaran/projects/workarea/repo, sub-proj-location: /home/vdinakaran/projects/workarea/repo
rootLocation:/home/user/projects/workarea/repo /home/vdinakaran/projects/workarea/repo, sub-proj-location: /home/vdinakaran/projects/workarea/repo
directory structure:
repo
|-- sub_project
As a work around , I have added the sub_project folder using the rootLocation. But why the file("sub_project") is not returning the path ?
If you define your subproject like this
lazy val subProject = project in file("sub_project") // ...
then you can get its path using the scoped baseDirectory setting:
(outdated syntax, pre sbt 1)
baseDirectory.in(subProject).value.getAbsolutePath
(new unified syntax)
(subProject / baseDirectory).value.getAbsolutePath
And in the sbt console:
> show subProject/baseDirectory
About the problem with your code (beside that you mixed up root and sub-project in the output) is the usage of relative paths. Sbt documentation on Paths explicitly says
Relative files should only be used when defining the base directory of a Project, where they will be resolved properly.
Elsewhere, files should be absolute or be built up from an absolute base File. The baseDirectory setting defines the base directory of the build or project depending on the scope.

Scala.js compilation destination

I'm working on a Scala.js cross project where the jvm folder represents my server application and jsrepresents my scala.js code.
Whenever i compile my scala.js code via sbt crossJS/fastOptJS the compiled JS ends up in ./js/target/scala-2.11/web-fastopt.js.
I need to have this compiled JS file accessible in the resources of the server project in the jvm folder, so i can server it through my web application. I think i have to do something with artifactPath but i can't seem to get any results from my experiments thus far.
You can simply set the artifactPath of the fastOptJS task (or the fullOptJS task) to the (managed) resources directory of your JVM project:
// In the JS project's settings
artifactPath in fastOptJS in Compile :=
(resourceManaged in jvm in Compile).value /
((moduleName in fastOptJS).value + "-fastopt.js"))
This will put it in the directory, if the you run the fastOptJS task. However, it will not be included in sbt's resources task and it will not automatically be triggered, if you launch your server. Therefore:
// In the JVM project's settings
resources in Compile += (fastOptJS in js).value.data
A couple of notes:
The first step is only necessary, if your web-server does only serve specific directories. Otherwise the second one is enough, as this adds the file to the resources already; where it lies is secondary.
Setting the crossTarget, as in #ochrons' answer will also output all the .class and .sjsir files in the resource directory.
Have a look at Vincent Munier's sbt-play-scalajs for out-of-the-box sbt-web / Scala.js integration (it follows a slightly different approach: It copies the file from the js project, rather than directly placing it in the JVM project. Useful if you have multiple JVM projects).
You can configure the Scala.js SBT plugin to output the JavaScript file in folder of your choosing. For example like this:
// configure a specific directory for scalajs output
val scalajsOutputDir = Def.settingKey[File]("directory for javascript files output by scalajs")
// make all JS builds use the output dir defined later
lazy val js2jvmSettings = Seq(fastOptJS, fullOptJS, packageJSDependencies) map { packageJSKey =>
crossTarget in(js, Compile, packageJSKey) := scalajsOutputDir.value
}
// instantiate the JVM project for SBT with some additional settings
lazy val jvm: Project = sharedProject.jvm.settings(js2jvmSettings: _*).settings(
// scala.js output is directed under "web/js" dir in the jvm project
scalajsOutputDir := (classDirectory in Compile).value / "web" / "js",
This will also store -jsdeps.js and .js.map files in the same folder, in case you want to use those in your web app.
For a more complete example, check out this tutorial which addresses many other issues of creating a more complex Scala.js application.

How to create a custom package task to jar a subset of classes in SBT

I am trying to define a separate package task without modifying the original task in compile configuration. This new task will package only a subset of classes conforming an API which we need to be able to share with other teams so they can write plugins for our application. So the end result will be two jars, one with the full application and a second one with a subset of the classes.
I approached this problem by creating a different configuration which I called pluginApi and would redefine the packageBin task within this new configuration so it does not change the original definition of packageBin. This idea was taken from here:
How to create custom "package" task to jar up only specific package in SBT?
In my build.stb I have:
lazy val PluginApi = config("pluginApi") extend(Compile) describedAs("Custom plugin api configuration")
lazy val root = project in file(".") overrideConfigs (PluginApi)
This effectively creates my new configuration and I can call
sbt pluginApi:packageBin
Which generates the complete jar in the same way as compile:packageBin would do. I then try to modify the mappings in the new packageBin task with:
mappings in (PluginApi, packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath.startsWith("some/path/defining/api")
}
}
but this has no effect. I think the reason is because the call to pluginApi:packageBin is delegated to compile:packageBin rather than it being a cloned task.
I can redefine a new packageBin within the new scope like:
packageBin in PluginApi := {
}
However I would have to rewrite all packageBin functionality instead of reusing existing code. Also, in case that rewriting is unavoidable I am not sure how that implementation would be.
Could somebody provide an example about how to achieve this?
You could have it done as follows
lazy val PluginApi = config("pluginApi").extend(Compile)
inConfig(PluginApi)(Defaults.compileSettings) // you have to have standard
mappings in (PluginApi, packageBin) := {
val original = (mappings in (PluginApi, packageBin)).value
original.filter { case (file, toPath) => toPath.startsWith("some/path/defining/api") }
}
unmanagedSourceDirectories in PluginApi := (unmanagedSourceDirectories in Compile).value
Note that, if you keep your sources in src/main/scala you'll have to override unmanagedSourceDirectories in the newly created configuration.
Normally the unmanagedSourceDirectories contains the configuration name. E.g. src/pluginApi/scala or src/pluginApi/java.
I have had similar problems (with more than one jar per project). Our project uses ant - here you can do it, you just will repeat yourself a lot.
However, I have come to the conclusion that this scenario (2 JARs for one project) actually can be simplified by splitting the project - i.e. making 2 modules out of it.
This way, I don't have to "fight" tools which assume project==artifact (like sbt, maybe maven?, IDEA's default setting,...).
As a bonus point the compiler helps me to verify that my dependencies are correct, i.e. that I did not accidentally make my API package depend on the implementation package - when compiling everything together and only splitting classes apart in the JAR step, you do run the risk of getting an invalid dependency in your setup which you would only see when testing, because during compile time everything is compiled together.

Defining sbt task that invokes method from project code?

I'm using SBT to build a scala project. I want to define a very simple task, that when I input generate in sbt:
sbt> generate
It will invoke my my.App.main(..) method to generate something.
There is a App.scala file in myproject/src/main/scala/my, and the simplified code is like this:
object App {
def main(args: Array[String]) {
val source = readContentOfFile("mysource.txt")
val result = convert(source)
writeToFile(result, "mytarget.txt");
}
// ignore some methods here
}
I tried to add following code into myproject/build.sbt:
lazy val generate = taskKey[Unit]("Generate my file")
generate := {
my.App.main(Array())
}
But which doesn't compile since it can't find my.App.
Then I tried to add it to myproject/project/build.scala:
import sbt._
import my._
object HelloBuild extends Build {
lazy val generate = taskKey[Unit]("Generate my file")
generate := {
App.main(Array())
}
}
But it still can't be compiled, that it can't find package my.
How to define such a task in SBT?
In .sbt format, do:
lazy val generate = taskKey[Unit]("Generate my file")
fullRunTask(generate, Compile, "my.App")
This is documented at http://www.scala-sbt.org/0.13.2/docs/faq.html, “How can I create a custom run task, in addition to run?”
Another approach would be:
lazy val generate = taskKey[Unit]("Generate my file")
generate := (runMain in Compile).toTask(" my.App").value
which works fine in simple cases but isn't as customizable.
Update: Jacek's advice to use resourceGenerators or sourceGenerators instead is good, if it fits your use case — can't tell from your description whether it does.
The other answers fit the question very well, but I think the OP might benefit from mine, too :)
The OP asked about "I want to define a very simple task, that when I input generate in sbt will invoke my my.App.main(..) method to generate something." that might ultimately complicate the build.
Sbt already offers a way to generate files at build time - sourceGenerators and resourceGenerators - and I can't seem to notice a need to define a separate task for this from having read the question.
In Generating files (see the future version of the document in the commit) you can read:
sbt provides standard hooks for adding source or resource generation
tasks.
With the knowledge one could think of the following solution:
sourceGenerators in Compile += Def.task {
my.App.main(Array()) // it's not going to work without one change, though
Seq[File]() // a workaround before the above change is in effect
}.taskValue
To make that work you should return a Seq[File] that contains files generated (and not the empty Seq[File]()).
The main change for the code to work is to move the my.App class to project folder. It then becomes a part of the build definition. It also reflects what the class does as it's really a part of the build not the artifact that's the product of it. When the same code is a part of the build and the artifact itself you don't keep the different concerns separate. If the my.App class participates in a build, it should belong to it - hence the move to the project folder.
The project's layout would then be as follows:
$ tree
.
├── build.sbt
└── project
├── App.scala
└── build.properties
Separation of concerns (aka #joescii in da haus)
There's a point in #joescii's answer (which I extend in the answer) - "to make it a separate project that other projects can use. To do this, you will need to put your App object into a separate project and include it as a dependency in project/project", i.e.
Let's assume you've got a separate project build-utils with App.scala under src/main/scala. It's a regular sbt configuration with just the Scala code.
jacek:~/sandbox/so/generate-project-code
$ tree build-utils/
build-utils/
└── src
└── main
└── scala
└── App.scala
You could test it out as a regular Scala application without messing up with sbt. No additional setup's required (and frees your mind from sbt that might be beneficial at times - less setup is always of help).
In another project - project-code - that uses App.scala that is supposed to be a base for the build, build.sbt is as follows:
project-code/build.sbt
lazy val generate = taskKey[Unit]("Generate my file")
generate := {
my.App.main(Array())
}
Now the most important part - the wiring between projects so the App code is visible for the build of project-code:
project-code/project/build.sbt
lazy val buildUtils = RootProject(
uri("file:/Users/jacek/sandbox/so/generate-project-code/build-utils")
)
lazy val plugins = project in file(".") dependsOn buildUtils
With the build definition(s), executing generate gives you the following:
jacek:~/sandbox/so/generate-project-code/project-code
$ sbt
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/sandbox/so/generate-project-code/project-code/project
[info] Updating {file:/Users/jacek/sandbox/so/generate-project-code/build-utils/}build-utils...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Updating {file:/Users/jacek/sandbox/so/generate-project-code/project-code/project/}plugins...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/jacek/sandbox/so/generate-project-code/build-utils/target/scala-2.10/classes...
[info] Set current project to project-code (in build file:/Users/jacek/sandbox/so/generate-project-code/project-code/)
> generate
Hello from App.main
[success] Total time: 0 s, completed May 2, 2014 2:54:29 PM
I've changed the code of App to be:
> eval "cat ../build-utils/src/main/scala/App.scala"!
package my
object App {
def main(args: Array[String]) {
println("Hello from App.main")
}
}
The project structure is as follows:
jacek:~/sandbox/so/generate-project-code/project-code
$ tree
.
├── build.sbt
└── project
├── build.properties
└── build.sbt
Other changes aka goodies
I'd also propose some other changes to the code of the source generator:
Move the code out of main method to a separate method that returns the files generated and have main call it. It'll make reusing the code in sourceGenerators easier (without unnecessary Array() to call it as well as explicitly returning the files).
Use filter or map functions for convert (to add a more functional flavour).
The solution that #SethTisue proposes will work. Another approach is to make it a separate project that other projects can use. To do this, you will need to put your App object into a separate project and include it as a dependency in project/project, OR package it as an sbt plugin ideally with this task definition included.
For an example of how to create a lib that is packaged as a plugin, take a look at snmp4s. The gen directory contains the code that does some code generation (analogous to your App code) and the sbt directory contains the sbt plugin wrapper for gen.