I use a case class from a sub project and receive error: Referring to non-existent at runtime
In build.sbt, I have:
lazy val scalaJsProject = (project in file("scala-project/"))
.dependsOn(modelProject)
I guess dependsOn does not work for scalaJs?
I followed instruction here https://github.com/scala-js/scalajs-cross-compile-example to create crossProject. And put all models into shared folder of crossProject. And it recoginizes both in JVM and JS project.
Related
I am new to Scala, so I hope this question is not too naive.
Suppose I have a multi-module sbt-project and there is a dependence between projects.
lazy val core = (project in file("core")).
settings( ... )
lazy val utils = (project in file("utils")).
settings( ... ).dependsOn(core)
The question, does .dependsOn(core) mean that if I do projects utils; compile it is going to compile the core beforehand (and use its latest version)?
I am asking this, since in practice I don't see this behavior (and I want it).
You are looking for the aggregate method. Like this:
lazy val utils = (project in file("utils")).
settings( ... ).dependsOn(core).aggregate(core)
The aggregate method here causes all tasks run on utils to also be run on core (update, etc...). If you want to disable a task from running on an aggregated project you can check out the documentation here
Yes, you should see this behavior (and I do see it in practice). As the linked documentation says (note that the roles of util and core are opposite there: core depends on util):
This also creates an ordering between the projects when compiling them; util must be updated and compiled before core can be compiled
Switching sbt-assembly from 0.11.2 to 0.13.0, I suddenly find myself in a situation where calling sbt assembly does not just invoke the task in the sub-project that explicitly added assemblySettings, but it tries to run it for each and every sub project.
So, if I have
lazy val root = project(...).aggregate(core, app)
lazy val core = project(...)
lazy val app = project(...).dependsOn(core)
How can I disable the assembly task for all but the root project? With other plugins such as sbt-buildinfo this problem doesn't occur because you have to explicitly enable the plugin per sub-project.
The goal is to be able to run sbt assembly so it will do that just for the root project.
Found the answer in a closed issue. You have to add the following line to your common settings:
aggregate in assembly := false
I'm trying to build a Scala project aggregated by multiple projects, one of which is an sbt plugin. I would like to use this plugin in another subproject under the same parent project, but I don't quite understand how to do this.
My "build.sbt" in the project root is like this:
lazy val plugin = project
.in(file("sbt-Something"))
.dependsOn(lib)
.settings(common: _*)
.settings(name := "My plugin",
sbtPlugin := true)
lazy val examples = project
.in(file("examples"))
.dependsOn(lib, plugin)
.settings(common: _*)
.settings(name := "Examples")
How to add the plugin as a plugin to project examples?
I don't think you can have a plugin at the same "level" that project which is using it.
If you think of it, the plugin must be available before the compilation of the project that is using it. This is because it may, for example modify the build settings, which would influence the way the project is built.
If you want to keep your plugin within your project you can do so by declaring a project in the project/project directory.
$YOUR_PROJECT_ROOT/project/build.sbt
lazy val plugin = project
.in(file("sbt-plugin"))
.dependsOn(lib)
.settings(name := "My plugin", sbtPlugin := true)
lazy val lib = project.in(file("lib"))
lazy val root = project.in(file(".")).dependsOn(plugin)
Then you can put your code to sbt-plugin directory, and your shared library code to the lib folder.
In your normal build you can reference the shared library and the plugin.
$YOUR_PROJECT_ROOT/build.sbt
val lib = ProjectRef(file("project/lib"), "lib")
val root = project.in(file(".")).dependsOn(lib).enablePlugins(MyPlugin)
Please note that maybe it would be better to keep the shared library as a separate project, because I think this setup may be a bit tricky. For example if you change something in the shared library the main project should recompile and should use new code. The plugin however will only use new code after issuing the reload command on the project.
If you want to share settings between the projects you can check answers to How to share version values between project/plugins.sbt and project/Build.scala?
I'm trying to use classes from a dependant project in my views, but it seems the scala compiler isn't able to pick it up.
The project is a sibling of the play project:
workspace/lib
workspace/play-project
But I get an error when compiling the project:
#import lib.TheClass
Error:
[error] scala-2.9.1/src_managed/main/views/html/index.template.scala:28: not found: value lib
[error] _display_ {import lib.TheClass
How can I set up a project dependency for the scale compiler?
I found the following related SO questions, but they seem to talk about projects stored in central repositories:
Play 2.0 Framework external Model in Template
Dependency Management with Play 2.0 Applications
You have to declare the dependency on the lib project in your sbt configuration. There is a guide in the sbt wiki. First you declare your lib project.
lazy val lib = Project(id = "lib",
base="../lib/")
Then you define the main project and let it depend on the lib project.
lazy val play = Project(id = "play-app",
base = file(".")) dependsOn(lib)
The project I'm working on at work is a webapp on the Lift Framework. We're using xsbt web plugin as well. There's a "core" project, which contains the vast majority of the functionality; my current goal is to create two "distribution" projects that add a different set of classpath resources to the "core" project. The problem is that I either 1) can't get the "distribution" projects to run, or 2) Get them to run, but the required resource doesn't seem to be there.
What I've tried
Here's an abridged version of my project/Build.scala:
lazy val core = Project("Core", file("core"))
.settings( /*some dependencies, resolvers, webSettings */ )
lazy val app1 = Project("App1", file("app1"))
.aggregate(core)
.settings( /*the same settings as core */ )
lazy val app2 = Project("App2", file("app2"))
.aggregate(core)
.settings( /*the same settings as core*/ )
Then in the directory structure for both app1 and app2, I have a file at src/main/resources/aFileINeed. The core application is using the class.getResource approach to load the file from the classpath.
Problems
If I try to run one of the distribution projects, using container:start, it does not detect the required file in the classpath. Also, it claims that src/main/webapp is not an existing directory (that folder is included in the core project, as it is required by the xsbt web plugin).
How can I get these projects to "merge" their resources? I expected that using aggregate or dependsOn in the Build.scala project definition would handle that for me, but it apparently doesn't.
This is what I'm doing. I create a global 'resources' directory in the root of the project, and then the following variable:
lazy val globalResources = file("resources")
And then in every project's settings:
unmanagedResourceDirectories in Runtime += globalResources
For projects that use the xsbt-web-plugin or some other library that imports it you'll have to use the stronger
unmanagedResourceDirectories in Compile += globalResources
Beware that using this solution your projects will potentially have muliple resource directories and AFAIK if you define the same file twice compile:copy-resources is going to be angry at you.