Play Framework evolutions in different subprojects - scala

I have a Play Framework 2.2 project that has different subprojects. Everything worked fine while only one of those subprojects had SQL evolution scripts.
Now, I'm trying to introduce another subproject with a SQL evolution script and I see no way of defining dependencies between them, or even to execute them both, while keeping them in their subprojects (where logically they belong).
So, how can I have evolution scripts in different submodules and have them all execute, respecting dependencies between them?
Thanks!

Add to application.conf file the list of models packages:
ebean.default = ["models.common.*","models.sub1.*", "models.sub2.*",
...]
And use dependsOn in build.sbt like this to get what you need:
lazy val sub1 = project.in(file("modules/sub1"))
.enablePlugins(PlayJava,PlayEbean)
.dependsOn(common)
Attention: the sintax is for play framework 2.4

Related

sbt-assembly multimodule project?

My project is separated on multiple parts:
* core project
* utils (as example) project
Both of them have some unit-tests, and "core project" relies on code in "utils" project by "dependsOn" mechanism of sbt.
I am using sbt-assembly plugin for building "uber-jar"/"fat-jar", but sbt assembly task does`t run test on utils project - which is what I am trying to achieve (I can not see any tests from "utils project" in logs)
Changing "dependsOn" to "aggregate" introduce new problem:
sbt assembly aggregate deduplicate
something similar to this issue-on-github
So my question is how to organize multimodule project which can be assembled by sbt-assembly and all of the tests are executed during assembly task?
My guess is that you should have both dependsOn and aggregate relationships between your subprojects. They are not mutually exclusive, only serve different purposes.
DependsOn introduces code dependency, so if core depends on utils it means that you can reference types from utils in core.
Aggregate introduces task dependency. That means if you execute compile or test on core and it aggregates utils then the task will be executed on both subprojects.
Problems with deduplicate are another beast - it means, that there are duplicates in resources or classes when attempting to create one jar. The reasons for this may be various and you can tackle them by verifying the library dependencies in the build or creating MergeStrategy - https://github.com/sbt/sbt-assembly#merge-strategy

sbt multi-module project: dependence between projects

I am new to Scala, so I hope this question is not too naive.
Suppose I have a multi-module sbt-project and there is a dependence between projects.
lazy val core = (project in file("core")).
settings( ... )
lazy val utils = (project in file("utils")).
settings( ... ).dependsOn(core)
The question, does .dependsOn(core) mean that if I do projects utils; compile it is going to compile the core beforehand (and use its latest version)?
I am asking this, since in practice I don't see this behavior (and I want it).
You are looking for the aggregate method. Like this:
lazy val utils = (project in file("utils")).
settings( ... ).dependsOn(core).aggregate(core)
The aggregate method here causes all tasks run on utils to also be run on core (update, etc...). If you want to disable a task from running on an aggregated project you can check out the documentation here
Yes, you should see this behavior (and I do see it in practice). As the linked documentation says (note that the roles of util and core are opposite there: core depends on util):
This also creates an ordering between the projects when compiling them; util must be updated and compiled before core can be compiled

scalajs: multiple apps in one project?

If I understood scalajs docs correctly it allows only one javascript generation per project.
Is there a way to avoid this limitation?
Currently I created scalajs sub project for Play framework. In this sub project I planned to create all scalajs apps for the service I am working on. Now I found this limitation and it's really confusing and annoying, because the only two solutions I can think of are:
create one "mega-script" which isn't acceptable for me
for every scalajs app create separate sub projects
They both are really not acceptable for a big project.
Every scalajs in it's own subproject and manage everything via SBT MutliProject
Here is somewhat complex example of play project, that has 6+ subprojects that compiles to one single file. scala-js-binding
Check the Build.scala
lazy val preview = (project in file(".")).enablePlugins(PlayScala) settings(previewSettings: _*) dependsOn shared dependsOn bindingPlay aggregate frontend
//aggregate scalaJs
lazy val frontend = Project(
id = "frontend",
base = file("frontend") ) dependsOn shared dependsOn binding
...
scalajsOutputDir := baseDirectory.value / "public" / "javascripts" / "scalajs",
//fastOptJs - not optimized (3Mb)
compile in Compile <<= (compile in Compile) dependsOn (fastOptJS in (frontend, Compile)),
//fullOptJS - fully optimized (330k)
dist <<= dist dependsOn (fullOptJS in (frontend, Compile)),
Scala.js is indeed designed to generate one JavaScript output per project. There is virtually no way to change this.
However, this is not a restriction of Scala.js. Scala/Java themselves behave the same way: you cannot generate two .jars from a single sbt project for a Scala/Java application. So I do not see why Scala.js should behave any differently.
Use multiple projects in your sbt build for this, as suggested by #user3430609.

How do I use shared configurations across SBT (Play) multi-projects?

I have several SBT 0.13 / Play 2.2 projects (websites). They are all multi-module as they share some common functionality. This makes their project configuration files both complex and almost identical, but not quite.
I would like to be able to share as much as possible of these configuration files across the projects (frequent play updates makes keeping 5+ websites up to date a royal pain, not to mention all the almost-identical-but-evolving dependency lists across the projects).
build.properties and plugins.sbt are identical across projects and can be overwritten by a simple script. Great.
Build.scala is trickier - I would like to introduce a shared base class like so:
abstract class MyBuildBase extends Build { ... }
that in Build.scala do:
object ApplicationBuild extends MyBuildBuild { ... }
In order for this to make any sense at all, MyBuildBase.scala needs to be shared across projects. This can be done with svn:external, which operates on directories. Which means I need to somehow make this shared directory accessible when Build.scala is compiled (otherwise sbt complains loudly).
Reading http://www.scala-sbt.org/0.13.0/docs/Detailed-Topics/Classpaths.html and http://www.scala-sbt.org/0.13.0/docs/Getting-Started/Full-Def.html it seems like this should be possible.
However, it is exceptionally unclear to me what to actually put in the project/project/Build.scala file to actually achieve this - I can't find an example of "an sbt build file that's intended to build an sbt build file and include some extra source files in the build".
Any suggestions?
What you probably want to do is create a plugin, or shared library.
You can make an sbt project with a build like follows:
build.sbt
sbtPlugin := true
organization := "you"
name := "common-build"
version := "1.0"
Then create in src/main/scala your abstract class "MyBuildBase". Release this project as an sbt plugin.
Then in your other projects, you can use this as a library/plugin. In project/plugins.sbt add:
addSbtPlugin("you" % "common-build" % "1.0")
And this will resolve your common build library when building your build.
If you need more information, look up more about sbt plugins and ignore the part about making something that extends a Plugin. Plugins are just libraries versioned with sbt's version number and your own. You should be able to put whatever code you want in there to share between builds.
Note: in 2016, Build.scala is deprecated for Build.sbt.
Here is the new (Dec. 2016) multi-module with App Scala sbt template by Michael Lewis.
Usage
sbt new lewismj/sbt-template.g8
You can then run:
sbt compile
sbt publish-local
sbt assembly
It is based on Scala SBT template (Library)
This giter8 template will write SBT build files for a Scala library.

Run "clean" all dependent SBT subprojects

I have an SBT project, specifically a Play Framework 2.1 project, that has a number of subprojects specified in the configuration. The dependencies seem to work fine when compiling, but "clean" only seems to clean the currently selected project, not including its dependencies. Is there any way to clean both the selected project and its dependent subprojects?
If your main project aggregates subjects, like this:
lazy val root = Project("name", file("."))
.aggregate(module1, module2, macros)
then any command called on this root project will be executed for all subprojects. If you call inspect clean command in your sbt session, you'll see, under Related section all subprojects which relates on this clean
On the side note in the comment
aggregate and dependsOn are different command for different purposes. The purpose of aggregation is in running commands called on the root project. In my example by calling test command on my root project, this command will be executed also for module1 module2 and macros. If you want to turn off such behaviour with the following setting:
aggregate in test := false
Aggregated project are independent on the code in them. It's usually used on the root project, nfor example not to call test on each project, but to call it on root. Remeber that in case of aggregation commands will be executed in parallel.
And dependsOn means that your project will depend on the code from other project. And in this case SBT will execute command sequentialy, in order to compile your root project, which dependsOn some modules, it should compile those modules at first step, the the root project.