If I understood scalajs docs correctly it allows only one javascript generation per project.
Is there a way to avoid this limitation?
Currently I created scalajs sub project for Play framework. In this sub project I planned to create all scalajs apps for the service I am working on. Now I found this limitation and it's really confusing and annoying, because the only two solutions I can think of are:
create one "mega-script" which isn't acceptable for me
for every scalajs app create separate sub projects
They both are really not acceptable for a big project.
Every scalajs in it's own subproject and manage everything via SBT MutliProject
Here is somewhat complex example of play project, that has 6+ subprojects that compiles to one single file. scala-js-binding
Check the Build.scala
lazy val preview = (project in file(".")).enablePlugins(PlayScala) settings(previewSettings: _*) dependsOn shared dependsOn bindingPlay aggregate frontend
//aggregate scalaJs
lazy val frontend = Project(
id = "frontend",
base = file("frontend") ) dependsOn shared dependsOn binding
...
scalajsOutputDir := baseDirectory.value / "public" / "javascripts" / "scalajs",
//fastOptJs - not optimized (3Mb)
compile in Compile <<= (compile in Compile) dependsOn (fastOptJS in (frontend, Compile)),
//fullOptJS - fully optimized (330k)
dist <<= dist dependsOn (fullOptJS in (frontend, Compile)),
Scala.js is indeed designed to generate one JavaScript output per project. There is virtually no way to change this.
However, this is not a restriction of Scala.js. Scala/Java themselves behave the same way: you cannot generate two .jars from a single sbt project for a Scala/Java application. So I do not see why Scala.js should behave any differently.
Use multiple projects in your sbt build for this, as suggested by #user3430609.
Related
I'm basically a scala developer and i have project in scala - sbt recently i've started using kotlin and trying to some parts of my code in to kotlin. I need help in understanding gradle build system.
db-service
queue-service
business-logic
processor-code depends on projects on db-service, queue-service and business logic
another project "X" depends on queue-service and some other service.
Usually in sbt this is something very straight forward you can use ProjectRef to include these projects as dependencies.
How do i achieve the same with gradle? Thanks in advance
//Update
sample build.sbt
lazy val buildSettings = Seq(
scalaVersion := "2.12",
fork in Test := true,
fork in IntegrationTest := true,
...
)
lazy val root = Project("processor-code", file("."))
.settings(buildSettings: _*)
.settings(
libraryDependencies ++= //Deps
)
.dependsOn(db-service, queue-service, utilities)
.aggregate(db-service, queue-service, utilities)
lazy val db-service = ...
lazy val queue-service = Project("queue-service", file(".")).settings()...
lazy val utilities = ProjectRef(file("../utilities"), "utilities")
i have tried including the project in settings.gradle
include 'project'
project(":project").projectDir = "../myProject"
and added
implementation(project(":project"))
however, it doesn't seem working show error plugin already on classpath, then i've also tried creating a submodule and it seemed like it was working but when i was trying to run it throws a initialization exception
i've understood that we can configure a project in the following scenario
Root project: (contains common build.gradle)
| - subproject A
| - subproject B
| - myapp
now myapp easily can depend on subprojects
My scenario (no common build.gradle, each project will have its own build.gradle)
| Independent project A
| Independent project B
| my app
While it's true that you can only depend on other sub-projects, technically, that's still possible with Gradle.
Since project A is a Gradle project, you could pack it with gradle jar command. Then you could move the produced JAR to wherever you want, but probably to your my app project.
Now it's possible to launch an arbitrary command from Gradle using project.exec {}
What's more, you can even write your own Kotlin/Groovy function inside Gradle to do that for you.
That's what SBT is doing for you, actually.
I am new to Scala, so I hope this question is not too naive.
Suppose I have a multi-module sbt-project and there is a dependence between projects.
lazy val core = (project in file("core")).
settings( ... )
lazy val utils = (project in file("utils")).
settings( ... ).dependsOn(core)
The question, does .dependsOn(core) mean that if I do projects utils; compile it is going to compile the core beforehand (and use its latest version)?
I am asking this, since in practice I don't see this behavior (and I want it).
You are looking for the aggregate method. Like this:
lazy val utils = (project in file("utils")).
settings( ... ).dependsOn(core).aggregate(core)
The aggregate method here causes all tasks run on utils to also be run on core (update, etc...). If you want to disable a task from running on an aggregated project you can check out the documentation here
Yes, you should see this behavior (and I do see it in practice). As the linked documentation says (note that the roles of util and core are opposite there: core depends on util):
This also creates an ordering between the projects when compiling them; util must be updated and compiled before core can be compiled
I'm trying to use twirl as part of my integration tests.
My integration tests have an HTTP simplicator that mimics a real world service. To implement this simplicator in the tests, I'm using spray-can embedded HTTP server which needs to spit out responses based on twirl templates.
My tests are located at /src/it/scala so naturally I want to place these twirl templates inside /src/it/twirl. This doesn't work since the twirl compiler ignores this directory.
If I place the twirl templates in /src/main/twirl, everything works fine - but I'm trying to avoid this because I don't want these templates to clutter the production package (the templates only come into play in the tests so they should only compile inside when the tests are run).
How can I tell the twirl compiler to look for templates in the new directory?
You should really be using sourceDirectory in twirlCompile that by default is set to:
sourceDirectory in twirlCompile <<= (sourceDirectory in Compile) / "twirl"
Redefine the sourceDirectory setting for the twirlCompile task for the IntegrationTest config. The following should work (it's yet to be verified).
sourceDirectory in twirlCompile in IntegrationTest := (sourceDirectory in IntegrationTest).value / "twirl"
I have a Play Framework 2.2 project that has different subprojects. Everything worked fine while only one of those subprojects had SQL evolution scripts.
Now, I'm trying to introduce another subproject with a SQL evolution script and I see no way of defining dependencies between them, or even to execute them both, while keeping them in their subprojects (where logically they belong).
So, how can I have evolution scripts in different submodules and have them all execute, respecting dependencies between them?
Thanks!
Add to application.conf file the list of models packages:
ebean.default = ["models.common.*","models.sub1.*", "models.sub2.*",
...]
And use dependsOn in build.sbt like this to get what you need:
lazy val sub1 = project.in(file("modules/sub1"))
.enablePlugins(PlayJava,PlayEbean)
.dependsOn(common)
Attention: the sintax is for play framework 2.4
The project I'm working on at work is a webapp on the Lift Framework. We're using xsbt web plugin as well. There's a "core" project, which contains the vast majority of the functionality; my current goal is to create two "distribution" projects that add a different set of classpath resources to the "core" project. The problem is that I either 1) can't get the "distribution" projects to run, or 2) Get them to run, but the required resource doesn't seem to be there.
What I've tried
Here's an abridged version of my project/Build.scala:
lazy val core = Project("Core", file("core"))
.settings( /*some dependencies, resolvers, webSettings */ )
lazy val app1 = Project("App1", file("app1"))
.aggregate(core)
.settings( /*the same settings as core */ )
lazy val app2 = Project("App2", file("app2"))
.aggregate(core)
.settings( /*the same settings as core*/ )
Then in the directory structure for both app1 and app2, I have a file at src/main/resources/aFileINeed. The core application is using the class.getResource approach to load the file from the classpath.
Problems
If I try to run one of the distribution projects, using container:start, it does not detect the required file in the classpath. Also, it claims that src/main/webapp is not an existing directory (that folder is included in the core project, as it is required by the xsbt web plugin).
How can I get these projects to "merge" their resources? I expected that using aggregate or dependsOn in the Build.scala project definition would handle that for me, but it apparently doesn't.
This is what I'm doing. I create a global 'resources' directory in the root of the project, and then the following variable:
lazy val globalResources = file("resources")
And then in every project's settings:
unmanagedResourceDirectories in Runtime += globalResources
For projects that use the xsbt-web-plugin or some other library that imports it you'll have to use the stronger
unmanagedResourceDirectories in Compile += globalResources
Beware that using this solution your projects will potentially have muliple resource directories and AFAIK if you define the same file twice compile:copy-resources is going to be angry at you.