I'm working on a Scala.js cross project where the jvm folder represents my server application and jsrepresents my scala.js code.
Whenever i compile my scala.js code via sbt crossJS/fastOptJS the compiled JS ends up in ./js/target/scala-2.11/web-fastopt.js.
I need to have this compiled JS file accessible in the resources of the server project in the jvm folder, so i can server it through my web application. I think i have to do something with artifactPath but i can't seem to get any results from my experiments thus far.
You can simply set the artifactPath of the fastOptJS task (or the fullOptJS task) to the (managed) resources directory of your JVM project:
// In the JS project's settings
artifactPath in fastOptJS in Compile :=
(resourceManaged in jvm in Compile).value /
((moduleName in fastOptJS).value + "-fastopt.js"))
This will put it in the directory, if the you run the fastOptJS task. However, it will not be included in sbt's resources task and it will not automatically be triggered, if you launch your server. Therefore:
// In the JVM project's settings
resources in Compile += (fastOptJS in js).value.data
A couple of notes:
The first step is only necessary, if your web-server does only serve specific directories. Otherwise the second one is enough, as this adds the file to the resources already; where it lies is secondary.
Setting the crossTarget, as in #ochrons' answer will also output all the .class and .sjsir files in the resource directory.
Have a look at Vincent Munier's sbt-play-scalajs for out-of-the-box sbt-web / Scala.js integration (it follows a slightly different approach: It copies the file from the js project, rather than directly placing it in the JVM project. Useful if you have multiple JVM projects).
You can configure the Scala.js SBT plugin to output the JavaScript file in folder of your choosing. For example like this:
// configure a specific directory for scalajs output
val scalajsOutputDir = Def.settingKey[File]("directory for javascript files output by scalajs")
// make all JS builds use the output dir defined later
lazy val js2jvmSettings = Seq(fastOptJS, fullOptJS, packageJSDependencies) map { packageJSKey =>
crossTarget in(js, Compile, packageJSKey) := scalajsOutputDir.value
}
// instantiate the JVM project for SBT with some additional settings
lazy val jvm: Project = sharedProject.jvm.settings(js2jvmSettings: _*).settings(
// scala.js output is directed under "web/js" dir in the jvm project
scalajsOutputDir := (classDirectory in Compile).value / "web" / "js",
This will also store -jsdeps.js and .js.map files in the same folder, in case you want to use those in your web app.
For a more complete example, check out this tutorial which addresses many other issues of creating a more complex Scala.js application.
Related
(Long question ahead. Simplified tl;dr at the bottom).
I have two ScalaJS projects built with SBT - "myapp" and "mylib", in the following directory structure
root/build.sbt
root/myapp/build.sbt
root/myapp/jvm/
root/myapp/js/
root/myapp/shared/
root/mylib/build.sbt
root/mylib/jvm
root/mylib/js
root/mylib/shared
lib exports an artifact named "com.example:mylib:0.1", which as used as a libraryDependency for myapp.
myapp and mylib are in separate repositories, contain their own build files, and should be able to be build completely separately (i.e. they must contain their own individual build config).
In production, they will be built separately with mylib being first published as a maven artifact before building myapp separately.
In development however, I want to be able to merge these into a parent SBT project so that both can be developed in parallel without needing to use publishLocal after each change.
In a traditional (not scalajs) project this would be quite easy
$ROOT/build.sbt:
lazy val mylib = project
lazy val myapp = project.dependsOn(mylib)
However in ScalaJS, we actually have two projects inside each module - appJVM, appJS, libJVM and libJS. As such, the above configuration only finds the aggregate root project and does not correctly apply the dependsOn configuration to the actual JVM and JS projects.
(i.e. myapp and mylib build.sbt each contains two projects, and an aggregate root project)
Ideally I'd like to be able to do something like the following
lazy val mylibJVM = project
lazy val myappJVM = project.dependsOn(mylibJVM)
lazy val mylibJS = project
lazy val myappJS = project.dependsOn(myappJS)
Unfortunately this just creates new projects within the root instead of importing the subprojects themselves.
I've also tried various combinations of paths (such as)
lazy val mylibJVM = project.in(file("mylib/jvm"))
But this doesn't see configuration in build.sbt file in mylib
Ultimately I keep running up against the same problem - when importing an existing multi-project SBT project into a parent sbt file, it imports the root project, but does not seem to provide a way to import a subproject from an existing multimodule SBT file in a way that lets me add dependsOn configuration to it.
tl;dr
If I have
root/mylib/build.sbt with multiple projects defined and
root/myapp/build.sbt with multiple projects defined
Is it possible to import individual subprojects into root/build.sbt instead of the root project from the submodule?
i.e. Can I have two layers of multiproject builds.
After spending a lot of time digging through SBT source code, I managed to figure out a solution. This isn't clean, but it works. (For bonus points, it imports correctly into IntelliJ).
// Add this function to your root build.sbt file.
// It can be used to define a dependency between any
// `ProjectRef` without needing a full project definition.
def addDep(from:String, to:String) = {
buildDependencies in Global <<= (
buildDependencies in Global,
thisProjectRef in from,
thisProjectRef in to) {
(deps, fromref, toref) =>
deps.addClasspath(fromref, ResolvedClasspathDependency(toref, None))
}
}
// `project` will import the `build.sbt` file
// in the subdirectory of the same name as the `lazy val`
// (performed by an SBT macro). i.e. `./mylib/build.sbt`
//
// This won't reference the actual subprojects directly,
// will but import them into the namespace such that they
// can be referenced as "ProjectRefs", which are implicitly
// converted to from strings.
//
// We then aggregate the JVM and JS ScalaJS projects
// into the new root project we've defined. (Which unfortunately
// won't inherit anything from the child build.sbt)
lazy val mylib = project.aggregate("mylibJVM","mylibJS")
lazy val myapp = project.aggregate("myappJVM","myappJS")
// Define a root project to aggregate everything
lazy val root = project.in(file(".")).aggregate(mylib,myapp)
// We now call our custom function to define a ClassPath dependency
// between `myapp` -> `mylib` for both JVM and JS subprojects.
// In particular, this will correctly find exported artifacts
// so that `myapp` can refer to `mylib` in libraryDependencies
// without needing to use `publishLocal`.
addDep("myappJVM", "mylibJVM")
addDep("myappJS","mylibJS")
I am using Play 2.2.3.
I am new to Play and SBT. I want to make Play not watch a folder during development. In this case, a node_modules folder under public folder.
I tried these below but it did not seems to work. Also, I dont' know what is the difference between watchSources and playMonitoredFiles.
.settings(
playMonitoredFiles <<= playMonitoredFiles map { (files: Seq[String]) =>
files.filterNot(file => file.contains("node_modules"))
}
)
.settings(
watchSources <<= watchSources map { (sources: Seq[java.io.File]) =>
sources
.filterNot(source => source.isFile && source.getPath.contains("node_modules") )
.filterNot(source => source.isDirectory && source.getPath.contains("node_modules"))
}
)
.settings(
watchSources := watchSources.value.filter { !_.getPath.contains("node_modules") }
)
Note: The original question was about a legacy release (2.2.3), however I found useful the same question for modern versions.
So basically watchSources is a SBT key used for the triggered-execution functionality.
On the contrary, playMonitoredFiles is a TaskKey defined by the PlayFramework sbt-plugin, which controls a set of directories to watch by
the development server (sbt run) for the auto-reloading functionality.
When launching an sbt run, for EVERY REQUEST, there is a check of the directories defined by playMonitoredFiles and if any change is found,
a silent re-compilation is triggered. This means shutting down the server, recompiling and starting again.
The default values for playMonitoredFiles are calculated in playMonitoredFilesTask.
So for example, we have a certain version file added to our resources, computed in each compilation. We needed PlayFramework to avoid checking changes on this file
on development, because on every-request, the auto-recompiling regenerated this file and play marked the sources to be recompiled again, however no java files were recompiled and
nothing is printed out, so we had to debug the application in order to find the issue.
This setting only applies to development server (sbt run).
Within a project settings in build.sbt, the task can be overridden to exclude a directory (in the example, target/version directory):
.settings(
...,
playMonitoredFiles := playMonitoredFiles.value.filterNot {
_.getPath() endsWith String.format("target%sversion", File.separator)
}
...,
)
The value is of type File and only contains directories.
I'm having trouble concatenating and fingerprinting all the CoffeeScript files in Play application. Everything works fine for JavaScript files with build.sbt like this one
pipelineStages := Seq(concat, digest)
Concat.groups := Seq(
"javascripts/app.js" -> group(((sourceDirectory in Assets).value / "javascripts") * "*.js")
)
But when sourceDirectory is changed to resourcesManaged that supposedly contains compiled CoffeeScript files sbt-concat doesn't pick them up.
sbt-coffeescript, and all other official source task plugins, don't put their files in resourcesManaged in Assets, but instead their own sub-directory in target/web/<taskname>. They scope the resourcesManaged setting to their main task, in this case this means resourcesManaged in (Assets, coffeescript) and resourcesManaged in (TestAssets, coffeescript).
When you run sbt coffeescript you can see the files are output to target/web/coffeescript/main. You can verify this by running show web-assets:coffeescript::resourceManaged from the sbt console.
I've been writing an SBT plugin that generates resources into resource_managed. I'm now looking to include these generated resources in the generated jar as the SBT docs detail:
Generating resources:
By default, generated resources are not included in the packaged source artifact. To do so, add them as you would other mappings. See Adding files to a package
I've read the docs but honestly how to do this I can't figure out. Can anyone explain it or point me to another project that does this so I can see how they do it?
First just to clarify, they are included in jars containing compiled classes. They are not included in jars containing sources.
By default, generated resources are not included in the packaged
source artifact.
For packageBin the generated files should already be included - just make sure you return all generated files from the generator method. Assuming you want to package them in the sources artifact, this is what you have to do.
Let's assume you have a generator that generates a property file.
lazy val generatePropertiesTask = Def.task {
val file = (Compile / resourceManaged).value / "stack-overflow" / "res.properties"
val contents = s"name=${name.value}\nversion=${version.value}"
IO.write(file, contents)
Seq(file)
}
resourceGenerators in Compile += generatePropertiesTask.taskValue
To include that in the generated sources you have to tell sbt to where the res.properties must be copied in the generated sources artefact. The task, which generates the packaged sources is called packageSrc, therefore you have to set mappings scoped to that task.
mappings in (Compile, packageSrc) += {
((resourceManaged in Compile).value / "stack-overflow" / "res.properties") -> "path/in/jar/res.properties"
}
Because your generator can generate many tasks, and mapping each by hand would be a tedious task, sbt gives you an utility to map multiple paths at once.
mappings in (Compile, packageSrc) ++= {
val allGeneratedFiles = ((resourceManaged in Compile).value ** "*") filter { _.isFile }
allGeneratedFiles.get pair relativeTo((resourceManaged in Compile).value)
}
The first line finds all generated files using path finders and second line maps them to their path in the target jar.
I am trying to define a separate package task without modifying the original task in compile configuration. This new task will package only a subset of classes conforming an API which we need to be able to share with other teams so they can write plugins for our application. So the end result will be two jars, one with the full application and a second one with a subset of the classes.
I approached this problem by creating a different configuration which I called pluginApi and would redefine the packageBin task within this new configuration so it does not change the original definition of packageBin. This idea was taken from here:
How to create custom "package" task to jar up only specific package in SBT?
In my build.stb I have:
lazy val PluginApi = config("pluginApi") extend(Compile) describedAs("Custom plugin api configuration")
lazy val root = project in file(".") overrideConfigs (PluginApi)
This effectively creates my new configuration and I can call
sbt pluginApi:packageBin
Which generates the complete jar in the same way as compile:packageBin would do. I then try to modify the mappings in the new packageBin task with:
mappings in (PluginApi, packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath.startsWith("some/path/defining/api")
}
}
but this has no effect. I think the reason is because the call to pluginApi:packageBin is delegated to compile:packageBin rather than it being a cloned task.
I can redefine a new packageBin within the new scope like:
packageBin in PluginApi := {
}
However I would have to rewrite all packageBin functionality instead of reusing existing code. Also, in case that rewriting is unavoidable I am not sure how that implementation would be.
Could somebody provide an example about how to achieve this?
You could have it done as follows
lazy val PluginApi = config("pluginApi").extend(Compile)
inConfig(PluginApi)(Defaults.compileSettings) // you have to have standard
mappings in (PluginApi, packageBin) := {
val original = (mappings in (PluginApi, packageBin)).value
original.filter { case (file, toPath) => toPath.startsWith("some/path/defining/api") }
}
unmanagedSourceDirectories in PluginApi := (unmanagedSourceDirectories in Compile).value
Note that, if you keep your sources in src/main/scala you'll have to override unmanagedSourceDirectories in the newly created configuration.
Normally the unmanagedSourceDirectories contains the configuration name. E.g. src/pluginApi/scala or src/pluginApi/java.
I have had similar problems (with more than one jar per project). Our project uses ant - here you can do it, you just will repeat yourself a lot.
However, I have come to the conclusion that this scenario (2 JARs for one project) actually can be simplified by splitting the project - i.e. making 2 modules out of it.
This way, I don't have to "fight" tools which assume project==artifact (like sbt, maybe maven?, IDEA's default setting,...).
As a bonus point the compiler helps me to verify that my dependencies are correct, i.e. that I did not accidentally make my API package depend on the implementation package - when compiling everything together and only splitting classes apart in the JAR step, you do run the risk of getting an invalid dependency in your setup which you would only see when testing, because during compile time everything is compiled together.