In Play, how to copy js files from ScalaJS client to server? - scala.js

I'm trying to write an app that's part ScalaJS and part Play framework. I'm using the ScalaJS bundler. It's bundling my JavaScript fine and I can see the resulting files where they are supposed to go.
But I noticed that only client-jsdeps.js and client-fastopt.js are available to the app. The reason for that is that their are the only files that are copied to the path server/target/web/public/main. I have looked everywhere I could think, sbt build files, config files, everywhere, and I could not find why those files are copied over and no other. I'd like the -bundle files to be copied instead. Where is that setting?
It is worth noting that the two files that are packaged with the app do not appear in the user-editable path, server/public/js, they are copied directly to the WAR file and therefore visible in the target directory.

From the documentation:
For sbt-web integration use the sbt-web-scalajs-bundler plugin instead of sbt-scalajs-bundler:
addSbtPlugin("ch.epfl.scala" % "sbt-web-scalajs-bundler" % "0.13.1")
Then, enable the WebScalaJSBundlerPlugin on the project that uses sbt-web:
lazy val server = project
.settings(
scalaJSProjects := Seq(client),
pipelineStages in Assets := Seq(scalaJSPipeline)
)
.enablePlugins(WebScalaJSBundlerPlugin)
lazy val client = project.enablePlugins(ScalaJSBundlerPlugin, ScalaJSWeb)
You also need to setup the ScalaJSBundlerPlugin on the Scala.js project, as described in the preceding section, and
the sbt-web-scalajs plugins as described in their documentation.
The WebScalaJSBundlerPlugin plugin automatically configures the scalaJSPipeline task to use
the bundles rather than the output of the Scala.js compilation.
You can see a complete example here.
Do you follow that guide?

Related

How to share .proto (protobuf) files using a shared scala library using sbt

I have a few apps using shared .proto files. Each app's repo currently contains a copy of the files, which is not ideal and has recently created a problem when they accidentally diverged.
I would like to store the .proto files in a shared library which is already a common dependency for these apps. We're using sbt-protoc which has documentation for including .proto files from external libraries, but I can't find any information on how to package libraries that include them.
The .proto files are located in src/main/protobuf, but do not appear in the generated jar, which is presumably standard behaviour. I know you can tell sbt to include specific resource files, but I don't know if I've missed how to do it using sbt-protoc
To get protos included in the JAR, you can rely on standard sbt functionality, by adding a setting such as:
Compile / unmanagedResourceDirectories += sourceDirectory.value / "protobuf"

What is the correct way of using static assets in a Scala SBT project?

What is the best way of using static resources in an SBT based Scala project with a packaging plugin such as sbt-assembly or sbt-native-packager.
We know that by using TypeSafe-Config with sbt-native-packager's universal plugin, we can just put the configuration file in the resources directory under sources. However, what if I wanted my application to have other static sources such as JSON files containing mappings, and models?
I understand that I can just reference the resources directory and read from the file, but would that still work after packaging the application with plugins (assuming the universal or docker plugin in this case)?
If not, what is the correct way to achieve this?
You could use
unmanagedResourceDirectories += (baseDirectory in <project>).value / some / path
to add more directories that are later mapped into the jar as static resources.
Put it into the resources directory, but don't "reference the resources directory and read from the file": use ClassLoader.getResourceAsStream() (or getResources, depending on your requirements) instead. This is the same technique TypeSafe Config and innumerable other libraries use. For this it doesn't matter if you use sbt-native-packager or not.
This approach runs into problems if you need to make these resources available specifically as files (e.g. to feed them to an external process). In this case add them to mappings as shown here:
mappings in Universal in packageBin += file("README") -> "README"
(obviously replacing "README" with the file(s) you need).

How to use ForkOptions in SBT to change working directory for test within subproject?

I currently have a multiproject SBT build with 2 projects under it, one of which has a dependency on the other. The dependent one has a test in which it needs to load a file from a certain directory structure underneath its working directory. It uses a relative path in a configuration file to designate this directory structure.
The issue is that depending on whether I am running this test through my IDE (with working directory at the subproject level) or at the SBT umbrella project level (with working directory at the umbrella level) makes a difference in the ability of my test to load this file via its relative path and succeed.
I need to use a relative path so that other developers working on this project may use the checked-in code out of the box, and duplicating the directory structure and contained files at two levels in the project is out of the question. What I really need to do is direct SBT to move the working directory into the subproject when doing tests, so that the directory structure can remain the same regardless of where the test is initiated from.
SBT offers a ForkOptions class (http://www.scala-sbt.org/0.13.0/api/index.html#sbt.ForkOptions), further described here: http://www.scala-sbt.org/0.13/docs/Forking.html at the bottom of the page, through which it seems one can supply a working directory for the forked JVM to be started in but gives no good examples on how to set up the configuration in a root build.sbt or supply a ForkOptions instance to a test task.
Does anyone have any experience using this class, and/or can anyone offer some guidance on getting this functionality out of a multiproject build in SBT?
The solution is to provide the following settings to the project definition in the root build.sbt.
lazy val yourProject = project.settings(
fork := true,
baseDirectory in test := file("yourProject")
)
These cause the JVM to fork for the tests in that subproject and changes the working directory to the base of that subproject.

Play Framework Project: How to include plugin from source

Background:
I am in the process of integrating TypeScript into a Play Framework (2.2.6) and I am trying to use mumoshu's plugin to do so. Problem is, the plugin has problems when running "play dist" on a windows machine.
I've forked the code from GitHub in order to make some modifications to the source so I can continue using the plugin.
Question:
I have a play framework plugin in the traditional source structure:
project/build.properties
project/Build.scala
project/plugins.sbt
src/main/scala/TypeScriptPlugin
src/main/scala/TypeScriptKeys.scala
...<other code>
I'd like to include this plugin into another project but I don't really know where to start and how to hookup the settings.
From previous suggestions, I've been able to add the module to my project as follows:
// In project/Build.scala...
object ApplicationBuild extends Build{
lazy val typeScriptModule = ProjectRef(file("../../../play2-typescript"), "play2-typescript")
lazy val main = play.Project(<appName>, <appVersion>, <appDependencies>).settings(typescriptSettings: _*).dependsOn(typeScriptModule).aggregate(typeScriptModule)
}
Where typescriptSettings is defined in the other project... I think, I'm still not 100% sure what typescriptSettings IS other than adding this settings call enabled the plugin to work. This worked fine originally when I had included the plugin in the plugins.sbt file and imported the package com.github.mumoshu.play2.typescript.TypeScriptPlugin._ but now that I'm including the source and explicitly including the module, I can't just import the package... Or at least not the way I used to.
I am still new to scala/sbt and I am having difficulty finding helpful resources online. Any help would be appreciated.
Assuming in the same parent directory you have two directories:
play2-typescript: which is a clone of mumoshu's play2-typescript
play2-typescript-testapp: which is the play app in which you're testing your changes
You need to create, inside play2-typescript-testapp's project directory a file like so:
play2-typescript.sbt
val metaBuild = (project in file(".")
dependsOn ProjectRef(file("../../play2-typescript"), "play2-typescript")
)
Note:
The relative path is to the play2-typescript plugin project, and is relative to the project directory inside play2-typescript-testapp.
Change that to what is correct in your setup, and consider that you can also define it as an absolute path.
There a lots of activator template examples of this. I have a project where we followed the https://typesafe.com/activator/template/play-multidomain-auth path. Specifically, to address your question; the plugins in the root project play-multidomain-auth/project/ are accessible in the modules (play-multidomain-auth/modules/admin/, .../common, and .../web).
This example is the cleanest example I've seen in using multi-project design however that opinion is very subjective.
I hope this helps.

SBT: Simplest way to separate plugins.sbt

This is a very simple question, but I surprisingly havn't gotten an answer for it yet.
Simply put, in most non trivial SBT projects you will have a plugins.sbt file that contains plugins that are required to run your project (like a web container plugin if your SBT project is a website). However in the same file (plugins.sbt), plugins which have nothing to do with actually running your project (such as ensime/intellij/eclipse project generators) are also typically placed in plugins.sbt
I have seen this behavior for many SBT projects which are placed into github
This ideally speaking is not the correct way to do things, ideally plugins which have nothing to do with actually running/compiling your project should be in a separate file which is put into a .gitignore
What is the idiomatic SBT way of dealing with this (I assume it should be something that consists of having 2 separate plugins.sbt files, one with actual project plugins and the other with IDE generators and whatnot)
You can install plugins globally by placing them in ~/.sbt/0.13/plugins/. .sbt or .scala files located here are loaded for every project that you have.
You can also use addSbtPlugin() in a .sbt file to add other plugins.
Check out http://www.scala-sbt.org/release/docs/Getting-Started/Using-Plugins.html