How can scala-js integrate with sbt-web? - scala

I would like to use scala-js with sbt-web in such a way that it can be compiled to produce javascript assets that are added to the asset pipeline (e.g. gzip, digest). I am aware of lihaoyi's workbench project but I do not believe this affects the asset pipeline. How can these two projects be integrated as an sbt-web plugin?

Scala-js produces js files from Scala files. The sbt-web documentation calls this a Source file task.
The result would look something like this:
val compileWithScalaJs = taskKey[Seq[File]]("Compiles with Scala js")
compileWithScalaJs := {
// call the correct compilation function / task on the Scala.js project
// copy the resulting javascript files to webTarget.value / "scalajs-plugin"
// return a list of those files
}
sourceGenerators in Assets <+= compileWithScalaJs
Edit
To create a plugin involves a bit more work. Scala.js is not yet an AutoPlugin, so you might have some problems with dependencies.
The first part is that you add the Scala.js library as a dependency to your plugin. You can do that by using code like this:
libraryDependencies += Defaults.sbtPluginExtra(
"org.scala-lang.modules.scalajs" % "scalajs-sbt-plugin" % "0.5.5",
(sbtBinaryVersion in update).value,
(scalaBinaryVersion in update).value
)
Your plugin would look something like this:
object MyScalaJsPlugin extends AutoPlugin {
/*
Other settings like autoImport and requires (for the sbt web dependency),
see the link above for more information
*/
def projectSettings = scalaJSSettings ++ Seq(
// here you add the sourceGenerators in Assets implementation
// these settings are scoped to the project, which allows you access
// to directories in the project as well
)
}
Then in a project that uses this plugin you can do:
lazy val root = project.in( file(".") ).enablePlugins(MyScalaJsPlugin)

Have a look at sbt-play-scalajs. It is an additional sbt plugin that helps integration with Play / sbt-web.
Your build file will look something like this (copied from README):
import sbt.Project.projectToRef
lazy val jsProjects = Seq(js)
lazy val jvm = project.settings(
scalaJSProjects := jsProjects,
pipelineStages := Seq(scalaJSProd)).
enablePlugins(PlayScala).
aggregate(jsProjects.map(projectToRef): _*)
lazy val js = project.enablePlugins(ScalaJSPlugin, ScalaJSPlay)

Related

How do I create an sbt task to generate code, then include these generated managed sources in my root project?

I would like to have a sbt task that I can run to generate some code. I don't want to generate this with each run, just manually run this task once in awhile. I created a skeleton project to explain (https://github.com/jinyk/sbtmanagedsrc).
build.sbt:
lazy val root = (project in file("."))
.settings(scalaVersion := "2.11.8")
.settings(gensomecode := genSomeCodeTask.value)
/////////////////////////////////////////////////////////////
// fugly way to get managed sources compiled along with main
.settings(unmanagedSourceDirectories in Compile += baseDirectory.value / "target/scala-2.11/src_managed/")
/////////////////////////////////////////////////////////////
lazy val gensomecode = taskKey[Seq[File]]("gen-code")
lazy val genSomeCodeTask = Def.task {
val file = (sourceManaged in Compile).value / "SomeGenCode.scala"
println("file: " + file)
IO.write(file, """object SomeGenCode {
| def doSomething() {
| println("Hi!")
| }
|}""".stripMargin)
Seq(file)
}
So with the build.sbt above I can run sbt gensomecode which creates
target/scala-2.11/src_managed/main/SomeGenCode.scala the default place that sbt puts "managed sources."
I would like to make this SomeGenCode available to the root project.
src/main/scala/Main.scala:
object Main extends App {
SomeGenCode.doSomething()
}
The only thing I can figure out to do is to include the default sourceManaged directory in the root project's unmanagedSourceDirectories (see build.sbt:line 4 aka the line below the fugly way... comment). This is ugly as hell and doesn't seem like it's how managed sources are supposed to be handled.
I'm probably not understanding something basic about sbt's managed sources concept or how to handle the situation of creating an sbt task to generate sources.
What am I missing?
There are three options that I am familiar with:
Generate into the unmanaged source directories.
Generate on every run, by adding sourceGenerators in Compile <+= gensomecode
Similar to (2), but use caching so it doesn't generate the file on every compile. Full example below.
In this example, the cache is based on the content of build.sbt, so whenever that file is changed it will regenerate the file.
lazy val root = (project in file("."))
.settings(scalaVersion := "2.11.8")
.settings(gensomecode <<= genSomeCodeTask)
sourceGenerators in Compile <+= genSomeCodeTask
lazy val gensomecode = taskKey[Seq[File]]("gen-code")
def generateFile(sourceManaged: java.io.File) = {
val file = sourceManaged / "main" / "SomeGenCode.scala"
println("file: " + file)
IO.write(file, """object SomeGenCode {
| def doSomething() {
| println("Hi!")
| }
|}""".stripMargin)
Set(file)
}
def genSomeCodeTask = (sourceManaged in Compile, streams).map {
(sourceManaged, streams) =>
val cachedCompile = FileFunction.cached(
streams.cacheDirectory / "mything",
inStyle = FilesInfo.lastModified,
outStyle = FilesInfo.exists) {
(in: Set[java.io.File]) =>
generateFile(sourceManaged)
}
cachedCompile(Set(file("build.sbt"))).toSeq
}
I hope I wasn't too late for the answer, but let's look at this section about Unmanaged vs Managed files
Classpaths, sources, and resources are separated into two main categories: unmanaged and managed. Unmanaged files are manually created files that are outside of the control of the build. They are the inputs to the build. Managed files are under the control of the build. These include generated sources and resources as well as resolved and retrieved dependencies and compiled classes.
It seems that the key difference between "Unmanaged vs Managed" is "Manually vs Automatically". Now, if we look at documentation for "generating files". We will notice immediately that it means "generating files automatically", since generating files will happen at sbt compile.
Compile / sourceGenerators += <task of type Seq[File]>.taskValue
It makes sense. Since anything that happened during sbt compile should be removed during sbt clean.
Now, from your code below, It seems that you were trying to generate an unmanaged source file (you were not using sourceGenerators, didn't you?), to the managed source file directory. The most obvious problem with this is, your source file will be removed every time you call sbt clean, so you have to run this task again to get this file back (worse, you have to run the task manually, opposed to having the sbt compile do it for you.), thus defeating your purpose of doing it manually once in a while.
val file = (sourceManaged in Compile).value / "SomeGenCode.scala"
To fix this, you have to manually generate files to unmanaged source, which is basically your source code directory (it depends -- mine is "/app"). Yet, you have to annotate it somehow that these files are generated by some means. My solution is something like:
val file = (scalaSource in Compile).value / "generated" / "SomeGenCode.scala"
Hope this help!

How to create a custom package task to jar a subset of classes in SBT

I am trying to define a separate package task without modifying the original task in compile configuration. This new task will package only a subset of classes conforming an API which we need to be able to share with other teams so they can write plugins for our application. So the end result will be two jars, one with the full application and a second one with a subset of the classes.
I approached this problem by creating a different configuration which I called pluginApi and would redefine the packageBin task within this new configuration so it does not change the original definition of packageBin. This idea was taken from here:
How to create custom "package" task to jar up only specific package in SBT?
In my build.stb I have:
lazy val PluginApi = config("pluginApi") extend(Compile) describedAs("Custom plugin api configuration")
lazy val root = project in file(".") overrideConfigs (PluginApi)
This effectively creates my new configuration and I can call
sbt pluginApi:packageBin
Which generates the complete jar in the same way as compile:packageBin would do. I then try to modify the mappings in the new packageBin task with:
mappings in (PluginApi, packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath.startsWith("some/path/defining/api")
}
}
but this has no effect. I think the reason is because the call to pluginApi:packageBin is delegated to compile:packageBin rather than it being a cloned task.
I can redefine a new packageBin within the new scope like:
packageBin in PluginApi := {
}
However I would have to rewrite all packageBin functionality instead of reusing existing code. Also, in case that rewriting is unavoidable I am not sure how that implementation would be.
Could somebody provide an example about how to achieve this?
You could have it done as follows
lazy val PluginApi = config("pluginApi").extend(Compile)
inConfig(PluginApi)(Defaults.compileSettings) // you have to have standard
mappings in (PluginApi, packageBin) := {
val original = (mappings in (PluginApi, packageBin)).value
original.filter { case (file, toPath) => toPath.startsWith("some/path/defining/api") }
}
unmanagedSourceDirectories in PluginApi := (unmanagedSourceDirectories in Compile).value
Note that, if you keep your sources in src/main/scala you'll have to override unmanagedSourceDirectories in the newly created configuration.
Normally the unmanagedSourceDirectories contains the configuration name. E.g. src/pluginApi/scala or src/pluginApi/java.
I have had similar problems (with more than one jar per project). Our project uses ant - here you can do it, you just will repeat yourself a lot.
However, I have come to the conclusion that this scenario (2 JARs for one project) actually can be simplified by splitting the project - i.e. making 2 modules out of it.
This way, I don't have to "fight" tools which assume project==artifact (like sbt, maybe maven?, IDEA's default setting,...).
As a bonus point the compiler helps me to verify that my dependencies are correct, i.e. that I did not accidentally make my API package depend on the implementation package - when compiling everything together and only splitting classes apart in the JAR step, you do run the risk of getting an invalid dependency in your setup which you would only see when testing, because during compile time everything is compiled together.

How can I configure Scalatra (sbt) to find the webapp folder in a different source folder than "main" [duplicate]

By default, Scalatra expects the "webapp" directory to be at src/main/webapp. How could that be changed to, e.g., content/doc-root?
sbt allows for customizing its default directories using something like the following:
scalaSource <<= (baseDirectory)(_ / "src")
So I assume it's just a matter of knowing the right "configuration key" to use...
#Kelsey Gilmore-Innis has the right answer, but since it's not accepted let's break it, break it, break it down.
First, I'm assuming you're following Getting started guide to install Scalatra using g8. Hopefully the same version I just got.
g8 scalatra/scalatra-sbt
What that g8 template did was to set up an sbt 0.13 build which uses scalatra-sbt 0.3.2 plugin:
addSbtPlugin("org.scalatra.sbt" % "scalatra-sbt" % "0.3.2")
This plugin internally uses JamesEarlDouglas/xsbt-web-plugin 0.4.0 to do the webapp-related settings.
xsbt-web-plugin 0.4.0
This is why xsbt-web-plugin becomes relevant even though you just want to change Scalatra's setting. The setting you need to rewire is called webappResources in Compile. How does that work?
rewiring webappResources
To rewire the setting, open project/build.scala. Add
import com.earldouglas.xsbtwebplugin.PluginKeys.webappResources
to the import clauses. Then change settings as follows:
lazy val project = Project (
"foo",
file("."),
settings = Defaults.defaultSettings ++ ScalatraPlugin.scalatraWithJRebel ++ scalateSettings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += Classpaths.typesafeReleases,
webappResources in Compile := Seq(baseDirectory.value / "content" / "doc-root"),
...
)
)
Now move src/main/webapp to content/doc-root, reload sbt, and that should be it.
The resource folder is a Jetty property. If you're running embedded Jetty, it's specified here. You can edit it manually or override by setting the PUBLIC environment variable.
You also can override it in your SBT build file. It uses the xsbt-web-plugin to run, and you can override that plugin's settings.
For newer version of xsbt-web-plugin (1.0.0 as of writing) the way of changing source path is different.
First of all corresponding settings were moved to XwpPlugin.webappSettings. And you needs these two
webappSrc in webapp <<= (baseDirectory in Compile) map { _ / "content" / "doc-root" },
webappDest in webapp <<= (baseDirectory in Compile) map { _ / "content" / "doc-root" },
If you dont want to change the sbt settings, you can also do it programmatically by overriding serveStaticResource and using forward
override protected def serveStaticResource(): Option[Any] = {
// check to see if we need to alter the path to find the TRUE disk url
val incUrl = request.getRequestURI
if(incUrl.startsWith("/otherDir")) {
servletContext.resource(request) map { _ =>
servletContext.getNamedDispatcher("default").forward(request, response)
}
} else {
val trueUrl = "/otherdir" + incUrl
Option(servletContext.getRequestDispatcher(trueUrl).forward(request, response))
}
}
Disclaimer: You should also check that it doesn't go into an infinite loop.

Unable to get a type from different package in Scala

I have a project that consists of several subprojects. Let's say, I have three of them:
service
core
common
In my build.scala, I have the following definition
lazy val root = Project ("root", file("."), settings = Info.settings) aggregate(common, core, service)
lazy val common = Project("common", file("common"), settings = Info.settings)
lazy val core = Project ("core", file("appcore"), settings = Info.settings ++ Seq(libraryDependencies ++= dependencies)) dependsOn common
lazy val security = Project ("Service", file("service"), settings = Info.gatewaySettings ++ Seq(resolvers := packageResolvers, libraryDependencies ++= gatewayDeps)) dependsOn(common, core)
I use idea for development and therefore sbt-idea 1.4.0 for the generation of idea specific files.
I have created a class in 'common': User in package com.project.common.domain and I would like to use it from my 'Service' module. I can't. It simply doesn't see it. I have checked the iml file, it contains dependencies.
Have anyone seen this issue?
The problem was that all classes for some reasons were created in /test/scala instead of /main/scala

Making stand-alone jar with Simple Build Tool

Is there a way to tell sbt to package all needed libraries (scala-library.jar) into the main package, so it is stand-alone? (static?)
Edit 2011:
Since then, retronym (which posted an answer in this page back in 2010), made this sbt-plugin "sbt-onejar", now in its new address on GitHub, with docs updated for SBT 0.12.
Packages your project using One-JARâ„¢
onejar-sbt is a simple-build-tool plugin for building a single executable JAR containing all your code and dependencies as nested JARs.
Currently One-JAR version 0.9.7 is used. This is included with the plugin, and need not be separately downloaded.
Original answer:
Directly, this is not possible without extending sbt (a custom action after the model of the "package" sbt action).
GitHub mentions an assembly task, custom made for jetty deployment. You could adapt it for your need though.
The code is pretty generic (from this post, and user Rio):
project / build / AssemblyProject.scala
import sbt._
trait AssemblyProject extends BasicScalaProject
{
def assemblyExclude(base: PathFinder) = base / "META-INF" ** "*"
def assemblyOutputPath = outputPath / assemblyJarName
def assemblyJarName = artifactID + "-assembly-" + version + ".jar"
def assemblyTemporaryPath = outputPath / "assembly-libs"
def assemblyClasspath = runClasspath
def assemblyExtraJars = mainDependencies.scalaJars
def assemblyPaths(tempDir: Path, classpath: PathFinder, extraJars: PathFinder, exclude: PathFinder => PathFinder) =
{
val (libs, directories) = classpath.get.toList.partition(ClasspathUtilities.isArchive)
for(jar <- extraJars.get ++ libs) FileUtilities.unzip(jar, tempDir, log).left.foreach(error)
val base = (Path.lazyPathFinder(tempDir :: directories) ##)
(descendents(base, "*") --- exclude(base)).get
}
lazy val assembly = assemblyTask(assemblyTemporaryPath, assemblyClasspath, assemblyExtraJars, assemblyExclude) dependsOn(compile)
def assemblyTask(tempDir: Path, classpath: PathFinder, extraJars: PathFinder, exclude: PathFinder => PathFinder) =
packageTask(Path.lazyPathFinder(assemblyPaths(tempDir, classpath, extraJars, exclude)), assemblyOutputPath, packageOptions)
}
It takes a bit of work, but you can also use Proguard from within SBT to create a standalone JAR.
I did this recently in the SBT build for Scalala.
Working off of what #retronym offered above, I built a simple example that builds a stand alone jar which includes the Scala libraries (i.e. scala-library.jar) using Proguard with sbt. Thanks, retronym.
The simplest example using sbt-assembly
Create directory project in your home project dir with file assembly.sbt including
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")
In file build.sbt
import AssemblyKeys._ // put this at the top of the file
assemblySettings
jarName += "Mmyjarnameall.jar"
libraryDependencies ++= Seq( "exmpleofmydependency % "mydep" % "0.1" )
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case s if s.endsWith(".class") => MergeStrategy.last
case s if s.endsWith("pom.xml") => MergeStrategy.last
case s if s.endsWith("pom.properties") => MergeStrategy.last
case x => old(x)
}
}
The simplest method is just to create the jar from the command line. If you don't know how to do this I would highly recommend that you do so. Automation is useful, but its much better if you know what the automation is doing.
The easiest way to automate the creation of a runnable jar is to use a bash script or batch script in windows.
The simplest way in sbt is just to add the Scala libraries you need to the resource directories:
unmanagedResourceDirectories in Compile := Seq(file("/sdat/bins/ScalaCurrent/lib/scalaClasses"))
So in my environment ScalaCurrent is a link to the current Scala library. 2.11.4 at the time of writing. The key point is that I extract the Scala library but place it inside a ScalaClassses directory. Each extracted library needs to go into its top level directory.