Groping in the dark, I just resorted to a pathetic hack (note the path backtracking):
(resourceManaged in (Compile, CoffeeKeys.coffee)) <<=
(crossTarget in Compile)(_ / "../../../apache/static" / "js")
Is there any way to specify the absolute target write path with coffeescripted-sbt? The intro/overview states
You can override this behavior by overriding the resourceManaged
setting scoped to your configration and the CoffeeKeys.coffee task.
Below is an example you can append to your build definition which will
copy generated javascript to target/:scala-version/your_preference/js
That's great, but I'd like to write directly to apache statics directory, and not 4 levels deep in my sbt-eclipse project
Should note: I'm getting the Unicorn is Angry quite often on GitHub these days, so issue tracker isn't much help.
Thanks for any clues, what I have works, but I'd like to know how to set the absolute path properly
(resourceManaged in (Compile, CoffeeKeys.coffee)) <<=
(crossTarget in Compile)(_ / "pref" / "js")
Sets compile target relative to the default, which is "project_root/target/scala-version/"
The solution is refreshingly simple:
resourceManaged in (Compile, CoffeeKeys.coffee) :=
file("/absolute/path/to/apache/static/js")
SBT user group thread
Related
I'm working on a Scala.js cross project where the jvm folder represents my server application and jsrepresents my scala.js code.
Whenever i compile my scala.js code via sbt crossJS/fastOptJS the compiled JS ends up in ./js/target/scala-2.11/web-fastopt.js.
I need to have this compiled JS file accessible in the resources of the server project in the jvm folder, so i can server it through my web application. I think i have to do something with artifactPath but i can't seem to get any results from my experiments thus far.
You can simply set the artifactPath of the fastOptJS task (or the fullOptJS task) to the (managed) resources directory of your JVM project:
// In the JS project's settings
artifactPath in fastOptJS in Compile :=
(resourceManaged in jvm in Compile).value /
((moduleName in fastOptJS).value + "-fastopt.js"))
This will put it in the directory, if the you run the fastOptJS task. However, it will not be included in sbt's resources task and it will not automatically be triggered, if you launch your server. Therefore:
// In the JVM project's settings
resources in Compile += (fastOptJS in js).value.data
A couple of notes:
The first step is only necessary, if your web-server does only serve specific directories. Otherwise the second one is enough, as this adds the file to the resources already; where it lies is secondary.
Setting the crossTarget, as in #ochrons' answer will also output all the .class and .sjsir files in the resource directory.
Have a look at Vincent Munier's sbt-play-scalajs for out-of-the-box sbt-web / Scala.js integration (it follows a slightly different approach: It copies the file from the js project, rather than directly placing it in the JVM project. Useful if you have multiple JVM projects).
You can configure the Scala.js SBT plugin to output the JavaScript file in folder of your choosing. For example like this:
// configure a specific directory for scalajs output
val scalajsOutputDir = Def.settingKey[File]("directory for javascript files output by scalajs")
// make all JS builds use the output dir defined later
lazy val js2jvmSettings = Seq(fastOptJS, fullOptJS, packageJSDependencies) map { packageJSKey =>
crossTarget in(js, Compile, packageJSKey) := scalajsOutputDir.value
}
// instantiate the JVM project for SBT with some additional settings
lazy val jvm: Project = sharedProject.jvm.settings(js2jvmSettings: _*).settings(
// scala.js output is directed under "web/js" dir in the jvm project
scalajsOutputDir := (classDirectory in Compile).value / "web" / "js",
This will also store -jsdeps.js and .js.map files in the same folder, in case you want to use those in your web app.
For a more complete example, check out this tutorial which addresses many other issues of creating a more complex Scala.js application.
I'm having trouble concatenating and fingerprinting all the CoffeeScript files in Play application. Everything works fine for JavaScript files with build.sbt like this one
pipelineStages := Seq(concat, digest)
Concat.groups := Seq(
"javascripts/app.js" -> group(((sourceDirectory in Assets).value / "javascripts") * "*.js")
)
But when sourceDirectory is changed to resourcesManaged that supposedly contains compiled CoffeeScript files sbt-concat doesn't pick them up.
sbt-coffeescript, and all other official source task plugins, don't put their files in resourcesManaged in Assets, but instead their own sub-directory in target/web/<taskname>. They scope the resourcesManaged setting to their main task, in this case this means resourcesManaged in (Assets, coffeescript) and resourcesManaged in (TestAssets, coffeescript).
When you run sbt coffeescript you can see the files are output to target/web/coffeescript/main. You can verify this by running show web-assets:coffeescript::resourceManaged from the sbt console.
I am trying to define a separate package task without modifying the original task in compile configuration. This new task will package only a subset of classes conforming an API which we need to be able to share with other teams so they can write plugins for our application. So the end result will be two jars, one with the full application and a second one with a subset of the classes.
I approached this problem by creating a different configuration which I called pluginApi and would redefine the packageBin task within this new configuration so it does not change the original definition of packageBin. This idea was taken from here:
How to create custom "package" task to jar up only specific package in SBT?
In my build.stb I have:
lazy val PluginApi = config("pluginApi") extend(Compile) describedAs("Custom plugin api configuration")
lazy val root = project in file(".") overrideConfigs (PluginApi)
This effectively creates my new configuration and I can call
sbt pluginApi:packageBin
Which generates the complete jar in the same way as compile:packageBin would do. I then try to modify the mappings in the new packageBin task with:
mappings in (PluginApi, packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath.startsWith("some/path/defining/api")
}
}
but this has no effect. I think the reason is because the call to pluginApi:packageBin is delegated to compile:packageBin rather than it being a cloned task.
I can redefine a new packageBin within the new scope like:
packageBin in PluginApi := {
}
However I would have to rewrite all packageBin functionality instead of reusing existing code. Also, in case that rewriting is unavoidable I am not sure how that implementation would be.
Could somebody provide an example about how to achieve this?
You could have it done as follows
lazy val PluginApi = config("pluginApi").extend(Compile)
inConfig(PluginApi)(Defaults.compileSettings) // you have to have standard
mappings in (PluginApi, packageBin) := {
val original = (mappings in (PluginApi, packageBin)).value
original.filter { case (file, toPath) => toPath.startsWith("some/path/defining/api") }
}
unmanagedSourceDirectories in PluginApi := (unmanagedSourceDirectories in Compile).value
Note that, if you keep your sources in src/main/scala you'll have to override unmanagedSourceDirectories in the newly created configuration.
Normally the unmanagedSourceDirectories contains the configuration name. E.g. src/pluginApi/scala or src/pluginApi/java.
I have had similar problems (with more than one jar per project). Our project uses ant - here you can do it, you just will repeat yourself a lot.
However, I have come to the conclusion that this scenario (2 JARs for one project) actually can be simplified by splitting the project - i.e. making 2 modules out of it.
This way, I don't have to "fight" tools which assume project==artifact (like sbt, maybe maven?, IDEA's default setting,...).
As a bonus point the compiler helps me to verify that my dependencies are correct, i.e. that I did not accidentally make my API package depend on the implementation package - when compiling everything together and only splitting classes apart in the JAR step, you do run the risk of getting an invalid dependency in your setup which you would only see when testing, because during compile time everything is compiled together.
By default, Scalatra expects the "webapp" directory to be at src/main/webapp. How could that be changed to, e.g., content/doc-root?
sbt allows for customizing its default directories using something like the following:
scalaSource <<= (baseDirectory)(_ / "src")
So I assume it's just a matter of knowing the right "configuration key" to use...
#Kelsey Gilmore-Innis has the right answer, but since it's not accepted let's break it, break it, break it down.
First, I'm assuming you're following Getting started guide to install Scalatra using g8. Hopefully the same version I just got.
g8 scalatra/scalatra-sbt
What that g8 template did was to set up an sbt 0.13 build which uses scalatra-sbt 0.3.2 plugin:
addSbtPlugin("org.scalatra.sbt" % "scalatra-sbt" % "0.3.2")
This plugin internally uses JamesEarlDouglas/xsbt-web-plugin 0.4.0 to do the webapp-related settings.
xsbt-web-plugin 0.4.0
This is why xsbt-web-plugin becomes relevant even though you just want to change Scalatra's setting. The setting you need to rewire is called webappResources in Compile. How does that work?
rewiring webappResources
To rewire the setting, open project/build.scala. Add
import com.earldouglas.xsbtwebplugin.PluginKeys.webappResources
to the import clauses. Then change settings as follows:
lazy val project = Project (
"foo",
file("."),
settings = Defaults.defaultSettings ++ ScalatraPlugin.scalatraWithJRebel ++ scalateSettings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += Classpaths.typesafeReleases,
webappResources in Compile := Seq(baseDirectory.value / "content" / "doc-root"),
...
)
)
Now move src/main/webapp to content/doc-root, reload sbt, and that should be it.
The resource folder is a Jetty property. If you're running embedded Jetty, it's specified here. You can edit it manually or override by setting the PUBLIC environment variable.
You also can override it in your SBT build file. It uses the xsbt-web-plugin to run, and you can override that plugin's settings.
For newer version of xsbt-web-plugin (1.0.0 as of writing) the way of changing source path is different.
First of all corresponding settings were moved to XwpPlugin.webappSettings. And you needs these two
webappSrc in webapp <<= (baseDirectory in Compile) map { _ / "content" / "doc-root" },
webappDest in webapp <<= (baseDirectory in Compile) map { _ / "content" / "doc-root" },
If you dont want to change the sbt settings, you can also do it programmatically by overriding serveStaticResource and using forward
override protected def serveStaticResource(): Option[Any] = {
// check to see if we need to alter the path to find the TRUE disk url
val incUrl = request.getRequestURI
if(incUrl.startsWith("/otherDir")) {
servletContext.resource(request) map { _ =>
servletContext.getNamedDispatcher("default").forward(request, response)
}
} else {
val trueUrl = "/otherdir" + incUrl
Option(servletContext.getRequestDispatcher(trueUrl).forward(request, response))
}
}
Disclaimer: You should also check that it doesn't go into an infinite loop.
Although SBT is called simple build tools, it's far from being simple. I still can't get this syntax in sbt session like compile:compile? What's the difference between this and just compile?
The main trick in here is in scopes. If you want really understand how SBT works then always use three commands:
show <setting> - Displays the value of the specified setting.
show <task> - Evaluates the specified task and display the value returned by the task.
inspect <key> - shows info about setting
inspect tree <key> - displays key and its dependencies in a tree structure.
There are many other good commands, but this will help you most to understand the basics of SBT.
As for the syntax. Each build consists of settings, tasks, projects and scopes. There are too much to tell about them, there is a good explanation given on the official site. And the syntax you gave is all about this terms, for example let's take a look at:
compile:scalaSource::sourceDirectory
1 2 3
1 - it is a Compile scope
2 - it is a dependant Setting
3 - dependency Setting
If you type inspect scalaSource you'll see that, if you type just scalaSource in the SBT session this will call scalaSource in the compile scope(compile:scalaSource), this explains the difference between compile:compile and compile, this are the same (call inspect on compile). The second thing you should take a look at in inspect scalaSource is the Dependencies: part: compile:sourceDirectory, so scalaSource depends on the sourceDirectory setting in compile:sourceDirectory and if you've seen some build on github, in *.sbt or *.scala build files it's written like:
sourceDirectory in (Compile, scalaSource) := ....
Just for the exercise, call:
show compile:scalaSource::sourceDirectory
and you'll see the output like this: <project-dir>/src/main and then call:
set sourceDirectory in (Compile, scalaSource) <<= baseDirectory(_ / "src" / "sc")
and then again:
show compile:scalaSource::sourceDirectory