I have written a plugin to process some SQL files and generate new ones as managed resources. When I run 'sbt compile' the files are generated in to the target/resource_managed/main/sql folder. When I run 'sbt run' or 'sbt test' they are not copied into the target/classes directory like I expect, so the code that is looking for them on the classpath cannot find them.
Here is the code for the plugin:
object SqlProcessorPlugin extends AutoPlugin {
import autoImport._
override def requires = plugins.JvmPlugin
override def trigger = noTrigger
object autoImport {
lazy val processorSettings = taskKey[File]("Settings for sql processing")
lazy val processSqlTask = taskKey[Seq[File]]("Process Sql")
def configProcessor(cfg: Configuration) = {
inConfig(cfg) {
Seq(
target in processorSettings := resourceManaged.value / "sql",
sourceDirectory in processorSettings := sourceDirectory.value / "sql",
processSqlTask / fileInputs += (sourceDirectory in processorSettings).value.toGlob / ** / "*.sql",
processSqlTask := {
SqlProcessor.process(
processSqlTask.inputFileChanges,
(target in processorSettings).value
)
},
resourceGenerators += processSqlTask.taskValue,
)
}
}
override val projectSettings = configProcessor(Compile)
}
}
I've try lots of variations on this based upon examples from other questions and from other plugins, but nothing has resulted in the generated files being copied to the class path.
What an I missing/doing wrong here?
I figured ou the issue. I was using resourceManage.value instead of (Compile/resourceManaged).value as the target directory. Also, I think as a result it was messing up the relative path on the output files resulting in them being copied to the wrong place.
Related
Here is my plugin code. It defines a master lint task that is triggered from the CLI like so: sbt api/lint jobs/lint. It calls out to some project-specific linters and some build-wide linters. The build-wide linter runs scalafix, but if I call lint multiple times from the CLI, as above (for multiple projects), then scalafix is run multiple times.
How do I make scalafix (and scalafixLinter) run only one time for a given sbt invocation? I thought sbt caches task results, but it seems to not be working here.
object LinterPlugin extends AutoPlugin {
object autoImport {
lazy val scalafixLinter = taskKey[Unit]("Run scalafix on all scala code")
lazy val lint = taskKey[Unit]("Run all linters")
}
override val buildSettings = Seq(
scalafixLinter := {
Def.taskDyn {
if (...) {
Def.task {
// run scalafix
(Compile / scalafix).toTask("").value
(Test / scalafix).toTask("").value
}
} else {
Def.task {}
}
}.all(ScopeFilter(inAnyProject)).value // run on all projects
}
)
override val projectSettings = Seq(
lint := {
// run all the linters
otherLinters.value
scalafixLinter.value
}
)
}
You could use a FileFunction.cache method. Though that only works for files.
I would like to generate sources from files which are part of the project (I have currently placed them in a resource directory, but this is not a requirement).
This is my attempt on it:
sourceGenerators in Test += (sourceManaged in Test map { src =>
(unmanagedResourceDirectories in Test).value map { dir =>
val file = dir / "demo" / src.name
IO.write(file, "Prefix---" + IO.read(src) + "---Postfix")
file
}
}).taskValue
This gives me an error:
error: Illegal dynamic dependency
(unmanagedResourceDirectories in Test).value map { src =>
What is a correct way to do this?
What has worked eventually is this (inspired by this code, referenced in a comment to a question SBT sourceGenerators task - execute only if a file changes):
sourceGenerators in Test += Def.task {
val sources = (unmanagedResources in Test).value filter ( _.isFile )
val dir = (sourceManaged in Test).value
sources map { src =>
IO.write(dir / src.name, "Prefix---" + IO.read(src) + "---Postfix")
f
}
}.taskValue
The important part was reading the settings inside of the task.
I think Dynamic tasks are the correct way to do it
http://www.scala-sbt.org/0.13/docs/Tasks.html#Dynamic+Computations+with
I want to create an set task which creates a database schema with slick. For that, I have a task object like the following in my project:
object CreateSchema {
val instance = Database.forConfig("localDb")
def main(args: Array[String]) {
val createFuture = instance.run(createActions)
...
Await.ready(createFuture, Duration.Inf)
}
}
and in my build.sbt I define a task:
lazy val createSchema = taskKey[Unit]("CREATE database schema")
fullRunTask(createSchema, Runtime, "sbt.CreateSchema")
which gets executed as expected when I run sbt createSchema from the command line.
However, the problem is that application.conf doesn't seem to get taken into account (I've also tried different scopes like Compile or Test). As a result, the task fails due to com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'localDb'.
How can I fix this so the configuration is available?
I found a lot of questions here that deal with using the application.conf inside the build.sbt itself, but that is not what I need.
I have setup a little demo using SBT 0.13.8 and Slick 3.0.0, which is working as expected. (And even without modifying "-Dconfig.resource".)
Files
./build.sbt
name := "SO_20150915"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.0" withSources() withJavadoc(),
"com.typesafe.slick" %% "slick" % "3.0.0",
"org.slf4j" % "slf4j-nop" % "1.6.4",
"com.h2database" % "h2" % "1.3.175"
)
lazy val createSchema = taskKey[Unit]("CREATE database schema")
fullRunTask(createSchema, Runtime, "somefun.CallMe")
./project/build.properties
sbt.version = 0.13.8
./src/main/resources/reference.conf
hello {
world = "buuh."
}
h2mem1 = {
url = "jdbc:h2:mem:test1"
driver = org.h2.Driver
connectionPool = disabled
keepAliveConnection = true
}
./src/main/scala/somefun/CallMe.scala
package somefun
import com.typesafe.config.Config
import com.typesafe.config.ConfigFactory
import slick.driver.H2Driver.api._
/**
* SO_20150915
* Created by martin on 15.09.15.
*/
object CallMe {
def main(args: Array[String]) : Unit = {
println("Hello")
val settings = new Settings()
println(s"Settings read from hello.world: ${settings.hw}" )
val db = Database.forConfig("h2mem1")
try {
// ...
println("Do something with your database.")
} finally db.close
}
}
class Settings(val config: Config) {
// This verifies that the Config is sane and has our
// reference config. Importantly, we specify the "di3"
// path so we only validate settings that belong to this
// library. Otherwise, we might throw mistaken errors about
// settings we know nothing about.
config.checkValid(ConfigFactory.defaultReference(), "hello")
// This uses the standard default Config, if none is provided,
// which simplifies apps willing to use the defaults
def this() {
this(ConfigFactory.load())
}
val hw = config.getString("hello.world")
}
Result
If running sbt createSchema from Console I obtain the output
[info] Loading project definition from /home/.../SO_20150915/project
[info] Set current project to SO_20150915 (in build file:/home/.../SO_20150915/)
[info] Running somefun.CallMe
Hello
Settings read from hello.world: buuh.
Do something with your database.
[success] Total time: 1 s, completed 15.09.2015 10:42:20
Ideas
Please verify that this unmodified demo project also works for you.
Then try changing SBT version in the demo project and see if that changes something.
Alternatively, recheck your project setup and try to use a higher version of SBT.
Answer
So, even if your code resides in your src-folder, it is called from within SBT. That means, you are trying to load your application.conf from within the classpath context of SBT.
Slick uses Typesafe Config internally. (So the approach below (described in background) is not applicable, as you can not modify the Config loading mechanism itself).
Instead try the set the path to your application.conf explicitly via config.resource, see typesafe config docu (search for config.resource)
Option 1
Either set config.resource (via -Dconfig.resource=...) before starting sbt
Option 2
Or from within build.sbt as Scala code
sys.props("config.resource") = "./src/main/resources/application.conf"
Option 3
Or create a Task in SBT via
lazy val configPath = TaskKey[Unit]("configPath", "Set path for application.conf")
and add
configPath := Def.task {
sys.props("config.resource") = "./src/main/resources/application.conf"
}
to your sequence of settings.
Please let me know, if that worked.
Background information
Recently, I was writing a custom plugin for SBT, where I also tried to access a reference.conf as well. Unfortunately, I was not able to access any of .conf placed within project-subfolder using the default ClassLoader.
In the end I created a testenvironment.conf in project folder and used the following code to load the (typesafe) config:
def getConfig: Config = {
val classLoader = new java.net.URLClassLoader( Array( new File("./project/").toURI.toURL ) )
ConfigFactory.load(classLoader, "testenvironment")
}
or for loading a genereal application.conf from ./src/main/resources:
def getConfig: Config = {
val classLoader = new java.net.URLClassLoader( Array( new File("./src/main/resources/").toURI.toURL ) )
// no .conf basename given, so look for reference.conf and application.conf
// use specific classLoader
ConfigFactory.load(classLoader)
}
I am using Sass as my CSS preprocesser, and I'm trying to have it run via the asset pipeline. I've tried implementing this sassTask as a source file task and as a web asset task, but I'm running into problems both ways.
If I run Sass as a source task (see below), it gets triggered during activator run when a page is requested and updated files are found upon page reloads. The problem I'm running into is that the resulting CSS files are all getting dumped directly into target/web/public/main/lib, instead of into the subdirectories reflecting the ones they are getting built into under the resources-managed directory. I can't figure out how to make this happen.
Instead, I tried implementing Sass compilation as a web asset task (see below). Working this way, as far as I can tell, resources-managed does not come into play, and so I compile my files directly into target/web/public/main/lib. I'm sure I'm not doing this dynamically enough, but I don't know how to do it any better. But the biggest problem here is that the pipeline apparently does not run when working through activator run. I can get it to run using activator stage, but I really need this to work in the regular development workflow so that I can change style files as the dev server is running, same as with Scala files.
I have tried combing through these forums, through the sbt-web docs, and through some of the existing plugins, but I am finding this process to be highly frustrating, due to the complexity of SBT and the opaqueness of what is actually happening in the build process.
Sass compilation as a source file task:
lazy val sassTask = TaskKey[Seq[java.io.File]]("sassTask", "Compiles Sass files")
sassTask := {
import sys.process._
val x = (WebKeys.nodeModules in Assets).value
val sourceDir = (sourceDirectory in Assets).value
val targetDir = (resourceManaged in Assets).value
Seq("sass", "-I", "target/web/web-modules/main/webjars/lib/susy/sass", "--update", s"$sourceDir:$targetDir").!
val sources = sourceDir ** "*.scss"
val mappings = sources pair relativeTo(sourceDir)
val renamed = mappings map { case (file, path) => file -> path.replaceAll("scss", "css") }
val copies = renamed map { case (file, path) => file -> targetDir / path }
copies map (_._2)
}
sourceGenerators in Assets <+= sassTask
Sass compilation as web asset task:
lazy val sassTask = taskKey[Pipeline.Stage]("Compiles Sass files")
sassTask := {
(mappings: Seq[PathMapping]) =>
import sys.process._
val sourceDir = (sourceDirectory in Assets).value
val targetDir = target.value / "web" / "public" / "main"
val libDir = (target.value / "web" / "web-modules" / "main" / "webjars" / "lib" / "susy" / "sass").toString
Seq("sass", "-I", libDir, "--update", s"$sourceDir:$targetDir").!
val sources = sourceDir ** "*.scss"
val mappings = sources pair relativeTo(sourceDir)
val renamed = mappings map { case (file, path) => file -> path.replaceAll("scss", "css") }
renamed
}
pipelineStages := Seq(sassTask)
I think that according to the documentation related to the Asset Pipeline, a Source File task is a way to go:
Examples of source file tasks as plugins are CoffeeScript, LESS and
JSHint. Some of these take a source file and produce a target web
asset e.g. CoffeeScript produces JS files. Plugins in this category
are mutually exclusive to each other in terms of their function i.e.
only one CoffeeScript plugin will take CoffeeScript sources and
produce target JS files. In summary, source file plugins produce web
assets.
I think what you try to achieve falls into this category.
TL;DR; - build.sbt
val sassTask = taskKey[Seq[File]]("Compiles Sass files")
val sassOutputDir = settingKey[File]("Output directory for Sass generated files")
sassOutputDir := target.value / "web" / "sass" / "main"
resourceDirectories in Assets += sassOutputDir.value
sassTask := {
val sourceDir = (sourceDirectory in Assets).value
val outputDir = sassOutputDir.value
val sourceFiles = (sourceDir ** "*.scss").get
Seq("sass", "--update", s"$sourceDir:$outputDir").!
(outputDir ** "*.css").get
}
sourceGenerators in Assets += sassTask.taskValue
Explanation
Assuming you have sass file in a app/assets/<whatever> directory, and that you want to create css files in web/public/main/<whatever> directory, this is what you could do.
Create a task, which will read in files in the app/assets/<whatever> directory and subdirectories, and output them to our defined sassOutputDir.
val sassTask = taskKey[Seq[File]]("Compiles Sass files")
val sassOutputDir = settingKey[File]("Output directory for Sass generated files")
sassOutputDir := target.value / "web" / "sass" / "main"
resourceDirectories in Assets += sassOutputDir.value
sassTask := {
val sourceDir = (sourceDirectory in Assets).value
val outputDir = sassOutputDir.value
val sourceFiles = (sourceDir ** "*.scss").get
Seq("sass", "--update", s"$sourceDir:$outputDir").!
(outputDir ** "*.css").get
}
This is not enough though. If you want to keep the directory structure you have to add your sassOutputDir to the resourceDirectories in Assets. This is because mappings in sbt-web are declared like this:
mappings := {
val files = (sources.value ++ resources.value ++ webModules.value) ---
(sourceDirectories.value ++ resourceDirectories.value ++ webModuleDirectories.value)
files pair relativeTo(sourceDirectories.value ++ resourceDirectories.value ++ webModuleDirectories.value) | flat
}
which means that all unmapped files are mapped using an alternative flat strategy. However the fix for it is simple, just add this to your build.sbt
resourceDirectories in Assets += sassOutputDir.value
This will make sure the directory structure is preserved.
I'm trying to create a multi-module application and run one of it's modules separately from the others (from another machine).
Project structure looks like this:
main
/ \
module1 module2
I want to run a module1 as a separate jar file (or there is a better way of doing this?), which I will run from another machine (I want to connect it to the main app using Akka remoting).
What I'm doing:
Running "play dist" command
Unzipping module1.zip from universal folder
Setting +x mode to bin/module1 executable.
Setting my main class (will paste it below): instead of play.core.server.NettyServer im putting my main class: declare -r app_mainclass="module1.foo.Launcher"
Running with external application.conf file.
Here is my main class:
class LauncherActor extends Actor {
def receive = {
case a => println(s"Received msg: $a ")
}
}
object Launcher extends App {
val system = ActorSystem("testsystem")
val listener = system.actorOf(Props[LauncherActor], name = "listener")
println(listener.path)
listener ! "hi!"
println("Server ready")
}
Here is the console output:
#pavel bin$ ./module1 -Dconfig.file=/Users/pavel/projects/foobar/conf/application.conf
[WARN] [10/18/2013 18:56:03.036] [main] [EventStream(akka://testsystem)] [akka.event-handlers] config is deprecated, use [akka.loggers]
akka://testsystem/user/listener
Server ready
Received msg: hi!
#pavel bin$
So the system switches off as soon as it gets to the last line of the main method. If I run this code without Play - it works as expected, the object is loaded and it waits for messages, which is expected behavior.
Maybe I'm doing something wrong? Or should I set some options in module1 executable? Other ideas?
Thanks in advance!
Update:
Versions:
Scala - 2.10.3
Play! - 2.2.0
SBT - 0.13.0
Akka - 2.2.1
Java 1.7 and 1.6 (tried both)
Build properties:
lazy val projectSettings = buildSettings ++ play.Project.playScalaSettings ++ Seq(resolvers := buildResolvers,
libraryDependencies ++= dependencies) ++ Seq(scalacOptions += "-language:postfixOps",
javaOptions in run ++= Seq(
"-XX:MaxPermSize=1024m",
"-Xmx4048m"
),
Keys.fork in run := true)
lazy val common = play.Project("common", buildVersion, dependencies, path = file("modules/common"))
lazy val root = play.Project(appName, buildVersion, settings = projectSettings).settings(
resolvers ++= buildResolvers
).dependsOn(common, module1, module2).aggregate(common, module1, module2)
lazy val module1 = play.Project("module1", buildVersion, path = file("modules/module1")).dependsOn(common).aggregate(common)
lazy val module2: Project = play.Project("module2", buildVersion, path = file("modules/module2")).dependsOn(common).aggregate(common)
So I found a dirty workaround and I will use it until I will find a better solution. In case someone is interested, I've added this code at the bottom of the Server object:
val shutdown = Future {
readLine("Press 'ENTER' key to shutdown")
}.map { q =>
println("**** Shutting down ****")
System.exit(0)
}
import scala.concurrent.duration._
Await.result(shutdown, 100 days)
And now system works until I will hit the ENTER key in the console. Dirty, I agree, but didn't find a better solution.
If there will be something better, of course I will mark it as an answer.