Does anyone knows how to configure a SBT project to run an annotation processor (APT)? I'm doing some lab on a web project, using some Java tools like QueryDSL, and i need to generate querydsl classes for my JPA model classes, in a similar way QueryDSL Maven plugin does.
Thanks in advance.
You could manually run the annotation processor (see command below) or implement an SBT task similar to the following:
lazy val processAnnotations = taskKey[Unit]("Process annotations")
processAnnotations := {
val log = streams.value.log
log.info("Processing annotations ...")
val classpath = ((products in Compile).value ++ ((dependencyClasspath in Compile).value.files)) mkString ":"
val destinationDirectory = (classDirectory in Compile).value
val processor = "com.package.PluginProcessor"
val classesToProcess = Seq("com.package.Class1", "com.package.Class2") mkString " "
val command = s"javac -cp $classpath -proc:only -processor $processor -XprintRounds -d $destinationDirectory $classesToProcess"
failIfNonZeroExitStatus(command, "Failed to process annotations.", log)
log.info("Done processing annotations.")
}
def failIfNonZeroExitStatus(command: String, message: => String, log: Logger) {
val result = command !
if (result != 0) {
log.error(message)
sys.error("Failed running command: " + command)
}
}
packageBin in Compile <<= (packageBin in Compile) dependsOn (processAnnotations in Compile)
Update destinationDirectory, processor, and classesToProcess as necessary.
You might also change the "-d" flag to "-s" depending on the type of annotation processor you have (see options for javac).
Related
I'm using ScalaPB to synthesize Scala classes for converting my data to and from Protobuf representation. By default, the SBT setup hooks into sbt compile to generate the files under the target folder.
Because I expect my .proto files to change very infrequently, I would rather manually invoke the ScalaPB process when they do, and keep the generated files under version control. This is the same approach I use for Slick's code generation functionality.
I can do something like:
lazy val genProto = TaskKey[Unit]("gen-proto", "Generate Scala classes from a proto file")
genProto := {
val protoSources = ...
val outputDirectory = ...
// ? run the same process
}
But I'm not sure how to invoke the process from SBT with custom inputs and outputs.
My latest attempt:
ScalaPbPlugin.runProtoc in ScalaPbPlugin.protobufConfig := (args =>
com.github.os72.protocjar.Protoc.runProtoc("-v261" +: args.toArray))
lazy val genProto = TaskKey[Unit]("gen-proto", "Generate Scala classes from a proto file")
genProto := {
val protoSourceDirectory = sourceDirectory.value / "main" / "protobuf"
val outputDirectory = (scalaSource in Compile).value / outputProtoDirectory
val schemas = (protoSourceDirectory ** "*.proto").get.map(_.getAbsoluteFile)
val includeOption = Seq(s"-I$protoSourceDirectory")
val outputOption = Seq(s"--scala_out=${outputDirectory.absolutePath}")
val options = schemas.map(_.absolutePath) ++ includeOption ++ outputOption
(ScalaPbPlugin.runProtoc in ScalaPbPlugin.protobufConfig).value(options)
(outputDirectory ** "*.scala").get.toSet
}
I get the following error:
> genProto
protoc-jar: protoc version: 261, detected platform: mac os x/x86_64
protoc-jar: executing: [/var/folders/lj/_85rbyf5525d3ktt666yjztr0000gn/T/protoc2879794465962204787.exe, /Users/alan/projects/causality/src/main/protobuf/lotEventStoreModel.proto, -I/Users/alan/projects/causality/src/main/protobuf, --scala_out=/Users/alan/projects/causality/src/main/scala/net/artsy/auction/protobuf]
protoc-gen-scala: program not found or is not executable
--scala_out: protoc-gen-scala: Plugin failed with status code 1.
[success] Total time: 0 s, completed Apr 25, 2016 9:39:09 AM
import sbt._
import Keys._
lazy val genProto = TaskKey[Unit]("gen-proto", "Generate Scala classes from a proto file")
genProto := {
Seq("/path/to/scalapbc-0.5.24/bin/scalapbc",
"src/main/protobuf/test.proto",
"--scala_out=src/main/scala/") !
}
I have added this to build.sbt
libraryDependencies += "com.typesafe.slick" %% "slick-codegen" % "2.1.0"
lazy val slickGenerate = TaskKey[Seq[File]]("slick code generation")
slickGenerate <<= slickGenerateTask
lazy val slickGenerateTask = {
(sourceManaged in Compile, dependencyClasspath in Compile, runner in Compile, streams) map { (dir, cp, r, s) =>
val dbName = "dbname"
val userName = "user"
val password = "password"
val url = s"jdbc:mysql://server:port/$dbName"
val jdbcDriver = "com.mysql.jdbc.Driver"
val slickDriver = "scala.slick.driver.MySQLDriver"
val targetPackageName = "models"
val outputDir = (dir / dbName).getPath // place generated files in sbt's managed sources folder
val fname = outputDir + s"/$targetPackageName/Tables.scala"
println(s"\nauto-generating slick source for database schema at $url...")
println(s"output source file file: file://$fname\n")
r.run("scala.slick.codegen.SourceCodeGenerator", cp.files, Array(slickDriver, jdbcDriver, url, outputDir, targetPackageName, userName, password), s.log)
Seq(file(fname))
}
}
The task's code itself isn't very exciting. It just needs to create an auto-generated scala source file. Problem is, sbt starts fine, yet this new task is evidently not recognized by sbt and cannot be run in the sbt prompt. I have also had very little luck with the := syntax for task definition. Existing documentation has been just confounding.
How can this new task be made available in the sbt prompt?
This works
libraryDependencies += "com.typesafe.slick" %% "slick-codegen" % "2.1.0"
lazy val slickGenerate = taskKey[Seq[File]]("slick code generation")
slickGenerate := {
val dbName = "dbname"
val userName = "user"
val password = "password"
val url = s"jdbc:mysql://server:port/$dbName"
val jdbcDriver = "com.mysql.jdbc.Driver"
val slickDriver = "scala.slick.driver.MySQLDriver"
val targetPackageName = "models"
val outputDir = ((sourceManaged in Compile).value / dbName).getPath // place generated files in sbt's managed sources folder
val fname = outputDir + s"/$targetPackageName/Tables.scala"
println(s"\nauto-generating slick source for database schema at $url...")
println(s"output source file file: file://$fname\n")
(runner in Compile).value.run("scala.slick.codegen.SourceCodeGenerator", (dependencyClasspath in Compile).value.files, Array(slickDriver, jdbcDriver, url, outputDir, targetPackageName, userName, password), streams.value.log)
Seq(file(fname))
}
In sbt 0.13.x you don't need all those blabla map sameblabla boilerplates. Just access value as is (runner in Compile).value - macro will do everything else for you.
> slickGenerate
[info] Updating {file:/Users/user/slick/}slick...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
auto-generating slick source for database schema at jdbc:mysql://server:port/dbname...
output source file file: file:///Users/user/slick/target/scala-2.10/src_managed/main/dbname/models/Tables.scala
> help slickGenerate
slick code generation
Talking about <<= - your TaskKey is incorrect, see the definition:
def apply[T](label : scala.Predef.String, description : scala.Predef.String // not description
So, the old definition <<= uses "generate slick code" as label, while the new := uses the code-given name for command (new style), so it uses your "generate slick code" as a doc. Which looks strange and inconsistent, but that's a fact and it's partially reasoned by backward-compatibility.
So, correct old-style version is:
import sbt.Keys._
lazy val slickGenerate = TaskKey[Seq[File]]("slick-generate", "generate slick code")
slickGenerate <<= slickGenerateTask
def slickGenerateTask =
(sourceManaged in Compile, dependencyClasspath in Compile, runner in Compile, streams) map { (dir, cp, r, s) =>
...
}
It works in the same way as previous. Note, that you have to use "slickGenerate", not "slick-generate", the last one doesn't work for "help" command.
By the way, you're using Bare build definition now - you may want to switch to Multi-project .sbt definition as it's recommended by sbt docs, see also.
I have an existing database, and i would like to connect to it with scala/slick.
I'd rather not have to manually write all of the slick classes, to wrap around my tables.
is there a quick way to just read the definitions from the database, from slick? or, possibly, is there another component in the scala standard library or standard toolset, which will do this work for me?
Use the Slick schema generator, you simply need to add this to your Build.scala:
lazy val slick = TaskKey[Seq[File]]("gen-tables")
lazy val slickCodeGenTask = (sourceManaged, dependencyClasspath in Compile, runner in Compile, streams) map {
(dir, cp, r, s) => {
val outputDir = (dir / "slick").getPath
val url = "your db url"
val jdbcDriver = "dbms drivers"
val slickDriver = "slick drivers"
val pkg = "schema"
toError(r.run("scala.slick.model.codegen.SourceCodeGenerator", cp.files, Array(slickDriver, jdbcDriver, url, outputDir, pkg), s.log))
val fname = outputDir + "/path/to/Tables.scala"
Seq(file(fname))
}
}
Add the task to the settings:
val main = play.Project(appName, appVersion, Seq()).settings(
Keys.fork in (Test) := false,
libraryDependencies := Seq(
...
),
slick <<= slickCodeGenTask // register manual sbt command
)
And then call genTables form SBT, this will create a scala file called Tables.scala to the specified path with the whole schema from the database.
This was the Github example I looked up the first time.
I'm trying to create a multi-module application and run one of it's modules separately from the others (from another machine).
Project structure looks like this:
main
/ \
module1 module2
I want to run a module1 as a separate jar file (or there is a better way of doing this?), which I will run from another machine (I want to connect it to the main app using Akka remoting).
What I'm doing:
Running "play dist" command
Unzipping module1.zip from universal folder
Setting +x mode to bin/module1 executable.
Setting my main class (will paste it below): instead of play.core.server.NettyServer im putting my main class: declare -r app_mainclass="module1.foo.Launcher"
Running with external application.conf file.
Here is my main class:
class LauncherActor extends Actor {
def receive = {
case a => println(s"Received msg: $a ")
}
}
object Launcher extends App {
val system = ActorSystem("testsystem")
val listener = system.actorOf(Props[LauncherActor], name = "listener")
println(listener.path)
listener ! "hi!"
println("Server ready")
}
Here is the console output:
#pavel bin$ ./module1 -Dconfig.file=/Users/pavel/projects/foobar/conf/application.conf
[WARN] [10/18/2013 18:56:03.036] [main] [EventStream(akka://testsystem)] [akka.event-handlers] config is deprecated, use [akka.loggers]
akka://testsystem/user/listener
Server ready
Received msg: hi!
#pavel bin$
So the system switches off as soon as it gets to the last line of the main method. If I run this code without Play - it works as expected, the object is loaded and it waits for messages, which is expected behavior.
Maybe I'm doing something wrong? Or should I set some options in module1 executable? Other ideas?
Thanks in advance!
Update:
Versions:
Scala - 2.10.3
Play! - 2.2.0
SBT - 0.13.0
Akka - 2.2.1
Java 1.7 and 1.6 (tried both)
Build properties:
lazy val projectSettings = buildSettings ++ play.Project.playScalaSettings ++ Seq(resolvers := buildResolvers,
libraryDependencies ++= dependencies) ++ Seq(scalacOptions += "-language:postfixOps",
javaOptions in run ++= Seq(
"-XX:MaxPermSize=1024m",
"-Xmx4048m"
),
Keys.fork in run := true)
lazy val common = play.Project("common", buildVersion, dependencies, path = file("modules/common"))
lazy val root = play.Project(appName, buildVersion, settings = projectSettings).settings(
resolvers ++= buildResolvers
).dependsOn(common, module1, module2).aggregate(common, module1, module2)
lazy val module1 = play.Project("module1", buildVersion, path = file("modules/module1")).dependsOn(common).aggregate(common)
lazy val module2: Project = play.Project("module2", buildVersion, path = file("modules/module2")).dependsOn(common).aggregate(common)
So I found a dirty workaround and I will use it until I will find a better solution. In case someone is interested, I've added this code at the bottom of the Server object:
val shutdown = Future {
readLine("Press 'ENTER' key to shutdown")
}.map { q =>
println("**** Shutting down ****")
System.exit(0)
}
import scala.concurrent.duration._
Await.result(shutdown, 100 days)
And now system works until I will hit the ENTER key in the console. Dirty, I agree, but didn't find a better solution.
If there will be something better, of course I will mark it as an answer.
My projects are still using sbt 0.7.7 and I find it very convenient to have utility classes that I can run from the sbt prompt. I can also combine this with properties that are separately maintained - typically for environment related values that changes from hosts to hosts. This is an example of my project definition under the project/build directory:
class MyProject(info: ProjectInfo) extends DefaultProject(info) {
//...
lazy val extraProps = new BasicEnvironment {
// use the project's Logger for any properties-related logging
def log = MyProject.this.log
def envBackingPath = path("paths.properties")
// define some properties that will go in paths.properties
lazy val inputFile = property[String]
}
lazy val myTask = task { args =>
runTask(Some("foo.bar.MyTask"),
runClasspath, extraProps.inputFile.value :: args.toList).dependsOn(compile)
describedAs "my-task [options]"
}
}
I can then use my task as my-task option1 option2 under the sbt shell.
I've read the new sbt 0.11 documentation at https://github.com/harrah/xsbt/wiki including the sections on Tasks and TaskInputs and frankly I'm still struggling on how to accomplish what I did on 0.7.7.
It seems the extra properties could simply be replaced a separate environment.sbt, that tasks have to be defined in project/build.scala before being set in build.sbt. It also looks like there is completion support, which looks very interesting.
Beyond that I'm somewhat overwhelmed. How do I accomplish what I did with the new sbt?
You can define a task like this :
val myTask = InputKey[Unit]("my-task")
And your setting :
val inputFile = SettingKey[String]("input-file", "input file description")
You can also define a new configuration like :
lazy val ExtraProps = config("extra-props") extend(Compile)
add this config to your project and use it to set settings for this configuration :
lazy val root = Project("root", file(".")).config( ExtraProps ).settings(
inputFile in ExtraProps := ...
...
myTask in ExtraPops <<= inputTask { (argTask:TaskKey[Seq[String]]) =>
(argTask, inputFile) map { (args:Seq[String], iFile[String]) =>
...
}
}
).dependsOn(compile)
then launch your task with extra-props:my-task