adding task to sbt 13.x build.sbt - scala

I have added this to build.sbt
libraryDependencies += "com.typesafe.slick" %% "slick-codegen" % "2.1.0"
lazy val slickGenerate = TaskKey[Seq[File]]("slick code generation")
slickGenerate <<= slickGenerateTask
lazy val slickGenerateTask = {
(sourceManaged in Compile, dependencyClasspath in Compile, runner in Compile, streams) map { (dir, cp, r, s) =>
val dbName = "dbname"
val userName = "user"
val password = "password"
val url = s"jdbc:mysql://server:port/$dbName"
val jdbcDriver = "com.mysql.jdbc.Driver"
val slickDriver = "scala.slick.driver.MySQLDriver"
val targetPackageName = "models"
val outputDir = (dir / dbName).getPath // place generated files in sbt's managed sources folder
val fname = outputDir + s"/$targetPackageName/Tables.scala"
println(s"\nauto-generating slick source for database schema at $url...")
println(s"output source file file: file://$fname\n")
r.run("scala.slick.codegen.SourceCodeGenerator", cp.files, Array(slickDriver, jdbcDriver, url, outputDir, targetPackageName, userName, password), s.log)
Seq(file(fname))
}
}
The task's code itself isn't very exciting. It just needs to create an auto-generated scala source file. Problem is, sbt starts fine, yet this new task is evidently not recognized by sbt and cannot be run in the sbt prompt. I have also had very little luck with the := syntax for task definition. Existing documentation has been just confounding.
How can this new task be made available in the sbt prompt?

This works
libraryDependencies += "com.typesafe.slick" %% "slick-codegen" % "2.1.0"
lazy val slickGenerate = taskKey[Seq[File]]("slick code generation")
slickGenerate := {
val dbName = "dbname"
val userName = "user"
val password = "password"
val url = s"jdbc:mysql://server:port/$dbName"
val jdbcDriver = "com.mysql.jdbc.Driver"
val slickDriver = "scala.slick.driver.MySQLDriver"
val targetPackageName = "models"
val outputDir = ((sourceManaged in Compile).value / dbName).getPath // place generated files in sbt's managed sources folder
val fname = outputDir + s"/$targetPackageName/Tables.scala"
println(s"\nauto-generating slick source for database schema at $url...")
println(s"output source file file: file://$fname\n")
(runner in Compile).value.run("scala.slick.codegen.SourceCodeGenerator", (dependencyClasspath in Compile).value.files, Array(slickDriver, jdbcDriver, url, outputDir, targetPackageName, userName, password), streams.value.log)
Seq(file(fname))
}
In sbt 0.13.x you don't need all those blabla map sameblabla boilerplates. Just access value as is (runner in Compile).value - macro will do everything else for you.
> slickGenerate
[info] Updating {file:/Users/user/slick/}slick...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
auto-generating slick source for database schema at jdbc:mysql://server:port/dbname...
output source file file: file:///Users/user/slick/target/scala-2.10/src_managed/main/dbname/models/Tables.scala
> help slickGenerate
slick code generation
Talking about <<= - your TaskKey is incorrect, see the definition:
def apply[T](label : scala.Predef.String, description : scala.Predef.String // not description
So, the old definition <<= uses "generate slick code" as label, while the new := uses the code-given name for command (new style), so it uses your "generate slick code" as a doc. Which looks strange and inconsistent, but that's a fact and it's partially reasoned by backward-compatibility.
So, correct old-style version is:
import sbt.Keys._
lazy val slickGenerate = TaskKey[Seq[File]]("slick-generate", "generate slick code")
slickGenerate <<= slickGenerateTask
def slickGenerateTask =
(sourceManaged in Compile, dependencyClasspath in Compile, runner in Compile, streams) map { (dir, cp, r, s) =>
...
}
It works in the same way as previous. Note, that you have to use "slickGenerate", not "slick-generate", the last one doesn't work for "help" command.
By the way, you're using Bare build definition now - you may want to switch to Multi-project .sbt definition as it's recommended by sbt docs, see also.

Related

How to load javascript in the JVM cross project

I have run into some difficulties and was wondering if someone can help me. I have the following Build.scala and I am trying to access the the compile javascript from the JVM project.
lazy val webProject = CrossProject(base = file("./main/web"), crossType = CrossType.Full, jvmId = "api-gateway", jsId = "web-js")
.settings(
name := "web",
unmanagedSourceDirectories in Compile += baseDirectory.value / "shared" / "main" / "scala",
libraryDependencies ++= Dependencies.Client.sharedDeps.value)
.jvmSettings(
persistLauncher := true,
persistLauncher in Test := false,
libraryDependencies ++= Dependencies.Client.jvmDeps.value)
.jsSettings(libraryDependencies ++= Dependencies.Client.jsDeps.value)
lazy val webJS = webProject.js.enablePlugins(ScalaJSPlugin)
lazy val webJVM = webProject.jvm
.settings((resources in Compile) += (fastOptJS in(webJS, Compile)).value.data)
.dependsOn(dominos)
The compile javascript is generated
[info] Fast optimizing /.../main/web/js/target/scala-2.11/web-fastopt.js
When I try to access the compile javascript by running get server, it can't be found.
object Main extends App {
implicit val system = ActorSystem("my-system")
implicit val materializer = ActorMaterializer()
implicit val executionContext = system.dispatcher
val routes = pathEndOrSingleSlash(getFromResource("web-fastopt.js"))
Http().bindAndHandle(routes, "localhost", 8080)
}
Isn't this line suppose to add the javascript the the JVM's resources folder when it runs?
(resources in Compile) += (fastOptJS in(webJS, Compile)).value.data
Any help would be greatly appreciated.
Seems like this like doesn't work for me for some reason
(resources in Compile) += (fastOptJS in(webJS, Compile)).value.data
Instead I ended having to move the fastOptJS file
lazy val webJVM = webProject.jvm
.settings(Seq(fastOptJS, fullOptJS, packageJSDependencies)
.map(pkg ⇒ crossTarget in(webJS, Compile, pkg) := scalaJSOutput.value))
I also needed to add
getFromResourceDirectory("")
to the Akka Http routes.

Slick code generator with Postgres in Play Framework

I'm new to Scala and Slick.
Trying to use code generator example from here https://github.com/slick/slick-codegen-example and change it to Postgres driver.
Here is my code in Build.Scala file:
import sbt._
import Keys._
import Tests._
object myBuild extends Build {
val slickVersion = "3.0.2"
lazy val mainProject = Project(
id="main",
base=file("."),
settings = Project.defaultSettings ++ Seq(
scalaVersion := "2.11.6",
libraryDependencies ++= List(
"com.typesafe.slick" %% "slick" % slickVersion,
"com.typesafe.slick" %% "slick-codegen" % slickVersion,
"org.slf4j" % "slf4j-nop" % "1.7.12",
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41"
),
slick <<= slickCodeGenTask, // register manual sbt command
sourceGenerators in Compile <+= slickCodeGenTask // register automatic code generation on every compile, remove for only manual use
)
)
// code generation task
lazy val slick = TaskKey[Seq[File]]("gen-tables")
lazy val slickCodeGenTask =
(sourceManaged, dependencyClasspath in Compile, runner in Compile, streams) map { (dir, cp, r, s) =>
val outputDir = (dir / "slick").getPath // place generated files in sbt's managed sources folder
val url = "jdbc:postgresql://localhost:5432/db'" // connection info
val jdbcDriver = "org.postgresql.Driver"
val slickDriver = "slick.driver.PostgresDriver"
val pkg = "dao"
val user="postgres"
val password="pass"
toError(r.run("slick.codegen.SourceCodeGenerator", cp.files, Array(slickDriver, jdbcDriver, url, outputDir, pkg, user, password), s.log))
val fname = outputDir + "/dao/Tables.scala"
Seq(file(fname))
}
}
The only things I change were drivers and login, password. As I understand it should generate code each time I run
activator run
but it doesn't generate any code. Also, how it can be run manually?
You can add this to your build.sbt file:
val conf = ConfigFactory.parseFile(new File("conf/application.conf")).resolve()
slick <<= slickCodeGenTask
lazy val slick = TaskKey[Seq[File]]("gen-tables")
lazy val slickCodeGenTask = (sourceManaged, dependencyClasspath in Compile, runner in Compile, streams) map { (dir, cp, r, s) =>
val outputDir = (dir / "slick").getPath
val url = conf.getString("slick.dbs.default.db.url")
val jdbcDriver = conf.getString("slick.dbs.default.db.driver")
val slickDriver = conf.getString("slick.dbs.default.driver").dropRight(1)
val pkg = "test"
val user = conf.getString("slick.dbs.default.db.user")
val password = conf.getString("slick.dbs.default.db.password")
toError(r.run("slick.codegen.SourceCodeGenerator", cp.files, Array(slickDriver, jdbcDriver, url, outputDir, pkg, user, password), s.log))
val fname = outputDir + s"/$pkg/Tables.scala"
Seq(file(fname))
}
And run it with activator gen-tables

how do i quickly generate scala classes from a set of sql table definitions?

I have an existing database, and i would like to connect to it with scala/slick.
I'd rather not have to manually write all of the slick classes, to wrap around my tables.
is there a quick way to just read the definitions from the database, from slick? or, possibly, is there another component in the scala standard library or standard toolset, which will do this work for me?
Use the Slick schema generator, you simply need to add this to your Build.scala:
lazy val slick = TaskKey[Seq[File]]("gen-tables")
lazy val slickCodeGenTask = (sourceManaged, dependencyClasspath in Compile, runner in Compile, streams) map {
(dir, cp, r, s) => {
val outputDir = (dir / "slick").getPath
val url = "your db url"
val jdbcDriver = "dbms drivers"
val slickDriver = "slick drivers"
val pkg = "schema"
toError(r.run("scala.slick.model.codegen.SourceCodeGenerator", cp.files, Array(slickDriver, jdbcDriver, url, outputDir, pkg), s.log))
val fname = outputDir + "/path/to/Tables.scala"
Seq(file(fname))
}
}
Add the task to the settings:
val main = play.Project(appName, appVersion, Seq()).settings(
Keys.fork in (Test) := false,
libraryDependencies := Seq(
...
),
slick <<= slickCodeGenTask // register manual sbt command
)
And then call genTables form SBT, this will create a scala file called Tables.scala to the specified path with the whole schema from the database.
This was the Github example I looked up the first time.

Can I access my Scala app's name and version (as set in SBT) from code?

I am building an app with SBT (0.11.0) using a Scala build definition like so:
object MyAppBuild extends Build {
import Dependencies._
lazy val basicSettings = Seq[Setting[_]](
organization := "com.my",
version := "0.1",
description := "Blah",
scalaVersion := "2.9.1",
scalacOptions := Seq("-deprecation", "-encoding", "utf8"),
resolvers ++= Dependencies.resolutionRepos
)
lazy val myAppProject = Project("my-app-name", file("."))
.settings(basicSettings: _*)
[...]
I'm packaging a .jar at the end of the process.
My question is a simple one: is there a way of accessing the application's name ("my-app-name") and version ("0.1") programmatically from my Scala code? I don't want to repeat them in two places if I can help it.
Any guidance greatly appreciated!
sbt-buildinfo
I just wrote sbt-buildinfo.
After installing the plugin:
lazy val root = (project in file(".")).
enablePlugins(BuildInfoPlugin).
settings(
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, sbtVersion),
buildInfoPackage := "foo"
)
Edit: The above snippet has been updated to reflect more recent version of sbt-buildinfo.
It generates foo.BuildInfo object with any setting you want by customizing buildInfoKeys.
Ad-hoc approach
I've been meaning to make a plugin for this, (I wrote it) but here's a quick script to generate a file:
sourceGenerators in Compile <+= (sourceManaged in Compile, version, name) map { (d, v, n) =>
val file = d / "info.scala"
IO.write(file, """package foo
|object Info {
| val version = "%s"
| val name = "%s"
|}
|""".stripMargin.format(v, n))
Seq(file)
}
You can get your version as foo.Info.version.
Name and version are inserted into manifest. You can access them using java reflection from Package class.
val p = getClass.getPackage
val name = p.getImplementationTitle
val version = p.getImplementationVersion
You can also generate a dynamic config file, and read it from scala.
// generate config (to pass values from build.sbt to scala)
Compile / resourceGenerators += Def.task {
val file = baseDirectory.value / "conf" / "generated.conf"
val contents = "app.version=%s".format(version.value)
IO.write(file, contents)
Seq(file)
}.taskValue
When you run sbt universal:packageBin the file will be there.

How to get list of dependency jars from an sbt 0.10.0 project

I have a sbt 0.10.0 project that declares a few dependencies somewhat like:
object MyBuild extends Build {
val commonDeps = Seq("commons-httpclient" % "commons-httpclient" % "3.1",
"commons-lang" % "commons-lang" % "2.6")
val buildSettings = Defaults.defaultSettings ++ Seq ( organization := "org" )
lazy val proj = Project("proj", file("src"),
settings = buildSettings ++ Seq(
name := "projname",
libraryDependencies := commonDeps, ...)
...
}
I wish to creat a build rule to gather all the jar dependencies of "proj", so that I can symlink them to a single directory.
Thanks.
Example SBT task to print full runtime classpath
Below is roughly what I'm using. The "get-jars" task is executable from the SBT prompt.
import sbt._
import Keys._
object MyBuild extends Build {
// ...
val getJars = TaskKey[Unit]("get-jars")
val getJarsTask = getJars <<= (target, fullClasspath in Runtime) map { (target, cp) =>
println("Target path is: "+target)
println("Full classpath is: "+cp.map(_.data).mkString(":"))
}
lazy val project = Project (
"project",
file ("."),
settings = Defaults.defaultSettings ++ Seq(getJarsTask)
)
}
Other resources
Unofficial guide to sbt 0.10.
Keys.scala defines predefined keys. For example, you might want to replace fullClasspath with managedClasspath.
This plugin defines a simple command to generate an .ensime file, and may be a useful reference.