sbt Task classpath - scala

I'm working on a sbt Task and I would like to have access to some of the application classes and dependencies.
(Specifically, I'd like to generate the Database DDL using scalaquery)
Is there any way to add those dependencies to the task or maybe I need to create a plugin for this?
object ApplicationBuild extends Build {
val appName = "test"
val appVersion = "1.0-SNAPSHOT"
val appDependencies = Seq(
"org.scalaquery" % "scalaquery_2.9.0-1" % "0.9.5")
val ddl = TaskKey[Unit]("ddl", "Generates the ddl in the evolutions folder")
val ddlTask = ddl <<= (baseDirectory, fullClasspath in Runtime) map { (bs, cp) =>
val f = bs / "conf/evolutions/default"
// Figures out the last sql number used
def nextFileNumber = { ... }
//writes to file
def printToFile(f: java.io.File)(op: java.io.PrintWriter => Unit) { ...}
def createDdl = {
import org.scalaquery.session._
import org.scalaquery.ql._
import org.scalaquery.ql.TypeMapper._
import org.scalaquery.ql.extended.H2Driver.Implicit._
import org.scalaquery.ql.extended.{ ExtendedTable => Table }
import models._
printToFile(new java.io.File(nextFileNumber, f))(p => {
models.Table.ddl.createStatements.foreach(p.println)
});
}
createDdl
None
}
val main = PlayProject(appName, appVersion, appDependencies, mainLang = SCALA).settings(
ddlTask)
}
The error I get is
[test] $ reload
[info] Loading global plugins from /home/asal/.sbt/plugins
[info] Loading project definition from /home/asal/myapps/test/project
[error] /home/asal/myapps/test/project/Build.scala:36: object scalaquery is not a member of package org
[error] import org.scalaquery.session._
[error] ^
[error] one error found
Thanks in advance

You have to add ScalaQuery and everything else your build depends on as a build dependency. That means that basically, you have to add it "as an sbt plugin".
This is described in some detail in the Using Plugins section of the sbt wiki. It all boils down to a very simple thing, though - just add a line defining your dependency under project/plugins.sbt like this:
libraryDependencies += "org.scalaquery" % "scalaquery_2.9.0-1" % "0.9.5"
Now, the problem with using application classes in the build is that you can't really add build products as build dependencies. - So, you would probably have to create a separate project that builds your DDL module, and add that as dependency to the build of this project.

Related

How to get the dependencies of all sub-projects from a SBT project, in a SBT task?

I'm write a SBT task, which can output the dependencies information, grouped by project (say a SBT project has multi projects)
I know there is a sbt-dependency-graph plugin, but I can use it directly, because I want to generate a json file, but that plugin just output the dependency tree to console, without returning an data object, I can't easily get the data I want.
I found the update task returns a UpdateReport which contains a lot of information I want, but it only belong to the current project. In command line, if I want to know the information of all project, I can manually show all the projects by projects command, and view them one by one by someproject/update.
But how to do the same in a SBT task? I tried:
val reports = projects.toList.map(prj => (update in prj).value)
It reports:
[error] /Users/me/workspace/sbt-test/project/Build.scala:51: Illegal dynamic reference: prj
[error] val reports = projects.toList.map(prj => (update in prj).value)
[error] ^
[error] one error found
How to fix it?
More code:
import sbt._
import sbt.Keys._
object DemoBuild extends Build {
lazy val allUpdate = taskKey[Unit]("show update reports of all projects")
lazy val core = project
lazy val web = project
lazy val allUpdateDef = allUpdate := {
val reports = projects.toList.map(prj => (update in prj).value)
println(reports)
}
lazy val root = (project in file("."))
.settings(
allUpdateDef
)
}
After checking the document: http://www.scala-sbt.org/0.13/docs/Tasks.html, I found the solution:
import sbt._
import sbt.Keys._
object DemoBuild extends Build {
lazy val groupByProject: Def.Initialize[Task[(String, UpdateReport)]] =
Def.task {
(thisProject.value.id, (update in thisProject).value)
}
lazy val filter = ScopeFilter(inAnyProject, inAnyConfiguration)
updateByProject := {
val subProjects = groupByProject.all(filter).value.map { case ( projectName, updateReport) =>
...
}
}
}

How to use SBT IntegrationTest configuration from Scala objects

To make our multi-project build more manageable we split up our Build.scala file into several files, e.g. Dependencies.scala contains all dependencies:
import sbt._
object Dependencies {
val slf4j_api = "org.slf4j" % "slf4j-api" % "1.7.7"
...
}
We want to add integration tests to our build. Following the SBT documentation we added
object Build extends sbt.Build {
import Dependencies._
import BuildSettings._
import Version._
import MergeStrategies.custom
lazy val root = Project(
id = "root",
base = file("."),
settings = buildSettings ++ Seq(Git.checkNoLocalChanges, TestReport.testReport)
).configs(IntegrationTest).settings(Defaults.itSettings: _*)
...
}
where Dependencies, BuildSettings, Version and MergeStrategies are custom Scala objects definied in their own files.
Following the documentation we want to add some dependencies for the IntegrationTest configuration in Dependencies.scala:
import sbt._
object Dependencies {
val slf4j_api = "org.slf4j" % "slf4j-api" % "1.7.7"
val junit = "junit" % "junit" % "4.11" % "test,it"
...
}
Unfortunately this breaks the build:
java.lang.IllegalArgumentException: Cannot add dependency
'junit#junit;4.11' to configuration 'it' of module ... because this configuration doesn't exist!
I guess I need to import the IntegrationTest configuration. I tried importing the IntegrationTest configuration in Dependencies.scala:
import sbt.Configurations.IntegrationTest
IntegrationTest is a lazy val defined in the Configurations object:
object Configurations {
...
lazy val IntegrationTest = config("it") extend (Runtime)
...
}
But that did not solve the problem.
Does someone has an idea how to solve this?
You need to add the config to the Project object before you add the dependency to the Project object.
Your code quotes show you doing the former, but you don't show where you are doing the latter in your quoted code.
Please could you post the full config, or try moving those two around each other?
Here is where the config is added to the Project object in the SBT docs you linked to:
lazy val root = (project in file(".")).
configs(IntegrationTest).
Your quoted code above which declares a lazy val but does not use it is not sufficient to get the "it" config into use:
lazy val IntegrationTest = config("it") extend (Runtime)

Why does sbt report "not found: value PlayScala" with Build.scala while build.sbt works?

I am creating a multi-module sbt project, with following structure:
<root>
----build.sbt
----project
----Build.scala
----plugins.sbt
----common
----LoggingModule
LoggingModule is a Play Framework project, while common is a simple Scala project.
In plugins.sbt:
resolvers += "Typesafe repo" at "http://repo.typesafe.com/typesafe/releases/"
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.3.3")
While I have this in build.sbt, all works fine and it recognises PlayScala:
name := "Multi-Build"
lazy val root = project.in(file(".")).aggregate(common, LoggingModule).dependsOn(common, LoggingModule)
lazy val common = project in file("common")
lazy val LoggingModule = (project in file("LoggingModule")).enablePlugins(PlayScala)
However as soon I put this in project/Build.scala instead of `build.sbt' as follows:
object RootBuild extends Build {
lazy val root = project.in(file("."))
.aggregate(common, LoggingModule)
.dependsOn(common, LoggingModule)
lazy val common = project in file("common")
lazy val LoggingModule = (project in file("LoggingModule")).enablePlugins(PlayScala)
...//other settings
}
it generates error as:
not found: value PlayScala
lazy val LoggingModule = (project in file("LoggingModule")).enablePlugins(PlayScala)
^
How to solve the issue?
It's just a missing import.
In .sbt files, some things are automatically imported by default: contents of objects extending Plugin, and (>= 0.13.5) autoImport fields in AutoPlugins. This is the case of PlayScala.
In a Build.scala file, normal Scala import rules apply. So you have to import things a bit more explicitly. In this case, you need to import play.PlayScala (or use .enabledPlugins(play.PlayScala) directly).

Play Framework and scala.tools.nsc

I have to use scala parser inside Play Framework application.
import scala.tools.nsc._
trait Foo
class Parser {
def parse(code: String) = {
val settings = new Settings
settings.embeddedDefaults[Foo]
val interpreter = new Interpreter(settings)
interpreter.parse(code)
}
}
I have following dependency in Build.scala
"org.scala-lang" % "scala-compiler" % "2.9.1"
This code works when build using SBT. In Play it ends with NullPointerException and:
Failed to initialize compiler: object scala not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programatically, settings.usejavacp.value = true.
Build.scala
import sbt._
import Keys._
import PlayProject._
object ApplicationBuild extends Build {
val appName = "com.qwerty.utils"
val appVersion = "1.0-SNAPSHOT"
val scalaVersion = "2.9.1"
val appDependencies = Seq(
"org.scala-lang" % "scala-compiler" % "2.9.1"
)
val main = PlayProject(appName, appVersion, appDependencies, mainLang = SCALA).settings(
// Add your own project settings here
)
}
For background on embeddedDefaults, see the original proposal.
The container (Play) must define the 'app.class.path' and 'boot.class.path' resources and then embeddedDefaults will use them to configure the interpreter properly for the environment. So, this is an enhancement for Play.
If you can pass the necessary classpaths into your application, you can configure classpaths and classloaders explicitly yourself with something like:
val settings = new Settings
settings.classpath.value = "<classpath>"
settings.bootclasspath.value =
settings.bootclasspath.value + File.pathSeparator +
"<extra-bootclasspath>"
val interpreter = new Interpreter(settings) {
override def parentClassLoader = classOf[Foo].getClassLoader
}
interpreter.parse(code)
The bootclasspath should generally contain scala-library.jar and the classpath should contain the application jars.

How to get list of dependency jars from an sbt 0.10.0 project

I have a sbt 0.10.0 project that declares a few dependencies somewhat like:
object MyBuild extends Build {
val commonDeps = Seq("commons-httpclient" % "commons-httpclient" % "3.1",
"commons-lang" % "commons-lang" % "2.6")
val buildSettings = Defaults.defaultSettings ++ Seq ( organization := "org" )
lazy val proj = Project("proj", file("src"),
settings = buildSettings ++ Seq(
name := "projname",
libraryDependencies := commonDeps, ...)
...
}
I wish to creat a build rule to gather all the jar dependencies of "proj", so that I can symlink them to a single directory.
Thanks.
Example SBT task to print full runtime classpath
Below is roughly what I'm using. The "get-jars" task is executable from the SBT prompt.
import sbt._
import Keys._
object MyBuild extends Build {
// ...
val getJars = TaskKey[Unit]("get-jars")
val getJarsTask = getJars <<= (target, fullClasspath in Runtime) map { (target, cp) =>
println("Target path is: "+target)
println("Full classpath is: "+cp.map(_.data).mkString(":"))
}
lazy val project = Project (
"project",
file ("."),
settings = Defaults.defaultSettings ++ Seq(getJarsTask)
)
}
Other resources
Unofficial guide to sbt 0.10.
Keys.scala defines predefined keys. For example, you might want to replace fullClasspath with managedClasspath.
This plugin defines a simple command to generate an .ensime file, and may be a useful reference.