sbt - basic local plugin setup? - scala

I have a particular task I'd like to automate as part of a build process, but I'm stuck at grammar stage with sbt. I'm trying to do a helloworld-ish task using two local projects, one the plugin and one a test using that plugin, but I can't get the new task in the plugin (sampleIntTask) to be available when using sbt on the test project.
I have the following in the filesystem:
/plugin/
Plugin.scala
build.sbt
/test-using-plugin/
build.sbt
/project/plugins.sbt
For my helloworld-ish plugin: in Plugin.scala :
import sbt._
import Keys._
object MyPlugin extends Plugin {
val sampleIntTask = taskKey[Int]("sum 1 and 2")
sampleIntTask := {
val sum = 1 + 2
println("sum: " + sum)
sum
}
}
in plugin/build.sbt:
sbtPlugin := true
name := "myPlugin"
version := "0.1"
scalaVersion := "2.10.3"
and for testing it: in test-using-plugin/build.sbt:
name := "test-test-test"
version := "0.1"
scalaVersion := "2.10.3"
and in test-using-plugin/project/plugins.sbt:
lazy val root = project.in( file(".") ).dependsOn( testPlugin )
lazy val testPlugin = file("/Users/cap10/gitprojects/processing")
When I /test-project$ sbt sampleIntTask, I get:
[info] Set current project to test-using-plugin (in build ...)
> sampleIntTask
[error] Not a valid command: sampleIntTask
[error] Not a valid project ID: sampleIntTask
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: sampleIntTask (similar: compileInputs)
[error] sampleIntTask
[error] ^
I feel like this is about the right level of complexity for this test (define plugin project config, define plugin project behavior, define test project config, add dependency on plugin project), but I'd be unsurprised if I'm totally off based on the grammar as I can't make heads or tails of the sbt intro.

build.sbt
If you do not need to share the settings across multiple builds, you can just add your settings to test-using-plugin/custom.sbt:
val sampleIntTask = taskKey[Int]("sum 1 and 2")
sampleIntTask := {
val sum = 1 + 2
println("sum: " + sum)
sum
}
and forget about the plugin.
Local plugin way
I haven't tested the other parts, but your Plugin.scala is wrong.
The setting expression needs to be in a setting sequence:
import sbt._
import Keys._
object MyPlugin extends Plugin {
val sampleIntTask = taskKey[Int]("sum 1 and 2")
lazy val baseMyPluginSettings: Seq[sbt.Def.Setting[_]] = Seq(
sampleIntTask := {
val sum = 1 + 2
println("sum: " + sum)
sum
}
)
lazy val myPluginSettings: Seq[sbt.Def.Setting[_]] = baseMyPluginSettings
}
And in your test-using-plugin/build.sbt add:
myPluginSettings
If you have to share settings across the builds, you can make a plugin like this or put them in global sbt file. The use of global sbt should be limited to user-specific settings and commands, so that's out. Personally, I would publish the plugin locally using publishLocal so it doesn't depend on specific file path. You can use the locally published plugin like any other plugins:
addSbtPlugin("com.example" % "myPlugin" % "0.1" changing())
By using "-SNAPSHOT" version or by calling changing(), sbt will check for the latest.

Related

get SBT settings from ModuleID

How can I use a moduleID: ModuleID for a "sibling" project to access settings keys?
I'm writing an SBT plugin for multi-module builds.
I have project A (which dependsOn B) and project B.
Both projects have my-own generate and mybuild tasks as settings keys.
The mybuild task consumes the value from generate - this works fine.
B doesn't depend upon anything, so B's mybuild only needs the key for B:generate and all is well.
I want A's mybuild to consume both A:generate and B:generate based on the fact that A dependsOn B in the build.sbt file.
The only promising key(s) I've found return the projects as : ModuleID instances, so is there some way to get a list of settings keys from a ModuleID?
... or should I be doing this another way?
Solution (Kind of)
Whth #himos help this ...
(myTaskKey in myConfig) := {
loadedBuild.value.allProjectRefs.find(_._1 == thisProjectRef.value).map(_._2) match {
case Some(myCurrentProject) =>
if (myCurrentProject.dependencies.nonEmpty)
sys.error {
myCurrentProject.dependencies
.map {
myDependsOnProject: ClasspathDep[ProjectRef] =>
(myDependsOnProject.project / myConfig / myTaskKey).value
// https://www.scala-sbt.org/0.13/docs/Tasks.html#Dynamic+Computations+with
}
.foldLeft("mine.dependencies:")(_ + "\n\t" + _)
}
}
}
... sort of works.
It causes an error that implies I've accessed the correct object, even if the SBT macros don't like it.
I think ModuleID that you mention relates to dependency management, not sub projects.
For taking sub project setting/task keys project scope can be used:
(generate in A).value
(generate in B).value
More comprehensive example:
name := "A"
version := "1.0"
scalaVersion := "2.12.5"
val generate = TaskKey[String]("generate")
val myBuild = TaskKey[String]("myBuild")
val a = (project in file(".")).settings(Seq(
generate := "A_generate"
))
val b = (project in file("proj_b")).settings(Seq(
generate := "B_generate",
myBuild := (generate in a).value + "_" + generate.value
)).dependsOn(a)
Sbt console output:
sbt:A> show b/myBuild
[info] A_generate_B_generate

"Reference to undefined setting" error with custom task using a custom configuration in SBT?

I'm trying to create a task in sbt that will output the full classpath of a custom Configuration, but I get an undefined setting error when sbt tries to load the project definition. I can't figure out which setting has to be defined:
import sbt.Keys._
import sbt._
object FoobarBuild extends Build {
lazy val ZK = config("zk")
lazy val fcp = TaskKey[String]("fcp", "create formatted classpath")
lazy val fcpTask = fcp <<= (fullClasspath in ZK) map { cp =>
println(cp.files.absString)
cp.files.absString
}
lazy val project = Project("foobar", file(".")).
configs(ZK).
settings(
name := "foobar",
version := "1.0",
scalaVersion := "2.11.7"
).
settings(fcpTask)
}
Error:
[info] Loading project definition from foobar/project
Reference to undefined setting:
zk:fullClasspath from *:fcp (/Users/gaston/mesosphere/foobar/project/Build.scala:7)
zk:fullClasspath on the 7th line of this file is, of course, fullClasspath in ZK. It's undefined because it isn't set or inherited from any other config, I believe.

Error while using aspectj with Scala

I am having an application in scala. I need to use AOP for one of the functionality. I used the plugin sbt-aspectj . Everything is working fine when I run using the sbt console. However, I am not able to make it work when using the executable jar. I tried the the sample code provided in the sbt-aspect git page. But, I am getting the errors as
[warn] warning incorrect classpath: D:\source\jvm\modules\scala\frameworks\aspectjTracer\target\scala-2.11\classes
[warn] Missing message: configure.invalidClasspathSection in: org.aspectj.ajdt.ajc.messages
[error] error no sources specified
.
[trace] Stack trace suppressed: run 'last aspectjTracer/aspectj:ajc' for the full output.
[error] (aspectjTracer/aspectj:ajc) org.aspectj.bridge.AbortException: ABORT
[error] Expected project ID
[error] Expected configuration
[error] Expected ':' (if selecting a configuration)
[error] Expected key
[error] Not a valid key: aspectjTracker (similar: aspectjSource, aspectj-source, aspectjDirectory)
[error] last aspectjTracker/aspectj:ajc
[error]
My Build.scala is given below :
object frameworkBuild extends Build {
import Dependencies._
val akkaV = "2.3.6"
val sprayV = "1.3.1"
val musterV = "0.3.0"
val common_settings = Defaults.defaultSettings ++
Seq(version := "1.3-SNAPSHOT",
organization := "com.reactore",
scalaVersion in ThisBuild := "2.11.2",
scalacOptions ++= Seq("-unchecked", "-feature", "-deprecation"),
libraryDependencies := frameworkDependencies ++ testLibraryDependencies,
publishMavenStyle := true,
)
connectInput in run := true
lazy val aspectJTracer = Project(
"aspectjTracer",
file("aspectjTracer"),
settings = common_settings ++ aspectjSettings ++ Seq(
// input compiled scala classes
inputs in Aspectj <+= compiledClasses,
// ignore warnings
lintProperties in Aspectj += "invalidAbsoluteTypeName = ignore",
lintProperties in Aspectj += "adviceDidNotMatch = ignore",
// replace regular products with compiled aspects
products in Compile <<= products in Aspectj
)
)
// test that the instrumentation works
lazy val instrumented = Project(
"instrumented",
file("instrumented"),
dependencies = Seq(aspectJTracer),
settings = common_settings ++ aspectjSettings ++ Seq(
// add the compiled aspects from tracer
binaries in Aspectj <++= products in Compile in aspectJTracer,
// weave this project's classes
inputs in Aspectj <+= compiledClasses,
products in Compile <<= products in Aspectj,
products in Runtime <<= products in Compile
)
)
lazy val frameworks = Project(id = "frameworks", base = file("."), settings = common_settings).aggregate( core, baseDomain,aspectJTracer,instrumented)
lazy val core = Project(id = "framework-core", base = file("framework-core"), settings = common_settings)
lazy val baseDomain = Project(id = "framework-base-domain", base = file("framework-base-domain"), settings = common_settings).dependsOn(core,aspectJTracer,instrumented)
}
Does anyone know how to fix this? I posted this in the sbt-aspectj github page and waiting for a response there as well. But I am in a little hurry to fix this. Your help will be really appreciated.
Finally the problem is resolved. I had added javaagent in the build.scala. But, while running with the sbt-one-jar, it was not taking that jar. So I have manually provided the javaagent as the aspectweaver jar file and it worked. However, it is almost taking 3-4 minutes to start the jar file with the aspect.
Sometime it is even taking 15 min to start the jar file due to the aspectjwaver. I am not sure if this is problem with aspectj or sbt-one-jar, I guess its with the one-jar.
has anyone else faced the same issue ? I don't see any activity in sbt-one-jar, so asking it here.

Why does sbt console not see packages from subproject in multi-module project?

This is my project/Build.scala:
package sutils
import sbt._
import Keys._
object SutilsBuild extends Build {
scalaVersion in ThisBuild := "2.10.0"
val scalazVersion = "7.0.6"
lazy val sutils = Project(
id = "sutils",
base = file(".")
).settings(
test := { },
publish := { }, // skip publishing for this root project.
publishLocal := { }
).aggregate(
core
)
lazy val core = Project(
id = "sutils-core",
base = file("sutils-core")
).settings(
libraryDependencies += "org.scalaz" % "scalaz-core_2.10" % scalazVersion
)
}
This seems to be compiling my project just fine, but when I go into the console, I can't import any of the code that just got compiled?!
$ sbt console
scala> import com.github.dcapwell.sutils.validate.Validation._
<console>:7: error: object github is not a member of package com
import com.github.dcapwell.sutils.validate.Validation._
What am I doing wrong here? Trying to look at the usage, I don't see a way to say which subproject to load while in the console
$ sbt about
[info] Loading project definition from /src/sutils/project
[info] Set current project to sutils (in build file:/src/sutils/)
[info] This is sbt 0.13.1
[info] The current project is {file:/src/sutils/}sutils 0.1-SNAPSHOT
[info] The current project is built against Scala 2.10.3
[info] Available Plugins: org.sbtidea.SbtIdeaPlugin
[info] sbt, sbt plugins, and build definitions are using Scala 2.10.3
There's the solution from #Alexey-Romanov to start the console task in the project the classes to import are in.
sbt sutils/console
There's however another solution that makes the root sutils project depend on the other core. Use the following snippet to set up the project - note dependsOn core that will bring the classes from the core project to sutils's namespace.
lazy val sutils = Project(
id = "sutils",
base = file(".")
).settings(
test := { },
publish := { }, // skip publishing for this root project.
publishLocal := { }
).aggregate(
core
).dependsOn core
BTW, you should really use a simpler build.sbt for your use case as follows:
scalaVersion in ThisBuild := "2.10.0"
val scalazVersion = "7.0.6"
lazy val sutils = project.in(file(".")).settings(
test := {},
publish := {}, // skip publishing for this root project.
publishLocal := {}
).aggregate(core).dependsOn(core)
lazy val core = Project(
id = "sutils-core",
base = file("sutils-core")
).settings(
libraryDependencies += "org.scalaz" %% "scalaz-core" % scalazVersion
)
You could make it even easier when you'd split the build to two build.sbts, each for the projects.

How to get list of dependency jars from an sbt 0.10.0 project

I have a sbt 0.10.0 project that declares a few dependencies somewhat like:
object MyBuild extends Build {
val commonDeps = Seq("commons-httpclient" % "commons-httpclient" % "3.1",
"commons-lang" % "commons-lang" % "2.6")
val buildSettings = Defaults.defaultSettings ++ Seq ( organization := "org" )
lazy val proj = Project("proj", file("src"),
settings = buildSettings ++ Seq(
name := "projname",
libraryDependencies := commonDeps, ...)
...
}
I wish to creat a build rule to gather all the jar dependencies of "proj", so that I can symlink them to a single directory.
Thanks.
Example SBT task to print full runtime classpath
Below is roughly what I'm using. The "get-jars" task is executable from the SBT prompt.
import sbt._
import Keys._
object MyBuild extends Build {
// ...
val getJars = TaskKey[Unit]("get-jars")
val getJarsTask = getJars <<= (target, fullClasspath in Runtime) map { (target, cp) =>
println("Target path is: "+target)
println("Full classpath is: "+cp.map(_.data).mkString(":"))
}
lazy val project = Project (
"project",
file ("."),
settings = Defaults.defaultSettings ++ Seq(getJarsTask)
)
}
Other resources
Unofficial guide to sbt 0.10.
Keys.scala defines predefined keys. For example, you might want to replace fullClasspath with managedClasspath.
This plugin defines a simple command to generate an .ensime file, and may be a useful reference.