i need to add the base directory path to system property
lazy val bas = baseDirectory.value.getPath
initialize ~= { _ =>
System.setProperty( "report.path", bas+"/reports/l2report")
System.setProperty( "build-version", buildVersion)
}
build.sbt:111: error: `value` can only be used within a task or setting macro, such as :=, +=, ++=, Def.task, or Def.setting.
this error is coming build.sbt
Related
I've defined a case class to be used as a schema for a Dataset in Spark.
I want to be able to refer to individual columns from that schema by referencing them programmatically (vs. hardcoding their string value somewhere)
For example, for the following case class
final case class MySchema(id: Int, name: String, timestamp: Long)
I would like to auto-generate the following object
object MySchema {
val id = "id"
val name = "name"
val timestamp = "timestamp"
}
The Macro approach outlined here appears to be what I want, but it won't compile under Scala 2.12. It gives the following errors which are completely baffling to me and show up in a total of 2 Google results with 0 fixes.
[error] pattern var qq$macro$2 in method unapply is never used: use a wildcard `_` or suppress this warning with `qq$macro$2#_`
[error] case (c#q"$_ class $tpname[..$_] $_(...$params) extends { ..$_ } with ..$_ { $_ => ..$_ }") :: Nil =>
[error] ^
[error] pattern var qq$macro$19 in method unapply is never used: use a wildcard `_` or suppress this warning with `qq$macro$19#_`
[error] case (c#q"$_ class $_[..$_] $_(...$params) extends { ..$_ } with ..$_ { $_ => ..$_ }") ::
[error] ^
[error] pattern var qq$macro$27 in method unapply is never used: use a wildcard `_` or suppress this warning with `qq$macro$27#_`
[error] q"$mods object $tname extends { ..$earlydefns } with ..$parents { $self => ..$body }" :: Nil =>
[error] ^
Suppressing the warning as outlined won't work because the macro numbers change every time I compile.
It's also worth noting that the similar SO answer here runs into the same compiler errors as shown above
IntelliJ also complains about several parts of the macro that the compiler doesn't complain about, but that's not really an issue if I can get it to compile
Is there a way to fix that Macro approach to work in Scala 2.12 or is there a better Scala 2.12 way to do this? (I can't use Scala 2.13 or higher due to compute environment constraints)
Just checked that the macro is still working both in Scala 2.13.10 and 2.12.17.
Most probably, you didn't set up your project for macro annotations
build.sbt
//ThisBuild / scalaVersion := "2.13.10"
ThisBuild / scalaVersion := "2.12.17"
lazy val macroAnnotationSettings = Seq(
scalacOptions ++= (CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, v)) if v >= 13 => Seq("-Ymacro-annotations") // for Scala 2.13
case _ => Nil
}),
libraryDependencies ++= (CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, v)) if v <= 12 => // for Scala 2.12
Seq(compilerPlugin("org.scalamacros" % "paradise" % "2.1.1" cross CrossVersion.full))
case _ => Nil
})
)
lazy val core = project
.settings(
macroAnnotationSettings,
scalacOptions ++= Seq(
"-Ymacro-debug-lite", // optional, convenient to see how macros are expanded
),
)
.dependsOn(macros) // you must split your project into subprojects because macros must be compiled before core
lazy val macros = project
.settings(
macroAnnotationSettings,
libraryDependencies ++= Seq(
scalaOrganization.value % "scala-reflect" % scalaVersion.value, // necessary for macros
),
)
project structure:
core
src
main
scala
Main.scala
macros
src
main
scala
Macros.scala
Then just do sbt clean compile.
The whole project: https://gist.github.com/DmytroMitin/2d9dbd6441ebf167aa127b80fb516afd
sbt documentation:
https://www.scala-sbt.org/1.x/docs/Macro-Projects.html
Scala documentation: https://docs.scala-lang.org/overviews/macros/annotations.html
Examples of build.sbt:
https://github.com/typelevel/simulacrum/blob/master/build.sbt
https://github.com/DmytroMitin/AUXify/blob/master/build.sbt
I was trying this piece of code:
// Get the Docker Configurations
def dockerConfForModule(moduleName: String): Dockerfile = {
val jarFile: File = (Compile / packageBin / sbt.Keys.`package`).value
val classpath = (Compile / managedClasspath).value
val mainclass = (Compile / packageBin / mainClass).value.getOrElse(sys.error("Expected exactly one main class"))
val jarTarget = s"/app/${jarFile.getName}"
// Make a colon separated classpath with the JAR file
val classpathString = classpath.files.map("/app/" + _.getName)
.mkString(":") + ":" + jarTarget
new Dockerfile {
// Base image
from("openjdk:8-jre")
// Add all files on the classpath
add(classpath.files, "/app/")
// Add the JAR file
add(jarFile, jarTarget)
// On launch run Java with the classpath and the main class
entryPoint("java", "-cp", classpathString, mainclass)
}
}
Which I'm calling as a function in one of my module like below:
// Define Sub Modules and its settings
lazy val cleanse = (project in file(MODULE_NAME_CLEANSE)).dependsOn(core)
.settings(
commonSettings,
enablingCoverageSettings,
name := MODULE_NAME_CLEANSE,
description := "Clean the incoming data for training purposes",
docker / dockerfile := dockerConfForModule(MODULE_NAME_CLEANSE)
)
.enablePlugins(DockerPlugin)
But this runs into an error:
/home/joesan/Projects/Private/ml-projects/housing-price-prediction-data-preparation/build.sbt:22: error: `value` can only be called on a task within a task definition macro, such as :=, +=, ++=, or Def.task.
val jarFile: File = (Compile / packageBin / sbt.Keys.`package`).value
^
/home/joesan/Projects/Private/ml-projects/housing-price-prediction-data-preparation/build.sbt:23: error: `value` can only be called on a task within a task definition macro, such as :=, +=, ++=, or Def.task.
val classpath = (Compile / managedClasspath).value
^
/home/joesan/Projects/Private/ml-projects/housing-price-prediction-data-preparation/build.sbt:24: error: `value` can only be called on a task within a task definition macro, such as :=, +=, ++=, or Def.task.
val mainclass = (Compile / packageBin / mainClass).value.getOrElse(sys.error("Expected exactly one main class"))
^
sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
[warn] Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? (default: r)
What is wrong in what I'm trying?
I'm trying to create a task in sbt that will output the full classpath of a custom Configuration, but I get an undefined setting error when sbt tries to load the project definition. I can't figure out which setting has to be defined:
import sbt.Keys._
import sbt._
object FoobarBuild extends Build {
lazy val ZK = config("zk")
lazy val fcp = TaskKey[String]("fcp", "create formatted classpath")
lazy val fcpTask = fcp <<= (fullClasspath in ZK) map { cp =>
println(cp.files.absString)
cp.files.absString
}
lazy val project = Project("foobar", file(".")).
configs(ZK).
settings(
name := "foobar",
version := "1.0",
scalaVersion := "2.11.7"
).
settings(fcpTask)
}
Error:
[info] Loading project definition from foobar/project
Reference to undefined setting:
zk:fullClasspath from *:fcp (/Users/gaston/mesosphere/foobar/project/Build.scala:7)
zk:fullClasspath on the 7th line of this file is, of course, fullClasspath in ZK. It's undefined because it isn't set or inherited from any other config, I believe.
I am new to SBT and I have been trying to build a custom task for this build.
I have a simple build project:
import sbt._
import Keys._
object JsonBuild extends Build{
lazy val barTask = taskKey[Unit]("some simple task")
val afterTestTask1 = barTask := { println("tests ran!") }
val afterTestTask2 = barTask <<= barTask.dependsOn(test in Test)
lazy val myBarTask = taskKey[Unit]("some simple task")
//val afterMyBarTask1 = myBarTask := { println("tests ran!") }
lazy val afterMyBarTask2 = myBarTask <<= (myBarTask).dependsOn(test in Test) map { _ => println("tests ran!") }
//settings ++ Seq(afterMyBarTask2)
override lazy val settings = super.settings ++ Seq(afterMyBarTask2)
}
I keep getting the error:
References to undefined settings:
{.}/*:myBarTask from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
{.}/test:test from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
Did you mean test:test ?
I have googled around and I cannot find a solution.
Can you explain why it is not working?
lazy val myBarTask = taskKey[Unit]("some simple task")
override lazy val settings = super.settings ++ Seq(myBarTask := { (test in Test).value; println("tests ran!") } )
myBarTask is undefined when you call dependsOn. you should define it before using dependsOn. also value call on key (task/setting) is now preferred way to depend on other keys. you can still use your version, but define myBarTask
This has been bothering.
I did a bit more reading.
I think I know why the above code does not work.
lazy val afterMyBarTask2 = myBarTask <<= (myBarTask).dependsOn(test in Test) map { _ => println("tests ran!") }
When I write (myBarTask).dependsOn(test in Test), the project scope for test is chosen by SBT as ThisBuild.
{.}/test:test from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
ThisBuild project scope does not have the setting test in configuration Test.
Only projects have the setting test.
The key I think that setting is added by some default SBT plugin to the projects settings.
You check what scopes settings exist in SBT by using the inspect command.
If you type in the SBT REPL:
{.}/test:test
The output is:
inspect {.}/test:test
[info] No entry for key.
SBT correctly suggests:
test:test which is:
{file:/C:/Users/haques/Documents/workspace/SBT/jsonParser/}jsonparser/test:test
If the project is not specified in the project scope axis, SBT chooses the current project by default.
Every SBT project if not specified has its own project settings.
I want to override the value of a SettingKey b only when computing SettingKey a1.
import sbt._
import sbt.Keys._
object Build extends Build {
val a1Key = SettingKey[String]("a1", "")
val a2Key = SettingKey[String]("a2", "")
val bKey = SettingKey[String]("b", "")
lazy val rootProject = Project("P", file(".")).settings(
bKey := "XXX",
a1Key <<= bKey((x) => ">>>"+x+"<<<"),
a2Key <<= bKey((x) => ">>>"+x+"<<<")
) .settings(
bKey in a1Key := "YYY" //providing custom value in setting scope
)
}
Current result is
> a1
[info] >>>XXX<<<
> a2
[info] >>>XXX<<<
> b
[info] XXX
...but I'm aiming at seeing YYY as the value of a1:
> a1
[info] >>>YYY<<<
> a2
[info] >>>XXX<<<
> b
[info] XXX
Better real world example than above is when you want to add some resources to your build only in runtime configuration, and some other resources when the application is packaged. For example building GWT app public resources served by server during development-mode and during production are different. It would be nice for example to customize setting resource-directories for run and package tasks.
You need to set a1Key and a2Key to allow for bKey to be overridden in the first place:
lazy val rootProject = Project("P", file(".")).settings(
bKey := "Fnord",
a1Key <<= (bKey in a1Key)(x => ">>>" + x + "<<<"),
a2Key <<= (bKey in a2Key)(x => ">>>" + x + "<<<")
).settings(
bKey in a1Key := "Meep"
)
That way, computing a1Key will use the more specific value Meep and while computing a2Key, sbt would "look for" the definition of bKey in a2Key and then, because it doesn't "find" it, falls back to the more general bKey (in the default scope), which is defined and therefore used.
Edit: this unfortunately means, that unless whoever provides the definitions of the a1Key and a2Key settings also explicitly provides the required extension points (in the form of setting-scoped dependencies), you cannot override the dependencies. That is at least how I understand it.