SBT version 1.3.13 - According to SBT documentation:
Like allInputFiles, there is an allOutputFiles task of return type Seq[Path] that is automatically generated for a task, foo, if the return type of foo is one of Seq[Path], Path, Seq[File] or File.
This seems to work with Seq[Path] and Path as expected:
val outputTask = Def.taskKey[Seq[java.nio.file.Path]]("")
outputTask := Seq[java.nio.file.Path]()
val printOutputs = Def.taskKey[Unit]("")
printOutputs := println((outputTask / allOutputFiles).value) // result: Vector()
However, if I change java.nio.file.Paths to java.io.Files it fails on loading:
[error] Reference to undefined setting:
[error]
[error] outputTask / allOutputFiles from printOutputs
I'm looking into the SBT source code but I haven't got a clue yet. Any insight on why it works with Path Seq[Path] but not with File Seq[File]?
Related
I was trying this piece of code:
// Get the Docker Configurations
def dockerConfForModule(moduleName: String): Dockerfile = {
val jarFile: File = (Compile / packageBin / sbt.Keys.`package`).value
val classpath = (Compile / managedClasspath).value
val mainclass = (Compile / packageBin / mainClass).value.getOrElse(sys.error("Expected exactly one main class"))
val jarTarget = s"/app/${jarFile.getName}"
// Make a colon separated classpath with the JAR file
val classpathString = classpath.files.map("/app/" + _.getName)
.mkString(":") + ":" + jarTarget
new Dockerfile {
// Base image
from("openjdk:8-jre")
// Add all files on the classpath
add(classpath.files, "/app/")
// Add the JAR file
add(jarFile, jarTarget)
// On launch run Java with the classpath and the main class
entryPoint("java", "-cp", classpathString, mainclass)
}
}
Which I'm calling as a function in one of my module like below:
// Define Sub Modules and its settings
lazy val cleanse = (project in file(MODULE_NAME_CLEANSE)).dependsOn(core)
.settings(
commonSettings,
enablingCoverageSettings,
name := MODULE_NAME_CLEANSE,
description := "Clean the incoming data for training purposes",
docker / dockerfile := dockerConfForModule(MODULE_NAME_CLEANSE)
)
.enablePlugins(DockerPlugin)
But this runs into an error:
/home/joesan/Projects/Private/ml-projects/housing-price-prediction-data-preparation/build.sbt:22: error: `value` can only be called on a task within a task definition macro, such as :=, +=, ++=, or Def.task.
val jarFile: File = (Compile / packageBin / sbt.Keys.`package`).value
^
/home/joesan/Projects/Private/ml-projects/housing-price-prediction-data-preparation/build.sbt:23: error: `value` can only be called on a task within a task definition macro, such as :=, +=, ++=, or Def.task.
val classpath = (Compile / managedClasspath).value
^
/home/joesan/Projects/Private/ml-projects/housing-price-prediction-data-preparation/build.sbt:24: error: `value` can only be called on a task within a task definition macro, such as :=, +=, ++=, or Def.task.
val mainclass = (Compile / packageBin / mainClass).value.getOrElse(sys.error("Expected exactly one main class"))
^
sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
[warn] Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? (default: r)
What is wrong in what I'm trying?
I upgraded to sbt 1.3.0 and related plugins.sbt. When I try to start sbt for my project it fails to initialize with the error
java.lang.IllegalArgumentException: Could not find proxy for val base: sbt.SettingKey in List(value base, method sbtdef$1, method $sbtdef, object $bd1712fb73ddc970045f, package <empty>, package <root>) (currentOwner= method $sbtdef )
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:316)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.$anonfun$proxy$4(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.$anonfun$proxy$4(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.$anonfun$proxy$4(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.$anonfun$proxy$4(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.proxy(LambdaLift.scala:330)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.proxyRef(LambdaLift.scala:370)
I did find this stackoverflow question Could not find proxy for ... in Macro , but I don't think it helps my error.
i think the code perpetrator is
//Ensure that version.sbt is included with each package.
mappings in Universal ++= {
val h=(packageBin in Compile, baseDirectory)
val base=h._2
val versionFile = (base.value / "version.sbt")
versionFile.get.map(file => file -> file.name)
}
and for some reason base is not storing (packageBin in Compile, baseDirectory) properly?
Edit:
I not a 100% but I think I fixed it by removing the intermediate variables and one lining it. So something like this:
mappings in Universal ++= {
((packageBin in Compile, baseDirectory)._2.value / "version.sbt").get.map(file => file -> file.name)
}
I don't know why it fixed it though...
I think the OP has confused the example with the ineffectual tuple use; perhaps there is some misunderstanding with some SBT API/DSL usage, that is, packageBin in Compile is never used or resolved (for it's side-effect).
I believe the error, however, is more to do with expressing the mappings in Universal task value in way the macro cannot process - it gets confused - for instance, expecting the macro/compiler to find the taskKey in a variable base, which is the _2 in a Tuple2.
The example could be rewritten as
mappings in Universal ++= {
(baseDirectory.value / "version.sbt").get.map(file => file -> file.name)
}
or
mappings in Universal ++= {
(baseDirectory in(Compile, packageBin)).value / "version.sbt").get.map(file => file -> file.name)
}
Depending on what the intention was (probably the latter).
Of course the new syntax would be
mappings in Universal ++= {
((Compile / packageBin / baseDirectory).value / "version.sbt").get.map(file => file -> file.name)
}
I'm getting the following set of errors, which I belive is caused by the sbt-assembly plugin that I is used.
In fact the object declaration of ;
object Build extends **Build** { (here Build is unresolved).
The error is as follows,
Error:Error while importing SBT project:<br/><pre>
[info] Loading settings from assembly.sbt,plugins.sbt ...
[info] Loading project definition from C:\Users\ShakthiW\IdeaProjects\TestProject\project
[error] <path>\Build.scala:4:22: not found: type Build
[error] object Build extends Build{
[error] ^
[error] <path>\Build.scala:8:80: value map is not a member of (sbt.TaskKey[sbt.librarymanagement.UpdateReport], sbt.SettingKey[java.io.File], sbt.SettingKey[String])
[error] def copyDepTask = copyDependencies <<= (update, crossTarget, scalaVersion) map {
[error] ^
[error] <path>\Build.scala:19:16: too many arguments (3) for method apply: (id: String, base: java.io.File)sbt.Project in object Project
[error] Note that 'settings' is not a parameter name of the invoked method.
[error] settings = Defaults.defaultSettings ++ Seq(
[error] ^
[error] three errors found
[error] (compile:compileIncremental) Compilation failed
A quick resolve is highly appreciated.
My Build.scala looks like this.
import sbt.Keys._
import sbt._
object MyBuild extends Build {
lazy val copyDependencies = TaskKey[Unit]("copy-dependencies")
def copyDepTask = copyDependencies <<= (update, crossTarget, scalaVersion) map {
(updateReport, out, scalaVer) =>
updateReport.allFiles foreach { srcPath =>
val destPath = out / "lib" / srcPath.getName
IO.copyFile(srcPath, destPath, preserveLastModified=true)
}
}
lazy val root = Project(
"root",
file("."),
settings = Defaults.defaultSettings ++ Seq(
copyDepTask
)
)
}
Also, I do rekon there is a issue with sbt-assembly upgrades as well which I'm not entirely aware of.
In sbt version 1.0.x, some key dependencies operators were removed. See the migration docs: https://www.scala-sbt.org/0.13/docs/Migrating-from-sbt-012x.html
Here is an short tutorial for writing Build.scala for sbt version 1.0.x: https://alvinalexander.com/scala/sbt-how-to-use-build.scala-instead-of-build.sbt.
You can also refer to build.scala of an existing project for more ref, eg. scalaz.
I hit a MissingRequirementError when I try to invoke scaladoc from within an sbt task.
Using any version of sbt 0.13.x, start with this build.sbt:
val scaladoc = taskKey[Unit]("run scaladoc")
scaladoc := {
import scala.tools.nsc._
val settings = new doc.Settings(error => print(error))
settings.usejavacp.value = true
val docFactory = new doc.DocFactory(new reporters.ConsoleReporter(settings), settings)
val universe = docFactory.makeUniverse(Left((sources in Compile).value.map(_.absolutePath).toList))
}
Then run sbt scaladoc, and behold (during makeUniverse):
[info] Set current project to test (in build file:...)
scala.reflect.internal.MissingRequirementError: object scala.annotation.Annotation in compiler mirror not found.
at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
What is wrong here? I've already tried fork := true and different combinations of sbt/scala versions to no avail.
It seems you need to provide scala-library (and indeed, any other dependencies) directly to the DocFactory.
scaladoc := {
import scala.tools.nsc._
val settings = new doc.Settings(error => print(error))
val dependencyPaths = (update in Compile).value
.select().map(_.absolutePath).mkString(java.io.File.pathSeparator)
settings.classpath.append(dependencyPaths)
settings.bootclasspath.append(dependencyPaths)
val docFactory = new doc.DocFactory(new reporters.ConsoleReporter(settings), settings)
val universe = docFactory.makeUniverse(Left((sources in Compile).value.map(_.absolutePath).toList))
}
play doesn't convert my java form object to the scala world.
[error] /home/myproject/split/frontend/app/controllers/frontend/Configuration.java:46: error: method render in class settings cannot be applied to given types;
[error] return ok(settings.render(settingsForm.fill(userSettings)));
[error] ^
[error] required: play.api.data.Form<Settings>
[error] found: play.data.Form<Settings>
[error] reason: actual argument play.data.Form<Settings> cannot be converted to play.api.data.Form<Settings> by method invocation conversion
the view-template looks like this:
#(settingsForm: Form[Settings])
#import play.i18n._
#import helper._
#import helper.twitterBootstrap._
#main {
#helper.form(action = controllers.frontend.routes.Configuration.setSettings) {
Any idea?
I should also mention that we use project split main->frontend->common and main->backend->common. We moved this page (view and controller) from common to frontend. It worked in common fine. Now in frontend I get this error.
I actually had a similar problem with a java.util.List and I had to add templatesImport ++= Seq("java.util._", ... to the settings:
val frontend = play.Project(
appName + "-frontend", appVersion, path = file("main/frontend")
).settings(
templatesImport ++= Seq("java.util._", "models.frontend._")
).dependsOn(common).aggregate(common)
I tried with play.data._ already, didn't help.
Your frontend project is a Scala project, not a Java project. Add a dependency on javaCore to it, and it will be a Java project. Then do a play clean compile, and everything should work. Eg:
val frontend = play.Project(
appName + "-frontend", appVersion, Seq(javaCore), path = file("main/frontend")
).settings(
templatesImport ++= Seq("java.util._", "models.frontend._")
).dependsOn(common).aggregate(common)