I have a build.sbt file with the following snippet:
scalastyleConfig in Compile := baseDirectory.value / "project" / "scalastyle_config.xml"
scalastyleConfig in Test := baseDirectory.value / "project" / "scalastyle_config.xml"
When I use IntelliJ to extract variable, I get:
val scalaStyleConfig: File = baseDirectory.value / "project" / "scalastyle_config.xml"
scalastyleConfig in Compile := scalaStyleConfig
scalastyleConfig in Test := scalaStyleConfig
which does not evaluate.
I tweaked things to get 2 possible alternatives that evaluate:
val scalastyleConfigFile = SettingKey[File]("scalaStyleConfig")
scalastyleConfigFile := baseDirectory.value / "project" / "scalastyle_config.xml"
scalastyleConfig in Compile := scalastyleConfigFile.value
scalastyleConfig in Test := scalastyleConfigFile.value
or:
def scalastyleConfigFile(baseDir: File) = baseDir / "project" / "scalastyle_config.xml"
scalastyleConfig in Compile := scalastyleConfigFile(baseDirectory.value)
scalastyleConfig in Test := scalastyleConfigFile(baseDirectory.value)
I'm not happy with either of my alternatives. I'm using the second alternative at the moment because it's shorter. It's annoying to have to pass the baseDirectory.value as a parameter to the function.
I tried various versions using lazy val—none of which worked :(. There must be a better way to abstract with SBT!
Can you help?
Use Def.setting { } around your original example:
val scalaStyleConfig: File = Def.setting { baseDirectory.value / "project" / "scalastyle_config.xml" }
scalastyleConfig in Compile := scalaStyleConfig.value
scalastyleConfig in Test := scalaStyleConfig.value
The reason is that := and Def.setting are compile-time macros that only work in the correct setting.
See http://www.scala-sbt.org/0.13/docs/ChangeSummary_0.13.0.html#New+task%2Fsetting+syntax for more explanations.
Related
I'm wondering how do I change the following which gives me a warning for deprecation for the
in
command?
lazy val enablingCoverageSettings = Seq(coverageEnabled in(Test, compile) := true, coverageEnabled in(Compile, compile) := false)
I guess I have to use the syntax
This
but how do I change it in my case here?
You should be able to transform it to:
lazy val enablingCoverageSettings = Seq(
Test / compile / coverageEnabled := true,
Compile / compile / coverageEnabled := false
)
The idea is to replace x in (y, z) with y / z / x.
I want multiple sbt projects with the exact same root so I can build the same code with different settings. I've tried something similar to what I have below, but sbt only recognizes the first project (root).
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / organization := "com.example"
ThisBuild / organizationName := "example"
lazy val root = (project in file("."))
.settings(
name := "Scala Seed Project",
scalaVersion := "2.13.6"
)
lazy val root2 = (project in file("."))
.settings(
name := "Scala Seed Project",
scalaVersion := "2.12.12"
)
This isn't a perfect answer, but I found the suggestion below on this page which does seem to work for this simple example.
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / organization := "com.example"
ThisBuild / organizationName := "example"
lazy val root = (project in file("target/root"))
.settings(
name := "Scala Seed Project",
scalaVersion := "2.13.6",
Compile / scalaSource := baseDirectory.value / ".." / ".." / "src" / "main" / "scala",
)
lazy val root2 = (project in file("target/root2"))
.settings(
name := "Scala Seed Project",
scalaVersion := "2.12.12",
Compile / scalaSource := baseDirectory.value / ".." / ".." / "src" / "main" / "scala",
)
I don't love this solution because it requires dummy directories and an otherwise unnecessary redefinition of scalaSource for multiple tasks (although I've only included compilation in the example above).
My sbt file has an added custom configuration called "dev". I want the sources from dev configuration to end up the same way as sources from the Compile configuration, i.e., in the srcs folder of ivy local cahce when using sbt publishLocal . Is there something wrong with the authoring of the below sbt file?
lazy val Dev = config("dev") extend(Compile) describedAs("Dependencies required for development environments")
lazy val dpframework = project
.in(file("datapipeline-framework"))
.configs(Dev,Compile,Test)
.settings(
name := "datapipeline-framework",
settings,
inConfig(Dev)(Defaults.compileSettings),
addArtifact(artifact in (Dev, packageBin), packageBin in Dev),
addArtifact(artifact in (Dev, packageDoc), packageDoc in Dev),
addArtifact(artifact in (Dev, packageSrc), packageSrc in Dev),
ivyConfigurations := overrideConfigs(Dev, Test, Compile)(ivyConfigurations.value),
defaultConfiguration := Some(Compile),
libraryDependencies ++= commonDependencies,
dependencyOverrides ++= commonDependencyOverrides,
publishArtifact in Dev := true,
(dependencyClasspath in Test) := (dependencyClasspath in Test).value ++ Seq(Attributed.blank((classDirectory in Dev).value))
)
The problem seems to be that Dev / packageSrc / artifact is in the wrong Artifact type, so you can fix that as follows:
Dev / packageSrc / artifact ~= { _.withType("src") },
Here's the full example:
ThisBuild / scalaVersion := "2.12.10"
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / organization := "com.example"
ThisBuild / organizationName := "example"
lazy val Dev = config("dev").extend(Compile)
.describedAs("Dependencies required for development environments")
lazy val root = (project in file("."))
.configs(Dev)
.settings(
name := "datapipeline-framework",
inConfig(Dev)(Defaults.compileSettings),
addArtifact(Dev / packageBin / artifact, Dev / packageBin),
addArtifact(Dev / packageDoc / artifact, Dev / packageDoc),
Dev / packageDoc / artifact ~= { _.withType("doc") },
addArtifact(Dev / packageSrc / artifact, Dev / packageSrc),
Dev / packageSrc / artifact ~= { _.withType("src") },
Dev / publishArtifact := true,
Test / dependencyClasspath ++= Seq(Attributed.blank((Test / classDirectory).value))
)
If I define an sbt scala project that depends on two external source folders the analysis does not work correctly. So say I define the following build.sbt:
lazy val root = project.in(file("."))
.settings(
name := "repro",
version := "1.0",
scalaVersion := "2.11.8",
unmanagedSourceDirectories in Compile +=
baseDirectory.value / ".." / "ext1" / "src" / "main" / "scala",
unmanagedSourceDirectories in Compile +=
baseDirectory.value / ".." / "ext2" / "src" / "main" / "scala"
)
such that sources in ext1 depend on sources in ext2. So in this example I've defined a trati T1 in ext1 and a trait T2 in ext2 that depends on T1. And I have a class in my project that depends on T2. This will all compile fine in sbt. But when I import this sbt project in IntelliJ it compiles. However, when I open my trait T2 in the editor it gives me an error when I reference T1, saying "Cannot resolve ext1". Why am I getting this error?
The reproduction of this issue can be found on github with the following links:
https://github.com/hughgearse/repro
https://github.com/hughgearse/ext1
https://github.com/hughgearse/ext2
Create build definition in ext1/build.sbt
lazy val root = project.in(file("."))
.settings(
name := "ext1",
version := "1.0",
scalaVersion := "2.11.8"
)
and then reference ext1 as an external build in ext2/build.sbt via RootProject
val ext1 = RootProject( file("../ext1") )
lazy val root = project.in(file(".")).dependsOn(ext1)
.settings(
name := "ext2",
version := "1.0",
scalaVersion := "2.11.8"
)
and then similarly reference both as external builds in repro/build.sbt
val ext1 = RootProject( file("../ext1") )
val ext2 = RootProject( file("../ext2") )
lazy val root = project.in(file(".")).dependsOn(ext1, ext2)
.settings(
name := "repro",
version := "1.0",
scalaVersion := "2.11.8",
unmanagedSourceDirectories in Compile +=
baseDirectory.value / ".." / "ext1" / "src" / "main" / "scala",
unmanagedSourceDirectories in Compile +=
baseDirectory.value / ".." / "ext2" / "src" / "main" / "scala"
)
Re-import repro project and IntelliJ should be able to analyse all the sources.
By Scala.js' sbt fastOptJS, I would simply want to redirect myproject/target/scala-2.11/web-fastopt.js to myproject/js is that possible?
Same for web-jsdeps.js - to redirect it to /myproject/libs
I've read this
Scala.js compilation destination
that seems too complicated. I have only one project, not two or three, there is no play framework, just plain file-to-folder copy.
UPDATE:
My settings, project/BuildProject.scala:
lazy val chromePluginProject = Project(id = "chromePlugin", base = file(".")).enablePlugins(ScalaJSPlugin).
settings(
version := "0.1",
scalaVersion := Versions.scala,
artifactPath in(Compile, fastOptJS) := baseDirectory.value / "plugin" / "src" / "content" / "fastOpt.js",
ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) }, // TODO:
//mainClass := Some("branch.ScalaJsSample"),
libraryDependencies ++= scalaJsDependencies,
libraryDependencies += "be.doeraene" %%% "scalajs-jquery" % "0.9.0",
libraryDependencies += "com.lihaoyi" %%% "upickle" % Versions.upickle,
libraryDependencies += "com.lihaoyi" %%% "scalatags" % Versions.scalaTags,
// we will not use use DOM directly so commenting it
libraryDependencies += "org.scala-js" %%% "scalajs-dom" % Versions.dom,
jsDependencies += "org.webjars" % "jquery" % Versions.jquery / "jquery.js",
jsDependencies += "org.webjars.bower" % "webcomponents.js" % Versions.webcomponents / "webcomponents-lite.js",
// After reloading and rerunning fastOptJS,
// this will create scala-js-jsdeps.js
skip in packageJSDependencies := false,
// allows DOM be available from from console' run (so no "ReferenceError: "window" is not defined." error would appear)
jsDependencies += RuntimeDOM, // it will use PhantomJS, basically
scalaJSUseRhino in Global := false //will use node.js to run the thing
)
My file structure is:
<root>/plugin/src/content where I want to copy the fastOpt.js
As i said it creates in *-site-jsdeps.js in /target/scala-2.11/
Yes, You can do it like this:
artifactPath in(Compile, packageScalaJSLauncher) := baseDirectory.value / ".." / "jvm" / "webapp" / "js" / "launcher.js",
artifactPath in(Compile, fastOptJS) := baseDirectory.value / ".." / "jvm" / "webapp" / "js" / "fastOpt.js",
artifactPath in(Compile, fullOptJS) := baseDirectory.value / ".." / "jvm" / "webapp" / "js" / "fullOpt.js",
artifactPath in(Compile, packageJSDependencies) := baseDirectory.value / ".." / "jvm" / "webapp" / "js" / "dependency.js"
for more, you can refer to https://github.com/yuanqingfei/gdbscan-akka-d3js/blob/master/build.sbt
Simply with this sbt setting:
crossTarget in fastOptJS := baseDirectory.value / "js"
in your BuildProject.scala (object BuildProject extends Build {) or build.sbt add this line:
lazy val copyJsTask = TaskKey[Unit]("copyJsTask", "Copy javascript files to target directory")
lazy val myPluginProject = Project(id = "my-plugin", base = file(".")).
settings(
copyJsTask := {
val outDir = baseDirectory.value / "plugin/src/content"
val inDir = baseDirectory.value / "target/scala-2.11"
val files = Seq("my-plugin-fastopt.js", "my-plugin-fastopt.js.map") map { p => (inDir / p, outDir / p) }
IO.copy(files, overwrite = true)
}, ..
add new file in the very root of your project
.sbtrc
with the content:
alias jsCompile=;fastOptJS;copyJsTask
--
That what satisfies me with its "complexity", npm/grunt/linux batch alike.
With sbt v1.6.1 you can add these settings to change the output directory of Scala.js.
settings(
Compile / fastOptJS / artifactPath := baseDirectory.value / "../where-you-like/main.js",
Compile / fullOptJS / artifactPath := baseDirectory.value / "../where-you-like/main.js",
)