Scala integration test not picking config properties - scala

I have an Scala project with integration test with following folder structure
My Project
- app
- it
- com.anjib.my.pkg
- resources
- application.regression.devl.conf
I want to overwrite one of the properties by putting in application.regression.devl.conf file but while running integration test its still pulling top level property.
For example some where in project there is
someKey=someValue
I put
somekey=otherValue
in application.regression.devl.conf. But integration tests are still picking someValue
My build.sbt looks like
lazy val `my-project` = (project in file("."))
....
.configs( IntegrationTest )
.settings( Defaults.itSettings : _*)
....
)
Config loading as
def apply(): Config = {
val environment = determineEnvironment
val defaultConfig = ConfigFactory.load()
val envConfig: Config = ConfigFactory.load(s"application.$environment.conf")
val regressionSuiteConfig = ConfigFactory.load(s"application.regression.$environment.conf")
regressionSuiteConfig.withFallback(envConfig).withFallback(defaultConfig)
}
Update: ID I do Ctrl + Shift + F10 in IntelliJ it did pick up otherValue. So it has issue only with sbt it:test

Try to set the javaOptions in sbt like this:
javaOptions in IntegrationTest += "-Dconfig.resource=" + System.getProperty("config.resource", "application.regression.devl.conf")
With this sbt will set the config.resource parameter to "application.regression.devl.conf" if nothing is passed explicitly.

Related

Different project on the same base-directory (or share a single file between projects)

My initial setup had two separate projects (sbt 1.2.6);
a web app (huge codebase, lot of dependencies, slow compile)
a command-line app (basically one file with 3-4 separate dependencies)
The feature request came in; we should show the "valid" values in the command line app. The valid values are in an enum in the web app. So I fired up the sbt documentation and came up with an idea which looked like this;
//main webapp
lazy val core = project
.in(file("."))
.withId("core") //I tried this just in case, not helped...
//... here comes all the plugins and deps
//my hack to get a single-file compile
lazy val `feature-signer-helper` = project
.in(file("."))
.withId("feature-signer-helper")
.settings(
target := { baseDirectory.value / "target" / "features" },
sources in Compile := {
((scalaSource in Compile).value ** "Features.scala").get
}
)
//the command line app
lazy val `feature-signer` = project
.in(file("feature-signer"))
.dependsOn(`feature-signer-helper`)
.settings(
libraryDependencies ++= signerDeps
)
The problem is that it seems like, that whatever the last lazy val xxx = project.in(file(y)) that will be the only project for the y dir.
Also, I don't want to move that one file to a separate directory structure... And logically the command line app and the web app are not "depends on" each other, they have different dependencies (and really different build times).
My questions are;
is there any quick-win in this situation? (I will copy the file worst-case...)
why we have this rich project and source settings if I can't bind them to the same dir?
EDIT:
The below code can copy the needed file (if you have the same dir structure). I'm not super happy with it, but it works. Still interested in other methods.
import sbt._
import Keys._
object FeaturesCopyTask {
val featuresCopyTask = {
sourceGenerators in Compile += Def.task {
val outFile = (sourceManaged in Compile).value / "Features.scala"
val rootDirSrc = (Compile / baseDirectory).value / ".." / "src"
val inFile: File = (rootDirSrc ** "Features.scala").get().head
IO.copyFile(inFile, outFile, preserveLastModified = true)
Seq(outFile)
}.taskValue
}
}
lazy val `feature-signer` = project
.in(file("feature-signer"))
.settings(
libraryDependencies ++= signerDeps,
FeaturesCopyTask.featuresCopyTask
)
I would have the tree be something more like
+- core/
+- webapp/
+- cli/
core is the small amount (mostly model type things) that webapp and cli have in common
webapp depends on core (among many other things)
cli depends on core (and not much else)
So the build.sbt would be something like
lazy val core = (project in file("core"))
// yadda yadda yadda
lazy val webapp = (project in file("webapp"))
.dependsOn(core)
// yadda yadda yadda
lazy val cli = (project in file("cli"))
.dependsOn(core)
// yadda yadda yadda
lazy val root = (project in file("."))
.aggregate(
core,
webapp,
cli
)

sbt: generating shared sources in cross-platform project

Building my project on Scala with sbt, I want to have a task that will run prior to actual Scala compilation and will generate a Version.scala file with project version information. Here's a task I've came up with:
lazy val generateVersionTask = Def.task {
// Generate contents of Version.scala
val contents = s"""package io.kaitai.struct
|
|object Version {
| val name = "${name.value}"
| val version = "${version.value}"
|}
|""".stripMargin
// Update Version.scala file, if needed
val file = (sourceManaged in Compile).value / "version" / "Version.scala"
println(s"Version file generated: $file")
IO.write(file, contents)
Seq(file)
}
This task seems to work, but the problem is how to plug it in, given that it's a cross project, targeting Scala/JVM, Scala/JS, etc.
This is how build.sbt looked before I started touching it:
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {}
)
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := sys.env.getOrElse("CI_VERSION", "0.1"),
// ...
).
jvmSettings(/* JVM-specific settings */).
jsSettings(/* JS-specific settings */)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
and, on the filesystem, I have:
shared/ — cross-platform code shared between JS/JVM builds
jvm/ — JVM-specific code
js/ — JS-specific code
The best I've came up so far with is adding this task to foo crossProject:
lazy val foo = crossProject.in(file(".")).
settings(
name := "foo",
version := sys.env.getOrElse("CI_VERSION", "0.1"),
sourceGenerators in Compile += generateVersionTask.taskValue, // <== !
// ...
).
jvmSettings(/* JVM-specific settings */).
jsSettings(/* JS-specific settings */)
This works, but in a very awkward way, not really compatible with "shared" codebase. It generates 2 distinct Version.scala files for JS and JVM:
sbt:root> compile
Version file generated: /foo/js/target/scala-2.12/src_managed/main/version/Version.scala
Version file generated: /foo/jvm/target/scala-2.12/src_managed/main/version/Version.scala
Naturally, it's impossible to access contents of these files from shared, and this is where I want to access it.
So far, I've came with a very sloppy workaround:
There is a var declared in singleton object in shared
in both JVM and JS main entry points, the very first thing I do is that I assign that variable to match constants defined in Version.scala
Also, I've tried the same trick with sbt-buildinfo plugin — the result is exactly the same, it generated per-platform BuildInfo.scala, which I can't use directly from shared sources.
Are there any better solutions available?
Consider pointing sourceManaged to shared/src/main/scala/src_managed directory and scoping generateVersionTask to the root project like so
val sharedSourceManaged = Def.setting(
baseDirectory.value / "shared" / "src" / "main" / "scala" / "src_managed"
)
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {},
sourceManaged := sharedSourceManaged.value,
sourceGenerators in Compile += generateVersionTask.taskValue,
cleanFiles += sharedSourceManaged.value
)
Now sbt compile should output something like
Version file generated: /Users/mario/IdeaProjects/scalajs-cross-compile-example/shared/src/main/scala/src_managed/version/Version.scala
...
[info] Compiling 3 Scala sources to /Users/mario/IdeaProjects/scalajs-cross-compile-example/js/target/scala-2.12/classes ...
[info] Compiling 1 Scala source to /Users/mario/IdeaProjects/scalajs-cross-compile-example/target/scala-2.12/classes ...
[info] Compiling 3 Scala sources to /Users/mario/IdeaProjects/scalajs-cross-compile-example/jvm/target/scala-2.12/classes ...

Per-project tasks in SBT

My .sbt file looks something like this:
lazy val common = (project in file("./common"))
.settings(
// other settings
)
lazy val one = (project in file("./one"))
.dependsOn(common)
.settings(
// other settings
addCommandAlias("build", ";clean;assembly;foo")
)
lazy val two = (project in file("./two"))
.dependsOn(common)
.settings(
// other settings
addCommandAlias("build", ";clean;compile;bar")
)
Additionally I have two tasks foo and bar which are only valid in their respective projects.
My tests show that upon calling build - no matter which project I am in - both aliases are being called.
And for tasks, the keys can already be only defined at top-level of the .sbt file (e.g. val foo = taskKey[Unit]("Does foo")).
I want to know how to correctly implement tasks and command aliases on project level.
Is that possible?
The problem you are having is with alias in sbt. When an alias is defined, it is attached to scope GlobalScope in the form of a command and therefore available for all sub-projects. When you do multiple definitions of aliases with addCommandAlias, the last execution wins as every executions removes previously created alias with the same name.
You can see the defined alias by running sbt alias and it will print that there is only one build alias.
You could achieve separations of build by introducing it as a taskKey
lazy val build = taskKey[Unit]("Builds")
lazy val root = (project in file("."))
.aggregate(one, two) // THIS IS NEED TO MAKE build TASK AVAILABLE IN ROOT
lazy val common = (project in file("./common"))
.settings(
//SOME KEYS
)
lazy val one = (project in file("./one"))
.dependsOn(common)
.settings(
build := {
Def.sequential(clean, Compile / compile).value
}
)
lazy val two = (project in file("./two"))
.dependsOn(common)
.settings(
build := {
Def.sequential(clean, assembly).value
}
)
EDIT: Added Def.sequential as suggested by #laughedelic in the comments

SBT `dependsOn` Per-configuration dependency

I have a Build with two projects in it.
I want to make the root project classpath dependent on subProject, but only in certain configuration. Simplified project's config:
Subproject:
object HttpBuild{
import Dependencies._
lazy val http: Project = Project(
"http",
file("http"),
settings =
CommonSettings.settings ++
Seq(
version := "0.2-SNAPSHOT",
crossPaths := false,
libraryDependencies ++= akkaActor +: spray) ++
Packaging.defaultPackageSettings
)}
Root:
object RootBuild extends Build {
import HttpBuild._
lazy val http = HttpBuild.http
lazy val MyConfig = config("myconfig") extend Compile
private val defaultSettings = Defaults.coreDefaultSettings
lazy val api = Project("root", file("."))
.configs(MyConfig)
.settings(defaultSettings: _*)
.dependsOn(HttpBuild.http % MyConfig)
}
Now if i type myconfig:compile i want to have my root project compiled with subproject, but it doesn't seem to happen.
If i leave dependencies like this dependsOn(HttpBuild.http), it compiles, but it happens every time, no matter which configuration i use.
Have you looked at this example. I'm not an expert here, but comparing with your code above, the difference seems to be
that a CustomCompile configuration is defined and used as classpathConfiguration in Common := CustomCompile
that the dependency is indirect http % "compile->myconfig"
Perhaps try to get closer to that example.

SBT 0.13 Build.scala References to undefined settings

I am new to SBT and I have been trying to build a custom task for this build.
I have a simple build project:
import sbt._
import Keys._
object JsonBuild extends Build{
lazy val barTask = taskKey[Unit]("some simple task")
val afterTestTask1 = barTask := { println("tests ran!") }
val afterTestTask2 = barTask <<= barTask.dependsOn(test in Test)
lazy val myBarTask = taskKey[Unit]("some simple task")
//val afterMyBarTask1 = myBarTask := { println("tests ran!") }
lazy val afterMyBarTask2 = myBarTask <<= (myBarTask).dependsOn(test in Test) map { _ => println("tests ran!") }
//settings ++ Seq(afterMyBarTask2)
override lazy val settings = super.settings ++ Seq(afterMyBarTask2)
}
I keep getting the error:
References to undefined settings:
{.}/*:myBarTask from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
{.}/test:test from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
Did you mean test:test ?
I have googled around and I cannot find a solution.
Can you explain why it is not working?
lazy val myBarTask = taskKey[Unit]("some simple task")
override lazy val settings = super.settings ++ Seq(myBarTask := { (test in Test).value; println("tests ran!") } )
myBarTask is undefined when you call dependsOn. you should define it before using dependsOn. also value call on key (task/setting) is now preferred way to depend on other keys. you can still use your version, but define myBarTask
This has been bothering.
I did a bit more reading.
I think I know why the above code does not work.
lazy val afterMyBarTask2 = myBarTask <<= (myBarTask).dependsOn(test in Test) map { _ => println("tests ran!") }
When I write (myBarTask).dependsOn(test in Test), the project scope for test is chosen by SBT as ThisBuild.
{.}/test:test from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
ThisBuild project scope does not have the setting test in configuration Test.
Only projects have the setting test.
The key I think that setting is added by some default SBT plugin to the projects settings.
You check what scopes settings exist in SBT by using the inspect command.
If you type in the SBT REPL:
{.}/test:test
The output is:
inspect {.}/test:test
[info] No entry for key.
SBT correctly suggests:
test:test which is:
{file:/C:/Users/haques/Documents/workspace/SBT/jsonParser/}jsonparser/test:test
If the project is not specified in the project scope axis, SBT chooses the current project by default.
Every SBT project if not specified has its own project settings.