I have a very simply project.
build.sbt:
scalaVersion := "2.13.5"
lazy val testSettings = Seq(
Test / javaOptions += "-Dconfig.resource=/test.conf"
)
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "3.2.3" % Test,
"com.typesafe" % "config" % "1.4.1"
)
Two configuration files under resources folder:
application.conf
some-value = "value from application.conf file"
test.conf
some-value = "value from test.conf file"
And only 1 spec test class:
SomeTestSpec:
class SomeTestSpec extends AnyFlatSpec with Matchers {
val config: Config = ConfigFactory.load()
"some test" should "read and print from test.conf" in {
val value = config.getString("some-value")
println(s"value of some-value = $value")
value shouldBe "value from test.conf file"
}
}
When I run the test if fails:
"value from [application].conf file" was not equal to "value from [test].conf file"
ScalaTestFailureLocation: SomeTestSpec at (SomeTestSpec.scala:12)
Expected :"value from [test].conf file"
Actual :"value from [application].conf file"
Why the spec is reading the file application.conf instead of test.conf? Is something wrong in the build.sbt?:
lazy val testSettings = Seq(
Test / javaOptions += "-Dconfig.resource=/test.conf"
)
The best practice is to place application.conf into src/main/resources/ for regular use and into src/test/resources/ for testing. You'll have 2 conf files with different values in test. If you don't need to change config for test you can simply keep one conf file in main.
You don't have to override the file explicitly with -Dconfig.resource or -Dconfig.file for your tests because loading resources from class path will work as you expect out of the box and your test config will be used. -D option is mostly used at runtime to provide external configuration or in some complex build scenarios.
If you use -Dconfig.resource or -Dconfig.file pay attention to the path. config.resource is just a filename, i.e. application.conf which will be loaded as resource while config.file is an absolute or relative file path.
If you are making a library you can get fancy with reference.conf file having default values and application.conf overriding them. This could also work in your scenario but would be more confusing because that's not the purpose or reference file.
For the purposes of testing just use 2 application.conf files.
Additionally, here are some options to override config at runtime:
Overriding multiple config values in Typesafe config when using an uberjar to deploy.
Overriding configuration with environment variables in typesafe config
Scala environment ${USER} in application.conf
More info here: https://github.com/lightbend/config#standard-behavior
Related
I have been trying to set up environment variables directly from build.sbt file as I need to use an assembly jar name which is defined in this file. So I've been trying to output the result of defined environment variable using echo and in the context of the Scala application code sys.props.get("SPARK_APP_JAR_PATH") but the result is blank or None, respectively.
What can be wrong with the environment variable configuration?
This how I defined the env variable in build.sbt:
assemblyJarName in assembly := "spark_app_example_test.jar"
mainClass in assembly := Some("example.SparkAppExample")
test in assembly := {}
fork := true
val envVars = sys.props("SPARK_APP_JAR_PATH") = "/path/to/jar/file"
That's how it works for me:
lazy val dockerRepo: String = sys.props.getOrElse("DOCKER_REPO", s"bpf.docker.repo")
Or with your example:
lazy val envVars = sys.props.getOrElse("SPARK_APP_JAR_PATH", "/path/to/jar/file")
According to the documentation:
System properties can be provided either as JVM options, or as SBT arguments, in both cases as -Dprop=value. The following properties influence SBT execution.
e.g. sbt -DSPARK_APP_JAR_PATH=/path/to/jar/file
See https://www.scala-sbt.org/release/docs/Command-Line-Reference.html#Command+Line+Options
I have defined a minimal build.sbt with two custom profiles ‘dev’ and ‘staging’ (what SBT seems to call Configurations). However, when I run SBT with the Configuration that was defined first in the file (dev), both Configuration blocks are executed - and if both modify the same setting, the last one wins (staging).
This seems to break any notion of conditional activation, so what am I doing wrong with SBT?
For reference, I want to emulate the conditionally activated Profiles concept of Maven e.g. mvn test -P staging.
SBT version: 1.2.1
build.sbt:
name := "example-project"
scalaVersion := "2.12.6"
...
fork := true
// Environment-independent JVM property (always works)
javaOptions += "-Da=b"
// Environment-specific JVM property (doesn’t work)
lazy val Dev = config("dev") extend Test
lazy val Staging = config("staging") extend Test
val root = (project in file("."))
.configs(Dev, Staging)
.settings(inConfig(Dev)(Seq(javaOptions in Test += "-Dfoo=bar")))
.settings(inConfig(Staging)(Seq(javaOptions in Test += "-Dfoo=qux")))
Command:
# Bad
sbt test
=> foo=qux
a=b
# Bad
sbt clean dev:test
=> foo=qux
a=b
# Good
sbt clean staging:test
=> foo=qux
a=b
Notice that despite of the inConfig usage you're still setting javaOptions in Test, i.e. in the Test config. If you remove in Test, it works as expected:
...
.settings(inConfig(Dev)(javaOptions += "-Dfoo=bar"))
.settings(inConfig(Staging)(javaOptions += "-Dfoo=qux"))
(also Seq(...) wrapping is unnecessary)
Now in sbt:
> show Test/javaOptions
[info] *
> show Dev/javaOptions
[info] * -Dfoo=bar
> show Staging/javaOptions
[info] * -Dfoo=qux
You can achieve the same result by scoping each setting explicitly (without inConfig wrapping):
.settings(
Dev/javaOptions += "-Dfoo=bar",
Staging/javaOptions += "-Dfoo=qux",
...
)
(here Conf/javaOptions is the same as javaOptions in Conf)
I am using Grizzled-SLF4J (a wrapper around SLF4J) for my Spark/Scala/SBT Project. The property file simplelogger.properties has been placed in the src/main/resources. But the property file is not getting detected when I run the Application using spark-submit. Whatever the change I make to property file is not getting reflected and seems like some default values for the properties are used(In my case only WARN / ERROR messages are displayed).
Here is my build.sbt
lazy val root = (project in file(".")).
settings(
name := "myprojectname",
...,
libraryDependencies ++= Seq(
"org.clapper" %% "grizzled-slf4j" % "1.3.0",
"org.slf4j" % "slf4j-simple" % "1.7.22",
"org.slf4j" % "slf4j-api" % "1.7.22",
)
)
simplelogger.properties
org.slf4j.simpleLogger.logFile = System.err
org.slf4j.simpleLogger.defaultLogLevel = debug
org.slf4j.simpleLogger.showDateTime = false
org.slf4j.simpleLogger.dateTimeFormat = yyyy'/'MM'/'dd' 'HH':'mm':'ss'-'S
org.slf4j.simpleLogger.showThreadName = true
org.slf4j.simpleLogger.showLogName = true
org.slf4j.simpleLogger.showShortLogName= false
org.slf4j.simpleLogger.levelInBrackets = true
Am I missing something here?
PS : I did checked the Jar and simplelogger.properties is available in the root directory
Instead of adding it to the root, add it under resources, next to scala code folder:
/
src
main
resources
simplelogger.properties
scala
Obviously this is not only valid for simplelogger.properties but for any other kind of file that you want in the classpath at runtime.
I had the same struggle until I tried to set the properties manually using System.setProperty(org.slf4j.simpleLogger.defaultLogLevel,"DEBUG");
and I noticed that these properties were not present in the slf4j-simple-1.6.1.jar
Make sure to get at least slf4j-simple-1.7.25.jar . The prior versions don't support properties
org.slf4j.simpleLogger.logFile
or
org.slf4j.simpleLogger.defaultLogLevel
Upgrading to 1.7.25 picked up the simple logger config from simplelogger.properties
Where do you create application logs of app running in Docker so that I can map them later to host system ?
It's unclear in documentation for me whether I should create logs in some directory accessible without root permissions (then which one?) or I somehow can chown directory I need.
I tried using /var/log/case and /opt/case/logs with no success. Here is my minified SBT script
object build extends Build {
lazy val root = Project(
id = "case-server",
base = file("."),
settings = Defaults.coreDefaultSettings ++ graphSettings ++ Revolver.settings ++ Seq(
version := "1.15",
scalaVersion := "2.11.7",
libraryDependencies ++= {
val akkaV = "2.4.1"
val akkaStreamV = "2.0.1"
val scalaTestV = "2.2.5"
Seq(
"com.typesafe.scala-logging" %% "scala-logging-slf4j" % "2.1.2",
"ch.qos.logback" % "logback-classic" % "1.1.+",
"org.scalatest" %% "scalatest" % scalaTestV % "test"
)
},
dockerBaseImage := "develar/java",
dockerCommands := dockerCommands.value.flatMap {
case cmd#Cmd("FROM", _) => List(cmd, Cmd("RUN", "apk update && apk add bash"))
case other => List(other)
},
dockerExposedVolumes := Seq("/opt/case/logs"),
version in Docker := version.value
)
).enablePlugins(AssemblyPlugin).enablePlugins(JavaAppPackaging).enablePlugins(DockerPlugin)
}
What is the correct approach to do this ?
From my POV you should think about not creating any log-file but use (unbuffered) stdout for logging in containerized environments.
Please see here for why this is a good idea http://12factor.net/logs
You can then use 'docker logs' to get the logs like described here
You don't want your application log to be written on files on the container filesystem. The other solutions are:
write log files to a docker volume
write log entries to stdout (so that they will be forwarded to the Docker engine and available with the docker logs command)
Furthermore, if your application is writing its log entries to stdout, you can rely on Docker logging drivers to send those logs to syslog, journald, gelf, fluentd, awslogs, json files, or any log collecting system that provides a docker logging driver.
A trick to make an application write to stdout instead of to a file is to configure it to write to the special file /proc/self/fd/1. Anything written to this special file will be sent to stdout.
I am trying to call sbt assembly from the command line passing it a scalac compiler flag to elides (elide-below 1).
I have managed to get the flag working in the build.sbt by adding this line to the build.sbt
scalacOptions ++= Seq("-Xelide-below", "1")
And also it's working fine when I start sbt and run the following:
$> sbt
$> set scalacOptions in ThisBuild ++=Seq("-Xelide-below", "0")
But I would like to know how to pass this in when starting sbt, so that my CI jobs can use it while doing different assembly targets (ie. dev/test/prod).
One way to pass the elide level as a command line option is to use system properties
scalacOptions ++= Seq("-Xelide-below", sys.props.getOrElse("elide.below", "0"))
and run sbt -Delide.below=20 assembly. Quick, dirty and easy.
Another more verbose way to accomplish the same thing is to define different commands for producing test/prod artifacts.
lazy val elideLevel = settingKey[Int]("elide code below this level.")
elideLevel in Global := 0
scalacOptions ++= Seq("-Xelide-below", elideLevel.value.toString)
def assemblyCommand(name: String, level: Int) =
Command.command(s"${name}Assembly") { s =>
s"set elideLevel in Global := $level" ::
"assembly" ::
s"set elideLevel in Global := 0" ::
s
}
commands += assemblyCommand("test", 10)
commands += assemblyCommand("prod", 1000)
and you can run sbt testAssembly prodAssembly. This buys you a cleaner command name in combination with the fact that you don't have to exit an active sbt-shell session to call for example testAssembly. My sbt-shell sessions tend to live for a long time so I personally prefer the second option.