I am using Grizzled-SLF4J (a wrapper around SLF4J) for my Spark/Scala/SBT Project. The property file simplelogger.properties has been placed in the src/main/resources. But the property file is not getting detected when I run the Application using spark-submit. Whatever the change I make to property file is not getting reflected and seems like some default values for the properties are used(In my case only WARN / ERROR messages are displayed).
Here is my build.sbt
lazy val root = (project in file(".")).
settings(
name := "myprojectname",
...,
libraryDependencies ++= Seq(
"org.clapper" %% "grizzled-slf4j" % "1.3.0",
"org.slf4j" % "slf4j-simple" % "1.7.22",
"org.slf4j" % "slf4j-api" % "1.7.22",
)
)
simplelogger.properties
org.slf4j.simpleLogger.logFile = System.err
org.slf4j.simpleLogger.defaultLogLevel = debug
org.slf4j.simpleLogger.showDateTime = false
org.slf4j.simpleLogger.dateTimeFormat = yyyy'/'MM'/'dd' 'HH':'mm':'ss'-'S
org.slf4j.simpleLogger.showThreadName = true
org.slf4j.simpleLogger.showLogName = true
org.slf4j.simpleLogger.showShortLogName= false
org.slf4j.simpleLogger.levelInBrackets = true
Am I missing something here?
PS : I did checked the Jar and simplelogger.properties is available in the root directory
Instead of adding it to the root, add it under resources, next to scala code folder:
/
src
main
resources
simplelogger.properties
scala
Obviously this is not only valid for simplelogger.properties but for any other kind of file that you want in the classpath at runtime.
I had the same struggle until I tried to set the properties manually using System.setProperty(org.slf4j.simpleLogger.defaultLogLevel,"DEBUG");
and I noticed that these properties were not present in the slf4j-simple-1.6.1.jar
Make sure to get at least slf4j-simple-1.7.25.jar . The prior versions don't support properties
org.slf4j.simpleLogger.logFile
or
org.slf4j.simpleLogger.defaultLogLevel
Upgrading to 1.7.25 picked up the simple logger config from simplelogger.properties
Related
I have a very simply project.
build.sbt:
scalaVersion := "2.13.5"
lazy val testSettings = Seq(
Test / javaOptions += "-Dconfig.resource=/test.conf"
)
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "3.2.3" % Test,
"com.typesafe" % "config" % "1.4.1"
)
Two configuration files under resources folder:
application.conf
some-value = "value from application.conf file"
test.conf
some-value = "value from test.conf file"
And only 1 spec test class:
SomeTestSpec:
class SomeTestSpec extends AnyFlatSpec with Matchers {
val config: Config = ConfigFactory.load()
"some test" should "read and print from test.conf" in {
val value = config.getString("some-value")
println(s"value of some-value = $value")
value shouldBe "value from test.conf file"
}
}
When I run the test if fails:
"value from [application].conf file" was not equal to "value from [test].conf file"
ScalaTestFailureLocation: SomeTestSpec at (SomeTestSpec.scala:12)
Expected :"value from [test].conf file"
Actual :"value from [application].conf file"
Why the spec is reading the file application.conf instead of test.conf? Is something wrong in the build.sbt?:
lazy val testSettings = Seq(
Test / javaOptions += "-Dconfig.resource=/test.conf"
)
The best practice is to place application.conf into src/main/resources/ for regular use and into src/test/resources/ for testing. You'll have 2 conf files with different values in test. If you don't need to change config for test you can simply keep one conf file in main.
You don't have to override the file explicitly with -Dconfig.resource or -Dconfig.file for your tests because loading resources from class path will work as you expect out of the box and your test config will be used. -D option is mostly used at runtime to provide external configuration or in some complex build scenarios.
If you use -Dconfig.resource or -Dconfig.file pay attention to the path. config.resource is just a filename, i.e. application.conf which will be loaded as resource while config.file is an absolute or relative file path.
If you are making a library you can get fancy with reference.conf file having default values and application.conf overriding them. This could also work in your scenario but would be more confusing because that's not the purpose or reference file.
For the purposes of testing just use 2 application.conf files.
Additionally, here are some options to override config at runtime:
Overriding multiple config values in Typesafe config when using an uberjar to deploy.
Overriding configuration with environment variables in typesafe config
Scala environment ${USER} in application.conf
More info here: https://github.com/lightbend/config#standard-behavior
I have been trying to set up environment variables directly from build.sbt file as I need to use an assembly jar name which is defined in this file. So I've been trying to output the result of defined environment variable using echo and in the context of the Scala application code sys.props.get("SPARK_APP_JAR_PATH") but the result is blank or None, respectively.
What can be wrong with the environment variable configuration?
This how I defined the env variable in build.sbt:
assemblyJarName in assembly := "spark_app_example_test.jar"
mainClass in assembly := Some("example.SparkAppExample")
test in assembly := {}
fork := true
val envVars = sys.props("SPARK_APP_JAR_PATH") = "/path/to/jar/file"
That's how it works for me:
lazy val dockerRepo: String = sys.props.getOrElse("DOCKER_REPO", s"bpf.docker.repo")
Or with your example:
lazy val envVars = sys.props.getOrElse("SPARK_APP_JAR_PATH", "/path/to/jar/file")
According to the documentation:
System properties can be provided either as JVM options, or as SBT arguments, in both cases as -Dprop=value. The following properties influence SBT execution.
e.g. sbt -DSPARK_APP_JAR_PATH=/path/to/jar/file
See https://www.scala-sbt.org/release/docs/Command-Line-Reference.html#Command+Line+Options
I have defined a minimal build.sbt with two custom profiles ‘dev’ and ‘staging’ (what SBT seems to call Configurations). However, when I run SBT with the Configuration that was defined first in the file (dev), both Configuration blocks are executed - and if both modify the same setting, the last one wins (staging).
This seems to break any notion of conditional activation, so what am I doing wrong with SBT?
For reference, I want to emulate the conditionally activated Profiles concept of Maven e.g. mvn test -P staging.
SBT version: 1.2.1
build.sbt:
name := "example-project"
scalaVersion := "2.12.6"
...
fork := true
// Environment-independent JVM property (always works)
javaOptions += "-Da=b"
// Environment-specific JVM property (doesn’t work)
lazy val Dev = config("dev") extend Test
lazy val Staging = config("staging") extend Test
val root = (project in file("."))
.configs(Dev, Staging)
.settings(inConfig(Dev)(Seq(javaOptions in Test += "-Dfoo=bar")))
.settings(inConfig(Staging)(Seq(javaOptions in Test += "-Dfoo=qux")))
Command:
# Bad
sbt test
=> foo=qux
a=b
# Bad
sbt clean dev:test
=> foo=qux
a=b
# Good
sbt clean staging:test
=> foo=qux
a=b
Notice that despite of the inConfig usage you're still setting javaOptions in Test, i.e. in the Test config. If you remove in Test, it works as expected:
...
.settings(inConfig(Dev)(javaOptions += "-Dfoo=bar"))
.settings(inConfig(Staging)(javaOptions += "-Dfoo=qux"))
(also Seq(...) wrapping is unnecessary)
Now in sbt:
> show Test/javaOptions
[info] *
> show Dev/javaOptions
[info] * -Dfoo=bar
> show Staging/javaOptions
[info] * -Dfoo=qux
You can achieve the same result by scoping each setting explicitly (without inConfig wrapping):
.settings(
Dev/javaOptions += "-Dfoo=bar",
Staging/javaOptions += "-Dfoo=qux",
...
)
(here Conf/javaOptions is the same as javaOptions in Conf)
Where do you create application logs of app running in Docker so that I can map them later to host system ?
It's unclear in documentation for me whether I should create logs in some directory accessible without root permissions (then which one?) or I somehow can chown directory I need.
I tried using /var/log/case and /opt/case/logs with no success. Here is my minified SBT script
object build extends Build {
lazy val root = Project(
id = "case-server",
base = file("."),
settings = Defaults.coreDefaultSettings ++ graphSettings ++ Revolver.settings ++ Seq(
version := "1.15",
scalaVersion := "2.11.7",
libraryDependencies ++= {
val akkaV = "2.4.1"
val akkaStreamV = "2.0.1"
val scalaTestV = "2.2.5"
Seq(
"com.typesafe.scala-logging" %% "scala-logging-slf4j" % "2.1.2",
"ch.qos.logback" % "logback-classic" % "1.1.+",
"org.scalatest" %% "scalatest" % scalaTestV % "test"
)
},
dockerBaseImage := "develar/java",
dockerCommands := dockerCommands.value.flatMap {
case cmd#Cmd("FROM", _) => List(cmd, Cmd("RUN", "apk update && apk add bash"))
case other => List(other)
},
dockerExposedVolumes := Seq("/opt/case/logs"),
version in Docker := version.value
)
).enablePlugins(AssemblyPlugin).enablePlugins(JavaAppPackaging).enablePlugins(DockerPlugin)
}
What is the correct approach to do this ?
From my POV you should think about not creating any log-file but use (unbuffered) stdout for logging in containerized environments.
Please see here for why this is a good idea http://12factor.net/logs
You can then use 'docker logs' to get the logs like described here
You don't want your application log to be written on files on the container filesystem. The other solutions are:
write log files to a docker volume
write log entries to stdout (so that they will be forwarded to the Docker engine and available with the docker logs command)
Furthermore, if your application is writing its log entries to stdout, you can rely on Docker logging drivers to send those logs to syslog, journald, gelf, fluentd, awslogs, json files, or any log collecting system that provides a docker logging driver.
A trick to make an application write to stdout instead of to a file is to configure it to write to the special file /proc/self/fd/1. Anything written to this special file will be sent to stdout.
I have a little problem when I'm running the command sbt run :
$ sbt run
java.lang.NoSuchMethodError: com.typesafe.config.ConfigFactory.defaultApplication(Lcom/typesafe/config/ConfigParseOptions;)Lcom/typesafe/config/Config;
at play.api.Configuration$$anonfun$3.apply(Configuration.scala:75)
at play.api.Configuration$$anonfun$3.apply(Configuration.scala:71)
at scala.Option.getOrElse(Option.scala:121)
at play.api.Configuration$.load(Configuration.scala:71)
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:203)
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:61)
at play.utils.Threads$.withContextClassLoader(Threads.scala:21)
at play.core.server.DevServerStart$.mainDev(DevServerStart.scala:60)
at play.core.server.DevServerStart$.mainDevHttpMode(DevServerStart.scala:50)
at play.core.server.DevServerStart.mainDevHttpMode(DevServerStart.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at play.runsupport.Reloader$.startDevMode(Reloader.scala:207)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.devModeServer$lzycompute$1(PlayRun.scala:73)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.play$sbt$run$PlayRun$$anonfun$$anonfun$$anonfun$$devModeServer$1(PlayRun.scala:73)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:99)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:52)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
Most of errors that I met with sbt have one issue when I google it but for this kind of error, no idea how to fix that.
My file build.sbt:
import play.routes.compiler.InjectedRoutesGenerator
import play.sbt.PlayScala
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
cache,
ws,
filters,
"com.typesafe" % "config" % "1.0.0",
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.7.play24",
"com.amazonaws" % "aws-java-sdk" % "1.10.12",
"org.webjars" %% "webjars-play" % "2.4.0-1",
"org.webjars" % "bootstrap" % "3.3.5",
"org.webjars" % "angularjs" % "1.4.7",
"org.webjars" % "angular-ui-bootstrap" % "0.14.3",
"org.webjars" % "angular-ui-router" % "0.2.15"
)
resolvers += "Sonatype Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/"
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
resolvers += "Typesafe Releases" at "http://repo.typesafe.com/typesafe/releases/"
// Play provides two styles of routers, one expects its actions to be injected, the
// other, legacy style, accesses its actions statically.
routesGenerator := InjectedRoutesGenerator
Anybody see one issue for this problem ?
Update with application.conf :
# This is the main configuration file for the application.
# ~~~~~
# Secret key
# ~~~~~
# The secret key is used to secure cryptographics functions.
#
# This must be changed for production, but we recommend not changing it in this file.
#
# See http://www.playframework.com/documentation/latest/ApplicationSecret for more details.
play.crypto.secret = "changeme"
# The application languages
# ~~~~~
play.i18n.langs = [ "en" ]
# Router
# ~~~~~
# Define the Router object to use for this application.
# This router will be looked up first when the application is starting up,
# so make sure this is the entry point.
# Furthermore, it's assumed your route file is named properly.
# So for an application router like `my.application.Router`,
# you may need to define a router file `conf/my.application.routes`.
# Default to Routes in the root package (and conf/routes)
# play.http.router = my.application.Routes
# Database configuration
# ~~~~~
# You can declare as many datasources as you want.
# By convention, the default datasource is named `default`
#
# db.default.driver=org.h2.Driver
# db.default.url="jdbc:h2:mem:play"
# db.default.username=sa
# db.default.password=""
# Evolutions
# ~~~~~
# You can disable evolutions if needed
# play.evolutions.enabled=false
# You can disable evolutions for a specific datasource if necessary
# play.evolutions.db.default.enabled=false
play.modules.enabled += "play.modules.reactivemongo.ReactiveMongoModule"
project/plugins.sbt:
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.4.0")
Update "com.typesafe" % "config" % "1.0.0", to 1.3.0
According to github, missed method appears in 1.3.0 version of ConfigFactory
https://github.com/typesafehub/config/blob/master/config/src/main/java/com/typesafe/config/ConfigFactory.java
* #since 1.3.0
*
* #param options the options
* #return the default application configuration
*/
public static Config defaultApplication(ConfigParseOptions options) {
return parseApplicationConfig(ensureClassLoader(options, "defaultApplication"));
}
For the people meeting this problem, The way to solve it was to put the config-1.3.0.jar into the lib folder.
The same error can happen if you put some play libraries into a provided environment classpath (as with an assembled fat jar). For example, Flink 1.3.2 has an older com.typesafe.config library, not compatible with Play 2.5 or further.
The way to solve is this is by shading the libraries as explained in sbt-assembly.