SBT run fails when I'm launching it - scala

I have a little problem when I'm running the command sbt run :
$ sbt run
java.lang.NoSuchMethodError: com.typesafe.config.ConfigFactory.defaultApplication(Lcom/typesafe/config/ConfigParseOptions;)Lcom/typesafe/config/Config;
at play.api.Configuration$$anonfun$3.apply(Configuration.scala:75)
at play.api.Configuration$$anonfun$3.apply(Configuration.scala:71)
at scala.Option.getOrElse(Option.scala:121)
at play.api.Configuration$.load(Configuration.scala:71)
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:203)
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:61)
at play.utils.Threads$.withContextClassLoader(Threads.scala:21)
at play.core.server.DevServerStart$.mainDev(DevServerStart.scala:60)
at play.core.server.DevServerStart$.mainDevHttpMode(DevServerStart.scala:50)
at play.core.server.DevServerStart.mainDevHttpMode(DevServerStart.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at play.runsupport.Reloader$.startDevMode(Reloader.scala:207)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.devModeServer$lzycompute$1(PlayRun.scala:73)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.play$sbt$run$PlayRun$$anonfun$$anonfun$$anonfun$$devModeServer$1(PlayRun.scala:73)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:99)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:52)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
Most of errors that I met with sbt have one issue when I google it but for this kind of error, no idea how to fix that.
My file build.sbt:
import play.routes.compiler.InjectedRoutesGenerator
import play.sbt.PlayScala
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
cache,
ws,
filters,
"com.typesafe" % "config" % "1.0.0",
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.7.play24",
"com.amazonaws" % "aws-java-sdk" % "1.10.12",
"org.webjars" %% "webjars-play" % "2.4.0-1",
"org.webjars" % "bootstrap" % "3.3.5",
"org.webjars" % "angularjs" % "1.4.7",
"org.webjars" % "angular-ui-bootstrap" % "0.14.3",
"org.webjars" % "angular-ui-router" % "0.2.15"
)
resolvers += "Sonatype Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/"
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
resolvers += "Typesafe Releases" at "http://repo.typesafe.com/typesafe/releases/"
// Play provides two styles of routers, one expects its actions to be injected, the
// other, legacy style, accesses its actions statically.
routesGenerator := InjectedRoutesGenerator
Anybody see one issue for this problem ?
Update with application.conf :
# This is the main configuration file for the application.
# ~~~~~
# Secret key
# ~~~~~
# The secret key is used to secure cryptographics functions.
#
# This must be changed for production, but we recommend not changing it in this file.
#
# See http://www.playframework.com/documentation/latest/ApplicationSecret for more details.
play.crypto.secret = "changeme"
# The application languages
# ~~~~~
play.i18n.langs = [ "en" ]
# Router
# ~~~~~
# Define the Router object to use for this application.
# This router will be looked up first when the application is starting up,
# so make sure this is the entry point.
# Furthermore, it's assumed your route file is named properly.
# So for an application router like `my.application.Router`,
# you may need to define a router file `conf/my.application.routes`.
# Default to Routes in the root package (and conf/routes)
# play.http.router = my.application.Routes
# Database configuration
# ~~~~~
# You can declare as many datasources as you want.
# By convention, the default datasource is named `default`
#
# db.default.driver=org.h2.Driver
# db.default.url="jdbc:h2:mem:play"
# db.default.username=sa
# db.default.password=""
# Evolutions
# ~~~~~
# You can disable evolutions if needed
# play.evolutions.enabled=false
# You can disable evolutions for a specific datasource if necessary
# play.evolutions.db.default.enabled=false
play.modules.enabled += "play.modules.reactivemongo.ReactiveMongoModule"
project/plugins.sbt:
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.4.0")

Update "com.typesafe" % "config" % "1.0.0", to 1.3.0
According to github, missed method appears in 1.3.0 version of ConfigFactory
https://github.com/typesafehub/config/blob/master/config/src/main/java/com/typesafe/config/ConfigFactory.java
* #since 1.3.0
*
* #param options the options
* #return the default application configuration
*/
public static Config defaultApplication(ConfigParseOptions options) {
return parseApplicationConfig(ensureClassLoader(options, "defaultApplication"));
}

For the people meeting this problem, The way to solve it was to put the config-1.3.0.jar into the lib folder.

The same error can happen if you put some play libraries into a provided environment classpath (as with an assembled fat jar). For example, Flink 1.3.2 has an older com.typesafe.config library, not compatible with Play 2.5 or further.
The way to solve is this is by shading the libraries as explained in sbt-assembly.

Related

Reading configuration files from unit tests

I have a very simply project.
build.sbt:
scalaVersion := "2.13.5"
lazy val testSettings = Seq(
Test / javaOptions += "-Dconfig.resource=/test.conf"
)
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "3.2.3" % Test,
"com.typesafe" % "config" % "1.4.1"
)
Two configuration files under resources folder:
application.conf
some-value = "value from application.conf file"
test.conf
some-value = "value from test.conf file"
And only 1 spec test class:
SomeTestSpec:
class SomeTestSpec extends AnyFlatSpec with Matchers {
val config: Config = ConfigFactory.load()
"some test" should "read and print from test.conf" in {
val value = config.getString("some-value")
println(s"value of some-value = $value")
value shouldBe "value from test.conf file"
}
}
When I run the test if fails:
"value from [application].conf file" was not equal to "value from [test].conf file"
ScalaTestFailureLocation: SomeTestSpec at (SomeTestSpec.scala:12)
Expected :"value from [test].conf file"
Actual :"value from [application].conf file"
Why the spec is reading the file application.conf instead of test.conf? Is something wrong in the build.sbt?:
lazy val testSettings = Seq(
Test / javaOptions += "-Dconfig.resource=/test.conf"
)
The best practice is to place application.conf into src/main/resources/ for regular use and into src/test/resources/ for testing. You'll have 2 conf files with different values in test. If you don't need to change config for test you can simply keep one conf file in main.
You don't have to override the file explicitly with -Dconfig.resource or -Dconfig.file for your tests because loading resources from class path will work as you expect out of the box and your test config will be used. -D option is mostly used at runtime to provide external configuration or in some complex build scenarios.
If you use -Dconfig.resource or -Dconfig.file pay attention to the path. config.resource is just a filename, i.e. application.conf which will be loaded as resource while config.file is an absolute or relative file path.
If you are making a library you can get fancy with reference.conf file having default values and application.conf overriding them. This could also work in your scenario but would be more confusing because that's not the purpose or reference file.
For the purposes of testing just use 2 application.conf files.
Additionally, here are some options to override config at runtime:
Overriding multiple config values in Typesafe config when using an uberjar to deploy.
Overriding configuration with environment variables in typesafe config
Scala environment ${USER} in application.conf
More info here: https://github.com/lightbend/config#standard-behavior

Multiple SBT Configurations should be exclusive, but they all activate at the same time - why?

I have defined a minimal build.sbt with two custom profiles ‘dev’ and ‘staging’ (what SBT seems to call Configurations). However, when I run SBT with the Configuration that was defined first in the file (dev), both Configuration blocks are executed - and if both modify the same setting, the last one wins (staging).
This seems to break any notion of conditional activation, so what am I doing wrong with SBT?
For reference, I want to emulate the conditionally activated Profiles concept of Maven e.g. mvn test -P staging.
SBT version: 1.2.1
build.sbt:
name := "example-project"
scalaVersion := "2.12.6"
...
fork := true
// Environment-independent JVM property (always works)
javaOptions += "-Da=b"
// Environment-specific JVM property (doesn’t work)
lazy val Dev = config("dev") extend Test
lazy val Staging = config("staging") extend Test
val root = (project in file("."))
.configs(Dev, Staging)
.settings(inConfig(Dev)(Seq(javaOptions in Test += "-Dfoo=bar")))
.settings(inConfig(Staging)(Seq(javaOptions in Test += "-Dfoo=qux")))
Command:
# Bad
sbt test
=> foo=qux
a=b
# Bad
sbt clean dev:test
=> foo=qux
a=b
# Good
sbt clean staging:test
=> foo=qux
a=b
Notice that despite of the inConfig usage you're still setting javaOptions in Test, i.e. in the Test config. If you remove in Test, it works as expected:
...
.settings(inConfig(Dev)(javaOptions += "-Dfoo=bar"))
.settings(inConfig(Staging)(javaOptions += "-Dfoo=qux"))
(also Seq(...) wrapping is unnecessary)
Now in sbt:
> show Test/javaOptions
[info] *
> show Dev/javaOptions
[info] * -Dfoo=bar
> show Staging/javaOptions
[info] * -Dfoo=qux
You can achieve the same result by scoping each setting explicitly (without inConfig wrapping):
.settings(
Dev/javaOptions += "-Dfoo=bar",
Staging/javaOptions += "-Dfoo=qux",
...
)
(here Conf/javaOptions is the same as javaOptions in Conf)

SLF4J : simplelogger.properties in the project not detected

I am using Grizzled-SLF4J (a wrapper around SLF4J) for my Spark/Scala/SBT Project. The property file simplelogger.properties has been placed in the src/main/resources. But the property file is not getting detected when I run the Application using spark-submit. Whatever the change I make to property file is not getting reflected and seems like some default values for the properties are used(In my case only WARN / ERROR messages are displayed).
Here is my build.sbt
lazy val root = (project in file(".")).
settings(
name := "myprojectname",
...,
libraryDependencies ++= Seq(
"org.clapper" %% "grizzled-slf4j" % "1.3.0",
"org.slf4j" % "slf4j-simple" % "1.7.22",
"org.slf4j" % "slf4j-api" % "1.7.22",
)
)
simplelogger.properties
org.slf4j.simpleLogger.logFile = System.err
org.slf4j.simpleLogger.defaultLogLevel = debug
org.slf4j.simpleLogger.showDateTime = false
org.slf4j.simpleLogger.dateTimeFormat = yyyy'/'MM'/'dd' 'HH':'mm':'ss'-'S
org.slf4j.simpleLogger.showThreadName = true
org.slf4j.simpleLogger.showLogName = true
org.slf4j.simpleLogger.showShortLogName= false
org.slf4j.simpleLogger.levelInBrackets = true
Am I missing something here?
PS : I did checked the Jar and simplelogger.properties is available in the root directory
Instead of adding it to the root, add it under resources, next to scala code folder:
/
src
main
resources
simplelogger.properties
scala
Obviously this is not only valid for simplelogger.properties but for any other kind of file that you want in the classpath at runtime.
I had the same struggle until I tried to set the properties manually using System.setProperty(org.slf4j.simpleLogger.defaultLogLevel,"DEBUG");
and I noticed that these properties were not present in the slf4j-simple-1.6.1.jar
Make sure to get at least slf4j-simple-1.7.25.jar . The prior versions don't support properties
org.slf4j.simpleLogger.logFile
or
org.slf4j.simpleLogger.defaultLogLevel
Upgrading to 1.7.25 picked up the simple logger config from simplelogger.properties

Connecting a Play! application to a postgres (with Postgis) database with Docker-compose

I'm trying to start my Play! 2.4 application using a Postgres database with Docker-compose.
I manage to start my Play! application alone (but it doesn't work since it can't connect to the database). And I also manage to start my postgis database using the image mdillon/postgis:9.4.
My Dockerfile is:
FROM mdillon/postgis:9.4
ADD init.sql /docker-entrypoint-initdb.d/
Here is my init.sql file:
CREATE USER simon WITH PASSWORD 'mySecretPassword';
ALTER USER simon WITH SUPERUSER;
CREATE DATABASE ticketapp;
GRANT ALL PRIVILEGES ON DATABASE ticketapp TO simon;
\connect ticketapp simon
CREATE EXTENSION postgis;
CREATE DATABASE tests;
GRANT ALL PRIVILEGES ON DATABASE tests TO simon;
\connect tests simon
CREATE EXTENSION postgis;
(I think that it is not necessary to create the extension as it seems to be already done.)
If I run my docker database and manually run the init.sql script, I can add a table with a Geometry type as a column.
Now comes my problem: if I try to link my two services with Docker-compose and the following docker-compose.yml file:
5.run:
image: 5.run
ports:
- "88:88"
links:
- dbHost
dbHost:
image: my_postgres
ports:
- "5433:5433"
expose:
- "5433"
I get the following errors:
dbHost_1 | LOG: database system is ready to accept connections
dbHost_1 | ERROR: relation "play_evolutions" does not exist at character 72
dbHost_1 | STATEMENT: select id, hash, apply_script, revert_script, state, last_problem from play_evolutions where state like 'applying_%'
dbHost_1 | ERROR: type "geometry" does not exist at character 150
dbHost_1 | STATEMENT: CREATE TABLE frenchCities (
dbHost_1 | cityId SERIAL PRIMARY KEY,
dbHost_1 | city VARCHAR(255) NOT NULL,
dbHost_1 | geographicPoint GEOMETRY NOT NULL
dbHost_1 | )
5.run_1 | [error] p.a.d.e.DefaultEvolutionsApi - ERROR: type "geometry" does not exist
5.run_1 | Position: 150 [ERROR:0, SQLSTATE:42704]
Please note that my Play! application is correctly waiting the database to be ready.
Now I don't have any idea of what should be done in order to make it work, any clue would be great!
At fist I will show you how I config my db(edit file for conection) and then how I config it in play.
sudo su
apt-get install scala //is important for play
apt-get -y install postgresql
sudo -u postgres psql
\password postgres
CREATE USER andi WITH PASSWORD 'pw';
create TABLE play OWNER TO andi;
\q
gui for postgres
apt-get -y install pgadmin3
You have to edit the file(root access):
you have to write md5 in the following data.
You have to delete the comments in you data only for explaining.
/etc/postgresql/9.4/main/pg_hba.conf
here is deffiniert who and how they can log in
# Database administrative login by Unix domain socket
local all postgres md5 // here
#here is deffiniert who and how they can log in
# TYPE DATABASE USER ADDRESS METHOD
#here is deffiniert who and how they can log in# "local" is for Unix domain socket connections only
#here is deffiniert who and how they can log inlocal all all md5
# IPv4 local connections:
host all all 127.0.0.1/32 md5 //and here
host all andi 127.0.0.1/32 md5 //and here
# IPv6 local connections:
host all all ::1/128 md5
play
build.sbt
import _root_.sbt.Keys._
import _root_.sbt._
name := """has"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayJava, PlayEbean)
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
cache,
javaWs,
javaCore,
javaJdbc,
"com.github.nscala-time" %% "nscala-time" % "1.6.0",
"org.postgresql" % "postgresql" % "9.3-1102-jdbc41",
"jp.t2v" %% "play2-auth" % "0.13.0",
"jp.t2v" %% "play2-auth-test" % "0.13.0" % "test",
"org.webjars" % "bootstrap" % "3.3.5"
)
lazy val myProject = (project in file("."))
.enablePlugins(PlayJava, PlayEbean)
val appDependencies = Seq(
"mimerender" %% "mimerender" % "0.1.2"
)
conf/application.conf
play.crypto.secret = "aKr4Mfn!vKzDjfhfdJRsakgbPS35!!HVDldkosGHRT"
# The application languages
play.i18n.langs = [ "en" ]
db.default.user=andi
db.default.password="pw"
db.default.driver=org.postgresql.Driver
db.default.url="jdbc:postgresql://localhost:5432/play"
#The following line define define the model folder. You have to add a folder called models in the app folder.
ebean.default = "models.*"
# Root logger:
logger.root=ERROR
play.evolutions.enabled=true
# Logger used by the framework:
logger.play=INFO
# Logger provided to your application:
logger.application=DEBUG
project/plugins.sbt
Normally you have only uncomment the last line.
// The Play plugin
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.4.4")
// Web plugins
addSbtPlugin("com.typesafe.sbt" % "sbt-coffeescript" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-less" % "1.0.6")
addSbtPlugin("com.typesafe.sbt" % "sbt-jshint" % "1.0.3")
addSbtPlugin("com.typesafe.sbt" % "sbt-rjs" % "1.0.7")
addSbtPlugin("com.typesafe.sbt" % "sbt-digest" % "1.1.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-mocha" % "1.1.0")
// Play enhancer - this automatically generates getters/setters for public fields
// and rewrites accessors of these fields to use the getters/setters. Remove this
// plugin if you prefer not to have this feature, or disable on a per project
// basis using disablePlugins(PlayEnhancer) in your build.sbt
addSbtPlugin("com.typesafe.sbt" % "sbt-play-enhancer" % "1.1.0")
// Play Ebean support, to enable, uncomment this line, and enable in your build.sbt using
// enablePlugins(SbtEbean). Note, uncommenting this line will automatically bring in
// Play enhancer, regardless of whether the line above is commented out or not.
addSbtPlugin("com.typesafe.sbt" % "sbt-play-ebean" % "1.0.0")
I hope that this is what you need.

Where do you create application logs of app running in Docker?

Where do you create application logs of app running in Docker so that I can map them later to host system ?
It's unclear in documentation for me whether I should create logs in some directory accessible without root permissions (then which one?) or I somehow can chown directory I need.
I tried using /var/log/case and /opt/case/logs with no success. Here is my minified SBT script
object build extends Build {
lazy val root = Project(
id = "case-server",
base = file("."),
settings = Defaults.coreDefaultSettings ++ graphSettings ++ Revolver.settings ++ Seq(
version := "1.15",
scalaVersion := "2.11.7",
libraryDependencies ++= {
val akkaV = "2.4.1"
val akkaStreamV = "2.0.1"
val scalaTestV = "2.2.5"
Seq(
"com.typesafe.scala-logging" %% "scala-logging-slf4j" % "2.1.2",
"ch.qos.logback" % "logback-classic" % "1.1.+",
"org.scalatest" %% "scalatest" % scalaTestV % "test"
)
},
dockerBaseImage := "develar/java",
dockerCommands := dockerCommands.value.flatMap {
case cmd#Cmd("FROM", _) => List(cmd, Cmd("RUN", "apk update && apk add bash"))
case other => List(other)
},
dockerExposedVolumes := Seq("/opt/case/logs"),
version in Docker := version.value
)
).enablePlugins(AssemblyPlugin).enablePlugins(JavaAppPackaging).enablePlugins(DockerPlugin)
}
What is the correct approach to do this ?
From my POV you should think about not creating any log-file but use (unbuffered) stdout for logging in containerized environments.
Please see here for why this is a good idea http://12factor.net/logs
You can then use 'docker logs' to get the logs like described here
You don't want your application log to be written on files on the container filesystem. The other solutions are:
write log files to a docker volume
write log entries to stdout (so that they will be forwarded to the Docker engine and available with the docker logs command)
Furthermore, if your application is writing its log entries to stdout, you can rely on Docker logging drivers to send those logs to syslog, journald, gelf, fluentd, awslogs, json files, or any log collecting system that provides a docker logging driver.
A trick to make an application write to stdout instead of to a file is to configure it to write to the special file /proc/self/fd/1. Anything written to this special file will be sent to stdout.