Play logging system - scala

I'm a bit confused about the logging system in Play.
Without importing any logging library, I added this to my code:
Logger.debug("Data is: " + data)
It didn't cause a compilation error but at the same time, it didn't print anything in the terminal window where I started the activator(where I typed activator run).
After looking here https://www.playframework.com/documentation/2.5.x/ScalaLogging, I also tried:
val logger = Logger(this.getClass)
logger.debug("Data is: " + data)
However, again nothing is printed.
Why is this happening?

there is few log level You can set in application.conf according documentation.
# Root logger:
logger.root=ERROR
# Logger used by the framework:
logger.play=INFO
# Logger provided to your application:
logger.application=DEBUG
# Logger for a third party library
logger.org.springframework=INFO
Try set log level to debug in Your application.conf

Currently there is an issue in the default configuration for the logger in DEV mode https://github.com/playframework/playframework/issues/5842
The default level for applications is INFO so debug messages are not shown.
While that issue is not fixed the workaround is to override logback.xml
Following the example in https://www.playframework.com/documentation/2.5.x/SettingsLogger that defines the log level for application as DEBUG

1.Add below in build.sbt:
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.9.0"
2.Import following in your controller:
import com.typesafe.scalalogging.Logger
3.Use
private val logger = Logger(this.getClass)
logger.warn("your messages in here.")

Related

Error running spark in a Scala REPL - access denied org.apache.derby.security.SystemPermission( "engine", "usederbyinternals" )

I have been using IntelliJ for getting up to speed with developing Spark applications in Scala using sbt. I understand the basics although IntelliJ hides a lot of the scaffolding so I'd like to try getting something up and running from the command-line (i.e. using a REPL). I am using macOS.
Here's what I've done:
mkdir -p ~/tmp/scalasparkrepl
cd !$
echo 'scalaVersion := "2.11.12"' > build.sbt
echo 'libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.0"' >> build.sbt
echo 'libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0"' >> build.sbt
echo 'libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.3.0"' >> build.sbt
sbt console
That opens a scala REPL (including downloading all the dependencies) in which I run:
import org.apache.spark.SparkConf
import org.apache.spark.sql.{SparkSession, DataFrame}
val conf = new SparkConf().setMaster("local[*]")
val spark = SparkSession.builder().appName("spark repl").config(conf).config("spark.sql.warehouse.dir", "~/tmp/scalasparkreplhive").enableHiveSupport().getOrCreate()
spark.range(0, 1000).toDF()
which fails with error access denied org.apache.derby.security.SystemPermission( "engine", "usederbyinternals" ):
scala> spark.range(0, 1000).toDF()
18/05/08 11:51:11 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('~/tmp/scalasparkreplhive').
18/05/08 11:51:11 INFO SharedState: Warehouse path is '/tmp/scalasparkreplhive'.
18/05/08 11:51:12 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
18/05/08 11:51:12 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
18/05/08 11:51:12 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
18/05/08 11:51:12 INFO ObjectStore: ObjectStore, initialize called
18/05/08 11:51:13 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
18/05/08 11:51:13 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
java.security.AccessControlException: access denied org.apache.derby.security.SystemPermission( "engine", "usederbyinternals" )
I've googled around and there is some information on this error but nothing which I've been able to use to solve it. I find it strange that a scala/sbt project on the command-line would have this problem whereas a sbt project in IntelliJ works fine (I pretty much copied/pasted the code from an IntelliJ project). I guess IntelliJ is doing something on my behalf but I don't know what, that's why I'm undertaking this exercise.
Can anyone advise how to solve this problem?
Not going to take full credit for this, but it looks similar to SBT test does not work for spark test
The solution is to issue this line before running the Scala code:
System.setSecurityManager(null)
So in full:
System.setSecurityManager(null)
import org.apache.spark.SparkConf
import org.apache.spark.sql.{SparkSession, DataFrame}
val conf = new SparkConf().setMaster("local[*]")
val spark = SparkSession.builder().appName("spark repl").config(conf).config("spark.sql.warehouse.dir", "~/tmp/scalasparkreplhive").enableHiveSupport().getOrCreate()
spark.range(0, 1000).toDF()
You can set the permission appropriately, add this to your pre-init script:
export SBT_OPTS="-Djava.security.policy=runtime.policy"
Create a runtime.policy file:
grant codeBase "file:/home/user/.ivy2/cache/org.apache.derby/derby/jars/*" {
permission org.apache.derby.security.SystemPermission "engine", "usederbyinternals";
};
This assumes that your runtime.policy file resides in the current working directory and you're pulling Derby from your locally cached Ivy repository. Change the path to reflect the actual parent folder of the Derby Jar if necessary. The placement of the asterisk is significant, and this is not a traditional shell glob.
See also: https://docs.oracle.com/javase/7/docs/technotes/guides/security/PolicyFiles.html

Can we use akka.event.Logging to write logs in file?

I have tried using log4j and slf4j with akka in scala and I am able to get log files. Can I achieve the same thing without using any external api other than akka APIs? By using akka.event.Logging I am able to print logs in console, but I want to print it in a file.
I have already tried setting log4j.properties file for my project in classpath and its not working when I am using akka.event.Logging.
Please suggest.
Accordingly to this http://doc.akka.io/docs/akka/current/java/logging.html
you have 3 options:
Use akka.event.Logging$DefaultLogger (to stdout, not for production)
Use akka.event.slf4j.Slf4jLogger (logger by akka for SLF4J)
Use the SLF4J API directly (with async appender)
Your case is 2 or 3 (you use log4j.properties).
Therefore you should properly configure file log4j.properties for output in file.
And
in case 2 (your desired case), you should use akka.event.Logging, for example: Logging.getLogger(system.eventStream(), "my.string")
in case 3, you should use SLF4J API, for example: org.slf4j.LoggerFactory.getLogger(...)
Your case is 2, if you use in your akka config something like this:
akka {
loggers = ["akka.event.slf4j.Slf4jLogger"]
loglevel = "DEBUG"
logging-filter = "akka.event.slf4j.Slf4jLoggingFilter"
}

ReactiveMongo 0.12 application.conf issue and logging issue

I've read up everything I could on SO and the ReactiveMongo community list and I am stumped. I am using ReactiveMongo version 0.12 and am just trying to test it out since I have some other problems.
The code in my scala worksheet is:
import reactivemongo.api.{DefaultDB, MongoConnection, MongoDriver}
import reactivemongo.bson.{
BSONDocumentWriter, BSONDocumentReader, Macros, document
}
import com.typesafe.config.{Config, ConfigFactory}
lazy val conf = ConfigFactory.load()
val driver1 = new reactivemongo.api.MongoDriver
val connection3 = driver1.connection(List("localhost"))
and the error I get is
[NGSession 3: 127.0.0.1: compile-server] INFO reactivemongo.api.MongoDriver - No mongo-async-driver configuration found
com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka'
at com.typesafe.config.impl.SimpleConfig.findKey(testMongo.sc:120)
at com.typesafe.config.impl.SimpleConfig.find(testMongo.sc:143)
at com.typesafe.config.impl.SimpleConfig.find(testMongo.sc:155)
at com.typesafe.config.impl.SimpleConfig.find(testMongo.sc:160)
at com.typesafe.config.impl.SimpleConfig.getString(testMongo.sc:202)
at akka.actor.ActorSystem$Settings.<init>(testMongo.sc:165)
at akka.actor.ActorSystemImpl.<init>(testMongo.sc:501)
at akka.actor.ActorSystem$.apply(testMongo.sc:138)
at reactivemongo.api.MongoDriver.<init>(testMongo.sc:879)
at #worksheet#.driver1$lzycompute(testMongo.sc:9)
at #worksheet#.driver1(testMongo.sc:9)
at #worksheet#.get$$instance$$driver1(testMongo.sc:9)
at #worksheet#.#worksheet#(testMongo.sc:30)
My application.conf is in src/main/resources of the sub-project which this worksheet is found and contains this:
mongo-async-driver {
akka {
loglevel = WARNING
}
}
I added the ConfigFactory precisely because I got this error and thought it might help. I looked at the code and that's what ReactiveMongo is doing at this point so I thought perhaps a call here would force it to load at this point. I have moved the application.conf file into every conceivable place including a conf directory (thinking it might require play conventions) and the src/main/resources of the top level directory. Nothing works. So my first question is what am I doing wrong? Where should application.conf file go?
This info message causes my program to crash and driver doesn't get created so I can't move on from here.
Also, I added an akka key to reference.conf just in case - that didnt help either.

Console scala app doesn't stop when using reactive mongo driver

I'm playing with Mongo database through the Reactive Mongo driver
import org.slf4j.LoggerFactory
import reactivemongo.api.MongoDriver
import reactivemongo.api.collections.default.BSONCollection
import reactivemongo.bson.BSONDocument
import scala.concurrent.Future
import scala.concurrent.duration._
import scala.concurrent.ExecutionContext.Implicits.global
object Main {
val log = LoggerFactory.getLogger("Main")
def main(args: Array[String]): Unit = {
log.info("Start")
val conn = new MongoDriver().connection(List("localhost"))
val db = conn("test")
log.info("Done")
}
}
My build.sbt file:
lazy val root = (project in file(".")).
settings(
name := "simpleapp",
version := "1.0.0",
scalaVersion := "2.11.4",
libraryDependencies ++= Seq(
"org.reactivemongo" %% "reactivemongo" % "0.10.5.0.akka23",
"ch.qos.logback" % "logback-classic" % "1.1.2"
)
)
When I run: sbt compile run
I get this output:
$ sbt compile run
[success] Total time: 0 s, completed Apr 25, 2015 5:36:51 PM
[info] Running Main
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
17:36:52.328 [run-main-0] INFO Main - Start
17:36:52.333 [run-main-0] INFO Main - Done
And application doesn't stop.... :/
I have to press Ctrl + C to kill it
I've read that MongoDriver() creates ActorSystem so I tried to close connection manually with conn.close() but I get this:
[info] Running Main
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
17:42:23.252 [run-main-0] INFO Main - Start
17:42:23.258 [run-main-0] INFO Main - Done
17:42:23.403 [reactivemongo-akka.actor.default-dispatcher-2] ERROR reactivemongo.core.actors.MongoDBSystem - (State: Closing) UNHANDLED MESSAGE: ChannelConnected(-973180998)
[INFO] [04/25/2015 17:42:23.413] [reactivemongo-akka.actor.default-dispatcher-3] [akka://reactivemongo/deadLetters] Message [reactivemongo.core.actors.Closed$] from Actor[akka://reactivemongo/user/$b#-1700211063] to Actor[akka://reactivemongo/deadLetters] was not delivered. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
[INFO] [04/25/2015 17:42:23.414] [reactivemongo-akka.actor.default-dispatcher-3] [akka://reactivemongo/user/$a] Message [reactivemongo.core.actors.Close$] from Actor[akka://reactivemongo/user/$b#-1700211063] to Actor[akka://reactivemongo/user/$a#-1418324178] was not delivered. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
And app doesn't exit also
So, what am i doing wrong? I can'f find answer...
And it seems to me that official docs doesn't explain whether i should care about graceful shutdown at all.
I don't have much experience with console apps, i use play framework in my projects but i want to create sub-project that works with mongodb
I see many templates (in activator) such as: Play + Reactive Mongo, Play + Akka + Mongo but there's no Scala + Reactive Mongo that would explain how to work properly :/
I was having the same problem. The solution I found was invoking close on both object, the driver and the connection:
val driver = new MongoDriver
val connection = driver.connection(List("localhost"))
...
connection.close()
driver.close()
If you close only the connection, then the akka system remains alive.
Tested with ReactiveMongo 0.12
This looks like a known issue with Reactive Mongo, see the relevant thread on GitHub
A fix for this was introduced in this pull request #241 by reid-spencer, merged on the 3rd of February 2015
You should be able to fix it by using a newer version. If no release has been made since February, you could try checking out a version that includes this fix and building the code yourself.
As far as I can see, there's no mention of this bugfix in the release notes for version 0.10.5
Bugfixes:
BSON library: fix BSONDateTimeNumberLike typeclass
Cursor: fix exception propagation
Commands: fix ok deserialization for some cases
Commands: fix CollStatsResult
Commands: fix AddToSet in aggregation
Core: fix connection leak in some cases
GenericCollection: do not ignore WriteConcern in save()
GenericCollection: do not ignore WriteConcern in bulk inserts
GridFS: fix uploadDate deserialization field
Indexes: fix parsing for Ascending and Descending
Macros: fix type aliases
Macros: allow custom annotations
The name of the committer does not appear as well:
Here is the list of the commits included in this release (since 0.9, the top commit is the most recent one):
$ git shortlog -s -n refs/tags/v0.10.0..0.10.5.x.akka23
39 Stephane Godbillon
5 Andrey Neverov
4 lucasrpb
3 Faissal Boutaounte
2 杨博 (Yang Bo)
2 Nikolay Sokolov
1 David Liman
1 Maksim Gurtovenko
1 Age Mooij
1 Paulo "JCranky" Siqueira
1 Daniel Armak
1 Viktor Taranenko
1 Vincent Debergue
1 Andrea Lattuada
1 pavel.glushchenko
1 Jacek Laskowski
Looking at the commit history for 0.10.5.0.akka23 (the one you reference in build.sbt), it seems the fix was not merged into it.

Play Framework 2 -- custom loggers in production?

I have a class with a custom logger. Here's a trivial example:
package models
import play.Logger
object AModel {
val log = Logger.of("amodel")
def aMethod() {
if (! log.isInfoEnabled) log.error("Can't log info...")
log.info("Logging aMethod in AModel")
}
}
and then we'll enable this logger in application.conf:
logger.amodel=DEBUG
and in development (Play console, use run) this logger does indeed log. But in production, once we hit the message
[info] play - Application started (Prod)
loggers defined like the above logger fail to log any further and instead we go through the error branch. It seems their log level has been changed to ERROR.
Is there anyway to correct this undesirable state of affairs? Is there special configuration for production logs?
edit
Play's handling of logs in production is a source of difficulty to more than a few people... https://github.com/playframework/playframework/issues/1186
For some reason it ships its own logger.xml which overrides application.conf.