Cannot format log messages in play framework scala - scala

I cannot get log messages formatted correctly in play 2.2.1 scala.
I am using the standard application-logger.xml file in the conf directory, as described here: http://www.playframework.com/documentation/2.2.1/SettingsLogger
I also have commented out all logging specific settings in application.conf.
Yet, when trying to log something with this code from within one of my controllers:
import play.api.Logger
...
play.api.Logger.info("hello")
my logs in logs/application.log look like this:
2014-01-09 19:06:25,536 - [INFO] - from application in play-akka.actor.default-dispatcher-5
hello
So, apparently the formatting is ignored for my "hello" log entry.
I would have expected somthing like this:
2014-01-09 19:06:25,536 - [INFO] - from application in play-akka.actor.default-dispatcher-5
2014-01-09 19:06:25,536 - [INFO] - from application in play-akka.actor.default-dispatcher-5 hello
What am i missing?

If you're using the sample configuration from the documentation then it seems to me that it's outputting exactly what the pattern specifies.
%date - [%level] - from %logger in %thread %n%message%n%xException%n
The first bit:
%date - [%level] - from %logger in %thread
> 2014-01-09 19:06:25,536 - [INFO] - from application in play-akka.actor.default-dispatcher-5
Followed by a new line:
%n
Followed by the message, exception, and another new line:
%message%n%xException%n
> hello
>
To get what you're expecting you would need to repeat that first bit of the pattern between the first %n and %message.
%date - [%level] - from %logger in %thread %n%date - [%level] - from %logger in %thread %message%n%xException%n

Related

Spark custom log4j integration for Scala application

I am new to spark and Scala as well,
Question is I am not able to debug my application.
I have developed a spark application using Maven in Scala.
But I am not able to log the details, meaning not getting where that log file is getting generated, cause as per log4j property, log file is not available at given path.
Any specific changes I need to do, to get that log file.
I am testing my application in Hortonworks.
Command for submitting the app:
bin/spark-submit --master yarn-cluster --class com.examples.MainExample lib/Test.jar
log4j.properties file is kept at src/resources folder
PFB log4j.properties
log4j.appender.myConsoleAppender=org.apache.log4j.ConsoleAppender
log4j.appender.myConsoleAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.myConsoleAppender.layout.ConversionPattern=%d [%t] %-5p %c - %m%n
log4j.appender.RollingAppender=org.apache.log4j.DailyRollingFileAppender
log4j.appender.RollingAppender.File=/var/log/spark.log
log4j.appender.RollingAppender.DatePattern='.'yyyy-MM-dd
log4j.appender.RollingAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.RollingAppender.layout.ConversionPattern=[%p] %d %c %M - %m%n
# By default, everything goes to console and file
log4j.rootLogger=INFO, myConsoleAppender, RollingAppender
# The noisier spark logs go to file only
log4j.logger.spark.storage=INFO, RollingAppender
log4j.additivity.spark.storage=false
log4j.logger.spark.scheduler=INFO, RollingAppender
log4j.additivity.spark.scheduler=false
log4j.logger.spark.CacheTracker=INFO, RollingAppender
log4j.additivity.spark.CacheTracker=false
log4j.logger.spark.CacheTrackerActor=INFO, RollingAppender
log4j.additivity.spark.CacheTrackerActor=false
log4j.logger.spark.MapOutputTrackerActor=INFO, RollingAppender
log4j.additivity.spark.MapOutputTrackerActor=false
log4j.logger.spark.MapOutputTracker=INFO, RollingAppender
log4j.additivty.spark.MapOutputTracker=false
Not able to solve this issue via application but, in if you change log4j.properties in conf folder as below, it will write logs to give file.
Make sure the path has write access.
log4j.rootLogger=INFO, FILE
# Set everything to be logged to the console
log4j.rootCategory=INFO, FILE
log4j.appender.FILE=org.apache.log4j.FileAppender
log4j.appender.FILE.File=/tmp/sparkLog/SparkOut.log
log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
log4j.appender.FILE.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
# Settings to quiet third party logs that are too verbose
log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
Try placing the log4j.properties inside 'src/main/scala/resources'.

Turn off logging of Solr in "sbt test"

I am using Solr in Scala. I have a test case that adds some documents into Solr core.
When running sbt test, the following information is shown repeatedly:
15/12/03 01:17:50 INFO LogUpdateProcessor: [test] webapp=null path=/update params={} {add=[(null)]} 0 2
In an attempt to suppress it, I put a log4j.properties with content:
.level=WARNING
org.apache.solr.core.level=WARNING
org.apache.solr.update.processor.level=WARNING
under both ${project_dir}/src/main/resources and ${project_dir}/src/test/resources
However, the log message is still there.
I am using :
Scala 2.11.5
solr-solrj 5.3.1
solr-core 5.3.1
sbt 0.1.0
The log4j.properties file is malformatted.
The following content works:
log4j.rootLogger=WARN, stdout
# Direct log messages to stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n>

Filter log4j appender to only include messages from class

I am trying to create a log4j file appender that includes only messages for a specific class. Here is an example message:
2015-08-06 16:41:43,773 [pool-3-thread-8] INFO ACME.log- new test message
Where I want all log messages for ACME.log to go to a specific appender. Here is code I currently have in properties file that is not working.
log4j.appender.myLog = org.apache.log4j.DailyRollingFileAppender
log4j.appender.myLog.File = /opt/netiq/idm/apps/tomcat/logs/acme.log
log4j.appender.myLog.layout = org.apache.log4j.PatternLayout
log4j.appender.myLog.layout.ConversionPattern = %d{yyyy-MM-dd HH:mm:ss} %-5p %m%n
log4j.appender.myLog.filter.filter.ID=ACME.log
log4j.appender.myLog.filter.ID.levelMin=INFO
log4j.appender.myLog.filter.ID.levelMax=ERROR
log4j.appender.myLog.filter.2 = org.apache.log4j.varia.DenyAllFilter
If you're using log4j 1.x, we strongly recommend that you use org.apache.log4j.rolling.RollingFileAppender 1 instead of org.apache.log4j.DailyRollingFileAppender (may lose messages, Bug 43374).
So the configuration of you appender can be:
log4j.appender.myLog=org.apache.log4j.rolling.RollingFileAppender
log4j.appender.myLog.rollingPolicy=org.apache.log4j.rolling.TimeBasedRollingPolicy
log4j.appender.myLog.rollingPolicy.fileNamePattern=/path/acme%d{yyyy-MM-dd}.log
log4j.appender.myLog.layout=org.apache.log4j.PatternLayout
log4j.appender.myLog.layout.ConversionPattern=%d %-5p (%c.java:%L).%M - %m%n
log4j.appender.myLog.filter.A=org.apache.log4j.varia.LevelRangeFilter
log4j.appender.myLog.filter.A.LevelMin=INFO
log4j.appender.myLog.filter.A.LevelMax=ERROR
log4j.appender.myLog.filter.A.AcceptOnMatch=true
If you only want the log of a class (e.g. com.company.MyClass), you specify it as follows:
log4j.logger.com.company.MyClass=TRACE, myLog
Notes
In that case, you need to add the respective jar (apache-log4j-extras-1.2.17.jar).

How do I suppress the bloat of useless information when using the DUMP command while using grunt via 'pig -x local'?

I'm working with PigLatin, using grunt, and every time I 'dump' stuffs, my console gets clobbered with blah blah, blah non-info, is there a way to surpress all that?
grunt> A = LOAD 'testingData' USING PigStorage(':'); dump A;
2013-05-06 19:42:04,146 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: UNKNOWN
2013-05-06 19:42:04,147 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
...
...
--- another like 50 lines of useless context clobbering junk here... till ---
...
...
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Success!
now my like 4 lines of info looking for:
(daemon,*,1,1,System Services,/var/root,/usr/bin/false)
(uucp,*,,,/var/spool/uucp,/usr/sbin/uucico)
(taskgated,*,13,13,Task Gate Daemon,/var/empty,/usr/bin/false)
(networkd,*,24,24,Network Services,/var/empty,/usr/bin/false)
(installassistant,*,25,25,/usr/bin/false)
grunt>
---> obviously if it errors, fine lotsa info helpful, but not when it basically works great.
You need to set the log4j properties.
For example:
$PIG_HOME/conf/pig.properties :
enable:
# log4jconf=./conf/log4j.properties
rename: log4j.properties.template -> log4j.properties
log4j.properties :
set info to error:
log4j.logger.org.apache.pig=info, A
You may also set the Hadoop related logging level as well:
log4j.logger.org.apache.hadoop = error, A
An easy way to do this seems to be to redirect standard error as below.
But it will suppress all errors.
pig -x local 2> /dev/null
Also found that if you remove or rename your hadoop install directory to basically make it inaccessible to pig then all those INFO messages go away. Changing logging levels in hadoop didn't help, just so that you know.
When you start pig, pass it a log4j.properties file with pig -4 <filename>.
In my case there was a log4j.properties in the conf directory and setting the level of the logger named org.apache.pig to ERROR is sufficient to make the logger less verbose.
log4j.logger.org.apache.pig=ERROR, A
pig has debug log level one need to set that in pig.properties file,
# Logging level. debug=OFF|ERROR|WARN|INFO|DEBUG (default: INFO)
#
# debug=INFO
The reason one get large logs on console, e.g. change it to ERROR

How to use Plack::Middleware::CSRFBlock with Dancer?

I want to protect all forms from CSRF with Dancer.
I tried using Plack::Middleware::CSRFBlock, but the error said "CSRFBlock needs Session.". Even if I use Plack::Session, forms didn't have a hidden input field that contains one time token.
Are there any good practice to do this? Any advice much appreciated.
my environment/development.yml is:
# configuration file for development environment
# the logger engine to use
# console: log messages to STDOUT (your console where you started the
# application server)
# file: log message to a file in log/
logger: "console"
# the log level for this environment
# core is the lowest, it shows Dancer's core log messages as well as yours
# (debug, info, warning and error)
log: "core"
# should Dancer consider warnings as critical errors?
warnings: 1
# should Dancer show a stacktrace when an error is caught?
show_errors: 1
# auto_reload is a development and experimental feature
# you should enable it by yourself if you want it
# Module::Refresh is needed
#
# Be aware it's unstable and may cause a memory leak.
# DO NOT EVER USE THIS FEATURE IN PRODUCTION
# OR TINY KITTENS SHALL DIE WITH LOTS OF SUFFERING
auto_reload: 0
session: Simple
#session: YAML
plack_middlewares:
-
#- Session
- CSRFBlock
- Debug
- panels
-
- Parameters
- Dancer::Version
- Dancer::Settings
- Memory
and the route is:
get '/test' => sub {
return <<EOM
<!DOCTYPE html>
<html>
<head><title>test route</title></head>
<body>
<form action="./foobar" method="post">
<input type="text"/>
<input type="submit"/>
</form>
</body>
</html>
EOM
};
Well, I noticed the Debug panel isn't shown, meaning Plack::Middlewares::Debug isn't loaded.
With help from How to use Dancer with Plack middlewares | PerlDancer Advent Calendar and Plack::Middleware::Debug::Dancer::Version I managed to turn it on
session: PSGI
## Dancer::Session::PSGI
plack_middlewares:
-
- Session
-
- CSRFBlock
-
- Debug
## panels is an argument for Debug, as in
## enable 'Debug', panels => [ qw( Parameters Response Environment Session Timer Dancer::Logger Dancer::Settings Dancer::Version ) ];
- panels
-
- Parameters
- Response
- Environment
- Session
- Timer
- Dancer::Logger
- Dancer::Settings
- Dancer::Version
#Plack::Middleware::Debug::Dancer::Version