How to substitute method name and object name in Logger Statements in scala - scala

import org.slf4j.LoggerFactory
object add {
private val LOGGER = LoggerFactory.getLogger(this.getClass)
def addAll() {
LOGGER.info("Start addAll for add Object")
}
}
In the logger statment I don't want to hard code addALL method and add class. How can I
provide a substitution for these In logger statement with take care of perfomance overhead.

You will have to configure logback file something like below.
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>INFO</level>
</filter>
<encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %class{36}.%M %L - %msg%n</pattern>
</encoder>
</appender>
You can have a look at Logback doc

Related

Override typesafe config logger for using in my logback

There is a problem I have in my typesafe config + logback config + refined types + pure config
When my config is not validated I get a raw exception without logback appender formatting use
for example, my logback looks like
How can I solve my problem?
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<if condition='property("ENVIRONMENT").equalsIgnoreCase("production")'>
<then>
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<timestamp/>
<version/>
<mdc/>
<pattern>
<pattern>
{
"logger": "%logger",
"level": "%level",
"severity": "%level",
"thread": "%thread",
"application": "argos-scheduler",
"environment": "${ENVIRONMENT}",
"message": "%.-5000message"
}
</pattern>
</pattern>
<stackTrace>
<throwableConverter class="net.logstash.logback.stacktrace.ShortenedThrowableConverter">
<maxLength>2048</maxLength>
<exclude>sun\.reflect\..*\.invoke.*</exclude>
<exclude>net\.sf\.cglib\.proxy\.MethodProxy\.invoke</exclude>
</throwableConverter>
</stackTrace>
</providers>
</encoder>
</then>
<else>
<encoder>
<pattern>%d{HH:mm:ss.SSS} %-5level [%thread] %logger{36} - %msg - mdc[ %X ]%n%ex%n</pattern>
</encoder>
</else>
</if>
</appender>
<appender name="ASYNC_STDOUT" class="ch.qos.logback.classic.AsyncAppender">
<appender-ref ref="STDOUT"/>
</appender>
<root level="INFO">
<appender-ref ref="ASYNC_STDOUT"/>
</root>
and when I have invalid config a have unformatted error like
[info] 18:39:29.220 INFO [main] kamon.prometheus.PrometheusReporter - Started the embedded
HTTP server on http://0.0.0.0:9099 - mdc[ ]
[info] 18:39:29.548 INFO [Prometheus Reporter] kamon.prometheus.PrometheusReporter -
Started the embedded HTTP server on http://0.0.0.0:9099 - mdc[ ]
[error] pureconfig.error.ConfigReaderException: Cannot convert configuration to a
com.scheduler.SchedulerConfig. Failures are:
[error] at 'postgres.max-pool-size':
[error] - (jar:file:/tmp/sbt_9a1143b0/job-8/target/e4302064/14e353f3/scheduler_2.12-
1.0.jar!/reference.conf:63) Expected type NUMBER. Found STRING instead.

spray.io debugging directives

I am playing around with spray.io and I am not able to make spray debugging directives logRequestResponse work - I don't see any output in the log.
val route: Route = {
pathPrefix("city") {
pathPrefix("v1") {
path("transaction" / Segment / Segment) {
(siteId: String, transactionId: String) =>
post {
authenticate(BasicAuth(UserPasswordAuthenticator _, realm = "bd cinema import api")) {
user =>
DebuggingDirectives.logRequestResponse("city-trans", Logging.InfoLevel) {
val resp = "Hello"
complete {
resp
}
}
}
}
}
}
}
}
Am I missing something here?
Do I need to enable debugging in global somewhere in spray configuration? I tried different places and none of them worked as expected
Check you have sensible values at your application.conf and logback.xml as these sample files on the Spray project
Pay attention at application.conf akka.loglevel=INFO
akka {
log-config-on-start = on
loglevel = "INFO"
actor.timeoutsecs = 2
loggers = ["akka.event.slf4j.Slf4jLogger"]
}
A minimum logback.xml to display logs on stdout.
<configuration>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<target>System.out</target>
<encoder>
<pattern>[%d{dd/MM/yyyy HH:mm:ss.SSS}] [%level] [%thread] %logger{36} - %msg %n</pattern>
<!--<pattern>%X{akkaTimestamp} %-5level[%thread] %logger{0} - %msg%n</pattern>-->
</encoder>
</appender>
<!-- <logger name="com.vegatic" level="DEBUG"/> -->
<root level="DEBUG">
<appender-ref ref="CONSOLE"/>
</root>
</configuration>
Usual suspects are logger's name attributes not matching the Scala namespaces or not verbose enough which have been commented on the example above for clarity
Docs link for LoggingContext
A LoggingAdapter that can always be supplied implicitly. If an
implicit ActorSystem the created LoggingContext forwards to the log of
the system. If an implicit ActorContext is in scope the created
LoggingContext uses the context's ActorRef as a log source. Otherwise,
i.e. if neither an ActorSystem nor an ActorContext is implicitly
available, the created LoggingContext will forward to NoLogging, i.e.
"/dev/null".

What's wrong with my logback.xml with Akka?

Why is my logback logging choking with Akka? If I leave Akka config alone but delete my logback.xml file it works w/o problems with whatever the defaults are. Is there a config error in my logback.xml file?
logback.xml
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<root level="ERROR">
<appender-ref ref="STDOUT" />
</root>
</configuration>
application.conf
akka {
loglevel = "ERROR"
stdout-loglevel = "ERROR"
loggers = ["akka.event.slf4j.Slf4jLogger"]
actor {
provider = akka.remote.RemoteActorRefProvider
}
remote {
enabled-transports = ["akka.remote.netty.tcp"]
}
}
Build.scala (clip):
lazy val root = project.in(file("."))
.settings(basicSettings: _*)
.settings(libraryDependencies ++=
dep_compile(
typesafe_config, logback, akka_actor, akka_remote, akka_slf4j) ++
dep_test(scalatest)
)
When something attempts to log an error I get:
error while starting up loggers
akka.ConfigurationException: Logger specified in config can't be loaded [akka.event.slf4j.Slf4jLogger] due to [akka.event.Logging$LoggerInitializationException: Logger log1-Slf4jLogger did not respond with LoggerInitialized, sent instead [TIMEOUT]]
at akka.event.LoggingBus$$anonfun$4$$anonfun$apply$1.applyOrElse(Logging.scala:116)
at akka.event.LoggingBus$$anonfun$4$$anonfun$apply$1.applyOrElse(Logging.scala:115)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185)
at scala.util.Try$.apply(Try.scala:161)
...
Caused by: akka.event.Logging$LoggerInitializationException: Logger log1-Slf4jLogger did not respond with LoggerInitialized, sent instead [TIMEOUT]
at akka.event.LoggingBus$class.akka$event$LoggingBus$$addLogger(Logging.scala:185)
at akka.event.LoggingBus$$anonfun$4$$anonfun$apply$4.apply(Logging.scala:114)
at akka.event.LoggingBus$$anonfun$4$$anonfun$apply$4.apply(Logging.scala:113)
at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
at scala.util.Try$.apply(Try.scala:161)
at scala.util.Success.map(Try.scala:206)
at akka.event.LoggingBus$$anonfun$4.apply(Logging.scala:113)
... 36 more
You are using the old logger class in config. Here is how to fix it:
Akka (2.3.0) fails to load Slf4jEventHandler class with java.lang.ClassNotFoundException

Apache Thrift server logs

I have a server which has implemented in scala.
import org.apache.thrift.server.TServer
import org.apache.thrift.server.TThreadPoolServer
import org.apache.thrift.transport.TServerSocket
import org.apache.thrift.transport.TTransportException
import hms.config.demo.AppStoreConfig
object Server {
def main(args: Array[String]) {
start
}
private def start {
try {
val serverTransport: TServerSocket = new TServerSocket(7911)
val processor = new AppStoreConfig.Processor(new AppStoreConfigImpl)
val server: TServer = new TThreadPoolServer(new TThreadPoolServer.Args(serverTransport).processor(processor))
println("Starting server on port 7911 ...")
server.serve
}
catch {
case e: TTransportException => {
e.printStackTrace
}
}
}
}
I need to get logs from the server. That means how it handles the requests and respond to clients like tomcat server. Is there a way to achieve my task in thrift server?
It depends on slf4j as you can see here in "depends on". Here is documentation on how to configure slf4j: http://slf4j.org/faq.html. Basically you start by making a config file src/main/resources/logback.xml and putting something like this into it:
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>/var/log/mysuperapp/supername.log</file>
<append>true</append>
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<logger name="org.apache.thrift.server" level="info" />
<root level="info">
<appender-ref ref="FILE" />
</root>
</configuration>
Pick the appropriate level for desired package, add multiple lines for multiple packages. This requires you to have logback dependecy/jar present on classpath.

howto use the logback discriminator value to filter mongodb inserts

I have multiple threads generating log entries and I use the logback SiftingAppender to know who did what. Everything works fine and now im trying to save the log to mongodb.
In the mongodb the log needs to be saved into an embedded document array. Every user document have one embedded document that has an array of embedded documents containing log lines
Since im just started to learn logback this has to be some trial and error now.
In the below test logback.xml i have the file, consol and a custom appender.
My idea was that i could catch the SiftingAppender discriminator value in the custom appender append() method. Then the getMDCPropertyMap(); in the ILoggingEventgives me the MDC values but the question is if this is an efficient technic for what i want to do
I cannot see that logback has any native MDC-mongodb-appender or maybe i have missed something.
<configuration>
<appender name="SIFT-FILE" class="ch.qos.logback.classic.sift.SiftingAppender">
<!-- in the absence of the class attribute, it is assumed that the
desired discriminator type is
ch.qos.logback.classic.sift.MDCBasedDiscriminator -->
<discriminator>
<key>userid</key>
<defaultValue>unknown</defaultValue>
</discriminator>
<sift>
<appender name="FILE-${userid}" class="ch.qos.logback.core.FileAppender">
<file>${userid}.log</file>
<append>true</append>
<layout class="ch.qos.logback.classic.PatternLayout">
<pattern>%d [%thread] %level %mdc %logger{35} - %msg%n</pattern>
</layout>
</appender>
</sift>
</appender>
<appender name="SIFT-STDOUT" class="ch.qos.logback.classic.sift.SiftingAppender">
<!-- in the absence of the class attribute, it is assumed that the
desired discriminator type is
ch.qos.logback.classic.sift.MDCBasedDiscriminator -->
<discriminator>
<key>userid</key>
<defaultValue>unknown</defaultValue>
</discriminator>
<sift>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<!-- encoders are assigned the type
ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
<encoder>
<pattern>%-4relative [%thread] %-5level %logger{35} ${userid}- %msg %n</pattern>
</encoder>
</appender>
</sift>
</appender>
<appender name="SIFT-MONGO" class="ch.qos.logback.classic.sift.SiftingAppender">
<!-- in the absence of the class attribute, it is assumed that the
desired discriminator type is
ch.qos.logback.classic.sift.MDCBasedDiscriminator -->
<discriminator>
<key>userid</key>
<defaultValue>unknown</defaultValue>
</discriminator>
<sift>
<appender name="MONGO" class="com.carlsberg.MongoAppender">
</appender>
</sift>
</appender>
<root level="DEBUG">
<appender-ref ref="SIFT-FILE" />
<appender-ref ref="SIFT-STDOUT" />
<appender-ref ref="SIFT-MONGO" />
</root>
</configuration>
I used the SiftingAppender discriminator value in the my custom appender like this.
public void append(ILoggingEvent event) {
Map<String, String> m = event.getMDCPropertyMap();
String id = (String) m.get("userid");
if(id != null){
Query<UserLog> query1 = mongo.createQuery(UserLog.class);
query1.field("lowerCaseUserName").equal(id.toLowerCase());
UpdateOperations<UserLog> up2 = mongo.createUpdateOperations
(UserLog.class).add("log", event.getLevel().levelStr +" "+ ft.format(event.getTimeStamp()) +" "+ event.getFormattedMessage(), true);
UpdateResults<UserLog> udr1 = mongo.update(query1, up2);
if(udr1.getError() != null){
System.out.print("ERROR CANNOT SAVE to UserLog ip adress for " + id);
}
}
}