Logging Configuration Not Working for Akka - scala

I am having problems with configuration for logging with Akka. In my STDOUT it is not hiding them DEBUG messages. In my console I seeing this:
12:45:27.790 [example-akka.kafka.default-dispatcher-18] DEBUG org.apache.kafka.clients.consumer.KafkaConsumer - [Consumer clientId=consumer-1, groupId=group1] Resuming partitions [test-topic-0]
12:45:27.823 [example-akka.kafka.default-dispatcher-18] DEBUG org.apache.kafka.clients.FetchSessionHandler - [Consumer clientId=consumer-1, groupId=group1] Node 1001 sent an incremental fetch response for session 1829476633 with 0 response partition(s), 1 implied partition(s)
So I am needing to stop seeing DEBUG messages. I make my logging like this:
akka {
# Loggers to register at boot time (akka.event.Logging$DefaultLogger logs
# to STDOUT)
loggers = ["akka.event.slf4j.Slf4jLogger"]
# Log level used by the configured loggers (see "loggers") as soon
# as they have been started; before that, see "stdout-loglevel"
# Options: OFF, ERROR, WARNING, INFO, DEBUG
loglevel = "INFO"
# Log level for the very basic logger activated during ActorSystem startup.
# This logger prints the log messages to stdout (System.out).
# Options: OFF, ERROR, WARNING, INFO, DEBUG
stdout-loglevel = "INFO"
}
And in my logback.xml I am writing this:
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%date{ISO8601} level=[%level] marker=[%marker] logger=[%logger] akkaSource=[%X{akkaSource}]
sourceActorSystem=[%X{sourceActorSystem}] sourceThread=[%X{sourceThread}] mdc=[ticket-#%X{ticketNumber}:
%X{ticketDesc}] - msg=[%msg]%n----%n
</pattern>
</encoder>
</appender>
<root level="info">
<appender-ref ref="STDOUT"/>
</root>
</configuration>
Why is not DEBUG messages being stopped?
My project structure
src -> main -> scala (here my source files in Scala)
src -> main -> scala -> resources (here my lokback.xml and appliction.conf)

Your sbt project structure is incorrect. This should be the correct structure:
src/
main/
resources/
<files to include in main jar here>
scala/
<main Scala sources>
java/
<main Java sources>
test/
resources
<files to include in test jar here>
scala/
<test Scala sources>
java/
<test Java sources>
Then you also need to follow #Mario's advice as well of adjusting the logging at the package level.

Log level can be configured at package level, for example, adding the following to logback.xml
<logger name="org.apache.kafka" level="INFO"/>
sets the log level to INFO for all the components inside org.apache.kafka package, which should stop DEBUG messages.

Related

timeout with kakfkaAppender in log4j?

When I use the KafkaAppender of log4j I have a problem when I put a single broker, but it is stopped. The problem is the KafkaAppender waits for a very long time before failing. I use syncsend=false I want to set some timeout so the appender wouldn't wait for such a long time.
Could you tell me how I need to configure the KafkaAppender in order to prevent this wait?
There is no timeout setting on KafkaAppender itself, but there are a few timeout options that can be configured on KafkaProducer. The options are described in the Kafka documentation.
Here you have example kafka appender configuration with two kafka producer timeout settings with their default values:
<Appenders>
<Kafka name="Kafka" topic="log-test">
<PatternLayout pattern="%date %message"/>
<Property name="bootstrap.servers">localhost:9092</Property>
<Property name="request.timeout.ms">30000</Property><!-- 30 seconds -->
<Property name="transaction.timeout.ms">60000</Property><!-- 1 minute -->
</Kafka>
</Appenders>
You might want to play with those to get the expected behaviour.
Also, remember that the syncSend option was added in log4j 2.8 version. If you use older version it will have no effect.

Recursive logging from KafkaAppender on level INFO

The documentation says to avoid recursive logging do not let write org.apache.kafka log on DEBUG level
<?xml version="1.0" encoding="UTF-8"?>
...
<Loggers>
<Root level="DEBUG">
<AppenderRef ref="Kafka"/>
</Root>
<Logger name="org.apache.kafka" level="INFO" /> <!-- avoid recursive logging -->
</Loggers>
However, in this case, org.apache.kafka.clients.Metadata logs ClusterID on level INFO, which leads to recursive logging and WARN log
kafka-producer-network-thread | producer-1 WARN Recursive logging from [org.apache.kafka.clients.Metadata] for appender [KafkaAppender1].
Does this mean that need to set org.apache.kafka level warn?
What the error means is that you should configure your org.apache.kafka Logger to send log events to some other Appender. All the logging level does is prevent log events from being logged at all. If you want to just prevent ALL Kafka log events then set its logger level to OFF.

Akka SLF4J and logback in Scala

I am trying to set up some basic logging for my akka actor system, but so far am only getting the standard logs and none of my added logs or an output file. I have followed along with the akka docs for logging and have set up the following:
I added these dependencies to the build.sbt file
"com.typesafe.akka" %% "akka-slf4j" % "2.3.14"
"ch.qos.logback" % "logback-classic" % "1.0.9"
I added this to the application.conf file
akka {
loggers = ["akka.event.slf4j.Slf4jLogger"]
loglevel = "DEBUG"
}
logback.xml is in src/main/resources
<configuration>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<File>./logs/akka.log</File>
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%-5level] %msg%n</pattern>
</encoder>
</appender>
<root level="info">
<appender-ref ref="FILE" />
</root>
</configuration>
This is what I'm hopping is supposed to do the logging
import akka.event.Logging
val log = Logging(context.system, classOf[TickActor])
log.info("Good Luck!")
I do not receive and messages of the failure from the standard logging, and I haven't been able to find additional solutions much different from what I already have. I have tried the suggestions in this question. It seemed to be the same issue I'm having, but the suggestions did not work. Have I missed a step or configured something wrong?
Everything looks correct except for the missing akka.logging-filter setting.
Here how it should look like:
akka {
loggers = ["akka.event.slf4j.Slf4jLogger"]
loglevel = "DEBUG"
logging-filter = "akka.event.slf4j.Slf4jLoggingFilter"
}
Here is a project with the same setup that has logging working: application.conf and logback.xml.
Explanation from the docs:
You need to enable the Slf4jLogger in the loggers element in the
Configuration. Here you can also define the log level of the event
bus. More fine grained log levels can be defined in the configuration
of the SLF4J backend (e.g. logback.xml). You should also define
akka.event.slf4j.Slf4jLoggingFilter in the logging-filter
configuration property. It will filter the log events using the
backend configuration (e.g. logback.xml) before they are published to
the event bus.
and
Warning! If you set the loglevel to a higher level than "DEBUG", any
DEBUG events will be filtered out already at the source and will never
reach the logging backend, regardless of how the backend is
configured.
which you took care of already.

Akka actor logging not writing to file

I'm attempting to log to a file rather than stdout.
My application.conf (in src/main/resources/):
akka {
event-handlers = ["akka.event.slf4j.Slf4jEventHandler"]
loglevel = "DEBUG"
}
logback.xml (in src/main/resources/):
<configuration>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>log/app.log</file>
<append>true</append>
<encoder>
<pattern>%date{yyyy-MM-dd} %X{akkaTimestamp} %-5level[%thread] %logger{1} - %msg%n</pattern>
</encoder>
</appender>
<root level="DEBUG">
<appender-ref ref="FILE"/>
</root>
</configuration>
Creating the actor system:
val conf: Config = ConfigFactory.load()
val system = ActorSystem("Process", conf)
Finally, the actual logging:
class Processor() extends Actor with ActorLogging {
def receive = {
case Start =>
log.info("Started")
}
}
However, when running the app, I get the logging in stdout:
[info] Running com.imgzine.analytics.apps.ProcessEvents
[DEBUG] [06/02/2014 09:28:53.356] [run-main] [EventStream(akka://Process)] logger log1-Logging$DefaultLogger started
[DEBUG] [06/02/2014 09:28:53.358] [run-main] [EventStream(akka://Process)] Default Loggers started
[INFO] [06/02/2014 09:28:53.389] [Process-akka.actor.default-dispatcher-4] [akka://Process/user/processor] Started
[DEBUG] [06/02/2014 09:28:54.887] [Process-akka.actor.default-dispatcher-4] [EventStream] shutting down: StandardOutLogger started
And inside log/app.log, I find:
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Registering Scala Conversions.
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Deserializers for Scala Conversions registering
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Serializers for Scala Conversions registering
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up OptionSerializer
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up ScalaProductSerializer
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up ScalaCollectionSerializer
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up ScalaRegexSerializers
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Hooking up scala.util.matching.Regex serializer
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Reached base registration method on MongoConversionHelper.
Any ideas?
If you are using Akka 2.3.3 as stated in response to my comment, then your config around logging is out of date (using old logging config setup). Try replacing your logging config with:
akka {
loggers = ["akka.event.slf4j.Slf4jLogger"]
loglevel = "DEBUG"
}
If that does not work let me know and I'll advise on another solution.

How to redirect logging in akka?

I am implementing a distributed database with scala 2.9 and akka 2.0. My current problem is I want to redirect the standard logging to a file instead of stdout. I don't realy want to use SLF4J or SLF4S. Is there a simple way to redirect the logging output?
The akka documentation for logging says, that you can register handlers in the config like this:
akka {
# Event handlers to register at boot time (Logging$DefaultLogger logs to STDOUT)
event-handlers = ["akka.event.Logging$DefaultLogger"]
# Options: ERROR, WARNING, INFO, DEBUG
loglevel = "DEBUG"
}
There is also an SLF4J handler
akka.event.slf4j.Slf4jEventHandler
Using this you can add any SLF4J compliant library like logback to write your logs to wherever you want.
edit:
To use logback to log to a file you have to add logback as a dependency, add the Slf4jEventHandler to your config:
akka {
# Event handlers to register at boot time (Logging$DefaultLogger logs to STDOUT)
event-handlers = ["akka.event.slf4j.Slf4jEventHandler"]
# Options: ERROR, WARNING, INFO, DEBUG
loglevel = "DEBUG"
}
and add a logback config to your project that lokks something like this (taken from logback docs):
<configuration>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>testFile.log</file>
<append>true</append>
<!-- encoders are assigned the type
ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
<encoder>
<pattern>%-4relative [%thread] %-5level %logger{35} - %msg%n</pattern>
</encoder>
</appender>
<root level="DEBUG">
<appender-ref ref="FILE" />
</root>
</configuration>
Due to the async logging in akka you cannot use the %thread variable in your log pattern, instead use the sourceThread variable from the MDC. You can read about that at the bottom of this page: http://doc.akka.io/docs/akka/2.0/scala/logging.html
You don't have to explicitly use slf4j or logback in your code, just use the normal akka logging, the handler will take care of everything else.