How to redirect logging in akka? - scala

I am implementing a distributed database with scala 2.9 and akka 2.0. My current problem is I want to redirect the standard logging to a file instead of stdout. I don't realy want to use SLF4J or SLF4S. Is there a simple way to redirect the logging output?

The akka documentation for logging says, that you can register handlers in the config like this:
akka {
# Event handlers to register at boot time (Logging$DefaultLogger logs to STDOUT)
event-handlers = ["akka.event.Logging$DefaultLogger"]
# Options: ERROR, WARNING, INFO, DEBUG
loglevel = "DEBUG"
}
There is also an SLF4J handler
akka.event.slf4j.Slf4jEventHandler
Using this you can add any SLF4J compliant library like logback to write your logs to wherever you want.
edit:
To use logback to log to a file you have to add logback as a dependency, add the Slf4jEventHandler to your config:
akka {
# Event handlers to register at boot time (Logging$DefaultLogger logs to STDOUT)
event-handlers = ["akka.event.slf4j.Slf4jEventHandler"]
# Options: ERROR, WARNING, INFO, DEBUG
loglevel = "DEBUG"
}
and add a logback config to your project that lokks something like this (taken from logback docs):
<configuration>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>testFile.log</file>
<append>true</append>
<!-- encoders are assigned the type
ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
<encoder>
<pattern>%-4relative [%thread] %-5level %logger{35} - %msg%n</pattern>
</encoder>
</appender>
<root level="DEBUG">
<appender-ref ref="FILE" />
</root>
</configuration>
Due to the async logging in akka you cannot use the %thread variable in your log pattern, instead use the sourceThread variable from the MDC. You can read about that at the bottom of this page: http://doc.akka.io/docs/akka/2.0/scala/logging.html
You don't have to explicitly use slf4j or logback in your code, just use the normal akka logging, the handler will take care of everything else.

Related

timeout with kakfkaAppender in log4j?

When I use the KafkaAppender of log4j I have a problem when I put a single broker, but it is stopped. The problem is the KafkaAppender waits for a very long time before failing. I use syncsend=false I want to set some timeout so the appender wouldn't wait for such a long time.
Could you tell me how I need to configure the KafkaAppender in order to prevent this wait?
There is no timeout setting on KafkaAppender itself, but there are a few timeout options that can be configured on KafkaProducer. The options are described in the Kafka documentation.
Here you have example kafka appender configuration with two kafka producer timeout settings with their default values:
<Appenders>
<Kafka name="Kafka" topic="log-test">
<PatternLayout pattern="%date %message"/>
<Property name="bootstrap.servers">localhost:9092</Property>
<Property name="request.timeout.ms">30000</Property><!-- 30 seconds -->
<Property name="transaction.timeout.ms">60000</Property><!-- 1 minute -->
</Kafka>
</Appenders>
You might want to play with those to get the expected behaviour.
Also, remember that the syncSend option was added in log4j 2.8 version. If you use older version it will have no effect.

Recursive logging from KafkaAppender on level INFO

The documentation says to avoid recursive logging do not let write org.apache.kafka log on DEBUG level
<?xml version="1.0" encoding="UTF-8"?>
...
<Loggers>
<Root level="DEBUG">
<AppenderRef ref="Kafka"/>
</Root>
<Logger name="org.apache.kafka" level="INFO" /> <!-- avoid recursive logging -->
</Loggers>
However, in this case, org.apache.kafka.clients.Metadata logs ClusterID on level INFO, which leads to recursive logging and WARN log
kafka-producer-network-thread | producer-1 WARN Recursive logging from [org.apache.kafka.clients.Metadata] for appender [KafkaAppender1].
Does this mean that need to set org.apache.kafka level warn?
What the error means is that you should configure your org.apache.kafka Logger to send log events to some other Appender. All the logging level does is prevent log events from being logged at all. If you want to just prevent ALL Kafka log events then set its logger level to OFF.

Logging Configuration Not Working for Akka

I am having problems with configuration for logging with Akka. In my STDOUT it is not hiding them DEBUG messages. In my console I seeing this:
12:45:27.790 [example-akka.kafka.default-dispatcher-18] DEBUG org.apache.kafka.clients.consumer.KafkaConsumer - [Consumer clientId=consumer-1, groupId=group1] Resuming partitions [test-topic-0]
12:45:27.823 [example-akka.kafka.default-dispatcher-18] DEBUG org.apache.kafka.clients.FetchSessionHandler - [Consumer clientId=consumer-1, groupId=group1] Node 1001 sent an incremental fetch response for session 1829476633 with 0 response partition(s), 1 implied partition(s)
So I am needing to stop seeing DEBUG messages. I make my logging like this:
akka {
# Loggers to register at boot time (akka.event.Logging$DefaultLogger logs
# to STDOUT)
loggers = ["akka.event.slf4j.Slf4jLogger"]
# Log level used by the configured loggers (see "loggers") as soon
# as they have been started; before that, see "stdout-loglevel"
# Options: OFF, ERROR, WARNING, INFO, DEBUG
loglevel = "INFO"
# Log level for the very basic logger activated during ActorSystem startup.
# This logger prints the log messages to stdout (System.out).
# Options: OFF, ERROR, WARNING, INFO, DEBUG
stdout-loglevel = "INFO"
}
And in my logback.xml I am writing this:
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%date{ISO8601} level=[%level] marker=[%marker] logger=[%logger] akkaSource=[%X{akkaSource}]
sourceActorSystem=[%X{sourceActorSystem}] sourceThread=[%X{sourceThread}] mdc=[ticket-#%X{ticketNumber}:
%X{ticketDesc}] - msg=[%msg]%n----%n
</pattern>
</encoder>
</appender>
<root level="info">
<appender-ref ref="STDOUT"/>
</root>
</configuration>
Why is not DEBUG messages being stopped?
My project structure
src -> main -> scala (here my source files in Scala)
src -> main -> scala -> resources (here my lokback.xml and appliction.conf)
Your sbt project structure is incorrect. This should be the correct structure:
src/
main/
resources/
<files to include in main jar here>
scala/
<main Scala sources>
java/
<main Java sources>
test/
resources
<files to include in test jar here>
scala/
<test Scala sources>
java/
<test Java sources>
Then you also need to follow #Mario's advice as well of adjusting the logging at the package level.
Log level can be configured at package level, for example, adding the following to logback.xml
<logger name="org.apache.kafka" level="INFO"/>
sets the log level to INFO for all the components inside org.apache.kafka package, which should stop DEBUG messages.

Akka SLF4J and logback in Scala

I am trying to set up some basic logging for my akka actor system, but so far am only getting the standard logs and none of my added logs or an output file. I have followed along with the akka docs for logging and have set up the following:
I added these dependencies to the build.sbt file
"com.typesafe.akka" %% "akka-slf4j" % "2.3.14"
"ch.qos.logback" % "logback-classic" % "1.0.9"
I added this to the application.conf file
akka {
loggers = ["akka.event.slf4j.Slf4jLogger"]
loglevel = "DEBUG"
}
logback.xml is in src/main/resources
<configuration>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<File>./logs/akka.log</File>
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%-5level] %msg%n</pattern>
</encoder>
</appender>
<root level="info">
<appender-ref ref="FILE" />
</root>
</configuration>
This is what I'm hopping is supposed to do the logging
import akka.event.Logging
val log = Logging(context.system, classOf[TickActor])
log.info("Good Luck!")
I do not receive and messages of the failure from the standard logging, and I haven't been able to find additional solutions much different from what I already have. I have tried the suggestions in this question. It seemed to be the same issue I'm having, but the suggestions did not work. Have I missed a step or configured something wrong?
Everything looks correct except for the missing akka.logging-filter setting.
Here how it should look like:
akka {
loggers = ["akka.event.slf4j.Slf4jLogger"]
loglevel = "DEBUG"
logging-filter = "akka.event.slf4j.Slf4jLoggingFilter"
}
Here is a project with the same setup that has logging working: application.conf and logback.xml.
Explanation from the docs:
You need to enable the Slf4jLogger in the loggers element in the
Configuration. Here you can also define the log level of the event
bus. More fine grained log levels can be defined in the configuration
of the SLF4J backend (e.g. logback.xml). You should also define
akka.event.slf4j.Slf4jLoggingFilter in the logging-filter
configuration property. It will filter the log events using the
backend configuration (e.g. logback.xml) before they are published to
the event bus.
and
Warning! If you set the loglevel to a higher level than "DEBUG", any
DEBUG events will be filtered out already at the source and will never
reach the logging backend, regardless of how the backend is
configured.
which you took care of already.

Changing appends programmatically

I want to get an appender and apply it to different loggers. I have an appender defined in my "logback.xml". Is there a way to get this appender, change the file location and apply to a logger.
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>logg.log</file>
<encoder>
<pattern>%msg%n</pattern>
</encoder>
</appender>
This is how I am adding a new FileAppender and applying to a specific logger. I need a way to do this for an existing appender.
val fileAppender = new FileAppender()
fileAppender.setFile("/location/logg.log")
val roote = LoggerFactory.getLogger("FOO.Class")
roote.addAppender(fileAppender)
Can you not just do this (note - untested):
val roote = LoggerFactory.getLogger("FOO.Class")
val appender = roote.getAppender("APPENDER_NAME_YOU_WANT_TO_GET")
roote.addAppender(appender)
See: Using getAppender() in Logback