Duplicate events in Splunk with Kafka and log4j2 - apache-kafka

We are using log4j2 and kafkaAppender to send logs to a topic which is consumed by Splunk.
Kafka topic has 5 partitions and 24 hour retention. All our services(~ 14) send log events to the same topic.
Below is our log4j configuration:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="debug">
<Appenders>
<Console name="consolelog" target="SYSTEM_OUT">
<PatternLayout pattern="" chatset="UTF-8"/>
</Console>
<Kafka name="kafka" topic="topic1" ignoreExceptions="false" syncSend="false">
<PatternLayout pattern="" chatset="UTF-8"/>
<Property name="bootstarp.servers">hostname:port</Property>
<Property name="security.protocol">SSL</Property>
<Property name="ssl.truststore.location">abc.jks</Property>
<Property name="ssl.truststore.password">abcd</Property>
<Property name="ssl.keystore.location">def.jks</Property>
<Property name="ssl.keystore.password">defg</Property>
<Property name="ssl.key.password">defg</Property>
</Kafka>
<Async name="Async">
<AppenderRef ref="kafka"/>
</Async>
<Async name="console-log">
<AppenderRef ref="consolelog"/>
</Async>
</Appenders>
<Loggers>
<AsyncLogger name="com.abc" level="debug" addivity="false">
<AppenderRef ref="Async"/>
</AsyncLogger>
<AsyncLogger name="org.springframework" level=debug"" addivity="false">
<AppenderRef ref="Async" level="ERROR"/>
<AppenderRef ref="console-log" level="ERROR"/>
</AsyncLogger>
<AsyncLogger name="com.def" level="warn" addivity="false">
<AppenderRef ref="Async"/>
</AsyncLogger>
<Root level="info">
<AppenderRef ref="Async"/>
</Root>
<Logger name="org.apache.kafka" level="WARN"/>
</Loggers>
</Configuration>
We are seeing duplicate events in Splunk.
Splunk team has informed us that they are seeing same event across different partitions having different offsets.
Is it possible that the above log4j configuration might be causing this issue?
How should I troubleshoot this further? Have any of you ever faced such an issue with similar setup and what was the root cause, resolution in your case?

Related

Debug logs are not printing for application level logs in root logback.xml configuration

I'm facing an issue with logback for slf4j-api 2.0+ versions, earlier it used to work when I set root logger level to "DEBUG"
Later on, post upgradation to 2.0+ (with logback-classic-1.4.+) I'm unable to print debug statements using root logger.
logback.xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration debug="true" scan="true" scanPeriod="60 seconds">
<property scope="system" name="ff_logs" value="${firefly.logging.directory}" />
<include optional="true" file="/etc/firefly.temp/service-logback.xml"/>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<layout class="ch.qos.logback.classic.PatternLayout">
<Pattern>[%d{yyyy-MM-dd'T'HH:mm:ss.SSSZ}] [%thread] %-5level %logger - %msg%n</Pattern>
</layout>
</appender>
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${ff_logs}/${app.name}.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
<fileNamePattern>${ff_logs}/${app.name}.log.%i.gz</fileNamePattern>
<minIndex>1</minIndex>
<maxIndex>20</maxIndex>
</rollingPolicy>
<triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
<maxFileSize>20MB</maxFileSize>
</triggeringPolicy>
<encoder>
<pattern>[%d{yyyy-MM-dd'T'HH:mm:ss.SSSZ}] [%thread] %-5level %logger - %msg%n</pattern>
</encoder>
</appender>
<logger name="com.tran.firefly" level="INFO"/>
<logger name="org.apache.activemq.transport" level="WARN"/>
<logger name="com.org.firefly.sessionmanager" level="INFO"/>
<logger name="com.org.apps" level="INFO"/>
<logger name="com.datastax.driver.core.Cluster" level="INFO"/>
<logger name="com.datastax.driver.core.Connection" level="INFO"/>
<logger name="com.datastax.driver.core.Session" level="INFO"/>
<logger name="com.datastax.driver.core.RequestHandler" level="INFO"/>
<logger name="org.apache.kafka.clients.consumer.ConsumerConfig" level="WARN"/>
<logger name="org.apache.kafka.clients.consumer.internals.Fetcher" level="WARN"/>
<logger name="org.apache.kafka.clients.NetworkClient" level="WARN"/>
<logger name="org.apache.kafka.clients.FetchSessionHandler" level="WARN"/>
<logger name="com.org.firefly.crud.initialization.module.PrepareModuleTask" level="WARN"/>
<logger name="io.netty" level="WARN"/>
<root level="DEBUG">
<!-- <appender-ref ref="STDOUT" /> Remove comment markers to log to STDOUT inside the container -->
<appender-ref ref="FILE" />
</root>
</configuration>
Surprisingly getLogger with package name is working, but not with getClass. I got some answer to similar problem, but in play logs. Someone had similar issue with play logs
val log: Logger = LoggerFactory.getLogger(getClass)
my question is: do I have any work around to get application level logs when I set to Debug? Or is there any official announcement of deprecation?
val log: Logger = LoggerFactory.getLogger("com.package.SomeQualified.CanonicalNameHere")
is working, I could see prints. but it is not a viable solution for me. Suggest on this.
Im testing under com.org.firefly.aaa.authorization package
I have tried to upgrade /downgrade slf4j versions. tried with setting static strings in place of getClass.

Gatling log files not getting generated for my logback.xml

<configuration>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%-5level] %logger{15} - %msg%n%rEx</pattern>
</encoder>
<immediateFlush>false</immediateFlush>
</appender>
<timestamp key="timestamp" datePattern="yyyy-MM-dd'T'HH:mm:ss"/>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>logs/test_${timestamp}.log</file>
<append>true</append>
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%-5level] %logger{15} - %msg%n%rEx</pattern>
</encoder>
</appender>
<!-- Uncomment for logging ALL HTTP request and responses-->
<logger name="io.gatling.http.ahc" level="TRACE" />
<logger name="io.gatling.http.response" level="TRACE" />
<!-- Uncomment for logging ONLY FAILED HTTP request and responses-->
<!-- <logger name="io.gatling.http.ahc" level="DEBUG" />-->
<!-- <logger name="io.gatling.http.response" level="DEBUG" />-->
<!-- TRACE logs all HTTP requests/response, DEBUG logs only failed HTTP requests/response-->
<!-- <logger name="io.gatling.http.engine.response" level="DEBUG" />-->
<root level="TRACE">
<appender-ref ref="FILE"/>
<!-- <appender-ref ref="CONSOLE"/>-->
</root>
</configuration>
Only simulation.log and gatling report got generated. but it should generate logs/test_${timestamp}.log file as per logback config. Can anyone help on this?
Are you sure the log file is generated where you expect it? Have you tried setting an absolute path?

Kubernetes log location inside the pod

I have a docker image for a Spring Boot app with the log file location as --logging.config=/conf/logs/logback.xml and the log file is as follows.
I am able to get the logs as
kubectl log POD_NAME
But, unable to find the log file when I log in to the pod. Is there any default location where the log file is placed as I haven't mentioned the logging location in the logback.xml file.
Logback file:
<?xml version="1.0" ?>
<configuration>
<property name="server.encoder.pattern"
value="%d{yyyy-MM-dd'T'HH:mm:ss.SSSZ} %-5level : loggerName="%logger{36}" threadName="%thread" txnId="%X{txnId}" %msg%n" />
<property name="metrics.encoder.pattern"
value="%d{yyyy-MM-dd'T'HH:mm:ss.SSSZ} %-5level : %msg%n" />
<!-- Enable LevelChangePropagator for jul-to-slf4j optimization -->
<contextListener class="ch.qos.logback.classic.jul.LevelChangePropagator" />
<appender name="METRICS" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>${metrics.encoder.pattern}</pattern>
</encoder>
</appender>
<logger name="appengAluminumMetricsLogger" additivity="false">
<appender-ref ref="METRICS" />
</logger>
<appender name="SERVER" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>${server.encoder.pattern}</pattern>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="SERVER" />
</root>
</configuration>
What you see from kubectl logs is console log from your service. Only console log can be seen like that and this is via docker logs support.

JBoss EAP 6.2 and Log4j2 stops writing logs after some time

I am using Log4j2 (RollingFile with routes) in my web application to log application specific logs in a few separate log files. The log4j2.xml file is bundled with in the WAR file.
Log files are generated and logs are generating fine to start with. After some time, it stops writing logs to the existing file and fails creating new folders/files too.
On restart everything resumes working and that is for some time only.
Tried monitoring, couldn't figure out any specific pattern or steps to simulate it.
<Configuration status="error" name="logger">
<Properties>
<Property name="logpath">path_to_log_file</Property>
</Properties>
<Appenders>
<Routing name="RoutingUserLogFile">
<Routes pattern="$${ctx:user}/">
<Route>
<RollingFile name="UserLogFile" fileName="${logpath}/${ctx:user}/MyLogFile.log" filePattern="${logpath}/${ctx:user}/%d{dd-MM-yyyy}-MyLogFile-%i.log.gz">
<PatternLayout>
<Pattern>%d %p %-40C{1.} %m%n</Pattern>
</PatternLayout>
<Policies>
<TimeBasedTriggeringPolicy interval="1" modulate="true" />
<SizeBasedTriggeringPolicy size="4 MB" />
</Policies>
</RollingFile>
</Route>
</Routes>
</Routing>
</Appenders>
<Loggers>
<Root>
<level value="debug" />
<AppenderRef ref="RoutingUserLogFile" level="debug" />
</Root>
</Loggers>
</Configuration>

Chainsaw v2 SocketReceiver not working with log4j2 SocketAppender

I'm trying to use Chainsaw v2 from http://people.apache.org/~sdeboy
I don't want to use zero configuration. Just a simple socketAppender/SocketReceiver combo.
I'm using log4j2 with the following configuration
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN" >
<Appenders>
<Console name="CONSOLE" target="SYSTEM_OUT">
<PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n" />
</Console>
<Socket name="SharathZeroConf" host="localhost" port="4445">
</Socket>
</Appenders>
<Loggers>
<Root level="debug">
<AppenderRef ref="SharathZeroConf" />
<AppenderRef ref="CONSOLE" />
</Root>
</Loggers>
</Configuration>
On ChainSaw, I'm selecting the option "Receive events from network" with port 4445.
However chainsaw doesnt log anything.
I've verified that the appender configuration is correct on log4j side by using the builtin socketserver
java -cp ~/.m2/reposiry/org/apache/logging/log4j/log4j-api/2.0.2/log4j-api-2.0.2.jar org.apache.logging.log4j.core.net.server.TcpSocketServer 4445
So the bug must be on chainsaw side. Any pointers #Scott ?
You're right, I got the same issue. I just tried with LogMX instead, and it works like a charm:
I just had to copy Log4j JARs in LogMX lib/ directory (i.e. log4j-api-2.xx.jar and log4j-core-2.xx.jar)