logback not reloading logback.xml in wildfly10 - wildfly

When I test my application in local (without wildfly) it works. I can go to logback.xml, change some appender and then I can see the changes on lilith.
When I deploy my application on wildfly it doesn't work. This is the logback.xml file:
<configuration debug="true" scan="true" scanPeriod="10 seconds">
<appender name="lilith-commty" class="ch.qos.logback.classic.net.SocketAppender">
<RemoteHost>192.168.56.1</RemoteHost>
<Port>4560</Port>
<ReconnectionDelay>170</ReconnectionDelay>
<IncludeCallerData>true</IncludeCallerData>
</appender>
<root level="trace">
<appender-ref ref="lilith-commty"/>
</root>
<logger name="org.mongodb" additivity="false"> </logger>
<logger name="io.swagger" additivity="false"> </logger>
<logger name="org.reflections" additivity="false"> </logger>
<logger name="ma.glasnost" additivity="false"> </logger>
In server.log from wildfly I've seen these messages:
2017-05-08 08:50:58,929 INFO [stdout] (ServerService Thread Pool -- 100) 08:50:58,866 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Found resource [logback.xml] at [vfs:/content/commty.war/WEB-INF/classes/logback.xml]
2017-05-08 08:50:58,931 INFO [stdout] (ServerService Thread Pool -- 100) 08:50:58,931 |-INFO in ch.qos.logback.classic.joran.action.ConfigurationAction - Will scan for changes in [vfs:/content/commty.war/WEB-INF/classes/logback.xml]
2017-05-08 08:50:58,931 INFO [stdout] (ServerService Thread Pool -- 100) 08:50:58,931 |-INFO in ch.qos.logback.classic.joran.action.ConfigurationAction - Setting ReconfigureOnChangeTask scanning period to 10 seconds
But then logback shows me that message on the server.log:
2017-05-08 08:51:02,021 INFO [stdout] (logback-5) 08:51:02,021 |-INFO in ReconfigureOnChangeTask(born:1494228032019) - Empty watch file list. Disabling
It seems that logback isn't finding the logback.xml so it can't see the changes.
How I can fix that?
Thanks.
EDIT: It seems that the main problem is that logback doesn't know how to solve the url to the logback.xml for reloading it because vfs:
ch.qos.logback.core.joran.spi.ConfigurationWatchList#50d2b290 - URL [vfs:/content/commty.war/WEB-INF/classes/logback.xml] is not of type file
Any idea?

Related

Logback XML not being picked up

I'm trying to implement Logback for an existing EAP7.2 app.
JBoss EAP 7.2.8.GA (WildFly Core 6.0.27.Final-redhat-00001)
When I run gradle clean build it creates the proper log in the location and logs all of the test results. But when I deploy the app it's not using the logback.xml at all and the logs aren't created. Only the server.log is active when the app is deployed because it's the default jboss setup.
How do I implement logback so that the app knows to use it when deployed? I've checked the war the logback.xml gets created under the proper WEB-INF/classes/
EAR build.gradle
dependencies {
implementation 'org.slf4j:slf4j-api:1.7.30'
implementation 'ch.qos.logback:logback-classic:1.2.3'
implementation 'ch.qos.logback:logback-core:1.2.3'
}
jboss exclusions
<exclusions>
<!-- don't want to integrate with server logging yet -->
<module name="org.jboss.logging"/>
<module name="org.slf4j"/>
<module name="org.slf4j.impl"/>
</exclusions>
server.log
2020-09-25 16:28:05,188 INFO [stdout] (QuartzScheduler_AppScheduler-<server>11601051265073_ClusterManager) 16:28:05.187 [QuartzScheduler_AppScheduler-<server>11601051265073_ClusterManager] DEBUG org.quartz.impl.jdbcjobstore.StdRowLockSemaphore - Lock 'STATE_ACCESS' given to: QuartzScheduler_AppScheduler-<server>11601051265073_ClusterManager
2020-09-25 16:28:05,188 INFO [stdout] (QuartzScheduler_AppScheduler-<server>11601051265073_ClusterManager) 16:28:05.188 [QuartzScheduler_AppScheduler-<server>11601051265073_ClusterManager] DEBUG org.quartz.impl.jdbcjobstore.StdRowLockSemaphore - Lock 'TRIGGER_ACCESS' is desired by: QuartzScheduler_AppScheduler-<server>11601051265073_ClusterManager
2020-09-25 16:28:05,189 INFO [stdout] (QuartzScheduler_AppScheduler-<server>11601051265073_ClusterManager) 16:28:05.188 [QuartzScheduler_AppScheduler-<server>11601051265073_ClusterManager] DEBUG org.quartz.impl.jdbcjobstore.StdRowLockSemaphore - Lock 'TRIGGER_ACCESS' is being obtained: QuartzScheduler_AppScheduler-<server>11601051265073_ClusterManager
2020-09-25 16:28:05,189 INFO [stdout] (QuartzScheduler_AppScheduler-<server>11601051265073_ClusterManager) 16:28:05.189 [QuartzScheduler_AppScheduler-<server>11601051265073_ClusterManager] DEBUG org.quartz.impl.jdbcjobstore.StdRowLockSemaphore - Lock 'TRIGGER_ACCESS' given to: QuartzScheduler_AppScheduler-<server>11601051265073_ClusterManager
logback.xml
<configuration debug="true" scan="true">
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n
</pattern>
</encoder>
</appender>
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${jboss.server.log.dir}/logs.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<FileNamePattern>${jboss.server.log.dir}/logs.log.%d{yyyy-MM-dd}.gz</FileNamePattern>
<maxHistory>10</maxHistory>
</rollingPolicy>
<encoder>
<pattern>%date %level [username:%X{username}][%thread] %logger [%file:%line] %msg%n
</pattern>
</encoder>
</appender>
<logger name="org.quartz" level="INFO"/>
<root level="ALL">
<appender-ref ref="FILE"/>
<appender-ref ref="STDOUT"/>
</root>
</configuration>
You should exclude logging subsystem in jboss-deployment-structure.xml and also set org.jboss.logging.provider system property.
You can follow this Red Hat article on steps to configure LogBack with JBoss EAP 7
Also have a look at various options of configuring external logging.

cannot disable DEBUG level log from org.apache.kafka.common

I am using log4j 1.2.17 with the following configuration:
log4j.rootLogger=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
log4j.logger.org.apache.kafka=OFF
With configuration above, I had expected that we would not see DEBUG level logs from kafka 2.4.0 libraries that we are using. However, somehow I still see logs as below. I have also tried using log4j2 with the same properties file in my application, it is the same. How shall we disable DEBUG level logging from kafka client libraries?
06:59:40.995 [main] DEBUG org.apache.kafka.common.metrics.Metrics - Added sensor with name join-latency
06:59:40.995 [main] DEBUG org.apache.kafka.common.metrics.Metrics - Added sensor with name sync-latency
06
Create a logback.xml with below contents (within the Resources folder):
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>[%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<root level="info">
<appender-ref ref="STDOUT" />
</root>
</configuration>

How to 'censor' some data in libraries self-generated logs?

I am using the library scalikejdbc for my Application. And when i read the logs, that scalikejdbc generates itself, it gives me something like that:
06:02:16.891 [main] DEBUG scalikejdbc.ConnectionPool$ - Registered connection pool : ConnectionPool(url:jdbc:sqlserver://foo.bar:8080;databaseName=foobar;user=user;password=**PASSWORD**;...
So scalike itself throws my database user's password into the logs, which is inappropriate.
I was thinking about filtering those logs at all with something like https://logback.qos.ch/manual/filters.html , but I do need the odd information from those logs from scalike, so I cannot filter them fully.
What do I have now:
06:02:16.891 [main] DEBUG scalikejdbc.ConnectionPool$ - Registered connection pool : ConnectionPool(url:jdbc:sqlserver://foo.bar:8080;databaseName=foobar;user=user;password=somepassword;
What am I try to get:
06:02:16.891 [main] DEBUG scalikejdbc.ConnectionPool$ - Registered connection pool : ConnectionPool(url:jdbc:sqlserver://foo.bar:8080;databaseName=foobar;user=user;password=CENSORED;
Consider using replace conversion word in logback.xml configuration, for example,
%replace(%msg){"password=.*", "password=CENSORED"}
which given
logger.debug("password=123456")
should output something like
[debug] application - password=CENSORED
Docs state
replace(p){r, t} Replaces occurrences of 'r', a regex, with its
replacement 't' in the string produces by the sub-pattern 'p'. For
example, "%replace(%msg){'\s', ''}" will remove all spaces contained
in the event message.
Here is an example of my logback.xml:
<configuration>
<conversionRule conversionWord="coloredLevel" converterClass="play.api.libs.logback.ColoredLevel" />
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%coloredLevel %logger{15} - %replace(%msg){"password=.*", "password=CENSORED"}%n %xException{10}</pattern>
</encoder>
</appender>
<logger name="play" level="INFO" />
<logger name="application" level="DEBUG" />
<logger name="com.gargoylesoftware.htmlunit.javascript" level="OFF" />
<root level="WARN">
<appender-ref ref="STDOUT" />
</root>
</configuration>

WELD-000119 Not generating any bean definitions from "Clazz" because of underlying class loading error

I'm trying to run Arquillian tests inside my JBoss EAP server container.
When i run them, i'm getting the following info messages:
16:49:48,648 INFO [org.jboss.weld.ClassLoading] (MSC service thread 1-1) WELD-000119 Not generating any bean definitions from package.ChartofaccountDAO because of underlying class loading error
16:49:48,768 INFO [org.jboss.weld.ClassLoading] (MSC service thread 1-1) WELD-000119 Not generating any bean definitions from package.ChartofaccountDAOImpl because of underlying class loading error
I think this is the cause because all my CDI injections aren't being processed.
I tried enabling DEBUG in this class to get more information, i've changed standalone.xml to:
<root-logger>
<level name="DEBUG"/>
<handlers>
<handler name="CONSOLE"/>
<handler name="FILE"/>
</handlers>
</root-logger>
But i still can only see INFO log records.
Found the solution, i had to enable logger in the class and in the console-handler:
<console-handler name="CONSOLE">
<level name="DEBUG"/>
<formatter>
<named-formatter name="COLOR-PATTERN"/>
</formatter>
</console-handler>
<logger category="org.jboss.weld.ClassLoading">
<level name="DEBUG"/>
</logger>

including a custom appender class for jboss logging jboss-log4j.xml

I am relatively new to JBOSS. I have to use a custom appender of which I have a jar file available.
For eg.
<appender name="MYLOGGER" class="org.log4j.appender.MyLogAppender">
<param name="File" value="/logs/abc.log"/>
<param name="Threshold" value="DEBUG"/>
...more parameters...
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern" value="%-5p %-23d{} [%t] %x: %c{1} - %m%n"/>
</layout>
</appender>
But doing so I get the error
log4j:ERROR Could not create an Appender. Reported error follows.
java.lang.ClassNotFoundException: org.log4j.appender.MyLogAppender
Which file other than jboss-log4j.xml must be configured ?
Where must the jar file be placed in the jboss hierarchy and how must the jboss-log4j.xml be configured to use the appender ?
Thanks.
You don't say which version of JBoss that you're using, but for JBoss 5.1.0, Log4J lives in $JBOSS_HOME/common/lib, so I'd suggest putting your jar file there.