How can I set the logger level with Quartz Scheduler and Cocoon? - quartz-scheduler

I have a project with an old version of Cocoon. There two cron jobs.
The project has the following log4j config:
log4j.appender.CONSOLE=org.apache.log4j.ConsoleAppender
log4j.appender.CONSOLE.layout=org.apache.log4j.PatternLayout
log4j.appender.CONSOLE.layout.conversionPattern=%d %-5p - %-26.26c{1} - %m\n
log4j.rootLogger=WARN,CONSOLE
In logs folder there exists file cron.log. But there are some INFO entries. How can I setup log level for this?

You can try adding the following line to set the debug level of the org.quartz package.
log4j.logger.org.quartz=WARN,CONSOLE
BTW, you probably have something that configures this file appender (cron.log) because by default quartz (2.x) does not provides such configuration.
HIH

Related

Can't turn audit logging off in Artemis

We're getting spammed in our artemis.log with log messages from org.apache.activemq.audit.message and org.apache.activemq.audit.base, like the following:
2020-06-04 12:02:26,151 INFO [org.apache.activemq.audit.message] AMQ601500: User xxx is sending a core message on target resource: ...
and
2020-06-04 12:02:26,081 INFO [org.apache.activemq.audit.base] AMQ601019: User amq|xxx| is getting mbean info on target resource: org.apache.activemq.artemis.core.management.impl.AddressControlImpl#60975100 []
We've added the following lines to our logging.properties with no luck:
logger.org.apache.activemq.audit.base.level=ERROR
logger.org.apache.activemq.audit.message.level=ERROR
What's going on here? How do we turn these off?
It looks like you haven't configured your logging.properties appropriately to ignore messages from those loggers. You've added lines to set the level for those loggers, but have you added those loggers to the loggers list?
For example, this is the default logging.properties shipped with ActiveMQ Artemis 2.13.0:
loggers=org.eclipse.jetty,org.jboss.logging,org.apache.activemq.artemis.core.server,org.apache.activemq.artemis.utils,org.apache.activemq.artemis.journal,org.apache.activemq.artemis.jms.server,org.apache.activemq.artemis.integration.bootstrap,org.apache.activemq.audit.base,org.apache.activemq.audit.message,org.apache.activemq.audit.resource
# Root logger level
logger.level=INFO
# ActiveMQ Artemis logger levels
logger.org.apache.activemq.artemis.core.server.level=INFO
logger.org.apache.activemq.artemis.journal.level=INFO
logger.org.apache.activemq.artemis.utils.level=INFO
logger.org.apache.activemq.artemis.jms.level=INFO
logger.org.apache.activemq.artemis.integration.bootstrap.level=INFO
logger.org.eclipse.jetty.level=WARN
# Root logger handlers
logger.handlers=FILE,CONSOLE
# to enable audit change the level to INFO
logger.org.apache.activemq.audit.base.level=ERROR
logger.org.apache.activemq.audit.base.handlers=AUDIT_FILE
logger.org.apache.activemq.audit.base.useParentHandlers=false
logger.org.apache.activemq.audit.resource.level=ERROR
logger.org.apache.activemq.audit.resource.handlers=AUDIT_FILE
logger.org.apache.activemq.audit.resource.useParentHandlers=false
logger.org.apache.activemq.audit.message.level=ERROR
logger.org.apache.activemq.audit.message.handlers=AUDIT_FILE
logger.org.apache.activemq.audit.message.useParentHandlers=false
# Console handler configuration
handler.CONSOLE=org.jboss.logmanager.handlers.ConsoleHandler
handler.CONSOLE.properties=autoFlush
handler.CONSOLE.level=DEBUG
handler.CONSOLE.autoFlush=true
handler.CONSOLE.formatter=PATTERN
# File handler configuration
handler.FILE=org.jboss.logmanager.handlers.PeriodicRotatingFileHandler
handler.FILE.level=DEBUG
handler.FILE.properties=suffix,append,autoFlush,fileName
handler.FILE.suffix=.yyyy-MM-dd
handler.FILE.append=true
handler.FILE.autoFlush=true
handler.FILE.fileName=${artemis.instance}/log/artemis.log
handler.FILE.formatter=PATTERN
# Formatter pattern configuration
formatter.PATTERN=org.jboss.logmanager.formatters.PatternFormatter
formatter.PATTERN.properties=pattern
formatter.PATTERN.pattern=%d %-5p [%c] %s%E%n
#Audit logger
handler.AUDIT_FILE=org.jboss.logmanager.handlers.PeriodicRotatingFileHandler
handler.AUDIT_FILE.level=INFO
handler.AUDIT_FILE.properties=suffix,append,autoFlush,fileName
handler.AUDIT_FILE.suffix=.yyyy-MM-dd
handler.AUDIT_FILE.append=true
handler.AUDIT_FILE.autoFlush=true
handler.AUDIT_FILE.fileName=${artemis.instance}/log/audit.log
handler.AUDIT_FILE.formatter=AUDIT_PATTERN
formatter.AUDIT_PATTERN=org.jboss.logmanager.formatters.PatternFormatter
formatter.AUDIT_PATTERN.properties=pattern
formatter.AUDIT_PATTERN.pattern=%d [AUDIT](%t) %s%E%n
Notice that the first line defines the loggers list and includes org.apache.activemq.audit.base, org.apache.activemq.audit.message, & org.apache.activemq.audit.resource.

IntellijIdea - Disable Info Message when running Spark Application

I'm getting so many message when running application that using Apache Spark and Hbase/Hadoop Library. For Example :
0 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation #org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
How to disable it, so i just get straight to the point Log like println(varABC) only ?
What you are seeing is logs produced by Spark through log4j, as by default it enables quite a log of printouts printed to stderr. You can configure it as you are usually configuring log4j behavior, e.g. through a log4j.properties configuration file. Refer to http://spark.apache.org/docs/latest/configuration.html#configuring-logging
In /spark-2.0.0-bin-hadoop2.6/conf folder you have a file log4j.properties.template
Rename from log4j.properties.template to log4j.properties
and make the following change in log4j.properties
from: log4j.rootCategory=INFO, console
to: log4j.rootCategory=ERROR, console
Hope this Help!!!...
Under $SPARK_HOME/conf dir modify the log4j.properties file - change values INFO to ERROR as below:
log4j.rootLogger=${root.logger}
root.logger=ERROR,console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{2}: %m%n
log4j.logger.org.apache.spark.repl.Main=WARN
log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.spark-project.jetty=WARN
log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=ERROR
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=ERROR
log4j.logger.org.apache.parquet=ERROR
log4j.logger.parquet=ERROR
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR
this will disable all the INFO log messages and only will print ERROR or FATAL log messages. you can change these values according to your requirement(s).

Running junit tests in maven ignores programmatic log4j2 setup

I have a particular JUnit test which processes a big data file and when the logging level is left on TRACE, it kills Eclipse - something to do with the console handling, which is not relevant to this question.
I often switch between running all my tests using m2e, in which case I don't need any debugging log output, and running individual tests, where I want often want to see the TRACE output.
To avoid the necessity of editing my log4j2.xml config every time, I coded the log4j config to increase the logging level to INFO in this particular test, like this from programmatically-change-log-level-in-log4j2:
#Before
public void beforeTest() {
LoggerContext ctx = (LoggerContext) LogManager.getContext(false);
Configuration config = ctx.getConfiguration();
LoggerConfig loggerConfig = config.getLoggerConfig(
LogManager.ROOT_LOGGER_NAME);
initialLogLevel = loggerConfig.getLevel();
loggerConfig.setLevel(Level.INFO);
}
But it has no effect.
If the "ROOT_LOGGER" that I am manipulating here represents the same logger as the <root> in my log4j2.xml, then this is not going to work, is it? I need to override all the other loggers, or shut it down completely, but how?
Could it be influenced by my use of slf4j as the log4j2 wrapper in all of my other classes?
I have tried getting hold of the Appenders and using append.stop() but that doesn't work.
You can put a log4j2 config file in src/test/resources/ directory. During unit tests that file will be used.

Documentation Generation is disabled

I did all that is specified in the tutorial - Doxygen Plugin.
Here is the sonarqube-4.5.1/conf/sonar.propeties file doxygen entries:
# Doxygen
sonar.doxygen.generateDocumentation=enable
sonar.doxygen.deploymentPath=D:\Downloads\sonarqube-4.5.1\web
sonar.doxygen.deploymentUrl=http://localhost:9000/sonar/documentation
The output of the sonarqube runner:
16:07:16.265 INFO - ANALYSIS SUCCESSFUL
16:07:16.266 DEBUG - Post-jobs : org.sonar.plugins.doxygen.DoxygenPostJob#28bda649
16:07:16.266 INFO - Executing post-job class org.sonar.plugins.doxygen.DoxygenPostJob
16:07:16.271 INFO - === SUPPRESS PREVIOUS GENERATION ===
16:07:16.395 INFO - === DOXYGEN EXECUTION ===
16:07:16.396 INFO - ### Generating configuration ###
16:07:16.427 INFO - ### Generating documentation ###
Also, in the specified \web folder there is a documentation folder which seems to contain the correct doxygen documentation output.
Yet I keep getting this Documentation Generation is disabled. message in the SonarQube web interface:
UPDATE
This is what my sonar-project.properties file contains now ― using a unix-style path:
#Doxygen
sonar.doxygen.generateDocumentation=enable
sonar.doxygen.deploymentPath=/Downloads/sonarqube-4.5.1/web
sonar.doxygen.deploymentUrl=http://localhost:9000/sonar/documentation
The output remains the same, same issue.
What do I need to do in order to see the documentation in the web server interface?
This seems to be a server linkage problem, because the documentation is being generated correctly, at this location: /Downloads/sonarqube-4.5.1/web/documentation.
I have also found this content:
core,true,sonar-core-plugin-4.5.1.jar|9289fc1067c31372c0b020aa01163087
emailnotifications,true,sonar-email-notifications-plugin-4.5.1.jar|bb35818e4a655a3ba2cff2afc65a296b
findbugs,false,sonar-findbugs-plugin-2.4.jar|bb0bf263ef1e0d56f569878732060cc9
java,false,sonar-java-plugin-2.4.jar|a105d018165ddeb2c0f5074100768660
cpd,true,sonar-cpd-plugin-4.5.1.jar|e11ff5066c9e2308036838510d87a6fe
dbcleaner,true,sonar-dbcleaner-plugin-4.5.1.jar|a444b3b4571791e1cde146ffa5132ee4
design,true,sonar-design-plugin-4.5.1.jar|0c6476994a44904307cfa8b8a08bbddd
doxygen,false,sonar-doxygen-plugin-0.1.jar|d86e1ab81c3ac34e6b31aa1da28d7f72
l10nen,true,sonar-l10n-en-plugin-4.5.1.jar|c21d53f67901cf6df3da1b4dd48a441b
in sonarqube-4.5.1\web\deploy\plugins\index.txt.
It looks like doxygen has a false associated with it. If I try to edit it (to true) and restart the server nothing changes. The file is overwritten at by the sonar-runner.
sonar.doxygen.generateDocumentation is a project property, not a server property. You have to set it in your "sonar-project.properties" file if you run your analysis with the SonarQube Runner or in your pom.xml file if you run the analysis with Maven.
Here is how I solved this:
Stopped the sonar-qube server.
Replaced the old sonar-doxygen-plugin-0.1.jar, from /Downloads/sonarqube-4.5.1/extensions/plugins, with the updated doxygen plugin from here https://github.com/SonarCommunity/sonar-doxygen/releases/download/1.0/sonar-doxygen-plugin-1.0-SNAPSHOT.jar.
Commented out the old configuration entries for doxygen from the project sonar-project.properties file. And replaced them with the follwoing entries:
sonar.doxygen.url=http://localhost:8000/
sonar.doxygen.enable=true
Used a simple python script to post the documentation html on that server (http://localhost:8000/).
Restarted the sonar-qube server.
Run the sonar-runner.bat again.
The documentation is in its place now.

hornetq restart overrides the log files

hornetq restart overrides the log files,although the log file rotation is working fine, I am using the following config, I am running hornet in a standalone clustered mode
# File handler configuration
handler.FILE=org.jboss.logmanager.handlers.PeriodicRotatingFileHandler
handler.FILE.level=DEBUG
handler.FILE.properties=autoFlush,fileName,suffix,append
handler.FILE.autoFlush=true
handler.FILE.fileName=../logs/hornetq.log
handler.FILE.suffix=.yyyy-MM-dd
handler.FILE.append=true
handler.FILE.formatter=PATTERN
found out the issue, the order of the properties matter!
# File handler configuration
handler.FILE=org.jboss.logmanager.handlers.PeriodicRotatingFileHandler
handler.FILE.level=DEBUG
handler.FILE.properties=autoFlush,append,fileName,suffix
handler.FILE.autoFlush=true
handler.FILE.append=true
handler.FILE.fileName=../logs/hornetq.log
handler.FILE.suffix=.yyyy-MM-dd
handler.FILE.formatter=PATTERN
https://community.jboss.org/message/742699