Problem with configuring log4j2.properties into spring boot( using gradle) - eclipse

I added a log4j2.properties file in scr/main/resources but it is not getting affected. Shouldn't log4j2.properties get detected on its own. How can I check if it's not getting detected??
Log4j2.properties file
status = error
name = PropertiesConfig
filters = threshold
filter.threshold.type = ThresholdFilter
filter.threshold.level = debug
appenders = console
appender.console.type = Console
appender.console.name = STDOUT
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = %d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
rootLogger.level = debug
rootLogger.appenderRefs = stdout
rootLogger.appenderRef.stdout.ref = STDOUT

Spring Boot is using Logback as logging framework.
If you want to use Log4j2 you have to do some configuration.
Exclude the default logger and add log4j2 starter dependency:
dependencies {
compile 'org.springframework.boot:spring-boot-starter-web'
compile 'org.springframework.boot:spring-boot-starter-log4j2'
}
configurations {
all {
exclude group: 'org.springframework.boot', module: 'spring-boot-starter-logging'
}
}
And as far as I know Log4j2 is configured usting a XML file not a property file.
Please find all the information in the official Spring Boot Reference Documentation:
https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#howto-configure-log4j-for-logging

Related

How to append Spark ApplicationID in filename of log4j log file - Scala

I am trying to append the Spark applicationId to the filename of log4j log file. Below is log4j.properties file
log4j.rootLogger=info,file
# Redirect log messages to console
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L -%m%n
# Redirect log messages to log file, support file rolling
log4j.appender.file=org.apache.log4j.rolling.RollingFileAppender
log4j.appender.file.rollingPolicy=org.apache.log4j.rolling.TimeBasedRollingPolicy
log4j.appender.file.rollingPolicy.FileNamePattern=log4j//Data_Quality.%d{yyyy-MM-dd}.log
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L -%m%n
# set the immediate flush to true
log4j.appender.FILE.ImmediateFlush=true
# set the threshold to debug mode INFO
log4j.appender.FILE.Threshold=INFO
#Set the append to false, overwrite
log4j.appender.FILE.Append=true
Spark-submit command:
spark2-submit -conf "spark.driver.extraJavaOptions=-Dconfig.file=./input.conf -Dlog4j.configuration=log4j.properties" --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=log4j.properties" --files "input.conf,log4j.properties" --master yarn --class "DataCheckImplementation" Data_quality.jar
Logfiles are created with name : Data_Quality.2020-07-21.log, which is working correctly.
I want to add Spark ApplicationID to filename
Expected filename : Data_Quality.(ApplicationID).2020-07-21.log
Example: Data_Quality.(application_1595144411765_20000).2020-07-21.log
Is it possible? Need help!
I don't know/think if this can be at configuration level (e.g lo4j.properties, etc), but there are ways we can achieve this. Here is one approach:
You will need to have a logger class/trait where you deal with all you logger management, something like :
trait SparkContextProvider {
def spark: SparkSession
}
trait Logger extends SparkContextProvider {
lazy val log = Logger.getLogger(...)
lazy val applicationId = spark.sparkContext.applicationId
val appender = new RollingFileAppender();
appender.setAppend(true);
appender.setMaxFileSize("1MB");
appender.setMaxBackupIndex(1);
appender.setFile("Data_Quality" + applicationId + "_" + dateFormat.format(date) + ".log");
appender.activateOptions();
val layOut = new PatternLayout();
layOut.setConversionPattern("%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n");
appender.setLayout(layOut);
log.addAppender(appender);
}

Stack Driver logging with SLF4J not logging properly in scala

I am using Logback with SLF4J for Stackdriver logging and I am following the example from Google Cloud. My application is written in Scala and running on Dataproc cluster on GCP.
logback.xml have the following contents.
<configuration>
<appender name="CLOUD" class="com.google.cloud.logging.logback.LoggingAppender">
<!-- Optional : filter logs at or above a level -->
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>INFO</level>
</filter>
<log>application.log</log> <!-- Optional : default java.log -->
<resourceType>gae_app</resourceType> <!-- Optional : default: auto-detected, fallback: global -->
<enhancer>com.company.customer.utils.MyLoggingEnhancer</enhancer> <!-- Optional -->
<flushLevel>WARN</flushLevel> <!-- Optional : default ERROR -->
</appender>
<root level="info">
<appender-ref ref="CLOUD" />
</root>
</configuration>
and MyLoggingEnhancer is
package com.company.customer.utils
import com.google.cloud.logging.LogEntry.Builder
import com.google.cloud.logging.LoggingEnhancer
class MyLoggingEnhancer extends LoggingEnhancer {
override def enhanceLogEntry(builder: Builder): Unit = {
builder
.addLabel("test-label-1", "test-value-1") //also not showing on the stack driver
}
}
And I am constructing logging object in classes
private val LOGGER: Logger = LoggerFactory.getLogger(this.getClass)
And I logged a message from logger object like
LOGGER.info("Spark Streaming Started for MyApplication")
The issue is resourceType and logName is not properly set on Stackdriver logs. logName is automatically set to yarn-userlogs and resourceType is set to cloud_dataproc_cluster. I want to set resourceType = Global and logName = MyApp so that I can filter MyApp logs under Global hierarchy.
I tried with after removing this line from logback.xml.
<resourceType>gae_app</resourceType>
and by adding this line
<resourceType>global</resourceType>
but no luck. Any kind of Help highly appreciated. The label test-label-1 which i provide is in LoggingEnhancer is not showing on Stackdriver as well.
My Json Payload on Stackdriver is
{
insertId: "gn8clokwpjqptwo5h"
jsonPayload: {
application: "application_1568634817510_0189"
class: "com.MyClass"
container: "container_e01_1568634817510_0189_01_000001"
container_logname: "stderr"
filename: "application_1568634817510_0189.container_e01_1568634817510_0189_01_000001.stderr"
message: "Spark Streaming Started for MyApplication"
}
labels: {
compute.googleapis.com / resource_id: "1437319101399877659"
compute.googleapis.com / resource_name: "e-spark-w-2"
compute.googleapis.com / zone: "us"
}
logName: "projects/myProject/logs/yarn-userlogs"
receiveTimestamp: "2019-10-01T05:25:16.044579001Z"
resource: {
labels: {
cluster_name: "shuttle-spark"
cluster_uuid: "d1557db6-72ee-4873-a276-4bd4ea0e89bb"
project_id: "MyProjectId"
region: "us"
}
type: "cloud_dataproc_cluster"
}
severity: "INFO"
timestamp: "2019-10-01T05:25:10Z"
}

Scala junit picking up wrong log4j.properties file

I have a test written in Scala, using junit. The test is in a module of a multi-pom with many other modules.
Here is the code of the test:
import org.apache.log4j.Logger
import org.apache.logging.log4j.scala.Logging
import org.junit._
class MyTest extends Logging {
#Test
def mainTest() = {
//val logger = Logger.getLogger("MyTest")
logger.fatal("fatal")
logger.error("error")
logger.warn("warn")
logger.info("info")
logger.debug("debug")
logger.trace("trace")
}
}
And here is the log4j.properties file, which is in the resources folder:
log4j.rootCategory=ALL, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.out
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
The maven dependencies are:
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api-scala_2.10</artifactId>
<version>2.8.2</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.8.2</version>
</dependency>
When I run the test, the debug and trace levels are not printed.
It seems to me that the logger might be picking up a files from one of the other projects. why?
If I uncomment the first line of the test, all the levels get printed.
Tried adding -Dlog4j.debug to the run command, but log4j seems to be ignoring it.
Any idea what I'm missing?
You are using log4j2.
Your file name should be log4j2.properties.
Also, the syntax of the .properties file has changes. The following example, taken from here, will get you started:
name=PropertiesConfig
property.filename = logs
appenders = console, file
appender.console.type = Console
appender.console.name = STDOUT
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = [%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n
appender.file.type = File
appender.file.name = LOGFILE
appender.file.fileName=${filename}/propertieslogs.log
appender.file.layout.type=PatternLayout
appender.file.layout.pattern=[%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n
loggers=file
logger.file.name=guru.springframework.blog.log4j2properties
logger.file.level = debug
logger.file.appenderRefs = file
logger.file.appenderRef.file.ref = LOGFILE
rootLogger.level = debug
rootLogger.appenderRefs = stdout
rootLogger.appenderRef.stdout.ref = STDOUT

akka.actor.ActorLogging does not log the stack trace of exception by logback

I am using Logback + SLF4J to do logging for those actors with trait of akka.actor.ActorLogging. However, when I do the code log.error("Error occur!", e), the stack trace of the exception e is not logged, but only print a line of Error occur! WARNING arguments left: 1. I wonder why and how to print the stack trace in the log file. Thank you. The following is my logback.groovy file configuration.
appender("FILE", RollingFileAppender) {
file = "./logs/logd.txt"
append = true
rollingPolicy(TimeBasedRollingPolicy) {
fileNamePattern = "./logs/logd.%d{yyyy-MM-dd}.log"
maxHistory = 30
}
encoder(PatternLayoutEncoder) {
pattern = "%date{ISO8601} [%thread] %-5level %logger{36} %X{sourceThread} - %msg%n"
}
}
root(DEBUG, ["FILE"])
Akka has separate logging, which is configured in Akka's application.conf. If you want bridge to SLF4J/Logback - use thеsе settings:
akka {
loggers = ["akka.event.slf4j.Slf4jLogger"]
loglevel = "DEBUG"
}
See: http://doc.akka.io/docs/akka/2.0/scala/logging.html
As far as I can see here, reason (Throwable) should be the first argument of log.error:
def error(cause: Throwable, message: String)
That's why you see "WARNING arguments left" - your Throwable argument was just ignored.
The 'cause' exception should be the first argument to error, not the second (as correctly mentioned by JasonG in a comment on another answer).
Using the Akka log system instead of 'bare' scala-logging has some advantages around automatically added metadata and easier testing/filtering.
See also:
http://doc.akka.io/docs/akka/2.4.16/scala/logging.html
http://doc.akka.io/api/akka/2.4/akka/event/LoggingAdapter.html#error(cause:Throwable,message:String):Unit

How to connect JBoss 7.1.1 remoting -jmx via java code?

I have a JBoss 7.1.1 server, for which I want to write jmx client. As far I understood, jboss 7.1.1 is not using typical rmi based jmx and they have given a layer of remoting-jmx over native management. I am using following code:
JMXServiceURL address = new JMXServiceURL("service:jmx:remoting-jmx://localhost:9999");
Map env = JMXConnectorConfig.getEnvironment(paramtbl);
JMXConnector connector = JMXConnectorFactory.connect(address, env);
But it is giving following exception:
java.net.MalformedURLException: Unsupported protocol: remoting-jmx
I googled it and the following thread seems relevant:
https://community.jboss.org/thread/204653?tstart=0
It asks to add jboss's libraries to my classpath. I tried that also but still getting same exception.
I got the same exception when trying to get a JmxServiceUrl.
Make sure that in your standalone.xml you have the following:
<subsystem xmlns="urn:jboss:domain:jmx:1.1">
<show-model value="true"/>
<remoting-connector use-management-endpoint="true" />
</subsystem>
And you should include in project classpath the jar named: jboss-client.jar, it can be found in JBOSS_DIRECTORY/bin/client. In fact, the JMX client must include that jar in its classpath.
This tip fixed the problem for me..Hope it will be helpful for you
Tried to do the same from Arquillian test on JBoss AS7 and finally had to use:
import org.jboss.remotingjmx.RemotingConnectorProvider;
RemotingConnectorProvider s = new RemotingConnectorProvider();
JMXConnector connector = s.newJMXConnector(url, credentials);
connector.connect();
Could not have "module name="org.jboss.remoting-jmx" services="import"" working
Also works with
environment.put("jmx.remote.protocol.provider.pkgs", "org.jboss.remotingjmx");
JMXConnector connector = JMXConnectorFactory.connect(url, environment);
connector.connect();
I used this code to connect to JBoss in a remote server
ModelControllerClient client = null;
try {
client = createClient(InetAddress.getByName("172.16.73.12"), 9999,
"admin", "pass", "ManagementRealm");
}
catch (UnknownHostException e) {
e.printStackTrace();
}
Where createClient is a method I wrote -
private ModelControllerClient createClient(final InetAddress host,
final int port, final String username, final String password,
final String securityRealmName) {
final CallbackHandler callbackHandler = new CallbackHandler() {
public void handle(Callback[] callbacks) throws IOException,
UnsupportedCallbackException {
for (Callback current : callbacks) {
if (current instanceof NameCallback) {
NameCallback ncb = (NameCallback) current;
ncb.setName(username);
} else if (current instanceof PasswordCallback) {
PasswordCallback pcb = (PasswordCallback) current;
pcb.setPassword(password.toCharArray());
} else if (current instanceof RealmCallback) {
RealmCallback rcb = (RealmCallback) current;
rcb.setText(rcb.getDefaultText());
} else {
throw new UnsupportedCallbackException(current);
}
}
}
};
return ModelControllerClient.Factory
.create(host, port, callbackHandler);
}
For more information on how to read the data obtained from Server or for the complete project using Java/Google visualizer API (to show the statistics in Graph after every 10 secs) , Please refer to this tutorial -
http://javacodingtutorial.blogspot.com/2014/05/reading-jboss-memory-usage-using-java.html
Add the following to your jboss-deployment-structure
<dependencies>
<module name="org.jboss.remoting3.remoting-jmx" services="import"/>
</dependencies>
Activate JMX remoting subsystem by adding following entry in standalone.xml
<subsystem xmlns="urn:jboss:domain:ee:1.1">
<!-- Activate JMX remoting -->
<global-modules>
<module name="org.jboss.remoting-jmx" slot="main"/>
</global-modules>
...
</subsystem>
It seems like "jboss-client.jar" is not available at run-time for JMX connection, So make sure that you have added "jboss-client.jar" in the class path.
And also you are using deprecated protocol "remoting-jmx" instead of "remote".
i.e, "service:jmx:remote://localhost:9999"
Hope it helps.