How to change log level in spark? - scala

I tried all this methods and nothing works :
In log4j file -
log4j.logger.org=OFF
log4j.rootCategory=ERROR, console
log4j.rootCategory=OFF, console
In code :
#option 1
Logger.getLogger("org.apache.spark").setLevel(Level.OFF)
#option 2
sparkContext.setLogLevel("OFF")
#option 3
val rootLogger: Logger = Logger.getRootLogger()
rootLogger.setLevel(Level.OFF)
And yes also tried by putting it after spark context object also before.Nothing seems working.
What am I missing ?
Or Is there another way to set the log levels ?

You could find these logs from the start, which means we need to set log config via logback instead of log4j.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/linzi/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/linzi/.m2/repository/org/slf4j/slf4j-log4j12/1.7.26/slf4j-log4j12-1.7.26.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
Add as logback.xml setting as below:
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<layout class="ch.qos.logback.classic.PatternLayout">
<Pattern>
%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n
</Pattern>
</layout>
</appender>
<logger name="com.mkyong" level="debug" additivity="false">
<appender-ref ref="CONSOLE"/>
</logger>
<root level="error">
<appender-ref ref="CONSOLE"/>
</root>

You should be able to do it with something like this:
spark = SparkSession.builder.getOrCreate();
spark.sparkContext().setLogLevel("OFF");
https://spark.apache.org/docs/2.3.0/api/java/org/apache/spark/SparkContext.html#setLogLevel-java.lang.String-
Can you share the rest of the code and where you're running it?

This should change your log level to OFF if you declare it before SparkSession object creation
import org.apache.log4j.{Level, Logger}
Logger.getLogger("org").setLevel(Level.OFF)
val spark = SparkSession.builder().appName("test").master("local[*]").getOrCreate()

Related

Disable logging from SDKs/jars in scala

I am using logback.xml for logging. I want to disable the logs from 3rd party jars/SDKs. For this I used the log level="OFF" for that jar, but still the logs are getting logged. Next I tried using the same log level for one of my files in codebase, I was able to disable the logs for my file.
Below is my logback config :
'''
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern> Some pattern </pattern>
</encoder>
</appender>
<logger name="<sdk file path>" level="OFF"/> !--- This doesn't work ---!
<logger name="<file from my codebase>" level="OFF"/> !--- This works ---!
<root level="INFO">
<appender-ref ref="STDOUT"/>
</root>
'''
A library can use any name it likes for a logger so the name will not necessarily match the path of the library.
The %logger field in the pattern gives the name of the logger so you will see the actual name in the logging output. If you see output that you want to suppress, use the name from the log (or a prefix) in the logger element.
I would also recommend setting the root to the lowest level and then increasing the level for the specific libraries that you are interested in.
<logger name="myloggername" level="DEBUG"/>
<root level="ERROR">
<appender-ref ref="STDOUT"/>
</root>

filter markers in logback

I have project on scala.
I use this lib for logging
https://github.com/typesafehub/scala-logging
i create logger
import com.typesafe.scalalogging.Logger
val log = Logger(getClass)
and two markers
import org.slf4j.{Marker, MarkerFactory}
private val marker: Marker = MarkerFactory.getMarker("DP")
private val marker2: Marker = MarkerFactory.getMarker("ST")
I use log in my controller
log.debug(marker, "----"
log.debug(marker2, "++++")
This is my logback
<appender name="STDOUTTime" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%coloredLevel %logger{30} - %marker - %d{yyyy/MM/dd/HH:mm:ss.SSS/Z} - %message%n%xException{3}</pattern>
</encoder>
<turboFilter class="ch.qos.logback.classic.turbo.MarkerFilter">
<Marker>DP</Marker>
<OnMatch>DENY</OnMatch>
<OnMismatch>DENY</OnMismatch>
</turboFilter>
<turboFilter class="ch.qos.logback.classic.turbo.MarkerFilter">
<Marker>ST</Marker>
<onMatch>DENY</onMatch>
<onMismatch>DENY</onMismatch>
</turboFilter>
</appender>
<logger name="ds.forwarding" level="DEBUG">
<appender-ref ref="STDOUTTime"/>
</logger>
<root level="ERROR">
</root>
now when i run my controller i have output in console:
[debug] d.f.c.a.s.InputStatisticController - DP - 2017/09/25/11:55:58.603/+0300 - ----
[debug] d.f.c.a.s.InputStatisticController - ST - 2017/09/25/11:55:58.603/+0300 - ++++
Now I have a questions:
Why marker and marker2 is visible, why DENY does not work?
How can I exclude two markers?
How can I exclude only one marker?
Here is a logback.xml which denies both DP and ST markers.
<configuration>
<turboFilter class="ch.qos.logback.classic.turbo.MarkerFilter">
<Marker>DP</Marker>
<OnMatch>DENY</OnMatch>
</turboFilter>
<turboFilter class="ch.qos.logback.classic.turbo.MarkerFilter">
<Marker>ST</Marker>
<onMatch>DENY</onMatch>
</turboFilter>
<appender name="STDOUTTime" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%coloredLevel %logger{30} - %marker - %d{yyyy/MM/dd/HH:mm:ss.SSS/Z} - %message%n%xException{3}</pattern>
</encoder>
</appender>
<root level="DEBUG">
<appender-ref ref="STDOUTTime"/>
</root>
</configuration>
The mistakes about your file:
Your file doesn't start with <configuration> and end with </configuration>
Filters are not in the correct namespace. They should be under <configuration> (same level with appenders).
No need to use <onMisMatch> flags in your case. It may cause to mix the things up.
Since your logger is named as ds.forwarding, in the class you have to be sure that you are calling that logger. In your case, you call the logger with getClass method. In my logback.xml file, I added the appender to my root logger. Therefore, it is sufficient to call it via Logger(getClass) method.
Always be careful about levels. I set the level to DEBUG.
Once you set the configuration properly, simply change the <onMatch> property to ALLOW if you want the logger to print it, or DENY if you don't. Simply setting both to ALLOW will result in printing all the markers, on the other hand, if you set both of them to DENY, that markers won't be printed.

How to stop SiftingAppender programmatically

I'm new to logback and I'm trying to stop a SiftingAppender programmatically.
Here is my appender:
<appender name="FILE-APPENDER" class="ch.qos.logback.classic.sift.SiftingAppender">
<!-- MDC value -->
<discriminator>
<key>fileName</key>
<defaultValue>log_file</defaultValue>
</discriminator>
<sift>
<appender name="ROLLING-FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
<append>true</append>
<file>{fileName}.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedRollingPolicy">
<!-- daily rollover. Make sure the path matches the one in the file element or else
the rollover logs are placed in the working directory. -->
<fileNamePattern>${fileName}-%d{yyyy-MM-dd}.%i.log</fileNamePattern>
<maxFileSize>1MB</maxFileSize>
<maxHistory>30</maxHistory>
<totalSizeCap>1GB</totalSizeCap>
</rollingPolicy>
<encoder>
<charset>UTF-8</charset>
<pattern>[%p] [%d{yy/MM/dd HH:mm:ss}] %c [%X{akkaSource}] : %msg%n</pattern>
</encoder>
</appender>
</sift>
</appender>
Root Logger:
<root level="INFO">
<appender-ref ref="FILE-APPENDER"/>
<appender-ref ref="ANOTHER-APPENDER"/>
</root>
At some point in the application I need to stop logging to file, here is my scala code:
val context: LoggerContext = LoggerFactory.getILoggerFactory.asInstanceOf[LoggerContext]
val root = LoggerFactory.getLogger(org.slf4j.Logger.ROOT_LOGGER_NAME).asInstanceOf[Logger]
root.getAppender("FILE-APPENDER").stop()
The code gets executed with no problem, but I still can see logs in the file.
If I don't use SiftingAppender and instead use only RollingFileAppender it works perfectly.
Is there anything that I'm missing here?

How can I view the SQL generated by Anorm?

I am trying out anorm to execute a number of insert statements, and then return the value of LAST_INSERT_ID(). But I am getting an error saying my SQL syntax is invalid.
Can anyone tell me how to check and see what the final generated SQL that is sent ty MySQL looks like?
Anorm doesn't really generate SQL, you do. But there is a way to log the exact queries that are sent over the wire to the console (after the statements are prepared, assuming you're using Anorm within Play).
Assuming you're using a single database called default (the default configuration), add the following to your application.conf:
db.default.logStatements=true
Then you can save the following to conf/logger.xml:
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%-5level - %msg%n</pattern>
</encoder>
</appender>
<logger name="com.jolbox.bonecp" level="NONE">
<appender-ref ref="STDOUT" />
</logger>
<logger name="play" level="INFO">
<appender-ref ref="STDOUT" />
</logger>
<logger name="application" level="NONE">
<appender-ref ref="STDOUT" />
</logger>
</configuration>
The key line in the file is related to BoneCP logging, but we want to put in lines for the application and play loggers as well, so we don't mess up the default logging.
Lack of any logging in Anorm sucks a lot.
If your generated SQL is not valid and you are getting mysterious exceptions from your driver then I suggest setting break point somewhere here (anorm.SimpleSql):
def preparedStatement(connection: Connection, getGeneratedKeys: Boolean = false) = {
implicit val res = StatementResource
resource.managed {
val (psql, vs): (String, Seq[(Int, ParameterValue)]) = Sql.prepareQuery(sql.stmt.tokens, sql.paramsInitialOrder, params, 0, new StringBuilder(), List.empty[(Int, ParameterValue)]).get
val stmt = if (getGeneratedKeys) connection.prepareStatement(psql, java.sql.Statement.RETURN_GENERATED_KEYS) else connection.prepareStatement(psql)
sql.timeout.foreach(stmt.setQueryTimeout(_))
vs foreach { case (i, v) => v.set(stmt, i + 1) }
stmt
}
and log psql

Using logback with Akka

I have some problems using Logback with my Akka (2.3.9) application. In order to log to stdout and in the logfile, I specified the logback.xml with all appenders:
<?xml version="1.0" encoding="UTF-8"?>
<configuration scan="true" scanPeriod="5 seconds" debug="true">
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<target>System.out</target>
<encoder>
<pattern>%X{akkaTimestamp} %-5level %logger{36} %X{sourceThread} - %msg%n</pattern>
</encoder>
</appender>
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>./akka.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>./akka.log.%d{yyyy-MM-dd-HH}</fileNamePattern>
</rollingPolicy>
<encoder>
<pattern>%X{akkaTimestamp} %-5level %logger{36} %X{sourceThread} - %msg%n</pattern>
</encoder>
</appender>
<logger name="proc" level="INFO">
<appender-ref ref="FILE"/>
<appender-ref ref="STDOUT"/>
</logger>
<logger name="akka.actor" level="INFO">
<appender-ref ref="FILE"/>
<appender-ref ref="STDOUT"/>
</logger>
<root level="INFO">
<appender-ref ref="FILE"/>
<appender-ref ref="STDOUT"/>
</root>
</configuration>
After that I use logging in my Actor:
val log = Logging(context.system, classOf[MyActor])
override def receive: Receive = {
case MyEvent(event) =>
log.info("Message received for processing.")
}
The problem is, in SBT everything is fine: I can see all log events in the created logfile.
When I make JAR-File with sbt-assembly and start this jar file (java -jar event-assembly-0.1.1-SNAPSHOT.jar). The app writes log entries to STDOUT but not in the file!
I have no idea how I can fix that. I tried with "lsof", the java process has no open log-files.
I don't know what's wrong but you could do the following:
Check if your logback.xml is in the jar-file
Try to start your application with java -Dlogback.configurationFile=/path/to/logback.xml -jar event-assembly-0.1.1-SNAPSHOT.jar
So, how I found out, the sbt-assembly removed some logback-binaries from the result jar (it was settings in my Build.scala). After I changed the MergingStrategy to preserve the first logback occurrence in the result build, all works correctly.