Link between MongoDB Casbah and Logback - mongodb

I have a sbt project, and I work with MongoDB (Driver Casbah).
I want to have logs on my application, so I tried to use Logback Framework.
It works but I don't understand exactly what my code is doing..
here is my code for logs :
def logger = LoggerFactory.getLogger("Test log")
StatusPrinter.print(LoggerFactory.getILoggerFactory.asInstanceOf[LoggerContext])
logger.info("Azuken")
and here is my logs :
14:36:16.616 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Registering Scala Conversions.
14:36:16.633 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Deserializers for Scala Conversions registering
14:36:16.641 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Serializers for Scala Conversions registering
14:36:16.647 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up OptionSerializer
14:36:16.658 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up ScalaCollectionSerializer
14:36:16.669 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up ScalaRegexSerializers
14:36:16.677 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Hooking up scala.util.matching.Regex serializer
14:36:16.683 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Reached base registration method on MongoConversionHelper.
14:36:17.056 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Registering Scala Conversions.
14:36:17.059 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Deserializers for Scala Conversions registering
14:36:17.063 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Serializers for Scala Conversions registering
14:36:17.067 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up OptionSerializer
14:36:17.071 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up ScalaCollectionSerializer
14:36:17.079 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up ScalaRegexSerializers
14:36:17.083 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Hooking up scala.util.matching.Regex serializer
14:36:17.087 [run-main] DEBUG c.m.c.c.c.s.RegisterConversionHelpers$ - Reached base registration method on MongoConversionHelper.
I've seen that is MongoDB actions, but I don't understand which line of my code do what...
Any explanations ? I've searched on the web but I've not found a good explanation.

Theres lots of debug logging in Casbah, the logging you are seeing are the automatic handling that ensures scala types are handled and registered for BSON encoding / decoding.

Related

Error while starting kafka cluster:java.lang.NoSuchMethodError

I am trying to start kafka cluster on my local machine having ubuntu 18.04 with intellij 2019. I have kafka 2.3. I already started zookeeper before it. I am trying to run a shell script having below code :
kafka-server-start.sh $KAFKA_HOME/config/server-0.properties.
I am getting below error :
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/vagrant/app/apache-hive-3.0.0-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/vagrant/app/kafka23/libs/slf4j-log4j12-1.7.26.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
2020-06-08T13:36:09,329 INFO [main] kafka.utils.Log4jControllerRegistration$ - Registered kafka:type=kafka.Log4jController MBean
2020-06-08T13:36:09,548 ERROR [main] kafka.Kafka$ - Exiting Kafka due to fatal exception
java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;
at kafka.Kafka$.getPropsFromArgs(Kafka.scala:43) [kafka_2.12-2.3.0.jar:?]
at kafka.Kafka$.main(Kafka.scala:67) [kafka_2.12-2.3.0.jar:?]
at kafka.Kafka.main(Kafka.scala) [kafka_2.12-2.3.0.jar:?]```
Can somebody please help to resolve this issue ?
The issue I found out was multiple sl4j binding coming from my bashrc file. Both hive and kafka sl4j bindings caused the conflict. I commented the relevant hive code in my bashrc and was able to create kafka cluster.

Logging Configuration Not Working for Akka

I am having problems with configuration for logging with Akka. In my STDOUT it is not hiding them DEBUG messages. In my console I seeing this:
12:45:27.790 [example-akka.kafka.default-dispatcher-18] DEBUG org.apache.kafka.clients.consumer.KafkaConsumer - [Consumer clientId=consumer-1, groupId=group1] Resuming partitions [test-topic-0]
12:45:27.823 [example-akka.kafka.default-dispatcher-18] DEBUG org.apache.kafka.clients.FetchSessionHandler - [Consumer clientId=consumer-1, groupId=group1] Node 1001 sent an incremental fetch response for session 1829476633 with 0 response partition(s), 1 implied partition(s)
So I am needing to stop seeing DEBUG messages. I make my logging like this:
akka {
# Loggers to register at boot time (akka.event.Logging$DefaultLogger logs
# to STDOUT)
loggers = ["akka.event.slf4j.Slf4jLogger"]
# Log level used by the configured loggers (see "loggers") as soon
# as they have been started; before that, see "stdout-loglevel"
# Options: OFF, ERROR, WARNING, INFO, DEBUG
loglevel = "INFO"
# Log level for the very basic logger activated during ActorSystem startup.
# This logger prints the log messages to stdout (System.out).
# Options: OFF, ERROR, WARNING, INFO, DEBUG
stdout-loglevel = "INFO"
}
And in my logback.xml I am writing this:
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%date{ISO8601} level=[%level] marker=[%marker] logger=[%logger] akkaSource=[%X{akkaSource}]
sourceActorSystem=[%X{sourceActorSystem}] sourceThread=[%X{sourceThread}] mdc=[ticket-#%X{ticketNumber}:
%X{ticketDesc}] - msg=[%msg]%n----%n
</pattern>
</encoder>
</appender>
<root level="info">
<appender-ref ref="STDOUT"/>
</root>
</configuration>
Why is not DEBUG messages being stopped?
My project structure
src -> main -> scala (here my source files in Scala)
src -> main -> scala -> resources (here my lokback.xml and appliction.conf)
Your sbt project structure is incorrect. This should be the correct structure:
src/
main/
resources/
<files to include in main jar here>
scala/
<main Scala sources>
java/
<main Java sources>
test/
resources
<files to include in test jar here>
scala/
<test Scala sources>
java/
<test Java sources>
Then you also need to follow #Mario's advice as well of adjusting the logging at the package level.
Log level can be configured at package level, for example, adding the following to logback.xml
<logger name="org.apache.kafka" level="INFO"/>
sets the log level to INFO for all the components inside org.apache.kafka package, which should stop DEBUG messages.

Spring Cloud can't work with kafka

We use spring cloud with stream Kafka, it running well, but can't visit the spring mvc Urls, it works fine if I remove the #EnableBinding(Sink.class).
It seems that #EnableBinding(Sink.class) affect the spring mvc functionality.
2018-01-11 15:54:19.218 [restartedMain] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'value.serializer' was supplied but isn't a known config.
2018-01-11 15:54:19.218 [restartedMain] WARN org.apache.kafka.clients.consumer.ConsumerConfig - The configuration 'key.serializer' was supplied but isn't a known config.
2018-01-11 15:54:19.219 [restartedMain] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version : 0.10.1.1
2018-01-11 15:54:19.219 [restartedMain] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId : f10ef2720b03b247
I had the same problem
In order for your application to be able to communicate with Kafka, need to define an outbound stream to write messages to a Kafka topic, and an inbound stream to read messages from a Kafka topic.
Spring Cloud provides a convenient way to do this by simply creating an interface that defines a separate method for each stream.
It seems that your interface is not in Spring context
You should use the #Component annotation,

exception in thread "main" java.lang.NoSuchMethodError scala.collection.immutable.hashset$

Header 1 #
imported spark code to run on eclipse
Getting build errors
Its working fine on the terminal
Header 2
/*SampleApp.scala:
This application simply counts the number of lines that contain "val"
*/
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object SimpleApp {
def main(args: Array[String]) {
val txtFile = "file:///home/edureka/Desktop/readme.txt"
val conf = new SparkConf().setMaster("local[2]").setAppName("Sample Application")
val sc = new SparkContext(conf)
val txtFileLines = sc.textFile(txtFile , 2).cache()
val numAs = txtFileLines.filter(line => line.contains("bash")).count()
println("Lines with bash: %s".format(numAs))
}
}
Header 3 "
LF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/edureka/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/edureka/spark-1.1.1/assembly/target/scala-2.10/spark-assembly-1.1.1-hadoop2.2.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/08/16 17:00:16 WARN util.Utils: Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 192.168.211.130 instead (on interface eth2)
15/08/16 17:00:16 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/08/16 17:00:16 INFO spark.SecurityManager: Changing view acls to: edureka
15/08/16 17:00:16 INFO spark.SecurityManager: Changing modify acls to: edureka
15/08/16 17:00:16 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(edureka); users with modify permissions: Set(edureka)
Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
at akka.actor.ActorCell$.<init>(ActorCell.scala:305)
at akka.actor.ActorCell$.<clinit>(ActorCell.scala)
at akka.actor.RootActorPath.$div(ActorPath.scala:152)
at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:465)
at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:124)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
at scala.util.Try$.apply(Try.scala:191)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at scala.util.Success.flatMap(Try.scala:230)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:550)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1504)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:166)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1495)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:204)
at SimpleApp$.main(SampleApp.scala:14)
at SimpleApp.main(SampleApp.scala)
Be careful, this kind of problems happen quite often with Spark. If you don't want other surprises, you can build Spak yourself against the right versions of the dependencies you may be using (Guava, log4j, Scala, Jackson) Also, consider using spark.driver.userClassPathFirst and spark.executor.userClassPathFirst properties in order to make your classpath priotary over Spark bundled dependencies. Personally it only worked for me when passing them as a parameter of spark-submit. It does not work when setting them in SparkConf (which makes sense).
Even with these properties set to true, you may still have problems because Spark uses a separate classloader, which can lead to some issues even if your dependencies have the same version number. In this case, only manually building Spark will allow to fix it (to my knowledge).
I actually did try and install spark with all dependencies and tried to run the code. This actually did work. The main point was to set the directory structure correctly. Create a project the create the file structure src/main/scala inside it and then the actual program(code) file code.scala. And the dependencies file .sbt should be inside the main Project file. Thanks #Dici

Akka actor logging not writing to file

I'm attempting to log to a file rather than stdout.
My application.conf (in src/main/resources/):
akka {
event-handlers = ["akka.event.slf4j.Slf4jEventHandler"]
loglevel = "DEBUG"
}
logback.xml (in src/main/resources/):
<configuration>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>log/app.log</file>
<append>true</append>
<encoder>
<pattern>%date{yyyy-MM-dd} %X{akkaTimestamp} %-5level[%thread] %logger{1} - %msg%n</pattern>
</encoder>
</appender>
<root level="DEBUG">
<appender-ref ref="FILE"/>
</root>
</configuration>
Creating the actor system:
val conf: Config = ConfigFactory.load()
val system = ActorSystem("Process", conf)
Finally, the actual logging:
class Processor() extends Actor with ActorLogging {
def receive = {
case Start =>
log.info("Started")
}
}
However, when running the app, I get the logging in stdout:
[info] Running com.imgzine.analytics.apps.ProcessEvents
[DEBUG] [06/02/2014 09:28:53.356] [run-main] [EventStream(akka://Process)] logger log1-Logging$DefaultLogger started
[DEBUG] [06/02/2014 09:28:53.358] [run-main] [EventStream(akka://Process)] Default Loggers started
[INFO] [06/02/2014 09:28:53.389] [Process-akka.actor.default-dispatcher-4] [akka://Process/user/processor] Started
[DEBUG] [06/02/2014 09:28:54.887] [Process-akka.actor.default-dispatcher-4] [EventStream] shutting down: StandardOutLogger started
And inside log/app.log, I find:
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Registering Scala Conversions.
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Deserializers for Scala Conversions registering
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Serializers for Scala Conversions registering
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up OptionSerializer
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up ScalaProductSerializer
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up ScalaCollectionSerializer
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Setting up ScalaRegexSerializers
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Hooking up scala.util.matching.Regex serializer
2014-06-02 DEBUG[ProcessEvents-akka.actor.default-dispatcher-2] c.m.c.c.c.s.RegisterConversionHelpers$ - Reached base registration method on MongoConversionHelper.
Any ideas?
If you are using Akka 2.3.3 as stated in response to my comment, then your config around logging is out of date (using old logging config setup). Try replacing your logging config with:
akka {
loggers = ["akka.event.slf4j.Slf4jLogger"]
loglevel = "DEBUG"
}
If that does not work let me know and I'll advise on another solution.