Slick logging with slf4j-simple - scala

I am using slf4j-simple in my project. I would like to change logging level for slick to INFO. After reading Logging options for Slick
and Class SimpleLogger docsI have tried to add following options to my VM line:
-Dorg.slf4j.simpleLogger.defaultLogLevel=INFO
-Dlogger.scala.slick=INFO
-Dlogger.scala.slick.jdbc.JdbcBackend.statement=INFO
-Dorg.slf4j.simpleLogger.log.scala.slick=INFO
I see a few INFO level logs comming from jetty, therefore the basic logging seems to be working. I am also able to change level of logs shown by using -Dorg.slf4j.simpleLogger.defaultLogLevel=TRACE, but even that only shows more jetty logs, no Slick logs are shown to me.
How can I configure slf4j-simple to shown slick logs to me?

According to http://www.slf4j.org/api/org/slf4j/impl/SimpleLogger.html the correct system property should be
-Dorg.slf4j.simpleLogger.log.scala.slick=debug
instead of
-Dlogger.scala.slick=INFO
For reference, the list of all loggers used by Slick can be found in logback.xml. We only use the debug level with all of them.

Related

How to run a Scio pipeline on Dataflow from SBT (local)

I am trying to run my first Scio pipeline on Dataflow .
The code in question can be found here. However I do not think that is too important.
My first experiment was to read some local CSV files and write another local CSV file, using the DirecRunner. That worked as expected.
Now, I am trying to read the files from GCS, write the output to BigQuery and run the pipeline using the DataflowRunner. I already made all the necessary changes (or that is what I believe). But I am unable to make it run.
I already gcloud auth application-default login and when I do
sbt run --runner=DataflowRunner --project=project-id --input-path=gs://path/to/data --output-table=dataset.table
I can see the Jb is submitted in Dataflow. However, after one hour the jobs fails with the following error message.
Workflow failed. Causes: The Dataflow job appears to be stuck because no worker activity has been seen in the last 1h.
(Note, the job did nothing in all that time, and since this is an experiment the data is simple too small to take more than a couple of minutes).
Checking the StackDriver I can find the follow error:
java.lang.ClassNotFoundException: scala.collection.Seq
Related to some jackson thing:
java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.scala.DefaultScalaModule could not be instantiated
And that is what is killing each executor just at the start. I really do not understand why I can not find the Scala standard library.
I also tried to first create a template and runt it latter with:
sbt run --runner=DataflowRunner --project=project-id --input-path=gs://path/to/data --output-table=dataset.table --stagingLocation=gs://path/to/staging --templateLocation=gs://path/to/templates/template-1
But, after running the template, I get the same error.
Also, I noticed that in the staging folder there are a lot of jars, but the scala-library.jar is not in there.
I am missing something obvious?
It's a known issue with sbt 1.3.0 which introduced some breaking change w.r.t. class loaders. Try 1.2.8?
Also the Jackson issue is probably related to Java 11 or above. Stay with Java 8 for now.
Fix by setting the sbt classLoaderLayeringStrategy:
run / classLoaderLayeringStrategy := ClassLoaderLayeringStrategy.Flat
sbt uses a new classloader for the application that is run with run. This causes other classes already loaded by the JVM (Predef for instance) to be reused, reducing startup time. See in-process classloaders for details.
This doesn't play well with the Beam DataflowRunner because it explicitly does not stage classes from parent classloaders, see PipelineResources.java#L51:
Attempts to detect all the resources the class loader has access to. This does not recurse to class loader parents stopping it from pulling in resources from the system class loader.
So the fix is to force all classes used by your application to be loaded in the same classloader so that DataflowRunner stages everything.
Hope that helps

Slick is failing to find config values in reference.conf (i.e., reference.conf is being ignored), in test setup

When attempting to execute a query via Slick (v3.0.3), I am receiving a com.typesafe.config.ConfigException$Missing exception (wrapped in a ExceptionInInitializerError), exclaiming:
No configuration setting found for key 'slick'
Apparently Slick requires a config value for slick.dumpPaths to be present when debug logging is enabled. Ordinarily, a default value will be provided by the reference.conf file that comes stock in Slick's jar-file, but for some reason that file (or that particular key) is not getting picked up, in this case.
In addition, adding an application.conf (which includes the requested config value, slick.dumpPaths) to my application's resource directory (src/main/resources/, by default) and/or to the test resource directory does not help the problem -- the exception still occurs.
It turns out this was (apparently) happening because I was attempting to run the Slick query via SBT's Tests.Setup hook. My hook, in build.sbt, looks something like this:
testOptions in Test += Tests.Setup(loader =>
loader.loadClass("TestSetup").newInstance)
My guess is that SBT has not properly instantiated the classpath at the time this TestSetup class gets instantiated (and when my Slick query tries to execute). Perhaps someone that knows more about SBT's internals can edit this answer to provide more insight, though.

Getting PostSharp 6.0.32 and Log4Net 2.0.0 to log at a custom level

We are using PostSharp with Log4Net as a back end. I am trying to get PostSharp to log at a different level to our manually added log statements which are at DEBUG level.
I've tried setting the postsharp.config file option below:
<LoggingProfile Name="default" IncludeSourceLineInfo="True">
<DefaultOptions>
<LoggingOptions Level="Trace"/>
</DefaultOptions>
</LoggingProfile>
But that doesn't seem to work.
I've overridden the Log4NetLoggingBackend to try and intercept the Trace level but it seems that when setting the PostSharp level to Trace, it doesn't hit any of the custom backend code.
Have I missed out a crucial step?
The configuration snippet shown in the question looks correct. Please note, however, that in the configuration file you set the PostSharp logging level that is then mapped to the logging level of the target logging back-end (log4net in this case). Both LogLevel.Trace and LogLevel.Debug map to the Debug level in log4net because the log4net library doesn't offer Trace level.
You can try to set level to Info to test that your configuration file works as expected.

Output of liquibase on console

Is there a way to let liquibase show the statements it is running (for example when using update) or generating (for example when using updateSql) on the console of the command-line while actually running them on the database / generating them to a SQL file?
I am not sure if this would work or not, but you could try setting up liquibase to use SLF4J logging framework, then configure logback (or anything else that slf4j supports) so that debug-level logging is directed to the console. If I remember correctly, liquibase will log the statements being executed at debug level.

GWT logging to file

I am writing my first GWT and i confess i have no idea how to set up loggers.
I am deploying the application to tomcat and want to be able to set up a logger so that i can log to a file in $catalina.home. Gwt came with logging.properties for a java util style log and log4j.properties; i have looked at documentation for the gwt java util logger and it seems to just write to console so it must be log4j i need?
In the past ive seen org.apache.log4j.Logger used, is this what i want?
Could somebody please point me to somewhere where this is documented?
Thanks.
The documentation is here. You can't use file appenders directly because the GWT code runs as Javascript in the browser (when not in development mode). If you want to log to a file you need to enable remote logging.
If there is a server side part logging works as normal. But then it has not much to do with GWT, except for being in the same project and providing services (via a custom protocol).
What do you want to log? rpc service servlets or client logic?
Log4j is just for java not javascript. So it is intended to log your classes in your /server/ package that will be deployed in your server.
Your /client/ package classes will be translated to javascript and will run in the client browser. So, no Java at all!
You can use log4j "emulated" to javascript with http://code.google.com/p/gwt-log/ which will send your client logs using a RemoteLogger to the server via rpc and then you can log them to a file.