log4j-over-slf4j missing class org.apache.log4j.varia.NullAppender logback - class

We are using a 3rd-party jar that contains a dependency to the log4j.varia.NullAppender class. We are transitioning to logback, but unable to change the 3rd-party jar, so we plan to use the log4j-over-sl4j bridge for now with the rest of our code still using the existing log4j calls.
Since the log4j-over-slf4j jar does not contain the NullAppender class, what is the best way to spoof it? Should we write our own and put it on the class path? Should we augment the log4j-over-slf4j bridge jar with this class?
What have others done who have run into this and does the logback project plan to support backwards compatibility? This is a real hassle for those of us who want to migrate when existing jar files use log4j classes that are not in the bridge.
Any suggestions or best practices?

Related

Upgrade log4j to log4j2 in OSGi environment (Eclipse plugin)

I have an OSGi application (Eclipse plugin) that contains several bundles.
I have a com.domain.dependencies bundle that, as the name suggests, contains dependencies. There is NO code in this bundle. The concept is that all 3rd-party dependencies used by 1+ other bundles are contained in this bundle and made available to other bundles within the plugin. This has always worked for the past decade or so that this plugin has evolved.
The above bundle 'pulls in' log4j - an older log4j version 1.x. So, log4j has always been exposed as an available library to other bundles that use com.domain.dependencies.
Due to the recent security issues with log4j2, a company security directive/edict has stated that all use of log4j or log4j2 must be upgraded to log4j2 v2.16.0
Initially I thought I'd just change the declaration in the build.gradle file for com.domain.dependencies to pull in that newer log4j2 but discovered that log4j2 is split in to 'core' and 'api' jars. OK so I tried to use those instead. I then followed the Apache migration steps for moving from log4j 1.x to 2.x, updated all the code etc.
After the above, compilation fails. None of the other bundles 'see' log4j2 as they saw log4j. A bit of Googling and I see people talk about creating OSGi Fragments. What's a Fragment? I've read a bit about them and feel none the wiser when it comes to my issue.
I should point out that my plugin also has a dedicated bundle com.domain.log, which depends on com.domain.dependencies and it's the com.domain.log bundle that contains the log4j.properties file (which also needs tweaking for log4j2). This logging bundle wrapped log4j (and soon to be log4j2) to expose logging features to the other bundles within the plugin.
So when it comes to using fragments, I am confused. I see some articles on the internet suggest at least 2 bundles are required. I don't know if these have to be new, or if I can re-use my existing arrangement of bundles. I struggle to relate those articles to how things are currently set up in my plugin, but I wish to maintain the idea that com.domain.dependencies supplies dependencies to other bundles and has no code of itself, while also having the com.domain.log continue to expose the same logging functionality to the other bundles that need it.
My instinctive feelings are that com.domain.log which exposes logging functionality to my other bundles, should use log4j-api, while com.domain.dependencies should obtain log4j-core (implementation) and expose it to com.domain.log. However, I can imagine too many different ways to try and set this up, and all will fail unless I am doing it the right way. Basically, I need help from somebody who knows how to in an OSGi environment.
So, how should I wire-in log4j2 to mimic the traditional behaviour/functionality in my OSGi environment?

Can SBT scopes be used for custom libraryDependencies for specific code blocks?

I've a simple SBT project, in which one code block reads from HDFS (needs a certain version of Hadoop's libraryDependencies) and another code block (needs another version of Hadoop's libraryDependencies) writes the filtered result to Cassandra.
Can SBT scopes be used to assign a different libraryDependencies to the two code blocks?
You can do this, but you have to split your code over one of the scope axises: Project, configuration, task. The only axis that can be used for your purpose is the "project" axis. So you have to create a multi-project sbt project and split your code on its sub projects.
But his will not solve your problem. Because you will not be able to run the resulting application. The Java class loader has no way to decide, when to use the one version of Hadoop and when the other. It will load one version of the classes in question and then use it in all cases.
For this task you have to use a context aware class loader. An example for this is an OSGi container, like Apache Feilx. OSGi is version aware and can load different versions of the same library in the same Java process. It will then reference to the classes of the correct version of the library depending on the context the library is used.
To be more precise: You must convert your different versions of your Hadoop library into OSGi bundles. Then you must split your code into mutliple OSGi bundles, each with a dependency of the correct version of the Hadoop bundle in its meta data (Manifest file). When you want to start you application, you must run it in an OSGi container.
This can be done, but is quite complex. Better to clean up your code, so you only depend on one version of the Hadoop library.

Verify OSGi bundles dependencies (import-package) programmatically

I need to validate whether the imported packages of a bundle are fulfilled by a set of other bundles' export packages. This should not be very hard to implement but I know all the OSGi containers plus eclipse (when you do "validate bundles" in PDE) do this. I just don't know how to find that code. Does anyone know what classes/libraries I could use that already implement all this logic?
My goal is to give a list of files (bundles) in the file system and do an analysis whether the set of bundles is self-contained and if not to show all the missing external imports/requires. all this without actually having to run the bundles in a real container
You should look at the Resolver API in the OSGi spec. Apache Felix has a resolver implementation that is also used by the Equinox framework.

Using JavaCompiler with Classpath referencing jars within ear

I am working on a project in which an enterprise archive (ear) deployed on a JBoss server needs to compile (and run) a class dynamically. I am using the JavaCompiler class to do this - the complication comes from the fact that the class being compiled has references to some of the classes contained within the ejb jar within the ear.
This is not a problem when the deployed ear is 'exploded' on deployment, so it is just a directory rather than an archive - in this case I am able to specify the required jar in the -classpath option of the compiler, and compilation works fine. Unfortunately due to constraints of the systems I am working with, it is not an acceptable solution to deploy these ears 'exploded', and the compiler seems not to be able to 'see' the required jar when it's wrapped up in an archive.
Given that the dynamic compilation is taking place from the ear in question, and therefore the system's class loader has access to the contents of the required jar, is there any way I can tell the compiler to just use the classes as loaded by the system class loader?
I appreciate this is something of a wordy question, but any help would be appreciated.
Thanks
It seems that there is no simple way to have the JavaCompiler load dependencies of compiled code from a ClassLoader. However, one could implement JavaFileManager directly and redirect the operations for the StandardLocation.CLASS_PATH using resource lookups on the context ClassLoader (getResource(<class/resource name>)). This would withdraw the limitation of StandardJavaFileManager directly operating on Files.
Someone already seems to have prototypically implemented that approach:
http://atamur.blogspot.de/2009/10/using-built-in-javacompiler-with-custom.html

Selectively include dependencies in JAR

I have a library that I wrote in Scala that uses Bouncy Castle and has a whole bunch of dependencies. When I roll a jar, I can either roll a "fat" jar that has all the dependencies (including scala), which weighs in around 19 MB, or I can roll a skinny jar, which doesn't have dependencies, but is only a few hundred KB.
The problem is that I need to include the Bouncy Castle classes/jar with my library, because if its not on the classpath at runtime, all kinds of exceptions get thrown.
So, I think the ideal situation is if there is some way that I can get either Maven or SBT to include some but not all dependencies in the jar that gets rolled. Some dependencies are needed at compile-time, but not at run time, such as the Scala standard libraries. Is there some way to get that to happen?
Thanks!
I would try out the sbt proguard plugin from https://github.com/nuttycom/sbt-proguard-plugin . It should be able to weed out the classes that are not in use.
If it is sufficient to explicitly define which dependencies should be added (one the artifact-level, i.e., single JARs), you can define an assembly (in case of a single project) or an additional assembly project (in case of a multi-module project). Assembly descriptors can explicitly exclude/include artifacts from the dependencies.
Here is some good documentation on this topic (section 8.5.4), here is the official documentation.
Note that you can include all artifacts that belong to one group by using the wildcard notation in dependecySets, e.g. hibernate:*:jar would include all JAR files belonging to the hibernate group.
Covering maven...
Because you declare your project to be dependent upon bouncy castle in your maven pom, anybody using maven to depend upon your library will by default pull in bouncy castle as a transitive dependency.
You should set the appropriate scope on your dependencies, e.g. compile for stuff needed at compile and runtime, test for dependencies only needed in testing and provided for stuff you expect to be provided by the environment.
Whether your library's dependencies are packaged into dependent projects when they are built is a question of how those are projects configured and setting the scopes will influence the default behaviour.
For example, jar type packaging by default does not include dependencies, whereas war will include those in compile scope (but not test or provided). The design aim here was to have packaging plugins behave in the most commonly required way without needing configuration, but of course packaging plugins in maven can be configured to have different behaviour if needed. The plugins themselves which do packaging are well documented at the apache maven site.
If users of your library are unlikely to be using maven to build their projects, an option is to use the shade plugin which will allow you to produce an "uber-jar" which contains all the dependencies you wish. You can configure particular includes or excludes.
This can be a problematic way to deliver, for example where your library includes dependencies which version clash with the direct dependencies of projects using it, i.e. they use a different version of the same libraries yours does.
However if you can it is best that you leave this to maven to manage so that projects using your library can decide whether they want your dependencies or to specify particular versions giving them more flexibility. This is the idiomatic approach.
For more information on dependencies and scopes in maven, see the reference guide published by Sonatype.
I'm not a scala guy, but I have played around with assembling stuff in Java + Maven.
Have you tried looking into creating your own assembly descriptor for the assembly plugin? https://maven.apache.org/plugins/maven-assembly-plugin/assembly.html
You can copy / paste the jar-with-dependencies descriptor then just add some excludes to your < dependencySet >. I'm not a Maven expert, but you should be able to configure it so different profiles kick off different assembly builds.
EDIT: Ack, didn't see my HTML got hidden