Dozer 5.3.2 issue NoClassDefFoundError - dozer

I am trying to upgrade from dozer 5.2.2 to 5.3.2, I am getting the following error
java.lang.NoClassDefFoundError: Could not initialize class org.dozer.DozerBeanMapper
It is there in the classpath and build.xml files, I just changed all references in my workspace from 5.2.2 to 5.3.2, I did similar thing while upgrading from 5.1 to 5.2.2, and it worked.
Any help is appreciated.
Thanks.

I had the same problem, and actually, the only way I found is to use an older version of Dozer as you seem to do. We tried a bigger update (4.2 -> 5.3.2) than you.
I precise that I only change the dependency in my pom.xml to make it work on my application server (WASCE), to resume:
Working:
<dependency>
<groupId>net.sf.dozer</groupId>
<artifactId>dozer</artifactId>
<version>5.2.2</version>
</dependency>
Not working:
<dependency>
<groupId>net.sf.dozer</groupId>
<artifactId>dozer</artifactId>
<version>5.3.2</version>
</dependency>
Even if we access to DozerBeanMapperSingletonWrapper :
Caused by: java.lang.NoClassDefFoundError: Could not initialize class
org.dozer.DozerBeanMapper
at org.dozer.DozerBeanMapperSingletonWrapper.getInstance(DozerBeanMapperSingletonWrapper.java:43)
The DozerBeanMapper constructor called is DozerBeanMapper(List mappingFiles), it could have been a bug inside. But the call to the simple constructor DozerBeanMapper() has the same result in our own classes.
Maybe a dependency is missing between the two versions...
Note that there is no problem on Eclipse with 5.3.2 version, so it can also be a class loader problem...
Hope this will highlight the source of the problem.

5.3.2 is using org.slf4j.Logger:
http://grepcode.com/file/repo1.maven.org/maven2/net.sf.dozer/dozer/5.3.2/org/dozer/DozerBeanMapper.java/
You are probably missing this library, which was not used in 5.2.2, where commons-logging were used: http://grepcode.com/file/repo1.maven.org/maven2/net.sf.dozer/dozer/5.2.2/org/dozer/DozerBeanMapper.java/

My dozer & dozer-spring version is 5.5.1.
Had the following error: "NoClassDefFoundError: org.dozer.stats.GlobalStatistics (initialization failure)". I was using commons-lang version 2.6 only in my dependencies.
Solved the problem by adding commons-lang3 dependency also. It is clear that there is a tight dependency in dozer on the version 3 of commons-lang library.

Related

How to fix the issue with Java 9 Modularity after adding dependency on mongock?

We are developing Spring Boot application that is using MongoDB as storage.
And we want to add to our project the DB migration tool: mongock.
in pom.xml I added a new dependency:
<dependency>
<groupId>com.github.cloudyrock.mongock</groupId>
<artifactId>mongock-spring</artifactId>
<version>3.3.2</version>
</dependency>
And IntelliJ Idea advised me to add the following lines to module-info.java:
requires mongock.spring;
requires mongock.core;
After that I am not able anymore to build the project, I am getting the following error:
Module 'com.acme.project-name' reads package 'com.github.cloudyrock.mongock' from both 'mongock.spring' and 'mongock.core'
I do not know a lot about Java 9 Modularity, that why I am stuck with resolving this issue, please advice.
If it's worth solving this issue one could upgrade to the latest release of the artifact.
<dependency>
<groupId>com.github.cloudyrock.mongock</groupId>
<artifactId>mongock-spring</artifactId>
<version>4.0.1.alpha</version>
</dependency>
You can understand what the issue means over this and this Q&A.
If you were to analyze the issue, you could straightforward notice that with the version 3.3.2, there are two artifacts that are brought in as a dependency under external libraries - mongock-spring and mongock-core. Further, if you look at the JARs, you would see their package structure is the same( i.e. both have classes within com.github.cloudyrock.mongock) and that is the reason for conflict that you see while they both are introduced in the modulepath.
Edit: Extended discussions to be moved over to #mongock/issues/212.

java.lang.ClassCastException: __redirected.__XMLInputFactory cannot be cast to org.codehaus.stax2.XMLInputFactory2

I am migration Java Web Application from jboss 6.0 to wildfly 11. I am getting
"java.lang.ClassCastException: __redirected.__XMLInputFactory cannot
be cast to org.codehaus.stax2.XMLInputFactory2" while running the code
on wildfly 11.0.0.Final. Junit tests are working without error.
Looks like there is some dependency issue in wildfly but unable to find any solution. Appreciate any help to resolve this issue..
I have included following woodstox dependencies in pom.
woodstox-core-asl 4.4.1
stax2-api 3.1.4
Thanks
Sanjay
This is caused by duplicate classes in the classpath.
Wildfly ships stax2-api as part of the woodstocks module, see modules/system/layers/base/org/codehaus/woodstox/main/ in the wildfly dist folder.
If you also have it in your application's lib folder, this will cause issues.
The solution is to either set the dependency to <scope>provided</scope> (or build) in pom.xml, or if you really need a special version, exclude wildfly's module via jboss-deployment-structure.xml.
See https://docs.jboss.org/author/display/WFLY10/Class+Loading+in+WildFly for more information on classloading in Wildfly.

Scala and persistence framework version incompatible

I try to use slick and squeryl framework for data persistence with scala. I don't want to use Play framework, but just the persistence framework, but when I import slick (or squeryl) jar file, I encountered the issue below:
slick_2.10.1-2.0.0-M1.jar of <project_name> build path is cross-compiled with an incompatible version of Scala (2.10.1). In case this report is mistaken, this check can be disabled in the compiler preference page.
I used scala jar (2.11.6) under scala plugin on Eclipse, and I can run simple scala application. I can also get access to mysql dbms with jdbc. This problem appears when I import the slick (or squeryl) jar files. Is it because the framework does not support scala 2.11? Is downgrade scala version the solution? If so, can anyone point me a direction on how to downgrade the scala version under Eclipse scala plugin. Thank you very much
If you are using scala 2.11 you need to use this dependency for slick:
<dependency>
<groupId>com.typesafe.slick</groupId>
<artifactId>slick_2.11</artifactId>
<version>3.0.0</version>
</dependency>
The previous answer should resolve your issue with slick. If you'd like to use Squeryl, the dependency should be
<dependency>
<groupId>org.squeryl</groupId>
<artifactId>squeryl_2.11</artifactId>
<version>0.9.6-RC3</version>
</dependency>
Or, if you want to use 0.9.5
<dependency>
<groupId>org.squeryl</groupId>
<artifactId>squeryl_2.11</artifactId>
<version>0.9.5-7</version>
</dependency>
Libraries in Scala are only binary compatible with the minor version of Scala they were compiled against. You'll see that in these examples the correct scala version is appended to the artifact ID with an underscore.
If you have the ability to use SBT instead of Maven, I would recommend it. SBT can choose the proper version for you when you reference a dependency like the following
libraryDependencies += "org.squeryl" % "squeryl_2.11" % "0.9.6-RC3"

scalac: Error: object CharRef in intelliJ 14

I can successfully compile code from terminal with mvn compile command. But when I compile code with intellij 14 I got following error:
Error:scalac: Error: object CharRef does not have a member create
scala.reflect.internal.FatalError: object CharRef does not have a member create
at scala.reflect.internal.Definitions$DefinitionsClass.scala$reflect$internal$Definitions$DefinitionsClass$$fatalMissingSymbol(Definitions.scala:1179)
at ...
What could be a reason?
I fixed this by running File -> Invalidate Caches/Restart.
I had a similar problem while trying to work with some Scala examples fro Spark.
Apparently the problem that I was experiencing, was caused by incompatibility of IntelliJ project's Scala version (3.11.7) and Spark Maven dependency (2.10):
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.4.1</version>
<scope>provided</scope>
</dependency>
Once I have changed spark-core_2.10 to spark-core_2.11 and rebuilt the project, everything started to work as expected.
My project depends on Scala 2.10 but Scala SDK in IDEA was 2.11. I changed also the SDK to 2.10 and the error disappeared.
My version is IntelliJ IDEA 2016.1.3 and has same issue. My solution is the scala-sdk 2.12.1 change to use version 2.10.4 , It can fixed. Help it can help guys.

Jasper Report Image plainWidth()F error

I am getting this error at my tomcat server 6 using iReport 1.3.0, iTest 2.1.0, jasper libraies jasperreports-1.2.8-javaflow.jar
some of forum told use iText jar lates, i also replace with latest and clean project and rebuild but still getting same problem.
Error :
**SEVERE: Servlet.service() for servlet default threw exception
java.lang.NoSuchMethodError: com.lowagie.text.Image.plainWidth()F
at net.sf.jasperreports.engine.export.JRPdfExporter.exportImage(JRPdfExporter.java:1046)
at net.sf.jasperreports.engine.export.JRPdfExporter.exportElements(JRPdfExporter.java:581)
at net.sf.jasperreports.engine.export.JRPdfExporter.exportPage(JRPdfExporter.java:549)**
Hey no worries, i have solved it, its iText jar version incompatibility.
I have two iText.2.1.0.jar and iText.1.3.1.jar. i removed iText.2.1.0.jar.
For those using the amazing Flying Saucer with Maven and getting this same error, just change the version of the library from an old one (in my case R8pre2) to a new one (in my case, R8):
<dependency>
<groupId>org.xhtmlrenderer</groupId>
<artifactId>core-renderer</artifactId>
<version>R8</version>
</dependency>
Brandizzi did this the correct way. I was previously on iText 2.0.8. Moved my org.xhtmlrenderer:core-renderer version to R8 and the issue was resolved.