While making any type of call to Mongo from my Scala application, I am getting this NullPointerException. Can somebody please help.
I am using Mongo 3.0.1 and my Scala version is 2.9.0. Other dependencies are as follows
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>casbah_2.9.1</artifactId>
<type>pom</type>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>2.11.1</version>
</dependency>
<dependency>
<groupId>com.novus</groupId>
<artifactId>salat-core_2.9.1</artifactId>
<version>1.9.1</version>
</dependency>
<dependency>
<groupId>com.google.code.morphia</groupId>
<artifactId>morphia</artifactId>
<version>0.99</version>
</dependency>
Error :
Caused by: java.lang.NullPointerException
at com.novus.salat.util.GraterPrettyPrinter$$anonfun$safeDefault$2$$anonfun$apply$1.apply(PrettyPrinters.scala:74)
at com.novus.salat.util.GraterPrettyPrinter$$anonfun$safeDefault$2$$anonfun$apply$1.apply(PrettyPrinters.scala:74)
at scala.Option.map(Option.scala:134)
at com.novus.salat.util.GraterPrettyPrinter$$anonfun$safeDefault$2.apply(PrettyPrinters.scala:74)
at com.novus.salat.util.GraterPrettyPrinter$$anonfun$safeDefault$2.apply(PrettyPrinters.scala:74)
at scala.Option.flatMap(Option.scala:147)
at com.novus.salat.util.GraterPrettyPrinter$class.safeDefault(PrettyPrinters.scala:74)
at com.novus.salat.util.ConstructorInputPrettyPrinter$.safeDefault(PrettyPrinters.scala:108)
at com.novus.salat.util.ConstructorInputPrettyPrinter$$anonfun$apply$3.apply(PrettyPrinters.scala:134)
at com.novus.salat.util.ConstructorInputPrettyPrinter$$anonfun$apply$3.apply(PrettyPrinters.scala:128)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:34)
at scala.collection.mutable.ArrayOps.foreach(ArrayOps.scala:38)
at com.novus.salat.util.ConstructorInputPrettyPrinter$.apply(PrettyPrinters.scala:128)
at com.novus.salat.util.ToObjectGlitch.<init>(ToObjectGlitch.scala:44)
at com.novus.salat.ConcreteGrater.feedArgsToConstructor(Grater.scala:294)
at com.novus.salat.ConcreteGrater.asObject(Grater.scala:263)
at com.novus.salat.ConcreteGrater.asObject(Grater.scala:105)
at com.novus.salat.dao.SalatMongoCursorBase$class.next(SalatMongoCursor.scala:47)
at com.novus.salat.dao.SalatMongoCursor.next(SalatMongoCursor.scala:149)
at scala.collection.Iterator$class.foreach(Iterator.scala:652)
at com.novus.salat.dao.SalatMongoCursor.foreach(SalatMongoCursor.scala:149)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:128)
at scala.collection.TraversableOnce$class.toList(TraversableOnce.scala:242)
at com.novus.salat.dao.SalatMongoCursor.toList(SalatMongoCursor.scala:149)
This issue was due to corrupt data in the db. After clearing that it worked.
Related
When I use this dependency:
<dependency>
<groupId>net.manub</groupId>
<artifactId>scalatest-embedded-kafka_2.11</artifactId>
<version>2.0.0</version>
<scope>test</scope>
</dependency>
With
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_2.11</artifactId>
<version>2.3.0</version>
</dependency>
I run into this error:
Cause: java.lang.ClassNotFoundException: org.apache.spark.sql.sources.v2.reader.SupportsScanUnsafeRow
Trying to figure out which version of 'scalatest-embedded-kafka' will work with Spark 2.3.
Any ideas?
Driven by a dependency-check warning we tried to bump the version of org.apache.kafka:kafka-clients to version 2.2.1 in our setup, using spring-kafka:2.2.7.
As a result the tests using EmbeddedKafkaRule fail during broker startup with an IOException claiming "Failed to load /some/path.."
java.io.IOException: Failed to load /Users/[..]/target/embedded-kafka during broker startup
at kafka.log.LogManager$$anonfun$createAndValidateLogDirs$1.apply(LogManager.scala:152)
at kafka.log.LogManager$$anonfun$createAndValidateLogDirs$1.apply(LogManager.scala:149)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at kafka.log.LogManager.createAndValidateLogDirs(LogManager.scala:149)
at kafka.log.LogManager.<init>(LogManager.scala:80)
at kafka.log.LogManager$.apply(LogManager.scala:953)
at kafka.server.KafkaServer.startup(KafkaServer.scala:237)
at kafka.utils.TestUtils$.createServer(TestUtils.scala:132)
at kafka.utils.TestUtils.createServer(TestUtils.scala)
at org.springframework.kafka.test.EmbeddedKafkaBroker.afterPropertiesSet(EmbeddedKafkaBroker.java:223)
at org.springframework.kafka.test.rule.EmbeddedKafkaRule.before(EmbeddedKafkaRule.java:109)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:46)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:190)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
We made sure, to have no conflicting version of kafka-clients on the classpath and also tried to specify the logs.dir of the EmbeddedKafkaRule to some folder under maven's target folder, "target/embedded-kafka" in the above example.
Both with no success.
Did anybody have the same issue and resolve it?
I just tested it without any problems.
Did you follow the instructions in the documentation about overriding kafka client versions?.
When you use spring-kafka-test (version 2.2.x) with the 2.1.x kafka-clients jar, you need to override certain transitive dependencies, as follows:
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
<version>${spring.kafka.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka-test</artifactId>
<version>${spring.kafka.version}</version>
<exclusions>
<exclusion>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
</exclusion>
</exclusions>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>2.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>2.1.1</version>
<classifier>test</classifier>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.12</artifactId>
<version>2.1.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.12</artifactId>
<version>2.1.1</version>
<classifier>test</classifier>
<scope>test</scope>
</dependency>
Note that when switching to scala 2.12 (recommended for 2.1.x and higher), the 2.11 version must be excluded from spring-kafka-test.
I am receiving "java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)" error while using dataframes in scala app and running it using spark. However if I work using only RDD's and not dataframes, no such error comes up with same pom and settings. Also while going through other posts with same error, it is mentioned that scala version has to be 2.10 as spark is not compatible with 2.11 scala, and i am using 2.10 scala version with 2.0.0 spark.
Below is the snip from pom:
<properties>
<spark-assembly>/usr/lib/spark/lib/spark-assembly.jar</spark-assembly>
<encoding>UTF-8</encoding>
<hadoop.version>2.7.1</hadoop.version>
<hbase.version>1.1.1</hbase.version>
<scala.version>2.10.5</scala.version>
<scala.tools.version>2.10</scala.tools.version>
<spark.version>2.0.0</spark.version>
<phoenix.version>4.7.0-HBase-1.1</phoenix.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-server</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
</dependencies>
Error:
16/10/19 02:57:26 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
at com.abc.xyz.Compare$.main(Compare.scala:64)
at com.abc.xyz.Compare.main(Compare.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:627)
16/10/19 02:57:26 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;)
16/10/19 02:57:26 INFO spark.SparkContext: Invoking stop() from shutdown hook
Change scala version
<scala.version>2.11.8</scala.version>
<scala.tools.version>2.11</scala.tools.version>
and add
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>${scala.version}</version>
</dependency>
I also faced this error and this is purely a version issue.
Your scala version is not compatible or may be you are using the correct version but the intellij libraries has the old version.
Quick fix :
I as using spark 2.2.0 and scala 2.10.4 , which I then changed to scala version 2.11.8.After that do the below:
1) right click on intellij module
2) open module-settings.
3) go to libraries and clear all of them
4) Rebuild
Doing above for me issue is resolved.
While extending a previously working project, I seemed to have muffed a maven dependency.
junit snippet:
Client interimClient = ClientBuilder.newClient();
WebTarget interim = interimClient.target(REST_TARGET_URL);
result persistedResult = interim.request()
.post(Entity.entity(testResult, MediaType.APPLICATION_JSON), Result.class);
Assert.assertEquals("A result should be persisted ", "TEST", persistedResult.getId());
Error:
java.lang.NoClassDefFoundError: org/apache/http/conn/ssl/X509HostnameVerifier
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getConstructor0(Class.java:3075)
at java.lang.Class.newInstance(Class.java:412)
at javax.ws.rs.client.FactoryFinder.newInstance(FactoryFinder.java:116)
at javax.ws.rs.client.FactoryFinder.find(FactoryFinder.java:164)
at javax.ws.rs.client.ClientBuilder.newBuilder(ClientBuilder.java:86)
I tried adding the dependency
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.0-alpha4</version>
</dependency>
...but then got
java.lang.NoClassDefFoundError: org/apache/http/conn/scheme/SchemeSocketFactory
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getConstructor0(Class.java:3075)
Given that this was working just a couple hours earlier, following each successive dependency error seems like sliding down the rabbit hole. Hopefully this is a known jumping-off point that someone can help direct me on. tiy.
We were getting a similar error message when running JUnit tests. Our dependency versions are coming from org.wildfly:wildfly-parent:10.0.0.Final.
The initial error was:
java.lang.RuntimeException: java.lang.ClassNotFoundException: org.glassfish.jersey.client.JerseyClientBuilder
Adding the following dependency resolved the initial error
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-client</artifactId>
<scope>provided</scope>
</dependency>
We then received this second error:
java.lang.NoClassDefFoundError: org/apache/http/conn/ssl/X509HostnameVerifier
Adding the following dependency resolved the second (X509HostnameVerifier) error
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<scope>provided</scope>
</dependency>
Once these two dependencies were added the problem was resolved. The resteasy-client resolves to version 3.0.14.Final and the httpclient resolves to version 4.5 for org.wildfly:wildfly-parent:10.0.0.Final.
It appears that my smattering of jax-rs related dependencies somehow mutated into causing this error. I was able to get back into good standing after whittling them down into just the following:
<dependency>
<groupId>javax.ws.rs</groupId>
<artifactId>javax.ws.rs-api</artifactId>
</dependency>
<dependency>
<groupId>javax.ejb</groupId>
<artifactId>javax.ejb-api</artifactId>
</dependency>
<dependency>
<groupId>javax.annotation</groupId>
<artifactId>javax.annotation-api</artifactId>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-client</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-jackson-provider</artifactId>
<scope>test</scope>
</dependency>
i am trying to run a very simple Spring application with java configuration. i am getting the following exception and i dont understand why. as far as i can tell i have all the required dependencies
public static void main(String[] args)
{
System.out.println( "Hello World from main!" );
ApplicationContext ctx = new AnnotationConfigApplicationContext(AppConfig.class);
HelloWorld helloWorld = ctx.getBean(HelloWorld.class);
System.out.println( helloWorld.getMessage() );
}
The exception occures at - AnnotationConfigApplicationContext function call
worth mentioning:
I have the following dependency in my POM.XML file.
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-asm</artifactId>
<version>${org.springframework.version}</version>
</dependency>
And the exception:
Exception in thread "main" java.lang.IllegalStateException: Cannot load configuration class: spring.play.springStart.AppConfig
at org.springframework.context.annotation.ConfigurationClassPostProcessor.enhanceConfigurationClasses(ConfigurationClassPostProcessor.java:313)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.postProcessBeanFactory(ConfigurationClassPostProcessor.java:197)
at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:681)
at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:620)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:446)
at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:73)
at spring.play.springStart.App.main(App.java:14)
Caused by: java.lang.NoClassDefFoundError: org/objectweb/asm/util/TraceClassVisitor
at net.sf.cglib.core.DebuggingClassWriter.toByteArray(DebuggingClassWriter.java:73)
at net.sf.cglib.core.DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:26)
at net.sf.cglib.core.AbstractClassGenerator.create(AbstractClassGenerator.java:216)
at net.sf.cglib.core.KeyFactory$Generator.create(KeyFactory.java:144)
at net.sf.cglib.core.KeyFactory.create(KeyFactory.java:116)
at net.sf.cglib.core.KeyFactory.create(KeyFactory.java:108)
at net.sf.cglib.core.KeyFactory.create(KeyFactory.java:104)
at net.sf.cglib.proxy.Enhancer.<clinit>(Enhancer.java:69)
at org.springframework.context.annotation.ConfigurationClassEnhancer.newEnhancer(ConfigurationClassEnhancer.java:136)
at org.springframework.context.annotation.ConfigurationClassEnhancer.enhance(ConfigurationClassEnhancer.java:109)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.enhanceConfigurationClasses(ConfigurationClassPostProcessor.java:303)
... 6 more
Caused by: java.lang.ClassNotFoundException: org.objectweb.asm.util.TraceClassVisitor
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 17 more
I actually don't really know what solved the problem, but I did go to some tutorial website that does the same thing and just copied their pom file dependencies.
I still don't understand what was missing.
In any case Balint Bako might be right with his answer but i am not sure, i got it solved by the time I got back to StackOverflow.
Here is the pom file
<!-- Spring 3 dependencies -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${spring.version}</version>
</dependency>
<!-- JavaConfig need this library -->
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2.2</version>
</dependency>
Good luck
As per exception it seems that ASM related jar is missing. You can add asm-all
<dependency>
<groupId>asm</groupId>
<artifactId>asm-all</artifactId>
<version>2.1</version>
</dependency>
It worked with me with following dependencies. Thanks Rubens Mariuzzo. You cracked with cglib version problem. It never worked with 3.0 but worked with 2.2.2
<dependencies>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>3.0.0.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-expression</artifactId>
<version>3.0.0.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>3.0.0.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>3.0.0.RELEASE</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2.2</version>
</dependency>
</dependencies>
You need spring-core too.
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${org.springframework.version}</version>
</dependency>
I had same issue when using Spock unit test framework, and using Spy test.
I overcome this issue by add this into my Maven dependencies.
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>3.3.0</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib-nodep</artifactId>
<version>3.3.0</version>
</dependency>