I have a program of Kafka streams. It's Windows 64 bit machine and standalone Kafka server is running on it. Java version is java 8.
In code pom has dependencies of Kafka client and streams APIs and the versions are latest i.e. 0.10.2.
Whenever I am running the streams app, it is looking for rocksdb's dll file in my user home and failed.
Anyone faced the same issue? Same code is running on different windows 64 machine.
pom
<dependencies>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.10.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-streams</artifactId>
<version>0.10.2.1</version>
</dependency>
</dependencies>
Stacktrace
Exception in thread "StreamThread-1" java.lang.UnsatisfiedLinkError: C:\Users\abcd\AppData\Local\Temp\librocksdbjni8989756873626302713.dll: Can't find dependent libraries
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
at java.lang.Runtime.load0(Runtime.java:809)
at java.lang.System.load(System.java:1086)
at org.rocksdb.NativeLibraryLoader.loadLibraryFromJar(NativeLibraryLoader.java:78)
at org.rocksdb.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:56)
at org.rocksdb.RocksDB.loadLibrary(RocksDB.java:64)
at org.rocksdb.RocksDB.<clinit>(RocksDB.java:35)
at org.rocksdb.Options.<clinit>(Options.java:22)
at org.apache.kafka.streams.state.internals.RocksDBStore.openDB(RocksDBStore.java:117)
at org.apache.kafka.streams.state.internals.Segment.openDB(Segment.java:38)
at org.apache.kafka.streams.state.internals.Segments.getOrCreateSegment(Segments.java:76)
at org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStore.put(RocksDBSegmentedBytesStore.java:73)
at org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStore.put(ChangeLoggingSegmentedBytesStore.java:55)
at org.apache.kafka.streams.state.internals.MeteredSegmentedBytesStore.put(MeteredSegmentedBytesStore.java:101)
at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:110)
at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:102)
at org.apache.kafka.streams.kstream.internals.KStreamJoinWindow$KStreamJoinWindowProcessor.process(KStreamJoinWindow.java:65)
at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
at org.apache.kafka.streams.kstream.internals.KStreamFilter$KStreamFilterProcessor.process(KStreamFilter.java:44)
at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:70)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:197)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:627)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:361)
Related
Any file write attempt of Avro format fails with the stack trace below.
We are using Spark 2.4.3 (with user provided Hadoop), Scala 2.12, and we load the Avro package at runtime with either spark-shell:
spark-shell --packages org.apache.spark:spark-avro_2.12:2.4.3
or spark-submit:
spark-submit --packages org.apache.spark:spark-avro_2.12:2.4.3 ...
The spark Session reports loading the Avro package successfully.
... in either case, the moment we attempt to write any data to an avro format, like:
df.write.format("avro").save("hdfs:///path/to/outputfile.avro")
or with a select:
df.select("recordidstring").write.format("avro").save("hdfs:///path/to/outputfile.avro")
... produces the same stacktrace error (this copy from spark-shell):
java.lang.NoSuchMethodError: org.apache.avro.Schema.createUnion([Lorg/apache/avro/Schema;)Lorg/apache/avro/Schema;
at org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:185)
at org.apache.spark.sql.avro.SchemaConverters$.$anonfun$toAvroType$1(SchemaConverters.scala:176)
at scala.collection.Iterator.foreach(Iterator.scala:941)
at scala.collection.Iterator.foreach$(Iterator.scala:941)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
at scala.collection.IterableLike.foreach(IterableLike.scala:74)
at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
at org.apache.spark.sql.types.StructType.foreach(StructType.scala:99)
at org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:174)
at org.apache.spark.sql.avro.AvroFileFormat.$anonfun$prepareWrite$2(AvroFileFormat.scala:119)
at scala.Option.getOrElse(Option.scala:138)
at org.apache.spark.sql.avro.AvroFileFormat.prepareWrite(AvroFileFormat.scala:118)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:103)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:170)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:122)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
at org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:676)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:290)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
We are able to write other formats (text-delimited, json, ORC, parquet) without any trouble.
We are using HDFS (Hadoop v3.1.2) as the filestore.
I have experimented with different package versions of Avro (e.g. 2.11, lower) which either raises the same error or fails to load entirely due to incompatibility. This error occurs with all of Python, Scala (using shell or spark-submit) and Java (using spark-submit).
There appears to be an Open Issue on apache.org JIRA for this, but this is a year old now without any resolution. I've bumped that issue, but also wondering if the community had a fix? Any help much appreciated.
I had the same exception on the latest Spark. When I added the following dependencies into the pom it disappeared.
<properties>
....
<spark.version>3.1.2</spark.version>
<avro.version>1.10.2</avro.version>
</properties>
<dependencies>
....
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.12</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-avro_2.12</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>${avro.version}</version>
</dependency>
</dependencies>
It seems you definitely have a lack of required dependencies in classpath, where you are launching your application.
Based on a comment in the linked error, you should specify avro with at least 1.8.0 version, something like this:
spark-submit --packages org.apache.spark:spark-avro_2.12:2.4.3,org.apache.avro:avro:1.9.2 ...
(You might want to try with the other order too.)
buddy, I met the same error as yours, but I updated my spark version to 2.11 2.4.4 and the problem disappeared.
This issue appears to be specific to our configuration on our local cluster - single node builds of HDFS (locally on windows, other linux etc) allow avro to write fine. We will rebuild the problem cluster but I'm confident the issue a bad config on that cluster only - solution - rebuild.
I noticed that since Kafka 0.8.2.0, Kafka has shipped with a new maven module:
http://mvnrepository.com/artifact/org.apache.kafka/kafka-clients
<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.8.2.0</version>
</dependency>
But, it still ships with the older maven module
<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<version>0.8.2.0</version>
</dependency>
What's the difference or relationship between these two modules? I noticed that SimpleConsumer that I have used before is in kafka_2.11 module,but not in kafka-clients, does it mean that if I want to use SimpleConsumer, I still have to include the kafka_2.11 module?
SimpleConsumer was an old implementation of Consumer in the Kafka. It's now deprecated in favor of new Consumer API. In Kafka 0.8.1, team had started to re-implement Producer/Consumer APIs, and it went into kafka-client maven artifact. You can trace the changes between versions: 0.8.1, 0.9.0, 1.0.0, ...
You need to use new Consumer API if you're using Kafka >= 0.10.
In my java web app I'm sending messages to kafka.
I would like to compress my messages before sending it so I'm setting in my producer properties:
props.put("compression.codec", "2");
As I understand "2" stands for snappy, but when sending a message I'm getting:
java.lang.UnsatisfiedLinkError: org.xerial.snappy.SnappyNative.maxCompressedLength(I)I
at org.xerial.snappy.SnappyNative.maxCompressedLength(Native Method)
at org.xerial.snappy.Snappy.maxCompressedLength(Snappy.java:316)
at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:79)
at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:66)
at kafka.message.SnappyCompression.<init>(CompressionUtils.scala:61)
at kafka.message.CompressionFactory$.apply(CompressionUtils.scala:82)
at kafka.message.CompressionUtils$.compress(CompressionUtils.scala:109)
at kafka.message.MessageSet$.createByteBuffer(MessageSet.scala:71)
at kafka.message.ByteBufferMessageSet.<init>(ByteBufferMessageSet.scala:44)
at kafka.producer.async.DefaultEventHandler$$anonfun$3.apply(DefaultEventHandler.scala:94)
at kafka.producer.async.DefaultEventHandler$$anonfun$3.apply(DefaultEventHandler.scala:82)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:95)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:95)
at scala.collection.Iterator$class.foreach(Iterator.scala:772)
at scala.collection.mutable.HashTable$$anon$1.foreach(HashTable.scala:157)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:190)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:45)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:95)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
at scala.collection.mutable.HashMap.map(HashMap.scala:45)
at kafka.producer.async.DefaultEventHandler.serialize(DefaultEventHandler.scala:82)
at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:44)
at kafka.producer.async.ProducerSendThread.tryToHandle(ProducerSendThread.scala:116)
at kafka.producer.async.ProducerSendThread$$anonfun$processEvents$3.apply(ProducerSendThread.scala:95)
at kafka.producer.async.ProducerSendThread$$anonfun$processEvents$3.apply(ProducerSendThread.scala:71)
at scala.collection.immutable.Stream.foreach(Stream.scala:526)
at kafka.producer.async.ProducerSendThread.processEvents(ProducerSendThread.scala:70)
at kafka.producer.async.ProducerSendThread.run(ProducerSendThread.scala:41)
To resolve it I tried adding snappy dependency to my pom:
<dependency>
<groupId>org.xerial.snappy</groupId>
<artifactId>snappy-java</artifactId>
<version>${snappy-version}</version>
<scope>provided</scope>
</dependency>
and add the jar to my jetty server under /lib/ext
but still getting this error.
If I set "0" instead of "2" in the "compression.codec" property I do not get the exception, as expected.
what should I do in order to be able to use snappy compression?
This is my snappy version (should I use a different one?):
1.1.0.1
I'm deploying my app on jetty 8.1.9 which runs on Ubuntu 12.10.
<dependency>
<groupId>org.xerial.snappy</groupId>
<artifactId>snappy-java</artifactId>
<version>1.1.1.3</version>
</dependency>
I had the same issue and the code above solves my problem. The jar contains native libraries for all OS. Below are my development environment:
JDK version: 1.7.0_76
Kafka version: 2.10-0.8.2.1
Zookeeper version: 3.4.6
I experience a very strange problem and I don't have found any help to resolve it. I'm developing an app using Jersey to provides RESTfull services that are accessed by a GWT website. Services produce Json contents that are deserialized by RestyGWT. Everything works fine until I wanted to serialize polymorphic types. I also use Shiro to protect returned values to unauthorized users.
The serialization process run well on Mac OS X Lion using Eclipse Indigo and included Jetty as webserver. It also work well by using Tomcat 6. When I'm on Linux (an Ubuntu 12.04 distro), the app run well by using Tomcat 6 but when I use Ecplise's included Jetty as webserver, the exception below is always thrown.
I tried to update each lib to the latest stable version as I read in some posts. Does anyone has experienced this problem ?
Thank you in advence.
java.lang.NoSuchFieldError: EXTERNAL_PROPERTY
at org.codehaus.jackson.map.introspect.JacksonAnnotationIntrospector._findTypeResolver(JacksonAnnotationIntrospector.java:781)
at org.codehaus.jackson.map.introspect.JacksonAnnotationIntrospector.findTypeResolver(JacksonAnnotationIntrospector.java:199)
at org.codehaus.jackson.map.AnnotationIntrospector$Pair.findTypeResolver(AnnotationIntrospector.java:1032)
at org.codehaus.jackson.map.ser.BasicSerializerFactory.createTypeSerializer(BasicSerializerFactory.java:200)
at org.codehaus.jackson.map.ser.BasicSerializerFactory.buildContainerSerializer(BasicSerializerFactory.java:406)
at org.codehaus.jackson.map.ser.BeanSerializerFactory.createSerializer(BeanSerializerFactory.java:268)
at org.codehaus.jackson.map.ser.StdSerializerProvider._createUntypedSerializer(StdSerializerProvider.java:782)
at org.codehaus.jackson.map.ser.StdSerializerProvider._createAndCacheUntypedSerializer(StdSerializerProvider.java:735)
at org.codehaus.jackson.map.ser.StdSerializerProvider.findValueSerializer(StdSerializerProvider.java:344)
at org.codehaus.jackson.map.ser.StdSerializerProvider.findTypedValueSerializer(StdSerializerProvider.java:420)
at org.codehaus.jackson.map.ser.StdSerializerProvider._serializeValue(StdSerializerProvider.java:601)
at org.codehaus.jackson.map.ser.StdSerializerProvider.serializeValue(StdSerializerProvider.java:256)
at org.codehaus.jackson.map.ObjectMapper.writeValue(ObjectMapper.java:1613)
at org.codehaus.jackson.jaxrs.JacksonJsonProvider.writeTo(JacksonJsonProvider.java:558)
at com.sun.jersey.spi.container.ContainerResponse.write(ContainerResponse.java:306)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1451)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1363)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1353)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:414)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:537)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:708)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:487)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1097)
at org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:61)
at org.apache.shiro.web.servlet.AdviceFilter.executeChain(AdviceFilter.java:108)
at org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(AdviceFilter.java:137)
at org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
at org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:66)
at org.apache.shiro.web.servlet.AbstractShiroFilter.executeChain(AbstractShiroFilter.java:449)
at org.apache.shiro.web.servlet.AbstractShiroFilter$1.call(AbstractShiroFilter.java:365)
at org.apache.shiro.subject.support.SubjectCallable.doCall(SubjectCallable.java:90)
at org.apache.shiro.subject.support.SubjectCallable.call(SubjectCallable.java:83)
at org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:380)
at org.apache.shiro.web.servlet.AbstractShiroFilter.doFilterInternal(AbstractShiroFilter.java:362)
at org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1088)
at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:360)
at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:181)
at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:729)
at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:405)
at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
at org.mortbay.jetty.handler.RequestLogHandler.handle(RequestLogHandler.java:49)
at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
at org.mortbay.jetty.Server.handle(Server.java:324)
at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:505)
at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:829)
at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:513)
at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:211)
at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:380)
at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:395)
at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:488)
In my case the same exception was caused by having restygwt library on the Classpath.
RestGwt includes the Jackson Annotation Classes in it's own source tree. This then leds to version conflicts between the Jackson library and RestyGwt beginning with the Jackson 1.9, where in the JsonTypeInfo Class new Enummeration Value for the As Enum was introduced (EXTERNAL_PROPERTY), which is resolved at Runtime and thus leds to the above Exception.
My solution was to back to the 1.7.1 Version of Jackson. Note aside: From my point of view the redundant inclusion of the source code, is not the way to resolve dependencies.
Check the version of your Jackson is 1.9 or bigger, because this enum was added in 1.9.
I've experienced the same issue recently with a GWT 2.4 application, using Jackson 1.9.7 and RestyGWT 1.3.
We're using one of Thomas Broyer's archetypes which produces three different Maven projects for the GWT app. We have RestyGWT included as a dependency in client and shared projects. Because the server project has a dependency on the shared one, we had a dependency (received transitively) on RestyGWT on the server project. So, in order to make it work I excluded the RestyGWT dependency when setting the shared project dependency on server project:
<dependency>
<groupId>io.pst</groupId>
<artifactId>accounts-ui-shared</artifactId>
<exclusions>
<exclusion>
<groupId>org.fusesource.restygwt</groupId>
<artifactId>restygwt</artifactId>
</exclusion>
</exclusions>
</dependency>
A simple workaround is to exclude the old Jackson lib from your RestyGWT dependency, while ensuring the most recent one will supercede it :
<dependency>
<groupId>org.fusesource.restygwt</groupId>
<artifactId>restygwt</artifactId>
<version>1.3</version>
<exclusions>
<exclusion>
<groupId>org.codehaus.jackson</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-core-asl</artifactId>
<version>1.9.9</version>
</dependency>
I am using GWT in my project., recently I tried converting a manual compilation of GWT + Java + tomcat to a maven project., almost I am able successfully package it to a war., BUt when I deployed on tomcat I got followin error:
EVERE: Exception while dispatching incoming RPC call
java.lang.NoClassDefFoundError: javax/validation/Path
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2818)
at org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1159)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1647)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1526)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at com.google.gwt.user.server.rpc.SerializationPolicyLoader.loadFromStream(SerializationPolicyLoader.java:196)
at com.google.gwt.user.server.rpc.RemoteServiceServlet.loadSerializationPolicy(RemoteServiceServlet.java:90)
at com.google.gwt.user.server.rpc.RemoteServiceServlet.doGetSerializationPolicy(RemoteServiceServlet.java:293)
at com.google.gwt.user.server.rpc.RemoteServiceServlet.getSerializationPolicy(RemoteServiceServlet.java:157)
at com.google.gwt.user.server.rpc.impl.ServerSerializationStreamReader.prepareToRead(ServerSerializationStreamReader.java:455)
Here is what i added for my added in my pom.xml:
<dependency>
<groupId>com.google.gwt</groupId>
<artifactId>gwt-servlet</artifactId>
<version>${gwt.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.google.gwt</groupId>
<artifactId>gwt-user</artifactId>
<version>${gwt.version}</version>
<scope>provided</scope>
</dependency>
as dependencies.
Kindly help me.
What I need to add more to resolve the issue?
It is the dependency scope that is causing the problem. compile is actually the default scope so this could be omitted from the first dependency if you want.
The problem is in the second artifact which is declared as provided. This means that the application is expecting the web container to provide that library/classes. It looks like it is not providing the required classes, which results in the NoClassDefFoundError.
Removing the <scope>provided</scope>, will instruct Maven to package that library with the application and Tomcat should be able to get past that error.
There are no compile-time errors since the gwt-user is available at compile time. It is simply not available at run-time for Tomcat.