Storm topology deployment failing with IllegalArgumentException when storm client upgraded from 0.10 to 1.1.0 - upgrade

Recently, we upgraded our storm version from 0.10 to 1.1.0 but while deploying topology with the upgraded client, it was failing with -
Exception in thread "main" java.lang.IllegalArgumentException
at org.apache.storm.hack.shade.org.objectweb.asm.ClassReader.<init>(Unknown Source)
at org.apache.storm.hack.shade.org.objectweb.asm.ClassReader.<init>(Unknown Source)
at org.apache.storm.hack.shade.org.objectweb.asm.ClassReader.<init>(Unknown Source)
at org.apache.storm.hack.DefaultShader.addRemappedClass(DefaultShader.java:182)
at org.apache.storm.hack.DefaultShader.shadeJarStream(DefaultShader.java:103)
at org.apache.storm.hack.StormShadeTransformer.transform(StormShadeTransformer.java:35)
at org.apache.storm.daemon.ClientJarTransformerRunner.main(ClientJarTransformerRunner.java:37)

Issue came out to be related to dependency conflicts between asm and guice over Java8. Excluding all guice dependencies which were coming from a dependent jar but wasn't actually used in storm project helped resolve this issue.
To be on safer side, I also cross-checked in my project that I do not have any other asm dependency except one coming from storm-core.
I spent more than a day to come to this resolution, hope this could be of help to someone :)

For us that issue was that we had
client.jartransformer.class : org.apache.storm.hack.StormShadeTransformer
in our storm.yaml. Removing this line fixed the issue.

Related

Connectio Faild to Kafka with IBM Info Sphere

While trying to read a Kafka topic in a InfoSpfhere Job, I got the error
Kafka_Customer: java.lang.NoClassDefFoundError: org.apache.kafka.clients.consumer.ConsumerRebalanceListener
at com.ibm.is.cc.kafka.runtime.KafkaProcessor.validateConfiguration (KafkaProcessor.java: 145)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.initializeProcessor (CC_JavaAdapter.java: 1008)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.getSerializedData (CC_JavaAdapter.java: 705)
Kafka_Customer: Java runtime exception occurred: java.lang.NoClassDefFoundError: org.apache.kafka.clients.consumer.ConsumerRebalanceListener (com.ibm.is.cc.kafka.runtime.KafkaProcessor::validateConfiguration, file KafkaProcessor.java, line 145)
I should add the jar file, which is missing, but where and how can I see which version is nedeed?. I could'n find anything after a lot of googling.
kafka-clients JAR versions are backwards compatible down to 0.10.2, however I would assume that the Kafka processors should have this, so you may want to reach out to IBM support

Error upgrading to Mongo java driver 3.2.2

we have migrated to MongoDB 3.2.6. What could be the compatible mongo version jars for below dependencies,
mongo java driver version (org.mongodb)
spring data mongo version (org.springframework.data)
spring data commons version (spring-data-commons)
I have tried to upgrade these to 3.2.2 for java driver and 1.9.4.RELEASE for spring data and spring commons but facing maven compatible issues. Below 2 issues i'm unable to resolve as of now.
Kindly suggest what could be the problem.
Issue 1:
The type org.springframework.data.repository.query.QueryByExampleExecutor cannot be resolved. It is indirectly referenced from required .class files
Issue 2:
Error occured processing XML 'Invalid default: public abstract java.lang.Class
org.springframework.data.mongodb.repository.config.EnableMongoRepositories.repositoryBaseClass()'. See Error Log for more details
Tried mvn clean dependency:tree and it is successful. But mvn clean compile is failing with Issue 1 error mentioned above.
Answer:
I'm able to resolve both the issues by upgrading to 1.12.1 for spring-data-commons. This will resolve above mentioned compile time issues.
Below are my current settings.
mongo-java-driver to 3.2.2, spring-data-mongodb to 1.9.4.RELEASE, spring-data-commons to 1.12.1.
As per Spring data commons documentation Spring Data Commons I also upgraded my Spring framework version to 4.2.8.RELEASE.
Facing below issue which I couldn't resolve. Any ideas will be appreciated.
Issue1:
06:37:33.943 [localhost-startStop-1] DEBUG o.s.c.t.c.AnnotationAttributesReadingVisitor - Failed to class-load type while reading annotation metadata. This is a non-fatal error, but certain annotation metadata may be unavailable.
java.lang.ClassNotFoundException: javax.annotation.Nullable
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1305) ~[catalina.jar:na]
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1139) ~[catalina.jar:na]
at org.springframework.core.type.classreading.RecursiveAnnotationAttributesVisitor.visitEnd(RecursiveAnnotationAttributesVisitor.java:47) ~[spring-core-4.2.8.RELEASE.jar:4.2.8.RELEASE]
at org.springframework.asm.ClassReader.readAnnotationValues(ClassReader.java:1802) [spring-core-4.2.8.RELEASE.jar:4.2.8.RELEASE]
at org.springframework.asm.ClassReader.readMethod(ClassReader.java:976) [spring-core-4.2.8.RELEASE.jar:4.2.8.RELEASE]
at org.springframework.asm.ClassReader.accept(ClassReader.java:695) [spring-core-4.2.8.RELEASE.jar:4.2.8.RELEASE]
at org.springframework.asm.ClassReader.accept(ClassReader.java:508) [spring-core-4.2.8.RELEASE.jar:4.2.8.RELEASE]
at org.springframework.core.type.classreading.SimpleMetadataReader.<init>(SimpleMetadataReader.java:64) [spring-core-4.2.8.RELEASE.jar:4.2.8.RELEASE]
at org.springframework.core.type.classreading.SimpleMetadataReaderFactory.getMetadataReader(SimpleMetadataReaderFactory.java:98) [spring-core-4.2.8.RELEASE.jar:4.2.8.RELEASE]
at org.springframework.core.type.classreading.CachingMetadataReaderFactory.getMetadataReader(CachingMetadataReaderFactory.java:102) [spring-core-4.2.8.RELEASE.jar:4.2.8.RELEASE]
at org.springframework.core.type.classreading.SimpleMetadataReaderFactory.getMetadataReader(SimpleMetadataReaderFactory.java:93) [spring-core-4.2.8.RELEASE.jar:4.2.8.RELEASE]
at org.springframework.core.type.filter.AbstractTypeHierarchyTraversingFilter.match(AbstractTypeHierarchyTraversingFilter.java:121) [spring-core-4.2.8.RELEASE.jar:4.2.8.RELEASE]
at org.springframework.core.type.filter.AbstractTypeHierarchyTraversingFilter.match(AbstractTypeHierarchyTraversingFilter.java:105) [spring-core-4.2.8.RELEASE.jar:4.2.8.RELEASE]
at org.springframework.data.repository.config.RepositoryConfigurationDelegate$LenientAssignableTypeFilter.match(RepositoryConfigurationDelegate.java:202) [spring-data-commons-1.12.1.RELEASE.jar:na]
at org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider.isCandidateComponent(ClassPathScanningCandidateComponentProvider.java:346) [spring-context-4.2.8.RELEASE.jar:4.2.8.RELEASE]
at org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider.findCandidateComponents(ClassPathScanningCandidateComponentProvider.java:280) [spring-context-4.2.8.RELEASE.jar:4.2.8.RELEASE]
at org.springframework.data.repository.config.RepositoryConfigurationDelegate.multipleStoresDetected(RepositoryConfigurationDelegate.java:167) [spring-data-commons-1.12.1.RELEASE.jar:na]
at org.springframework.data.repository.config.RepositoryConfigurationDelegate.<init>(RepositoryConfigurationDelegate.java:88) [spring-data-commons-1.12.1.RELEASE.jar:na]
at org.springframework.data.repository.config.RepositoryBeanDefinitionRegistrarSupport.registerBeanDefinitions(RepositoryBeanDefinitionRegistrarSupport.java:80) [spring-data-commons-1.12.1.RELEASE.jar:na]
Tried adding dependency jsr305 from com.google.code.findbugs but still seeing same exceptions.

maven-scr-plugin fails with SCRDescriptorException "unable to scan files ... Class file format probably not supported by ASM ?"

Using the following dependencies (amongst others, the bundle is supposed to be installed to AEM 6.1)
runtime is java8
maven-scr-plugin 1.15.0
org.apache.felix.scr.annotations: 1.9.8
org.apache.felix.scr.ds-annotations: 1.2.8
I get this exception
Caused by: org.apache.felix.scrplugin.SCRDescriptorException: Unable to scan class files: ...
(Class file format probably not supported by ASM ?)
at org.apache.felix.scrplugin.helper.ClassScanner.processClass(ClassScanner.java:219)
at org.apache.felix.scrplugin.helper.ClassScanner.process(ClassScanner.java:161)
at org.apache.felix.scrplugin.helper.ClassScanner.scanSources(ClassScanner.java:146)
at org.apache.felix.scrplugin.SCRDescriptorGenerator.execute(SCRDescriptorGenerator.java:146)
at org.apache.felix.scrplugin.mojo.SCRDescriptorMojo.execute(SCRDescriptorMojo.java:221)
... 22 more
Caused by: java.lang.IllegalArgumentException
at org.objectweb.asm.ClassReader.(Unknown Source)
at org.objectweb.asm.ClassReader.(Unknown Source)
at org.objectweb.asm.ClassReader.(Unknown Source)
at org.apache.felix.scrplugin.helper.ClassScanner.processClass(ClassScanner.java:201)
The class in question does not contain any osgi annotations at all, but is merely imported in some other #Component annotated classes.
Did anyone encounter this and found a solution?
I ran into this issue today. This happens when you are running maven-scr-plugin with scanClasses=true option. Older versions of maven-scr-plugin cannot scan classfile generated by java8. you will have to either switch to a newer version of scr plugin (I upgraded to 1.22) or set you maven-compiler-plugin target config to 1.7
I found out that my Felix SCR Annotation Processor plugin that I installed to my Intellij as a prerequisite to using the aem-ide-tooling-4-intellij from headwirecom is causing the issue. It was working for quite a while until today suddenly giving me an issue in running my unit tests (needless to say, there were no changes made in my java, mvn versions or in my IDE).
This forced me to update my java version, intellij version but didn't fix the issue. But disabled the SCR annotation plugin fixed it.
As you can see, this plugin is really old (2014). I hope they will release a newer version soon.

NoSuchMethod exception in Flink when using dataset with custom object array

I have a problem with Flink
java.lang.NoSuchMethodError: org.apache.flink.api.java.typeutils.ObjectArrayTypeInfo.getInfoFor(Lorg/apache/flink/api/common/typeinfo/TypeInformation;)Lorg/apache/flink/api/java/typeutils/ObjectArrayTypeInfo;
at LowLevel.FlinkImplementation.FlinkImplementation$$anon$6.<init>(FlinkImplementation.scala:28)
at LowLevel.FlinkImplementation.FlinkImplementation.<init>(FlinkImplementation.scala:28)
at IRLogic.GmqlServer.<init>(GmqlServer.scala:15)
at it.polimi.App$.main(App.scala:20)
at it.polimi.App.main(App.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
...
the line with the problem is this one
implicit val regionTypeInformation =
api.scala.createTypeInformation[FlinkDataTypes.FlinkRegionType]
in the FlinkRegionType I have an Array of custom object
I developed the app with the maven plugin in the IDE and everything is working good, but when I move to the version I downloaded from the website I get the error above
I am using Flink 0.9
I was thinking that some library may be missing but I am using maven for handling everything. Moreover running through the code of ObjectArrayTypeInfo.java it doesn't seem to be the problem
A NoSuchMethodError commonly indicates a version mismatch between the libraries a Flink program was compiled with and the system the program is executed on. Especially if the same code works in an IDE setup where compile and execution libraries are the same.
In such case, you should check the version of the Flink dependencies, for example in the Maven POM file.

How to fix akka version compatibility issues?

I was thinking of using spark and redis together with SBT.
It runs fine if I comment out the spark dependency, if I include the spark dependency I get:
Exception in thread "main" java.lang.NoSuchMethodError: akka.actor.ActorSystem.dispatcher()Lscala/concurrent/ExecutionContextExecutor;
at redis.RedisClientActorLike.<init>(Redis.scala:31)
at redis.RedisClient.<init>(Redis.scala:69)
I have no issues when I do not include "redisscala". When I do include redisscala, then I get weird errors about Akka.
How do I get around this?
It appears that those versions of Spark and rediscala are using incompatible versions of Akka. Spark 1.1.0 is using Akka 2.2.3, and rediscala 1.3.1 is using Akka 2.3.4. There are some changes between Akka 2.2.x and 2.3.x that are causing issues, and your project currently has both as transient dependencies.
You either need to downgrade rediscala to 1.2 (which uses Akka 2.2.x), or upgrade Spark to 1.2-snapshot (which uses Akka 2.3.x).