I am facing issue with Mule 3.7.0
I have been trying to deploy an existing working application to Mule 3.7.0.
If fails printing an error. I couldn't debug from the error.
The error printed is not showing anything in related to the application code.
The same application works fine in Mule 3.6.1 and Mule 3.3
But it fails giving the following error.
ERROR 2015-08-05 12:24:37,863 [WrapperListener_start_runner] org.mule.module.launcher.DefaultArchiveDeployer:
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+ Failed to deploy artifact 'myapp', see below +
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
org.mule.module.launcher.DeploymentInitException: NullPointerException:
at org.mule.module.launcher.application.DefaultMuleApplication.init(DefaultMuleApplication.java:197) ~[?:?]
at org.mule.module.launcher.artifact.ArtifactWrapper$2.execute(ArtifactWrapper.java:62) ~[?:?]
at org.mule.module.launcher.artifact.ArtifactWrapper.executeWithinArtifactClassLoader(ArtifactWrapper.java:129) ~[?:?]
at org.mule.module.launcher.artifact.ArtifactWrapper.init(ArtifactWrapper.java:57) ~[?:?]
at org.mule.module.launcher.DefaultArtifactDeployer.deploy(DefaultArtifactDeployer.java:25) ~[?:?]
at org.mule.module.launcher.DefaultArchiveDeployer.guardedDeploy(DefaultArchiveDeployer.java:310) ~[?:?]
at org.mule.module.launcher.DefaultArchiveDeployer.deployArtifact(DefaultArchiveDeployer.java:330) ~[?:?]
at org.mule.module.launcher.DefaultArchiveDeployer.deployExplodedApp(DefaultArchiveDeployer.java:297) ~[?:?]
at org.mule.module.launcher.DefaultArchiveDeployer.deployExplodedArtifact(DefaultArchiveDeployer.java:108) ~[?:?]
at org.mule.module.launcher.DeploymentDirectoryWatcher.deployExplodedApps(DeploymentDirectoryWatcher.java:290) ~[?:?]
at org.mule.module.launcher.DeploymentDirectoryWatcher.start(DeploymentDirectoryWatcher.java:151) ~[?:?]
at org.mule.module.launcher.MuleDeploymentService.start(MuleDeploymentService.java:100) ~[?:?]
at org.mule.module.launcher.MuleContainer.start(MuleContainer.java:170) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_75]
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:1.7.0_75]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:1.7.0_75]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:1.7.0_75]
at org.mule.module.reboot.MuleContainerWrapper.start(MuleContainerWrapper.java:52) ~[mule-module-boot-ee-3.7.0.jar:3.7.0]
at org.tanukisoftware.wrapper.WrapperManager$11.run(WrapperManager.java:4163) ~[wrapper-3.5.26.jar:3.5.26]
Caused by: org.mule.api.config.ConfigurationException: null (java.lang.NullPointerException) (org.mule.api.config.ConfigurationException)
at org.mule.config.builders.AbstractConfigurationBuilder.configure(AbstractConfigurationBuilder.java:49) ~[?:?]
at org.mule.config.builders.AbstractResourceConfigurationBuilder.configure(AbstractResourceConfigurationBuilder.java:69) ~[?:?]
at org.mule.context.DefaultMuleContextFactory$1.configure(DefaultMuleContextFactory.java:89) ~[?:?]
at org.mule.context.DefaultMuleContextFactory.doCreateMuleContext(DefaultMuleContextFactory.java:222) ~[?:?]
at org.mule.context.DefaultMuleContextFactory.createMuleContext(DefaultMuleContextFactory.java:81) ~[?:?]
at org.mule.module.launcher.application.DefaultMuleApplication.init(DefaultMuleApplication.java:188) ~[?:?]
... 18 more
Caused by: org.mule.api.config.ConfigurationException: null (java.lang.NullPointerException)
at org.mule.config.builders.AbstractConfigurationBuilder.configure(AbstractConfigurationBuilder.java:49) ~[?:?]
at org.mule.config.builders.AbstractResourceConfigurationBuilder.configure(AbstractResourceConfigurationBuilder.java:69) ~[?:?]
at org.mule.config.builders.AutoConfigurationBuilder.autoConfigure(AutoConfigurationBuilder.java:101) ~[?:?]
at org.mule.config.builders.AutoConfigurationBuilder.doConfigure(AutoConfigurationBuilder.java:52) ~[?:?]
at org.mule.config.builders.AbstractConfigurationBuilder.configure(AbstractConfigurationBuilder.java:43) ~[?:?]
at org.mule.config.builders.AbstractResourceConfigurationBuilder.configure(AbstractResourceConfigurationBuilder.java:69) ~[?:?]
at org.mule.context.DefaultMuleContextFactory$1.configure(DefaultMuleContextFactory.java:89) ~[?:?]
at org.mule.context.DefaultMuleContextFactory.doCreateMuleContext(DefaultMuleContextFactory.java:222) ~[?:?]
at org.mule.context.DefaultMuleContextFactory.createMuleContext(DefaultMuleContextFactory.java:81) ~[?:?]
at org.mule.module.launcher.application.DefaultMuleApplication.init(DefaultMuleApplication.java:188) ~[?:?]
... 18 more
Caused by: java.lang.NullPointerException
at com.sun.proxy.$Proxy82.hashCode(Unknown Source) ~[?:?]
at java.util.HashMap.hash(Unknown Source) ~[?:1.7.0_75]
at java.util.HashMap.getEntry(Unknown Source) ~[?:1.7.0_75]
at java.util.HashMap.containsKey(Unknown Source) ~[?:1.7.0_75]
at java.util.HashSet.contains(Unknown Source) ~[?:1.7.0_75]
at org.mule.lifecycle.RegistryLifecycleCallback.doApplyLifecycle(RegistryLifecycleCallback.java:81) ~[?:?]
at org.mule.lifecycle.RegistryLifecycleCallback.onTransition(RegistryLifecycleCallback.java:67) ~[?:?]
at org.mule.lifecycle.RegistryLifecycleManager.invokePhase(RegistryLifecycleManager.java:140) ~[?:?]
at org.mule.lifecycle.RegistryLifecycleManager.fireLifecycle(RegistryLifecycleManager.java:111) ~[?:?]
at org.mule.registry.AbstractRegistry.fireLifecycle(AbstractRegistry.java:146) ~[?:?]
at org.mule.registry.AbstractRegistry.initialise(AbstractRegistry.java:116) ~[?:?]
at org.mule.config.spring.SpringXmlConfigurationBuilder.createSpringRegistry(SpringXmlConfigurationBuilder.java:172) ~[?:?]
at org.mule.config.spring.SpringXmlConfigurationBuilder.doConfigure(SpringXmlConfigurationBuilder.java:95) ~[?:?]
at org.mule.config.builders.AbstractConfigurationBuilder.configure(AbstractConfigurationBuilder.java:43) ~[?:?]
at org.mule.config.builders.AbstractResourceConfigurationBuilder.configure(AbstractResourceConfigurationBuilder.java:69) ~[?:?]
at org.mule.config.builders.AutoConfigurationBuilder.autoConfigure(AutoConfigurationBuilder.java:101) ~[?:?]
at org.mule.config.builders.AutoConfigurationBuilder.doConfigure(AutoConfigurationBuilder.java:52) ~[?:?]
at org.mule.config.builders.AbstractConfigurationBuilder.configure(AbstractConfigurationBuilder.java:43) ~[?:?]
at org.mule.config.builders.AbstractResourceConfigurationBuilder.configure(AbstractResourceConfigurationBuilder.java:69) ~[?:?]
at org.mule.context.DefaultMuleContextFactory$1.configure(DefaultMuleContextFactory.java:89) ~[?:?]
at org.mule.context.DefaultMuleContextFactory.doCreateMuleContext(DefaultMuleContextFactory.java:222) ~[?:?]
at org.mule.context.DefaultMuleContextFactory.createMuleContext(DefaultMuleContextFactory.java:81) ~[?:?]
at org.mule.module.launcher.application.DefaultMuleApplication.init(DefaultMuleApplication.java:188) ~[?:?]
... 18 more
INFO 2015-08-05 12:24:37,870 [WrapperListener_start_runner] org.mule.module.launcher.DeploymentDirectoryWatcher:
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+ Mule is up and kicking (every 5000ms) +
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
INFO 2015-08-05 12:24:37,915 [WrapperListener_start_runner] org.mule.module.launcher.StartupSummaryDeploymentListener:
**********************************************************************
* - - + DOMAIN + - - * - - + STATUS + - - *
**********************************************************************
* default * DEPLOYED *
**********************************************************************
*******************************************************************************************************
* - - + APPLICATION + - - * - - + DOMAIN + - - * - - + STATUS + - - *
*******************************************************************************************************
* default * default * DEPLOYED *
* myapp * default * FAILED *
*******************************************************************************************************
Make sure the below are satisfied:
Your JDK and Anypoint studio/Mule ESB are of same bit architecture(either 32 or 64bit).
Mule 3.7's JDK prerequisite is JDK1.7. Try upgrading to JDK1.7, if your current JDK version is older than recommended.
This solved my problem.
My problem turned out to be an old version of ApiKit (1.6.1) which needed upgrading manually to 1.7.1 by editing pom.xml for the projects concerned.
Related
I deployed ELK stack version 7.17.5, with basic authentication and it was working fine. till the time we wanted to add another node and secure it. Hence we generated self signed certificates without any password and made entries in yml TLS and HTTP.p12 files, now when I restarted Elasticsearch it gave me following error. I am not able to understand where am I missing. I have Kibana on same host.
Caused by: java.io.IOException: keystore password was incorrect
at sun.security.pkcs12.PKCS12KeyStore.engineLoad(PKCS12KeyStore.java:2158) ~[?:?]
at sun.security.util.KeyStoreDelegator.engineLoad(KeyStoreDelegator.java:226) ~[?:?]
at java.security.KeyStore.load(KeyStore.java:1503) ~[?:?]
at org.elasticsearch.xpack.core.ssl.TrustConfig.getStore(TrustConfig.java:99) ~[?:?]
at org.elasticsearch.xpack.core.ssl.StoreTrustConfig.createTrustManager(StoreTrustConfig.java:66) ~[?:?]
at org.elasticsearch.xpack.core.ssl.SSLService.createSslContext(SSLService.java:455) ~[?:?]
at java.util.HashMap.computeIfAbsent(HashMap.java:1220) ~[?:?]
at org.elasticsearch.xpack.core.ssl.SSLService.lambda$loadSSLConfigurations$5(SSLService.java:548) ~[?:?]
at java.util.HashMap.forEach(HashMap.java:1421) ~[?:?]
at java.util.Collections$UnmodifiableMap.forEach(Collections.java:1553) ~[?:?]
at org.elasticsearch.xpack.core.ssl.SSLService.loadSSLConfigurations(SSLService.java:546) ~[?:?]
at org.elasticsearch.xpack.core.ssl.SSLService.<init>(SSLService.java:147) ~[?:?]
at org.elasticsearch.xpack.core.XPackPlugin.createSSLService(XPackPlugin.java:525) ~[?:?]
at org.elasticsearch.xpack.core.XPackPlugin.createComponents(XPackPlugin.java:338) ~[?:?]
at org.elasticsearch.node.Node.lambda$new$17(Node.java:731) ~
Caused by: java.security.UnrecoverableKeyException: failed to decrypt safe contents entry: javax.crypto.BadPaddingException: Given final block not properly padded. Such issues can arise if a bad key is used during decryption.
at sun.security.pkcs12.PKCS12KeyStore.engineLoad(PKCS12KeyStore.java:2158) ~[?:?]
at sun.security.util.KeyStoreDelegator.engineLoad(KeyStoreDelegator.java:226) ~[?:?]
at java.security.KeyStore.load(KeyStore.java:1503) ~[?:?]
at org.elasticsearch.xpack.core.ssl.TrustConfig.getStore(TrustConfig.java:99) ~[?:?]
at org.elasticsearch.xpack.core.ssl.StoreTrustConfig.createTrustManager(StoreTrustConfig.java:66) ~[?:?]
at org.elasticsearch.xpack.core.ssl.SSLService.createSslContext(SSLService.java:455) ~[?:?]
at java.util.HashMap.computeIfAbsent(HashMap.java:1220) ~[?:?]
at org.elasticsearch.xpack.core.ssl.SSLService.lambda$loadSSLConfigurations$5(SSLService.java:548) ~[?:?]
at java.util.HashMap.forEach(HashMap.java:1421) ~[?:?]
at java.util.Collections$UnmodifiableMap.forEach(Collections.java:1553) ~[?:?]
at org.elasticsearch.xpack.core.ssl.SSLService.loadSSLConfigurations(SSLService.java:546) ~[?:?]
at org.elasticsearch.xpack.core.ssl.SSLService.<init>(SSLService.java:147) ~[?:?]
at org.elasticsearch.xpack.core.XPackPlugin.createSSLService(XPackPlugin.java:525) ~[?:?]
at org.elasticsearch.xpack.core.XPackPlugin.createComponents(XPackPlugin.java:338) ~[?:?]
at org.elasticsearch.node.Node.lambda$new$17(Node.java:731) ~[elasticsearch-7.17.5.jar:7.17.5]
at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:273) ~[?:?]
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) ~[?:?]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921) ~[?:?]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682) ~[?:?]
at org.elasticsearch.node.Node.<init>(Node.java:745) ~[elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.node.Node.<init>(Node.java:309) ~[elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.bootstrap.Bootstrap$5.<init>(Bootstrap.java:234) ~[elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:234) ~[elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:434) ~[elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.bootstrap.Elasticsearch.init.............
Please suggest do I need to where to put in decrypt password, what is bad key.snippet of my Elasticsearch.yml is
xpack.security.enabled: true
xpack.security.authc.api_key.enabled: true
xpack.security.transport.ssl.enabled: true
#xpack.ssl.keystore.password:
xpack.security.transport.ssl.verification_mode: certificate
xpack.security.transport.ssl.client_authentication: required
xpack.security.transport.ssl.keystore.path: /etc/elasticsearch/certs/elastic-certificates.p12
xpack.security.transport.ssl.truststore.path: /etc/elasticsearch/certs/elastic-certificates.p12
xpack.security.http.ssl.enabled: true
xpack.security.http.ssl.keystore.path: /etc/elasticsearch/certs/http.p12
build.gradle of rest-api , console error imageI am trying to build my rest api in Liferay 7.4 , but the rest api is not working properly.
Can't find the request for http://localhost:8080/o/marketplace/username/'s Observer. I have tried doing cxf settings and start stop the go-go shell but it didnt work.Here's the error log of my* file for ref:
2021-06-29 07:21:30.488 ERROR [pipe-start 1406][ROOT:93] bundle com.market.rest.api:1.0.0 (1406)BundleComponentActivator : Unexpected failure enabling component holder com.market.rest.api.application.EmailNotificationListener
java.lang.NoClassDefFoundError: com/liferay/portal/kernel/messaging/BaseMessageListener
at java.lang.ClassLoader.defineClass1(Native Method) ~[?:1.8.0_261]
at java.lang.ClassLoader.defineClass(ClassLoader.java:756) ~[?:1.8.0_261]
at org.eclipse.osgi.internal.loader.ModuleClassLoader.defineClass(ModuleClassLoader.java:276) ~[org.eclipse.osgi.jar:?]
at org.eclipse.osgi.internal.loader.classpath.ClasspathManager.defineClass(ClasspathManager.java:634) ~[org.eclipse.osgi.jar:?]
at BundleComponentActivator : Unexpected failure enabling component holder
at org.eclipse.osgi.container.Module.publishEvent(Module.java:476) [org.eclipse.osgi.jar:?]
at org.eclipse.osgi.container.Module.start(Module.java:467) [org.eclipse.osgi.jar:?]
at org.eclipse.osgi.internal.framework.EquinoxBundle.start(EquinoxBundle.java:428) [org.eclipse.osgi.jar:?]
at org.apache.felix.gogo.command.Basic.start(Basic.java:739) [bundleFile:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_261]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_261]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_261]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_261]
at org.apache.felix.gogo.runtime.Reflective.invoke(Reflective.java:139) [bundleFile:?]
at org.apache.felix.gogo.runtime.CommandProxy.execute(CommandProxy.java:91) [bundleFile:?]
at org.apache.felix.gogo.runtime.Closure.executeCmd(Closure.java:599) [bundleFile:?]
at org.apache.felix.gogo.runtime.Closure.executeStatement(Closure.java:526) [bundleFile:?]
at org.apache.felix.gogo.runtime.Closure.execute(Closure.java:415) [bundleFile:?]
at org.apache.felix.gogo.runtime.Pipe.doCall(Pipe.java:416) [bundleFile:?]
at org.apache.felix.gogo.runtime.Pipe.call(Pipe.java:229) [bundleFile:?]
at org.apache.felix.gogo.runtime.Pipe.call(Pipe.java:59) [bundleFile:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_261]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_261]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_261]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_261]
Caused by: java.lang.ClassNotFoundException: com.liferay.portal.kernel.scheduler.SchedulerEntry cannot be found by com.market.rest.api_1.0.0
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:508) ~[org.eclipse.osgi.jar:?]
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:419) ~[org.eclipse.osgi.jar:?]
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:411) ~[org.eclipse.osgi.jar:?]
at org.eclipse.osgi.internal.loader.ModuleClassLoader.loadClass(ModuleClassLoader.java:151) ~[org.eclipse.osgi.jar:?]
at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[?:1.8.0_261]
... 50 more
2021-06-29 07:21:30.817 WARN [pipe-start 1406][AriesJaxrsServiceRuntime:298] Application from reference CachingServiceReference {__cachedProperties={osgi.jaxrs.application.select=null (cached), osgi.jaxrs.application.base=/marketplace, osgi.jaxrs.name=MarketPlace.Rest, osgi.jaxrs.extension.select=null (cached), osgi.http.whiteboard.context.select=null (cached), osgi.jaxrs.whiteboard.target=null (cached)}__serviceReference={javax.ws.rs.core.Application}={osgi.jaxrs.application.base=/marketplace, service.id=17706, service.bundleid=1406, service.scope=bundle, oauth2.scopechecker.type=none, osgi.jaxrs.name=MarketPlace.Rest, liferay.access.control.disable=true, component.name=com.market.rest.api.application.MarketplaceAPIrestApplication, component.id=8438}__} can't be got [Sanitized]
2021-06-29 07:21:30.832 ERROR [Framework Event Dispatcher: Equinox Container: e7d68436-b9af-4ee0-a2f1-a45bc009ca95][Framework:93] FrameworkEvent ERROR
org.osgi.framework.ServiceException: Exception in org.apache.felix.scr.impl.manager.SingleComponentManager.getService()
at org.eclipse.osgi.internal.serviceregistry.ServiceFactoryUse.factoryGetService(ServiceFactoryUse.java:222) ~[org.eclipse.osgi.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceFactoryUse.getService(ServiceFactoryUse.java:111) ~[org.eclipse.osgi.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceUse.newServiceObject(ServiceUse.java:96) ~[org.eclipse.osgi.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceConsumer$1.getService(ServiceConsumer.java:30) ~[org.eclipse.osgi.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistrationImpl.getService(ServiceRegistrationImpl.java:524) ~[org.eclipse.osgi.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceObjectsImpl.getService(ServiceObjectsImpl.java:89) ~[org.eclipse.osgi.jar:?]
at org.apache.aries.jax.rs.whiteboard.internal.utils.Utils.lambda$null$7(Utils.java:206) ~[?:?]
at org.apache.aries.component.dsl.OSGi.lambda$null$68(OSGi.java:674) ~[?:?]
at org.apache.aries.component.dsl.Publisher.apply(Publisher.java:28) ~[?:?]
at org.apache.aries.component.dsl.OSGi.lambda$null$75(OSGi.java:715) ~[?:?]
at org.apache.aries.component.dsl.OSGi.lambda$null$64(OSGi.java:602) ~[?:?]
at org.apache.aries.component.dsl.OSGi.lambda$null$64(OSGi.java:602) ~[?:?]
at org.apache.aries.component.dsl.internal.JustOSGiImpl.lambda$new$2(JustOSGiImpl.java:47) ~[?:?]
at org.apache.aries.component.dsl.internal.OSGiImpl.run(OSGiImpl.java:50) ~[?:?]
at org.apache.aries.component.dsl.OSGi.lambda$effects$65(OSGi.java:596) ~[?:?]
at org.apache.aries.component.dsl.internal.OSGiImpl.lambda$create$0(OSGiImpl.java:39) ~[?:?]
at org.apache.aries.component.dsl.internal.OSGiImpl.run(OSGiImpl.java:50) ~[?:?]
at org.apache.aries.component.dsl.OSGi.lambda$null$68(OSGi.java:674) ~[?:?]
at org.apache.aries.component.dsl.OSGi.lambda$null$64(OSGi.java:602) ~[?:?]
at org.apache.aries.component.dsl.Publisher.apply(Publisher.java:28) ~[?:?]
at org.apache.aries.component.dsl.OSGi.lambda$null$29(OSGi.java:296) ~[?:?]
at org.apache.aries.component.dsl.internal.UpdateSupport.deferPublication(UpdateSupport.java:40) ~[?:?]
at org.apache.aries.component.dsl.OSGi.lambda$null$32(OSGi.java:295) ~[?:?]
at org.apache.aries.component.dsl.Publisher.apply(Publisher.java:28) ~[?:?]
2021-06-29 07:21:30.817 WARN [pipe-start 1406][AriesJaxrsServiceRuntime:298] Application from reference CachingServiceReference {__cachedProperties={osgi.jaxrs.application.select=null (cached), osgi.jaxrs.application.base=/marketplace, osgi.jaxrs.name=MarketPlace.Rest, osgi.jaxrs.extension.select=null (cached), osgi.http.whiteboard.context.select=null (cached), osgi.jaxrs.whiteboard.target=null (cached)}__serviceReference={javax.ws.rs.core.Application}={osgi.jaxrs.application.base=/marketplace, service.id=17706, service.bundleid=1406, service.scope=bundle, oauth2.scopechecker.type=none, osgi.jaxrs.name=MarketPlace.Rest, liferay.access.control.disable=true, component.name=com.market.rest.api.application.MarketplaceAPIrestApplication, component.id=8438}__} can't be got [Sanitized]
Caused by: java.lang.ClassNotFoundException: com.liferay.portal.kernel.messaging.BaseMessageListener cannot be found by com.market.rest.api_1.0.0
Mostly in the error logs it is class not found exception
here is my rest-api gradle.build I am trying to use for Liferay 7.4
dependencies {
compileOnly group: "com.liferay.portal", name: "release.portal.api"
// https://mvnrepository.com/artifact/javax.ws.rs/javax.ws.rs-api
compileOnly group: 'javax.ws.rs', name: 'javax.ws.rs-api', version: '2.1.1'
compileOnly group: "org.osgi", name: "org.osgi.service.component.annotations"
compileOnly group: "org.osgi", name: "org.osgi.service.jaxrs"
compileOnly group: "org.json", name: "json"
compileOnly group: "com.sun.mail", name: "javax.mail", version: "1.6.2"
compileOnly group: 'commons-httpclient', name: 'commons-httpclient', version: '3.1'
//changes
compileOnly project(":modules:marketplace-service:marketplace-service-api")
compileOnly project(":modules:marketplace-service:marketplace-service-service")
// https://mvnrepository.com/artifact/org.osgi/org.osgi.service.log
//compileOnly group: 'org.osgi', name: 'org.osgi.service.log'
compileOnly group: 'com.liferay.portal', name: 'com.liferay.portal.kernel', version: '11.8.0'
compileInclude group: 'org.apache.poi', name: 'poi', version: '4.0.1', transitive:false
compileInclude group: 'org.apache.poi', name: 'poi-ooxml', version: '4.0.1', transitive:false
compileInclude group: 'org.apache.poi', name: 'poi-ooxml-schemas', version: '4.0.1', transitive:false
compileInclude group: 'org.apache.xmlbeans', name: 'xmlbeans', version: '3.0.2', transitive:false
compileInclude group: 'org.apache.commons', name: 'commons-collections4', version: '4.4', transitive:false
compileInclude group: 'dom4j', name: 'dom4j', version: '1.6.1', transitive:false
compileInclude group: 'org.apache.commons', name: 'commons-compress', version: '1.18', transitive:false
}
I am getting this error
osgi.jaxrs.application.select=null (cached), osgi.jaxrs.application.base=/marketplace, osgi.jaxrs.name=MarketPlace.Rest, osgi.jaxrs.extension.select=null (cached), osgi.http.whiteboard.context.select=null (cached),
I'm publishing avro serialized data to kafka topic and then trying to create Flink table from the topic via SQL CLI interface. I'm able to create the topic but not able to view the topic data after executing SQL SELECT statement. Howver, I'm able to deserialize and print the published data using Simple kafka consumer. Getting this error on the SQL CLI:
Flink SQL> SELECT * FROM test_flink2;
[ERROR] Could not execute SQL statement. Reason:
java.lang.ArrayIndexOutOfBoundsException: Index -3 out of bounds for length 2
Table Creation
Flink SQL> CREATE TABLE test_flink2 (
> `name` STRING,
> `address` STRING)
> WITH (
> 'connector' = 'kafka',
> 'topic' = 'test_flink2',
> 'scan.startup.mode' = 'earliest-offset',
> 'properties.bootstrap.servers' = 'krypton04.psc:9092',
> 'format' = 'avro');
[INFO] Table has been created.
Table Definition
Flink SQL> DESC test_flink2;
+---------+--------+------+-----+--------+-----------+
| name | type | null | key | extras | watermark |
+---------+--------+------+-----+--------+-----------+
| name | STRING | true | | | |
| address | STRING | true | | | |
+---------+--------+------+-----+--------+-----------+
2 rows in set
Avro schema
{
"name": "MyClass",
"type": "record",
"namespace": "myns",
"fields": [
{
"name": "name",
"type": "string"
},
{
"name": "address",
"type": "string"
}
]
}
Message value (Message key is None)
value = {'name' : 'vikram',
'address' : 'hyd'}
I'm sending this same message value continuously using Simple kafka producer to topic test_flink2
Kafka topic Description
Topic: reddyvel_test_flink2 PartitionCount: 1 ReplicationFactor: 3 Configs:
Topic: reddyvel_test_flink2 Partition: 0 Leader: 1 Replicas: 1,4,5 Isr: 1,4,5
Full Error log
From flink-*-sql-client-*.log
2021-02-05 09:10:01,351 WARN org.apache.flink.table.client.cli.CliClient [] - Could not execute SQL statement.
org.apache.flink.table.client.gateway.SqlExecutionException: Error while retrieving result.
at org.apache.flink.table.client.gateway.local.result.CollectStreamResult.lambda$startRetrieval$0(CollectStreamResult.java:96) ~[flink-sql-client_2.12-1.12.1.jar:1.12.1]
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859) ~[?:?]
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837) ~[?:?]
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506) ~[?:?]
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:610) ~[?:?]
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:840) ~[?:?]
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:479) ~[?:?]
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290) ~[?:?]
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020) ~[?:?]
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656) ~[?:?]
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594) ~[?:?]
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183) ~[?:?]
Caused by: java.util.concurrent.CompletionException: org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 583e8c2eb20eb8d8bdedba04673bb297)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:642) ~[?:?]
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506) ~[?:?]
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073) ~[?:?]
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859) ~[?:?]
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837) ~[?:?]
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506) ~[?:?]
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073) ~[?:?]
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859) ~[?:?]
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837) ~[?:?]
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506) ~[?:?]
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:610) ~[?:?]
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1085) ~[?:?]
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:478) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?]
at java.lang.Thread.run(Thread.java:834) ~[?:?]
Caused by: org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 583e8c2eb20eb8d8bdedba04673bb297)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:642) ~[?:?]
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506) ~[?:?]
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073) ~[?:?]
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859) ~[?:?]
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837) ~[?:?]
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506) ~[?:?]
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073) ~[?:?]
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859) ~[?:?]
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837) ~[?:?]
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506) ~[?:?]
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:610) ~[?:?]
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1085) ~[?:?]
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:478) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?]
at java.lang.Thread.run(Thread.java:834) ~[?:?]
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:642) ~[?:?]
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506) ~[?:?]
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073) ~[?:?]
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859) ~[?:?]
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837) ~[?:?]
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506) ~[?:?]
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073) ~[?:?]
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859) ~[?:?]
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837) ~[?:?]
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506) ~[?:?]
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:610) ~[?:?]
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1085) ~[?:?]
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:478) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?]
at java.lang.Thread.run(Thread.java:834) ~[?:?]
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:665) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:447) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at jdk.internal.reflect.GeneratedMethodAccessor42.invoke(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:306) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:159) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at scala.PartialFunction.applyOrElse(PartialFunction.scala:123) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.actor.Actor.aroundReceive(Actor.scala:517) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.actor.Actor.aroundReceive$(Actor.scala:515) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.actor.ActorCell.invoke(ActorCell.scala:561) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.dispatch.Mailbox.run(Mailbox.scala:225) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.dispatch.Mailbox.exec(Mailbox.scala:235) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
Caused by: java.io.IOException: Failed to deserialize Avro record.
at org.apache.flink.formats.avro.AvroRowDataDeserializationSchema.deserialize(AvroRowDataDeserializationSchema.java:104) ~[?:?]
at org.apache.flink.formats.avro.AvroRowDataDeserializationSchema.deserialize(AvroRowDataDeserializationSchema.java:44) ~[?:?]
at org.apache.flink.api.common.serialization.DeserializationSchema.deserialize(DeserializationSchema.java:82) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.streaming.connectors.kafka.table.DynamicKafkaDeserializationSchema.deserialize(DynamicKafkaDeserializationSchema.java:113) ~[?:?]
at org.apache.flink.streaming.connectors.kafka.internals.KafkaFetcher.partitionConsumerRecordsHandler(KafkaFetcher.java:177) ~[?:?]
at org.apache.flink.streaming.connectors.kafka.internals.KafkaFetcher.runFetchLoop(KafkaFetcher.java:137) ~[?:?]
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:761) ~[?:?]
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:66) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:241) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
Caused by: java.lang.ArrayIndexOutOfBoundsException: Index -3 out of bounds for length 2
at org.apache.flink.avro.shaded.org.apache.avro.io.parsing.Symbol$Alternative.getSymbol(Symbol.java:460) ~[?:?]
at org.apache.flink.avro.shaded.org.apache.avro.io.ResolvingDecoder.readIndex(ResolvingDecoder.java:283) ~[?:?]
at org.apache.flink.avro.shaded.org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:187) ~[?:?]
at org.apache.flink.avro.shaded.org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:160) ~[?:?]
at org.apache.flink.avro.shaded.org.apache.avro.generic.GenericDatumReader.readField(GenericDatumReader.java:259) ~[?:?]
at org.apache.flink.avro.shaded.org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:247) ~[?:?]
at org.apache.flink.avro.shaded.org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:179) ~[?:?]
at org.apache.flink.avro.shaded.org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:160) ~[?:?]
at org.apache.flink.avro.shaded.org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:153) ~[?:?]
at org.apache.flink.formats.avro.AvroDeserializationSchema.deserialize(AvroDeserializationSchema.java:137) ~[?:?]
at org.apache.flink.formats.avro.AvroRowDataDeserializationSchema.deserialize(AvroRowDataDeserializationSchema.java:101) ~[?:?]
at org.apache.flink.formats.avro.AvroRowDataDeserializationSchema.deserialize(AvroRowDataDeserializationSchema.java:44) ~[?:?]
at org.apache.flink.api.common.serialization.DeserializationSchema.deserialize(DeserializationSchema.java:82) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.streaming.connectors.kafka.table.DynamicKafkaDeserializationSchema.deserialize(DynamicKafkaDeserializationSchema.java:113) ~[?:?]
at org.apache.flink.streaming.connectors.kafka.internals.KafkaFetcher.partitionConsumerRecordsHandler(KafkaFetcher.java:177) ~[?:?]
at org.apache.flink.streaming.connectors.kafka.internals.KafkaFetcher.runFetchLoop(KafkaFetcher.java:137) ~[?:?]
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:761) ~[?:?]
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:66) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:241) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
I'm not able to figure out the cause of the issue here.
Thanks.
using confluent kafka python API for sending message
Then you must use Flink's Confluent Avro deserializer
Your error is because you're trying to consume plain Avro, which requires the schema to be part of the message (it can't find it, so throws array out of bounds)
I have a Flink session cluster (Job Manager + Task Manager), version 1.11.1, with configured log4j-console.properties to include Kafka appender. In addition, in both Job Manager and Task Manager I'm enabling flink-s3-fs-hadoop built-in plugin.
I've added kafka-clients jar to the flink/lib directory, which is necessary for the container to be running. But I'm still getting the below class loading error when the S3 plugin is being instantiated (and initializing the logger).
Caused by: org.apache.kafka.common.config.ConfigException: Invalid value org.apache.kafka.common.serialization.ByteArraySerializer for configuration key.serializer: Class org.apache.kafka.common.serialization.ByteArraySerializer could not be found.
(full stack trace at the bottom)
As I understood, there is a dedicated dynamic class loading for plugins, which is separated than the system class loading. Therefore, I added the following configurations to the flink-conf.yaml file:
classloader.parent-first-patterns.additional: org.apache.kafka
classloader.resolve-order: parent-first
but the error still appears.
While dubugging, I don't see the additional pattern being added to the "allowedFlinkPackages" of the Plugin Class Loader.
What am I missing here?
java.util.ServiceConfigurationError: org.apache.flink.core.fs.FileSystemFactory: Provider org.apache.flink.fs.s3hadoop.S3FileSystemFactory could not be instantiated
at java.util.ServiceLoader.fail(Unknown Source) ~[?:?]
at java.util.ServiceLoader$ProviderImpl.newInstance(Unknown Source) ~[?:?]
at java.util.ServiceLoader$ProviderImpl.get(Unknown Source) ~[?:?]
at java.util.ServiceLoader$3.next(Unknown Source) ~[?:?]
at org.apache.flink.core.plugin.PluginLoader$ContextClassLoaderSettingIterator.next(PluginLoader.java:103) ~[flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.shaded.guava18.com.google.common.collect.Iterators$5.next(Iterators.java:558) ~[flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.shaded.guava18.com.google.common.collect.TransformedIterator.next(TransformedIterator.java:48) ~[flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.core.fs.FileSystem.addAllFactoriesToList(FileSystem.java:1068) [flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.core.fs.FileSystem.loadFileSystemFactories(FileSystem.java:1050) [flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.core.fs.FileSystem.initialize(FileSystem.java:325) [flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.runtime.taskexecutor.TaskManagerRunner.runTaskManagerSecurely(TaskManagerRunner.java:315) [flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.runtime.taskexecutor.TaskManagerRunner.main(TaskManagerRunner.java:297) [flink-dist_2.11-1.11.1.jar:1.11.1]
Caused by: java.lang.ExceptionInInitializerError
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Unknown Source) ~[?:?]
... 11 more
Caused by: org.apache.kafka.common.config.ConfigException: Invalid value org.apache.kafka.common.serialization.ByteArraySerializer for configuration key.serializer: Class org.apache.kafka.common.serialization.ByteArraySerializer could not be found.
at org.apache.kafka.common.config.ConfigDef.parseType(ConfigDef.java:728) ~[kafka-clients-2.5.0.jar:?]
at org.apache.kafka.common.config.ConfigDef.parseValue(ConfigDef.java:474) ~[kafka-clients-2.5.0.jar:?]
at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:467) ~[kafka-clients-2.5.0.jar:?]
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:108) ~[kafka-clients-2.5.0.jar:?]
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:129) ~[kafka-clients-2.5.0.jar:?]
at org.apache.kafka.clients.producer.ProducerConfig.<init>(ProducerConfig.java:481) ~[kafka-clients-2.5.0.jar:?]
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:326) ~[kafka-clients-2.5.0.jar:?]
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:298) ~[kafka-clients-2.5.0.jar:?]
at org.apache.logging.log4j.core.appender.mom.kafka.DefaultKafkaProducerFactory.newKafkaProducer(DefaultKafkaProducerFactory.java:40) ~[log4j-core-2.12.1.jar:2.12.1]
at org.apache.logging.log4j.core.appender.mom.kafka.KafkaManager.startup(KafkaManager.java:136) ~[log4j-core-2.12.1.jar:2.12.1]
at org.apache.logging.log4j.core.appender.mom.kafka.KafkaAppender.start(KafkaAppender.java:164) ~[log4j-core-2.12.1.jar:2.12.1]
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:304) ~[log4j-core-2.12.1.jar:2.12.1]
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:579) ~[log4j-core-2.12.1.jar:2.12.1]
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:651) ~[log4j-core-2.12.1.jar:2.12.1]
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:668) ~[log4j-core-2.12.1.jar:2.12.1]
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:253) ~[log4j-core-2.12.1.jar:2.12.1]
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:153) ~[log4j-core-2.12.1.jar:2.12.1]
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:45) ~[log4j-core-2.12.1.jar:2.12.1]
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:194) ~[log4j-api-2.12.1.jar:2.12.1]
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:138) ~[log4j-api-2.12.1.jar:2.12.1]
at org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:45) ~[log4j-slf4j-impl-2.12.1.jar:2.12.1]
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:48) ~[log4j-api-2.12.1.jar:2.12.1]
at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:30) ~[log4j-slf4j-impl-2.12.1.jar:2.12.1]
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:329) ~[flink-dist_2.11-1.11.1.jar:1.11.1]
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:349) ~[flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.fs.s3.common.AbstractS3FileSystemFactory.<clinit>(AbstractS3FileSystemFactory.java:88) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Unknown Source) ~[?:?]
... 11 more
[2020-12-06T09:15:45,892][Error] {} [o.a.f.c.f.FileSystem]: Failed to load a file system via services
java.util.ServiceConfigurationError: org.apache.flink.core.fs.FileSystemFactory: Provider org.apache.flink.fs.s3hadoop.S3AFileSystemFactory could not be instantiated
at java.util.ServiceLoader.fail(Unknown Source) ~[?:?]
at java.util.ServiceLoader$ProviderImpl.newInstance(Unknown Source) ~[?:?]
at java.util.ServiceLoader$ProviderImpl.get(Unknown Source) ~[?:?]
at java.util.ServiceLoader$3.next(Unknown Source) ~[?:?]
at org.apache.flink.core.plugin.PluginLoader$ContextClassLoaderSettingIterator.next(PluginLoader.java:103) ~[flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.shaded.guava18.com.google.common.collect.Iterators$5.next(Iterators.java:558) ~[flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.shaded.guava18.com.google.common.collect.TransformedIterator.next(TransformedIterator.java:48) ~[flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.core.fs.FileSystem.addAllFactoriesToList(FileSystem.java:1068) [flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.core.fs.FileSystem.loadFileSystemFactories(FileSystem.java:1050) [flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.core.fs.FileSystem.initialize(FileSystem.java:325) [flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.runtime.taskexecutor.TaskManagerRunner.runTaskManagerSecurely(TaskManagerRunner.java:315) [flink-dist_2.11-1.11.1.jar:1.11.1]
at org.apache.flink.runtime.taskexecutor.TaskManagerRunner.main(TaskManagerRunner.java:297) [flink-dist_2.11-1.11.1.jar:1.11.1]
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.flink.fs.s3hadoop.S3FileSystemFactory
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Unknown Source) ~[?:?]
... 11 more
As you stated, Flink plugins are loaded through its own classloader and completely isolated from any other plugin.
If we delve into the source code there is another key which is used while the cluster is boot up(unfortunately its not documented anywhere) :
plugin.classloader.parent-first-patterns.additional
this let you add external jars into classpath using the PluginClassLoader
Declaration + Usage : https://github.com/apache/flink/blob/53a4b4407816c2780fed2f8995affbebc1f58c3c/flink-core/src/main/java/org/apache/flink/configuration/CoreOptions.java#L156-L174
Add the following to flink-conf.yaml.
plugin.classloader.parent-first-patterns.additional: org.apache.kafka
that should do the trick
Currently I encounter some problems when I try to run Spark with Cassandra on standalone mode.
Initially, I run successfully with parameter mater="local[4]" in SparkContext.
Then, I try to move to standalone mode. What I used are:
Ubuntu: 12.04
Cassandra: 1.2.11
Spark: 0.8.0
Scala: 2.9.3
JDK: Oracle 1.6.0_35
Kryo: 2.21
At first, I got "unread block" error. As suggestion in other topic I change to use Kryo serializer and add Twitter Chill. Then, I get the " Failed to register spark.kryo.registrator " in my console and the Exception as below:
13/10/28 12:12:36 INFO cluster.ClusterTaskSetManager: Lost TID 0 (task 0.0:0)
13/10/28 12:12:36 INFO cluster.ClusterTaskSetManager: Loss was due to java.io.EOFException
java.io.EOFException
at org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:109)
at org.apache.spark.broadcast.HttpBroadcast$.read(HttpBroadcast.scala:150)
at org.apache.spark.broadcast.HttpBroadcast.readObject(HttpBroadcast.scala:56)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at java.io.ObjectStreamClass.invokeReadObject(Unknown Source)
at java.io.ObjectInputStream.readSerialData(Unknown Source)
at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
at java.io.ObjectInputStream.readObject0(Unknown Source)
at java.io.ObjectInputStream.defaultReadFields(Unknown Source)
at java.io.ObjectInputStream.readSerialData(Unknown Source)
at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
at java.io.ObjectInputStream.readObject0(Unknown Source)
at java.io.ObjectInputStream.defaultReadFields(Unknown Source)
at java.io.ObjectInputStream.readSerialData(Unknown Source)
at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
at java.io.ObjectInputStream.readObject0(Unknown Source)
at java.io.ObjectInputStream.readObject(Unknown Source)
at scala.collection.immutable.$colon$colon.readObject(List.scala:435)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at java.io.ObjectStreamClass.invokeReadObject(Unknown Source)
at java.io.ObjectInputStream.readSerialData(Unknown Source)
at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
at java.io.ObjectInputStream.readObject0(Unknown Source)
at java.io.ObjectInputStream.defaultReadFields(Unknown Source)
at java.io.ObjectInputStream.readSerialData(Unknown Source)
at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
at java.io.ObjectInputStream.readObject0(Unknown Source)
at java.io.ObjectInputStream.readObject(Unknown Source)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:39)
at org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:61)
at org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:129)
at java.io.ObjectInputStream.readExternalData(Unknown Source)
at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
at java.io.ObjectInputStream.readObject0(Unknown Source)
at java.io.ObjectInputStream.readObject(Unknown Source)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:39)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:61)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:153)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Someone also encountered the EOFException in spark before, the answer is not register the registrator correctly. I register the Registrator following the Spark guide. Registrator as below:
class MyRegistrator extends KryoRegistrator {
override def registerClasses(kryo: Kryo) {
kryo.register(classOf[org.apache.spark.rdd.RDD[(Map[String, ByteBuffer], Map[String, ByteBuffer])]])
kryo.register(classOf[String], 1)
kryo.register(classOf[Map[String, ByteBuffer]], 2)
}
}
And I also set the property just as the guide does.
System.setProperty("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
System.setProperty("spark.kryo.registrator", "main.scala.MyRegistrator")
Can anyone give me some hints where I did wrong?
Thanks.
Based on my experience, the reasons to get the "EOFException" and the "data unread block" are the same. They lack some libraries when running on the clusters. The most wired thing is that I have add the libraries with "sbt assembly" in spark and the libraries are actually existing in the jars folder. But the spark still cannot find and load them successfully. Then I add the libraries in the spark context, it works. That means I need to transport the libraries to each nodes by specifying in the code.