java.lang.RuntimeException: Failed to resolve Oracle database version - apache-kafka

I am using debezium oracle connector in kafka connect.While starting connector I am getting below error,
java.lang.RuntimeException: Failed to resolve Oracle database version
at io.debezium.connector.oracle.OracleConnection.resolveOracleDatabaseVersion(OracleConnection.java:159)
at io.debezium.connector.oracle.OracleConnection.<init>(OracleConnection.java:71)
at io.debezium.connector.oracle.OracleConnector.validateConnection(OracleConnector.java:74)
at io.debezium.connector.common.RelationalBaseSourceConnector.validate(RelationalBaseSourceConnector.java:52)
at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:400)
at org.apache.kafka.connect.runtime.AbstractHerder.lambda$validateConnectorConfig$2(AbstractHerder.java:351)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.sql.SQLException: No suitable driver found for jdbc:oracle:thin:#**.**.*.**:1521/CDB
at java.sql.DriverManager.getConnection(DriverManager.java:689)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at io.debezium.jdbc.JdbcConnection.lambda$patternBasedFactory$0(JdbcConnection.java:184)
at io.debezium.jdbc.JdbcConnection$ConnectionFactoryDecorator.connect(JdbcConnection.java:121)
at io.debezium.jdbc.JdbcConnection.connection(JdbcConnection.java:890)
at io.debezium.jdbc.JdbcConnection.connection(JdbcConnection.java:885)
at io.debezium.jdbc.JdbcConnection.queryAndMap(JdbcConnection.java:643)
at io.debezium.jdbc.JdbcConnection.queryAndMap(JdbcConnection.java:517)
at io.debezium.connector.oracle.OracleConnection.resolveOracleDatabaseVersion(OracleConnection.java:129)
... 10 more
I am refering to the link for oracle setup and connector configuration,
**https://debezium.io/documentation/reference/connectors/oracle.html#setting-up-oracle**
connector-configuration.properties
name=debeziumoraclesource
connector.class=io.debezium.connector.oracle.OracleConnector
database.hostname=**.*.**.**
database.port=1521
database.user=username
database.password=password
database.dbname=CDBNAME
database.server.name=**.*.**.**
tasks.max=1
database.pdb.name=PDBNAME
database.history.kafka.bootstrap.servers=kafka:9092
database.history.kafka.topic=history.ENTITY_GROUP_PARAMETER_VALUES
database.connection.adaptor=logminer
snapshot.mode=initial
table.include.list=schema.ENTITY_GROUP_PARAMETER_VALUES
Also I have download ojdbc8.jar and placed inside kafka/libs folder.I have tried using different version of jars like ojdbc10 and different versions of ojdbc8.Nothing helped me.Also to the point of note I am using oracle19c.Please help me in resolving this issue.Thanks in advance.

Using OJDBC6.jar with all dependencies helped me to resolve the issue. And most importantly i placed the jars in connectors lib folder.

Using OJDBC8.jar helped me solve the problem, the problem appeared as I didn't place the jar on all the servers that were running kafka connect.

Related

ERROR Stopping due to error (org.apache.kafka.connect.cli.ConnectDistributed) java.lang.NoClassDefFoundError: io/debezium/util/IoUtil

Objective
I am trying to connect to my Oracle Database(12c) from Kafka Connect(ideally in distributed mode) using the Debezium connector(1.2.4.Final). The Kafka version i am using is 2.13-2.6.0.
Command used
As per mentioned here, i am running this command:
C:\Users\username\Downloads\kafka>bin\windows\connect-distributed.bat config\connect-distributed.properties
Error
The error i am getting is:
ERROR Stopping due to error
(org.apache.kafka.connect.cli.ConnectDistributed)
java.lang.NoClassDefFoundError: io/debezium/util/IoUtil
at io.debezium.connector.oracle.Module.(Module.java:19)
at io.debezium.connector.oracle.OracleConnector.version(OracleConnector.java:23)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.versionFor(DelegatingClassLoader.java:390)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.versionFor(DelegatingClassLoader.java:395)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.getPluginDesc(DelegatingClassLoader.java:365)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanPluginPath(DelegatingClassLoader.java:337)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanUrlsAndAddPlugins(DelegatingClassLoader.java:268)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.registerPlugin(DelegatingClassLoader.java:260)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.initPluginLoader(DelegatingClassLoader.java:229)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.initLoaders(DelegatingClassLoader.java:206)
at org.apache.kafka.connect.runtime.isolation.Plugins.(Plugins.java:61)
at org.apache.kafka.connect.cli.ConnectDistributed.startConnect(ConnectDistributed.java:91)
at org.apache.kafka.connect.cli.ConnectDistributed.main(ConnectDistributed.java:78)
Caused by: java.lang.ClassNotFoundException: io.debezium.util.IoUtil
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at org.apache.kafka.connect.runtime.isolation.PluginClassLoader.loadClass(PluginClassLoader.java:104)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 13 more
Settings
In my connect-distributed.properties, i have this:
plugin.path=C:/Users/username/Downloads/kafka/libs/debezium
And inside the debezium folder(following Gunnar's recommendation from the comment in this question), i have these jars:
I also added the plugin path in %PATH% as follows:
echo %PATH% | findstr debezium
XXX;C:\Users\username\Downloads\kafka\libs\debezium;
Help
Any help would be greatly appreciated, as i hope to replace my database polling with this debezium connector which seems a better approach. Thanks!
The solution from Gunnar here works! (His explanation is there too if you want to check it out.)
plugin.path=C:\\Users\\username\\Downloads\\kafka\\libs
and that also works:
plugin.path=C:/Users/username/Downloads/kafka/libs
plugin.path=C:\Users\username\Downloads\kafka\libs
plugin.path=/Users/username/Downloads/kafka/libs
The mistake is: plugin.path should be up to libs and not libs/debezium

WSO2 Integration Studio v6.5.0's built-in Kafka template throws NoClassDefFoundError

I've installed WSO2 Integration Studio version 6.5.0 in my Windows workstation and created a project using the Kafka Consumer and Producer built-in template.
Then I configured the project with my own Kafka server settings (topic name "myTopic").
I then right-clicked the composite application and chose Export Project Artifacts and Run.
The Console window displayed at the very top the following messages:
[2019-06-25 09:23:45,499] [micro-integrator] INFO - LibraryArtifactDeployer Synapse Library named '{org.wso2.carbon.connector}kafkaTransport' has been deployed from file : C:\IntegrationStudio\runtime\microesb\tmp\carbonapps\-1234\1561465425230TestCompositeApplication_1.0.0.car\kafkaTransport-connector_2.0.6\kafkaTransport-connector-2.0.6.zip
[2019-06-25 09:23:45,517] [micro-integrator] INFO - SynapseImportFactory Successfully created Synapse Import: kafkaTransport
[2019-06-25 09:23:45,533] [micro-integrator] ERROR - ClassMediatorFactory
Error in instantiating class :
org.wso2.carbon.connector.KafkaProduceConnector
java.lang.NoClassDefFoundError: org/apache/kafka/common/header/Headers
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getConstructor0(Class.java:3075)
[snipped rest for clarity]
I've tried uninstalling Integrator Studio and running it with elevated right to no avail.
I expected the project to be deployed normally.
EDIT: after copying:
kafka_2.11-2.2.1.jar
metrics-core-2.2.0.jar
zkclient-0.11.jar
kafka-clients-2.2.1.jar
scala-library-2.11.12.jar
zookeeper-3.4.13.jar
to the EI_HOME/lib directory, the exception changed to:
org.apache.axis2.deployment.DeploymentException: kafka/consumer/ConsumerTimeoutException
at org.apache.synapse.deployers.AbstractSynapseArtifactDeployer.deploy(AbstractSynapseArtifactDeployer.java:219)
at org.wso2.carbon.application.deployer.synapse.SynapseAppDeployer.deployArtifactType(SynapseAppDeployer.java:1099)
at org.wso2.carbon.application.deployer.synapse.SynapseAppDeployer.deployArtifacts(SynapseAppDeployer.java:114)
at org.wso2.carbon.application.deployer.internal.ApplicationManager.deployCarbonApp(ApplicationManager.java:272)
at org.wso2.carbon.application.deployer.CappAxis2Deployer.deploy(CappAxis2Deployer.java:72)
at org.apache.axis2.deployment.repository.util.DeploymentFileData.deploy(DeploymentFileData.java:136)
at org.apache.axis2.deployment.DeploymentEngine.doDeploy(DeploymentEngine.java:807)
at org.apache.axis2.deployment.repository.util.WSInfoList.update(WSInfoList.java:144)
[snipped for clarity]
Caused by: org.apache.axis2.deployment.DeploymentException: kafka/consumer/ConsumerTimeoutException
at org.apache.synapse.deployers.AbstractSynapseArtifactDeployer.deploy(AbstractSynapseArtifactDeployer.java:207)
... 87 more
Caused by: java.lang.NoClassDefFoundError: kafka/consumer/ConsumerTimeoutException
at org.wso2.carbon.inbound.endpoint.protocol.kafka.KAFKAPollingConsumer.startsMessageListener(KAFKAPollingConsumer.java:90)
at org.wso2.carbon.inbound.endpoint.protocol.kafka.KAFKAProcessor.init(KAFKAProcessor.java:96)
at org.apache.synapse.inbound.InboundEndpoint.init(InboundEndpoint.java:79)
at org.apache.synapse.deployers.InboundEndpointDeployer.deploySynapseArtifact(InboundEndpointDeployer.java:57)
at org.apache.synapse.deployers.AbstractSynapseArtifactDeployer.deploy(AbstractSynapseArtifactDeployer.java:197)
... 87 more
Caused by: java.lang.ClassNotFoundException: kafka.consumer.ConsumerTimeoutException cannot be found by synapse-core_2.1.7.wso2v111
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:475)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:421)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:412)
at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:107)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 92 more
Have you copied the required jars from kafka_home/libs folder to EI_home/lib, if yes then share your code to get the issue detail
According to this documentation https://docs.wso2.com/display/EI650/Kafka+Inbound+Protocol, the recommended versions for Kafka is kafka_2.9.2-0.8.1.1. You can download it in the below link. http://kafka.apache.org/downloads.html. Please use those jars and copy them to the EI_HOME/lib. There is an github issue for this as well. https://github.com/wso2/product-ei/issues/2239
I may be a bit too late but we have been using Custom inbound endpoint for Kafka. We also faced exactly same issue and that was the only way to fix it.
You could use https://github.com/wso2-extensions/esb-inbound-kafka/blob/master/docs/config.md to configure it.

Connection failure. You must change the jdbc7.jar at org.talende

Can you tell me the exact location where jar file need to be copied. A jar will be required for the below issue.
Connection failure. You must change the Database Settings.
org.talend.utils.exceptions.MissingDriverException: can not find class :oracle.jdbc.OracleDriver
missing JDBC driver :
ojdbc7.jar
at org.talend.core.model.metadata.builder.database.ExtractMetaDataUtils.connect(ExtractMetaDataUtils.java:1128)
at org.talend.core.model.metadata.builder.database.ExtractMetaDataFromDataBase.testConnection(ExtractMetaDataFromDataBase.java:315)
at org.talend.metadata.managment.repository.ManagerConnection.check(ManagerConnection.java:289)
at org.talend.repository.ui.wizards.metadata.connection.database.DatabaseForm$59.runWithCancel(DatabaseForm.java:3812)
at org.talend.repository.ui.wizards.metadata.connection.database.DatabaseForm$59.runWithCancel(DatabaseForm.java:1)
at org.talend.repository.ui.dialog.AProgressMonitorDialogWithCancel$1.runnableWithCancel(AProgressMonitorDialogWithCancel.java:77)
at org.talend.repository.ui.dialog.AProgressMonitorDialogWithCancel$ARunnableWithProgressCancel$1.call(AProgressMonitorDialogWithCancel.java:161)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.lang.Thread.run(Thread.java:745)
In Talend, try to go to Windows>Show View..>Modules. There you can import or download external jar files for your components.

Wildfly 9.0.2 Access is denied while setup server lock file

I am using wildfly 9.0.2, whenever I deploy my application, I am getting the following exception:
at org.jboss.as.messaging.jms.JMSService.doStart(JMSService.java:174)
at org.jboss.as.messaging.jms.JMSService.access$000(JMSService.java:62)
at org.jboss.as.messaging.jms.JMSService$1.run(JMSService.java:96)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at org.jboss.threads.JBossThread.run(JBossThread.java:320)
Caused by: java.io.IOException: Access is denied
at java.io.WinNTFileSystem.createFileExclusively(Native Method)
at java.io.File.createNewFile(File.java:1012)
at org.hornetq.core.server.NodeManager.setUpServerLockFile(NodeManager.java:185)
at org.hornetq.core.server.impl.FileLockNodeManager.start(FileLockNodeManager.java:66)
at org.hornetq.core.server.impl.HornetQServerImpl.start(HornetQServerImpl.java:429)
at org.hornetq.jms.server.impl.JMSServerManagerImpl.start(JMSServerManagerImpl.java:488)
at org.jboss.as.messaging.jms.JMSService.doStart(JMSService.java:170)
... 8 more
I have checked the following points:
There is no other process using that file
If I delete ~\WildFly\standalone\data\messagingjournal\server.lock, It again gets created even though no other process is running
If I restart my computer then it is working but after some time getting the same issue.
Does anyone know about it?
I have also attached the screenshot.Thanks
I have delete this file / directory using unlocker and not this is fixed
Thanks

Error when starting JBoss

Get the below error when starting the JBoss. Tried reuploading the jars, wars and the modules. Wasn't of much help.
Starting jboss-as: Exception in thread "main" java.lang.ExceptionInInitializerError
at java.lang.Class.initializeClass(libgcj.so.10)
at __redirected.__JAXPRedirected.initAll(__JAXPRedirected.java:87)
at org.jboss.modules.Module$1.run(Module.java:85)
at org.jboss.modules.Module$1.run(Module.java:72)
at java.security.AccessController.doPrivileged(libgcj.so.10)
at org.jboss.modules.Module.<clinit>(Module.java:72)
at java.lang.Class.initializeClass(libgcj.so.10)
at org.jboss.modules.Main.main(Main.java:255)
Caused by: java.lang.IllegalArgumentException: Problem configuring DatatypeFactory
at __redirected.__DatatypeFactory.<clinit>(__DatatypeFactory.java:70)
at java.lang.Class.initializeClass(libgcj.so.10)
...7 more
Caused by: javax.xml.datatype.DatatypeConfigurationException: java.lang.ClassNotFoundException: gnu.xml.datatype.JAXPDatatypeFactory
at javax.xml.datatype.DatatypeFactory.newInstance(libgcj.so.10)
at __redirected.__DatatypeFactory.<clinit>(__DatatypeFactory.java:62)
...8 more
Caused by: java.lang.ClassNotFoundException: gnu.xml.datatype.JAXPDatatypeFactory
at java.lang.Class.forName(libgcj.so.10)
at javax.xml.datatype.DatatypeFactory.newInstance(libgcj.so.10)
...9 more
I solved the above issue by updating Java. I was using java 1.5 I updated to java 1.7.xx. I am not getting the Error any more, also my application works fine.
I am curious to know what causes this issue, If anyone could answer this it will be helpful