Can you tell me the exact location where jar file need to be copied. A jar will be required for the below issue.
Connection failure. You must change the Database Settings.
org.talend.utils.exceptions.MissingDriverException: can not find class :oracle.jdbc.OracleDriver
missing JDBC driver :
ojdbc7.jar
at org.talend.core.model.metadata.builder.database.ExtractMetaDataUtils.connect(ExtractMetaDataUtils.java:1128)
at org.talend.core.model.metadata.builder.database.ExtractMetaDataFromDataBase.testConnection(ExtractMetaDataFromDataBase.java:315)
at org.talend.metadata.managment.repository.ManagerConnection.check(ManagerConnection.java:289)
at org.talend.repository.ui.wizards.metadata.connection.database.DatabaseForm$59.runWithCancel(DatabaseForm.java:3812)
at org.talend.repository.ui.wizards.metadata.connection.database.DatabaseForm$59.runWithCancel(DatabaseForm.java:1)
at org.talend.repository.ui.dialog.AProgressMonitorDialogWithCancel$1.runnableWithCancel(AProgressMonitorDialogWithCancel.java:77)
at org.talend.repository.ui.dialog.AProgressMonitorDialogWithCancel$ARunnableWithProgressCancel$1.call(AProgressMonitorDialogWithCancel.java:161)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.lang.Thread.run(Thread.java:745)
In Talend, try to go to Windows>Show View..>Modules. There you can import or download external jar files for your components.
Related
I am using debezium oracle connector in kafka connect.While starting connector I am getting below error,
java.lang.RuntimeException: Failed to resolve Oracle database version
at io.debezium.connector.oracle.OracleConnection.resolveOracleDatabaseVersion(OracleConnection.java:159)
at io.debezium.connector.oracle.OracleConnection.<init>(OracleConnection.java:71)
at io.debezium.connector.oracle.OracleConnector.validateConnection(OracleConnector.java:74)
at io.debezium.connector.common.RelationalBaseSourceConnector.validate(RelationalBaseSourceConnector.java:52)
at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:400)
at org.apache.kafka.connect.runtime.AbstractHerder.lambda$validateConnectorConfig$2(AbstractHerder.java:351)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.sql.SQLException: No suitable driver found for jdbc:oracle:thin:#**.**.*.**:1521/CDB
at java.sql.DriverManager.getConnection(DriverManager.java:689)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at io.debezium.jdbc.JdbcConnection.lambda$patternBasedFactory$0(JdbcConnection.java:184)
at io.debezium.jdbc.JdbcConnection$ConnectionFactoryDecorator.connect(JdbcConnection.java:121)
at io.debezium.jdbc.JdbcConnection.connection(JdbcConnection.java:890)
at io.debezium.jdbc.JdbcConnection.connection(JdbcConnection.java:885)
at io.debezium.jdbc.JdbcConnection.queryAndMap(JdbcConnection.java:643)
at io.debezium.jdbc.JdbcConnection.queryAndMap(JdbcConnection.java:517)
at io.debezium.connector.oracle.OracleConnection.resolveOracleDatabaseVersion(OracleConnection.java:129)
... 10 more
I am refering to the link for oracle setup and connector configuration,
**https://debezium.io/documentation/reference/connectors/oracle.html#setting-up-oracle**
connector-configuration.properties
name=debeziumoraclesource
connector.class=io.debezium.connector.oracle.OracleConnector
database.hostname=**.*.**.**
database.port=1521
database.user=username
database.password=password
database.dbname=CDBNAME
database.server.name=**.*.**.**
tasks.max=1
database.pdb.name=PDBNAME
database.history.kafka.bootstrap.servers=kafka:9092
database.history.kafka.topic=history.ENTITY_GROUP_PARAMETER_VALUES
database.connection.adaptor=logminer
snapshot.mode=initial
table.include.list=schema.ENTITY_GROUP_PARAMETER_VALUES
Also I have download ojdbc8.jar and placed inside kafka/libs folder.I have tried using different version of jars like ojdbc10 and different versions of ojdbc8.Nothing helped me.Also to the point of note I am using oracle19c.Please help me in resolving this issue.Thanks in advance.
Using OJDBC6.jar with all dependencies helped me to resolve the issue. And most importantly i placed the jars in connectors lib folder.
Using OJDBC8.jar helped me solve the problem, the problem appeared as I didn't place the jar on all the servers that were running kafka connect.
I've installed WSO2 Integration Studio version 6.5.0 in my Windows workstation and created a project using the Kafka Consumer and Producer built-in template.
Then I configured the project with my own Kafka server settings (topic name "myTopic").
I then right-clicked the composite application and chose Export Project Artifacts and Run.
The Console window displayed at the very top the following messages:
[2019-06-25 09:23:45,499] [micro-integrator] INFO - LibraryArtifactDeployer Synapse Library named '{org.wso2.carbon.connector}kafkaTransport' has been deployed from file : C:\IntegrationStudio\runtime\microesb\tmp\carbonapps\-1234\1561465425230TestCompositeApplication_1.0.0.car\kafkaTransport-connector_2.0.6\kafkaTransport-connector-2.0.6.zip
[2019-06-25 09:23:45,517] [micro-integrator] INFO - SynapseImportFactory Successfully created Synapse Import: kafkaTransport
[2019-06-25 09:23:45,533] [micro-integrator] ERROR - ClassMediatorFactory
Error in instantiating class :
org.wso2.carbon.connector.KafkaProduceConnector
java.lang.NoClassDefFoundError: org/apache/kafka/common/header/Headers
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getConstructor0(Class.java:3075)
[snipped rest for clarity]
I've tried uninstalling Integrator Studio and running it with elevated right to no avail.
I expected the project to be deployed normally.
EDIT: after copying:
kafka_2.11-2.2.1.jar
metrics-core-2.2.0.jar
zkclient-0.11.jar
kafka-clients-2.2.1.jar
scala-library-2.11.12.jar
zookeeper-3.4.13.jar
to the EI_HOME/lib directory, the exception changed to:
org.apache.axis2.deployment.DeploymentException: kafka/consumer/ConsumerTimeoutException
at org.apache.synapse.deployers.AbstractSynapseArtifactDeployer.deploy(AbstractSynapseArtifactDeployer.java:219)
at org.wso2.carbon.application.deployer.synapse.SynapseAppDeployer.deployArtifactType(SynapseAppDeployer.java:1099)
at org.wso2.carbon.application.deployer.synapse.SynapseAppDeployer.deployArtifacts(SynapseAppDeployer.java:114)
at org.wso2.carbon.application.deployer.internal.ApplicationManager.deployCarbonApp(ApplicationManager.java:272)
at org.wso2.carbon.application.deployer.CappAxis2Deployer.deploy(CappAxis2Deployer.java:72)
at org.apache.axis2.deployment.repository.util.DeploymentFileData.deploy(DeploymentFileData.java:136)
at org.apache.axis2.deployment.DeploymentEngine.doDeploy(DeploymentEngine.java:807)
at org.apache.axis2.deployment.repository.util.WSInfoList.update(WSInfoList.java:144)
[snipped for clarity]
Caused by: org.apache.axis2.deployment.DeploymentException: kafka/consumer/ConsumerTimeoutException
at org.apache.synapse.deployers.AbstractSynapseArtifactDeployer.deploy(AbstractSynapseArtifactDeployer.java:207)
... 87 more
Caused by: java.lang.NoClassDefFoundError: kafka/consumer/ConsumerTimeoutException
at org.wso2.carbon.inbound.endpoint.protocol.kafka.KAFKAPollingConsumer.startsMessageListener(KAFKAPollingConsumer.java:90)
at org.wso2.carbon.inbound.endpoint.protocol.kafka.KAFKAProcessor.init(KAFKAProcessor.java:96)
at org.apache.synapse.inbound.InboundEndpoint.init(InboundEndpoint.java:79)
at org.apache.synapse.deployers.InboundEndpointDeployer.deploySynapseArtifact(InboundEndpointDeployer.java:57)
at org.apache.synapse.deployers.AbstractSynapseArtifactDeployer.deploy(AbstractSynapseArtifactDeployer.java:197)
... 87 more
Caused by: java.lang.ClassNotFoundException: kafka.consumer.ConsumerTimeoutException cannot be found by synapse-core_2.1.7.wso2v111
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:475)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:421)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:412)
at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:107)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 92 more
Have you copied the required jars from kafka_home/libs folder to EI_home/lib, if yes then share your code to get the issue detail
According to this documentation https://docs.wso2.com/display/EI650/Kafka+Inbound+Protocol, the recommended versions for Kafka is kafka_2.9.2-0.8.1.1. You can download it in the below link. http://kafka.apache.org/downloads.html. Please use those jars and copy them to the EI_HOME/lib. There is an github issue for this as well. https://github.com/wso2/product-ei/issues/2239
I may be a bit too late but we have been using Custom inbound endpoint for Kafka. We also faced exactly same issue and that was the only way to fix it.
You could use https://github.com/wso2-extensions/esb-inbound-kafka/blob/master/docs/config.md to configure it.
I am trying run a Scala Play app and got this exception:
Caused by: com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'memo'
at com.typesafe.config.impl.SimpleConfig.findKeyOrNull(SimpleConfig.java:152) ~[config-1.3.1.jar:na]
at com.typesafe.config.impl.SimpleConfig.findOrNull(SimpleConfig.java:170) ~[config-1.3.1.jar:na]
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:184) ~[config-1.3.1.jar:na]
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:189) ~[config-1.3.1.jar:na]
at com.typesafe.config.impl.SimpleConfig.getObject(SimpleConfig.java:264) ~[config-1.3.1.jar:na]
at com.typesafe.config.impl.SimpleConfig.getConfig(SimpleConfig.java:270) ~[config-1.3.1.jar:na]
at com.typesafe.config.impl.SimpleConfig.getConfig(SimpleConfig.java:37) ~[config-1.3.1.jar:na]
at lila.common.PlayApp$.loadConfig(PlayApp.scala:24) ~[na:na]
at lila.memo.Env$.lila$memo$Env$$$anonfun$1(Env.scala:28) ~[na:na]
at lila.common.Chronometer$.sync(Chronometer.scala:56) ~[na:na]
I put external reference to the configuration file such as :
run -Dconfig.resource=conf/base.conf
I am new to Scala.Tried to find out conifg-1.3.1-jar but unable to find out.Can you please suggest how to overcome this situation.
**In the base.conf there is configuration for the memo is present.
Try without conf/, run -Dconfig.resource=base.conf
From Play 2.6 docs
Using -Dconfig.resource
This will search for an alternative configuration file in the application classpath (you usually provide these alternative configuration files into your application conf/ directory before packaging). Play will look into conf/ so you don’t have to add conf/.
$ /path/to/bin/ -Dconfig.resource=prod.conf
Using -Dconfig.file
You can also specify another local configuration file not packaged into the application artifacts:
$ /path/to/bin/ -Dconfig.file=/opt/conf/prod.conf
If you are configuring an sbt task from Intellij you must quote the entire command.
"run -Dconfig.resource=prod.conf"
I am trying to import a table from my oracle database using spark and here I am using Scala to import the table.
My jdbc driver is ojdbc7.jar and it's added in both the parameter spark.driver.extraClassPath and spark.executor.extraClassPath in configuration file
spark.driver.extraClassPath :/usr/lib/hadoop-lzo/lib/*:/usr/lib/hadoop/hadoop-aws.jar:/usr/share/aws/aws-java-sdk/*:/usr/share/aws/emr/emrfs/conf:/usr/s
hare/aws/emr/emrfs/lib/*:/usr/share/aws/emr/emrfs/auxlib/*:/usr/share/aws/emr/security/conf:/usr/share/aws/emr/security/lib/*:/home/hadoop/ojdbc7.jar
spark.driver.extraLibraryPath /usr/lib/hadoop/lib/native:/usr/lib/hadoop-lzo/lib/native
spark.executor.extraClassPath :/usr/lib/hadoop-lzo/lib/*:/usr/lib/hadoop/hadoop-aws.jar:/usr/share/aws/aws-java-sdk/*:/usr/share/aws/emr/emrfs/conf:/usr/s
hare/aws/emr/emrfs/lib/*:/usr/share/aws/emr/emrfs/auxlib/*:/usr/share/aws/emr/security/conf:/usr/share/aws/emr/security/lib/*:/home/hadoop/ojdbc7.jar
spark.executor.extraLibraryPath /usr/lib/hadoop/lib/native:/usr/lib/hadoop-lzo/lib/native
I can successfully import the table. I can print the schema of the table. But while performing any operations like Count,show() it throws below error
`
Caused by: java.lang.ClassNotFoundException: oracle.jdbc.OracleDriver
at java.lang.ClassLoader.findClass(ClassLoader.java:530) at
org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:34)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at
org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
at
org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:77)
... 21 more
`
This error was because Spark was not able to locate the ojdbc7.jar from every core node. So placing this jar in a shared location like /usr/lib/spark/jars will resolve this issue.
You can also do few other things including adding jar file full location as a dependency under spark section in the interpreter as an artifact
If you just want %jdbc to work, update the jdbc section under interpreter, add the jar file full location as a artifact under the dependencies and also update the default.driver, default.url, default.user, default.password accordingly
When I try to 'Activate' my newly deployed .WAR file on the weblogic - I get an error in the AdminServer log file.
iption here]2]2
weblogic.management.DeploymentException: Exception occured while
downloading files
at weblogic.deploy.internal.targetserver.datamanagement.AppDataUpdate.doDownload(AppDataUpdate.java:49)
at weblogic.deploy.internal.targetserver.datamanagement.DataUpdate.download(DataUpdate.java:57)
at weblogic.deploy.internal.targetserver.datamanagement.Data.prepareDataUpdate(Data.java:117)
at weblogic.deploy.internal.targetserver.BasicDeployment.prepareDataUpdate(BasicDeployment.java:750)
at weblogic.deploy.internal.targetserver.operations.AbstractOperation.prepareDataUpdate(AbstractOperation.java:918)
Caused By:
java.io.IOException: There is not enough space on the disk
at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(FileOutputStream.java:326)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:121)
There is lots of space on the E drive where weblogic and application resides. I tried to move the log files out of tmp and restarted one of the instance(there are 2 instances-load balanced) and didn't work.
Any suggestions? Thanks in advance.
I should have checked the both Servers on which my application runs. The A10P002 server disk was out of space. I just deleted old .Log files and server is back to normal and I am able to deploy - No more 'java.io.IOException'.