pyflink on yarn, kafka, NoClassDefFoundError - apache-kafka

I deployed my pyflink job on yarn, it include kafka consume.
my run command:
/opt/flink/bin/flink run -m yarn-cluster -yid application_1634021687380_0009 --jarfile=/opt/flink/lib/flink-sql-connector-kafka_2.11-1.13.2.jar -pyarch venv.zip -pyexec venv.zip/venv/bin/python -py demo.py
when I submit my job, it get this output errors:
<pre>SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/flink/lib/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/ impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hadoop/lib/slf4j-log4j12-1.7.25.ja r!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
2021-10-13 14:41:14,387 INFO org.apache.flink.yarn.cli.FlinkYarnSessionCli [] - Found Yarn properties file under /tmp/.yarn-properties-flink.
2021-10-13 14:41:14,387 INFO org.apache.flink.yarn.cli.FlinkYarnSessionCli [] - Found Yarn properties file under /tmp/.yarn-properties-flink.
PID 32298: initializing main ...
Traceback (most recent call last):
File "/opt/demo.py", line 92, in <module>
GenUniqueIPAssest()
File "/opt/demo.py", line 84, in GenUniqueIPAssest
source_table.select(
File "/opt/flink/opt/python/pyflink.zip/pyflink/table/table.py", line 1056, in execute _insert
File "/opt/flink/opt/python/py4j-0.10.8.1-src.zip/py4j/java_gateway.py", line 1285, in __call__
File "/opt/flink/opt/python/pyflink.zip/pyflink/util/exceptions.py", line 146, in deco
File "/opt/flink/opt/python/py4j-0.10.8.1-src.zip/py4j/protocol.py", line 326, in get_ return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o64.executeInsert.
: java.lang.NoClassDefFoundError: org/apache/kafka/common/errors/InvalidTxnStateExceptio n
at org.apache.flink.streaming.connectors.kafka.table.KafkaDynamicSink.createKafk aProducer(KafkaDynamicSink.java:329)
at org.apache.flink.streaming.connectors.kafka.table.KafkaDynamicSink.getSinkRun timeProvider(KafkaDynamicSink.java:175)
at org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecSink.createSi nkTransformation(CommonExecSink.java:118)
at org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecSink.translat eToPlanInternal(StreamExecSink.java:130)
at org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(E xecNodeBase.java:134)
at org.apache.flink.table.planner.delegation.StreamPlanner.$anonfun$translateToP lan$1(StreamPlanner.scala:70)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233)
at scala.collection.Iterator.foreach(Iterator.scala:937)
at scala.collection.Iterator.foreach$(Iterator.scala:937)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1425)
at scala.collection.IterableLike.foreach(IterableLike.scala:70)
at scala.collection.IterableLike.foreach$(IterableLike.scala:69)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike.map(TraversableLike.scala:233)
at scala.collection.TraversableLike.map$(TraversableLike.scala:226)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at org.apache.flink.table.planner.delegation.StreamPlanner.translateToPlan(Strea mPlanner.scala:69)
at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.s cala:165)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvir onmentImpl.java:1518)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(Tabl eEnvironmentImpl.java:740)
at org.apache.flink.table.api.internal.TableImpl.executeInsert(TableImpl.java:57 2)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl. java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(Metho dInvoker.java:244)
at org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(Re flectionEngine.java:357)
at org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282)
at org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod (AbstractCommand.java:132)
at org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallComm and.java:79)
at org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnecti on.java:238)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.errors.InvalidTxnSt ateException
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 32 more
org.apache.flink.client.program.ProgramAbortException: java.lang.RuntimeException: Pytho n process exits with code: 1
at org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:134)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl. java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgra m.java:355)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecu tion(PackagedProgram.java:222)
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:812)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246)
at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1054)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1132)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.jav a:1730)
at org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(H adoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132)
Caused by: java.lang.RuntimeException: Python process exits with code: 1
at org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:124)
... 16 more
</pre>
My Environment:
jdk1.8.0_211
flink 1.13.2
centos 7.6
my jar classpath : /opt/flink/lib
flink-connector-kafka_2.12.1.13.2.jar
flink-sql-connector-kafka_2.11.1.13.2.jar
I suspect it can't find the kafka jar libiary in my environment, while i use --jarfile and venv.zip to declare , how to slove this? many thanks

Related

Drools Workbench not configuration in Wildfly 10

I am using drools workbench 6.5.0 and trying to configure into Wildfly 10. I am gettings below error.
Cannot upload deployment: {"WFLYCTL0080: Failed services" =>
{"jboss.deployment.unit."kie-drools-wb-6.5.0.Final-wildfly10.war".WeldStartService"
=> "org.jboss.msc.service.StartException in service jboss.deployment.unit."kie-drools-wb-6.5.0.Final-wildfly10.war".WeldStartService:
Failed to start service Caused by:
org.jboss.weld.exceptions.DeploymentException: Exception List with 1
exceptions: Exception 0 : org.jboss.weld.exceptions.WeldException:
WELD-000049: Unable to invoke public void
org.kie.workbench.drools.backend.server.AppSetup.assertPlayground() on
org.kie.workbench.drools.backend.server.AppSetup#26107fd0 at
org.jboss.weld.injection.producer.DefaultLifecycleCallbackInvoker.invokeMethods(DefaultLifecycleCallbackInvoker.java:100)
at
org.jboss.weld.injection.producer.DefaultLifecycleCallbackInvoker.postConstruct(DefaultLifecycleCallbackInvoker.java:81)
at
org.jboss.weld.injection.producer.BasicInjectionTarget.postConstruct(BasicInjectionTarget.java:126)
at org.jboss.weld.bean.ManagedBean.create(ManagedBean.java:171) at
org.jboss.weld.context.AbstractContext.get(AbstractContext.java:96) at
org.jboss.weld.bean.ContextualInstanceStrategy$DefaultContextualInstanceStrategy.get(ContextualInstanceStrategy.java:101)
at
org.jboss.weld.bean.ContextualInstanceStrategy$ApplicationScopedContextualInstanceStrategy.get(ContextualInstanceStrategy.java:141)
at
org.jboss.weld.bean.ContextualInstance.get(ContextualInstance.java:50)
at
org.jboss.weld.bean.proxy.ContextBeanInstance.getInstance(ContextBeanInstance.java:99)
at
org.jboss.weld.bean.proxy.ProxyMethodHandler.getInstance(ProxyMethodHandler.java:125)
at
org.kie.workbench.drools.backend.server.AppSetup$Proxy$_$$WeldClientProxy.toString(Unknown
Source) at
org.uberfire.backend.server.cdi.SystemConfigProducer.runPostConstruct(SystemConfigProducer.java:162)
at
org.uberfire.backend.server.cdi.SystemConfigProducer.afterDeploymentValidation(SystemConfigProducer.java:143)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.jboss.weld.injection.StaticMethodInjectionPoint.invoke(StaticMethodInjectionPoint.java:88)
at
org.jboss.weld.injection.MethodInvocationStrategy$SpecialParamPlusBeanManagerStrategy.invoke(MethodInvocationStrategy.java:144)
at
org.jboss.weld.event.ObserverMethodImpl.sendEvent(ObserverMethodImpl.java:309)
at
org.jboss.weld.event.ExtensionObserverMethodImpl.sendEvent(ExtensionObserverMethodImpl.java:124)
at
org.jboss.weld.event.ObserverMethodImpl.sendEvent(ObserverMethodImpl.java:287)
at
org.jboss.weld.event.ObserverMethodImpl.notify(ObserverMethodImpl.java:265)
at
org.jboss.weld.event.ObserverNotifier.notifySyncObservers(ObserverNotifier.java:271)
at
org.jboss.weld.event.ObserverNotifier.notify(ObserverNotifier.java:260)
at
org.jboss.weld.event.ObserverNotifier.fireEvent(ObserverNotifier.java:154)
at
org.jboss.weld.event.ObserverNotifier.fireEvent(ObserverNotifier.java:148)
at
org.jboss.weld.bootstrap.events.AbstractContainerEvent.fire(AbstractContainerEvent.java:53)
at
org.jboss.weld.bootstrap.events.AbstractDeploymentContainerEvent.fire(AbstractDeploymentContainerEvent.java:35)
at
org.jboss.weld.bootstrap.events.AfterDeploymentValidationImpl.fire(AfterDeploymentValidationImpl.java:28)
at
org.jboss.weld.bootstrap.WeldStartup.validateBeans(WeldStartup.java:450)
at
org.jboss.weld.bootstrap.WeldBootstrap.validateBeans(WeldBootstrap.java:90)
at org.jboss.as.weld.WeldStartService.start(WeldStartService.java:96)
at
org.jboss.msc.service.ServiceControllerImpl$StartTask.startService(ServiceControllerImpl.java:1948)
at
org.jboss.msc.service.ServiceControllerImpl$StartTask.run(ServiceControllerImpl.java:1881)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) Caused by:
java.lang.reflect.InvocationTargetException at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.jboss.weld.injection.producer.DefaultLifecycleCallbackInvoker.invokeMethods(DefaultLifecycleCallbackInvoker.java:98)
... 37 more Caused by: java.lang.RuntimeException:
java.lang.RuntimeException: java.util.NoSuchElementException at
org.kie.workbench.drools.backend.server.AppSetup.assertPlayground(AppSetup.java:195)
... 42 more Caused by: java.lang.RuntimeException:
java.util.NoSuchElementException at
org.guvnor.structure.backend.repositories.RepositoryServiceImpl.createRepository(RepositoryServiceImpl.java:283)
at
org.guvnor.structure.backend.repositories.RepositoryServiceImpl$Proxy$$$WeldClientProxy.createRepository(Unknown
Source) at
org.kie.workbench.drools.backend.server.AppSetup.createRepository(AppSetup.java:330)
at
org.kie.workbench.drools.backend.server.AppSetup.assertPlayground(AppSetup.java:119)
... 42 more Caused by: java.util.NoSuchElementException at
java.util.ArrayList$Itr.next(ArrayList.java:862) at
org.uberfire.java.nio.fs.jgit.JGitFileSystem$1$1.next(JGitFileSystem.java:194)
at
org.uberfire.java.nio.fs.jgit.JGitFileSystem$1$1.next(JGitFileSystem.java:173)
at
org.guvnor.structure.backend.repositories.git.GitRepositoryBuilder.getDefaultRoot(GitRepositoryBuilder.java:120)
at
org.guvnor.structure.backend.repositories.git.GitRepositoryBuilder.setBranches(GitRepositoryBuilder.java:98)
at
org.guvnor.structure.backend.repositories.git.GitRepositoryBuilder.build(GitRepositoryBuilder.java:63)
at
org.guvnor.structure.backend.repositories.git.GitRepositoryFactoryHelper.newRepository(GitRepositoryFactoryHelper.java:64)
at
org.guvnor.structure.backend.repositories.git.GitRepositoryFactoryHelper$Proxy$$$WeldClientProxy.newRepository(Unknown
Source) at
org.guvnor.structure.backend.repositories.RepositoryFactoryImpl.newRepository(RepositoryFactoryImpl.java:61)
at
org.guvnor.structure.backend.repositories.RepositoryFactoryImpl$Proxy$$$_WeldClientProxy.newRepository(Unknown
Source) at
org.guvnor.structure.backend.repositories.RepositoryServiceImpl.createRepository(RepositoryServiceImpl.java:96)
at
org.guvnor.structure.backend.repositories.RepositoryServiceImpl.createRepository(RepositoryServiceImpl.java:279)
... 45 more "}}
Machine : windows 10
Wildlfly Server : 10.0
Kie server 6.5.0 - working fine
Only problem is Drools work bench
Drools Workbench 6.5.0 is targeted Wildfly 8, see official documentation.
If you have to use Wildfly 10, please consider Drools Workbench 7.0.0

Writing Spark Sql Dataframe to disk as parquet/csv

I'm reading a dataset as spark sql dataframe of shape(70000,38) using read.csv and then write to disk as parquet/csv using dataframe.write, It is not throwing any error, all fine!
When I read the dataset as spark sql dataframe of shape(70000,38), do some ETL operations and trying to write the spark sql dataframe of shape (80000,40) to disk as parquet/csv using dataframe.write, It is throwing error.
Could this be the reason? My resources got exhausted during ETL operations and unable to write the dataframe to disk.
My pyspark session configuration is
[('spark.rdd.compress', 'True'),
('spark.driver.port', '54782'),
('spark.app.id', 'local-1585668573104'),
('spark.serializer.objectStreamReset', '100'),
('spark.master', 'local[*]'),
('spark.submit.pyFiles', ''),
('spark.executor.id', 'driver'),
('spark.submit.deployMode', 'client'),
('spark.ui.showConsoleProgress', 'true'),
('spark.rpc.message.maxSize', '1024'),
('spark.app.name', 'pyspark-shell'),
('spark.driver.host', XXXXXXXXXXX.net')]
system configuration 8 cores/cpu's and 8gb memory/ram
will be grateful for any solution
Py spark commands
filepath = directory path of file to be saved
dataframe.write.format('csv').option("header", "true").save(filepath)
Traceback of Error
Py4JJavaError: An error occurred while calling o568.save.
: org.apache.spark.SparkException: Job aborted.
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:226)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:178)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:123)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:173)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:211)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:208)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:169)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:110)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:109)
at org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:828)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$4(SQLExecution.scala:100)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:87)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:828)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:309)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:236)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 6.0 failed 1 times, most recent failure: Lost task 2.0 in stage 6.0 (TID 235, IN2391797W1.ey.net, executor driver): org.apache.spark.api.python.PythonException: Traceback (most recent call last):
File "C:\spark\spark-3.0.0-preview2-bin-hadoop2.7\python\lib\pyspark.zip\pyspark\worker.py", line 577, in main
File "C:\spark\spark-3.0.0-preview2-bin-hadoop2.7\python\lib\pyspark.zip\pyspark\serializers.py", line 835, in read_int
length = stream.read(4)
File "C:\Users\Sravan.Tallozu\AppData\Local\Continuum\anaconda3\lib\socket.py", line 589, in readinto
return self._sock.recv_into(b)
socket.timeout: timed out
at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:484)
at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$2.read(PythonUDFRunner.scala:81)
at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$2.read(PythonUDFRunner.scala:64)
at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:437)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:489)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage6.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:726)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage8.agg_doAggregateWithKeys_0$(Unknown Source)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage8.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:726)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:132)
at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
at org.apache.spark.scheduler.Task.run(Task.scala:127)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:441)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:444)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1989)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1977)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1976)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1976)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:956)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:956)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:956)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2206)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2155)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2144)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:758)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2116)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:195)
30 more
Caused by: org.apache.spark.api.python.PythonException: Traceback (most recent call last):
File "C:\spark\spark-3.0.0-preview2-bin-hadoop2.7\python\lib\pyspark.zip\pyspark\worker.py", line 577, in main
File "C:\spark\spark-3.0.0-preview2-bin-hadoop2.7\python\lib\pyspark.zip\pyspark\serializers.py", line 835, in read_int
length = stream.read(4)
File "C:\Users\Sravan.Tallozu\AppData\Local\Continuum\anaconda3\lib\socket.py", line 589, in readinto
return self._sock.recv_into(b)
socket.timeout: timed out
at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:484)
at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$2.read(PythonUDFRunner.scala:81)
at org.apache.spark.sql.execution.python.PythonUDFRunner$$anon$2.read(PythonUDFRunner.scala:64)
at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:437)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:489)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage6.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:726)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage8.agg_doAggregateWithKeys_0$(Unknown Source)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage8.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:726)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:132)
at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
at org.apache.spark.scheduler.Task.run(Task.scala:127)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:441)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:444)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
1 more

pyspark checkpoint fails on local machine

I've just started learning pyspark using standalone on local machine. I can't get the checkpoint to work. I boiled down the script to this....
spark = SparkSession.builder.appName("PyTest").master("local[*]").getOrCreate()
spark.sparkContext.setCheckpointDir("/RddCheckPoint")
df = spark.createDataFrame(["10","11","13"], "string").toDF("age")
df.checkpoint()
and I get this output...
>>> spark = SparkSession.builder.appName("PyTest").master("local[*]").getOrCreate()
>>>
>>> spark.sparkContext.setCheckpointDir("/RddCheckPoint")
>>> df = spark.createDataFrame(["10","11","13"], "string").toDF("age")
>>> df.checkpoint()
20/01/24 15:26:45 WARN SizeEstimator: Failed to check whether UseCompressedOops is set; assuming yes
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "N:\spark\python\pyspark\sql\dataframe.py", line 463, in checkpoint
jdf = self._jdf.checkpoint(eager)
File "N:\spark\python\lib\py4j-0.10.8.1-src.zip\py4j\java_gateway.py", line 1286, in __call__
File "N:\spark\python\pyspark\sql\utils.py", line 98, in deco
return f(*a, **kw)
File "N:\spark\python\lib\py4j-0.10.8.1-src.zip\py4j\protocol.py", line 328, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o71.checkpoint.
: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:645)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:1230)
at org.apache.hadoop.fs.FileUtil.list(FileUtil.java:1435)
at org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:493)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1868)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1910)
at org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:678)
at org.apache.spark.rdd.ReliableCheckpointRDD.getPartitions(ReliableCheckpointRDD.scala:74)
at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:276)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:272)
at org.apache.spark.rdd.ReliableCheckpointRDD$.writeRDDToCheckpointDirectory(ReliableCheckpointRDD.scala:179)
at org.apache.spark.rdd.ReliableRDDCheckpointData.doCheckpoint(ReliableRDDCheckpointData.scala:59)
at org.apache.spark.rdd.RDDCheckpointData.checkpoint(RDDCheckpointData.scala:75)
at org.apache.spark.rdd.RDD.$anonfun$doCheckpoint$1(RDD.scala:1801)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDD.doCheckpoint(RDD.scala:1791)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2118)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2137)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2156)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2181)
at org.apache.spark.rdd.RDD.count(RDD.scala:1227)
at org.apache.spark.sql.Dataset.$anonfun$checkpoint$1(Dataset.scala:689)
at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3472)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$4(SQLExecution.scala:100)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:87)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3468)
at org.apache.spark.sql.Dataset.checkpoint(Dataset.scala:680)
at org.apache.spark.sql.Dataset.checkpoint(Dataset.scala:643)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Unknown Source)
The error doesn't give any specifics about why it failed. I suspect I have missed some spark config but not sure what...
You have this error because either is not created the checkpoint directory or you don't have permissions to write in this directory (because the checkpoint directory is under the root directory "/").
import os
os.mkdir("RddCheckPoint")
spark = SparkSession.builder.appName("PyTest").master("local[*]").getOrCreate()
spark.sparkContext.setCheckpointDir("RddCheckPoint")
df = spark.createDataFrame(["10","11","13"], "string").toDF("age")
df.checkpoint()

CloudFoundry quickstart problems. All proposed fixes don't work

Followed this guide: https://github.com/cloudfoundry/uaa#quick-start
First problem after doing "./gradlew run" It get's stuck at cargoRunLocal.
<============-> 96% EXECUTING [7m 57s]
> :cargoRunLocal
Then I found out that for some people changing build.gradle line like this worked. Found some variations and tried all of them with no luck.
-DLOGIN_CONFIG_URL=file://"+new
File(".").absolutePath+"/uaa/src/main/resources/required_configuration.yml
with
-DLOGIN_CONFIG_URL=file:///"+new
File(".").absolutePath.replace("\\","/")+"/uaa/src/main/resources/required_configuration.yml
Then I noticed the default branch is "develop" so I switched to "master" branch but got the same results.
Some messages from logs that might help to understand the problem.
1
INFO --- UaaMetricsEmitter: Could not find UaaMetrics object on MBean server. Please deploy UAA in the same JVM.
2
cloudfoundry-identity-server/uaa - 4596 [main] .... ERROR --- RecognizeFailureDispatcherServlet: Unable to start UAA application.
org.springframework.beans.factory.BeanDefinitionStoreException: Invalid bean definition with name 'activeKeyService' defined in class path resource [spring/login-ui.xml]: Could not resolve placeholder 'encryption.active_key_label' in value "${encryption.active_key_label}"; nested exception is java.lang.IllegalArgumentException: Could not resolve placeholder 'encryption.active_key_label' in value "${encryption.active_key_label}"
at org.springframework.beans.factory.config.PlaceholderConfigurerSupport.doProcessProperties(PlaceholderConfigurerSupport.java:223)
at org.springframework.context.support.PropertySourcesPlaceholderConfigurer.processProperties(PropertySourcesPlaceholderConfigurer.java:180)
at org.springframework.context.support.PropertySourcesPlaceholderConfigurer.postProcessBeanFactory(PropertySourcesPlaceholderConfigurer.java:152)
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:283)
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:163)
at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:687)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:525)
at org.springframework.web.servlet.FrameworkServlet.configureAndRefreshWebApplicationContext(FrameworkServlet.java:668)
at org.springframework.web.servlet.FrameworkServlet.createWebApplicationContext(FrameworkServlet.java:634)
at org.springframework.web.servlet.FrameworkServlet.createWebApplicationContext(FrameworkServlet.java:682)
at org.springframework.web.servlet.FrameworkServlet.initWebApplicationContext(FrameworkServlet.java:553)
at org.springframework.web.servlet.FrameworkServlet.initServletBean(FrameworkServlet.java:494)
at org.springframework.web.servlet.HttpServletBean.init(HttpServletBean.java:171)
at javax.servlet.GenericServlet.init(GenericServlet.java:158)
at org.cloudfoundry.identity.uaa.web.RecognizeFailureDispatcherServlet.init(RecognizeFailureDispatcherServlet.java:56)
at org.apache.catalina.core.StandardWrapper.initServlet(StandardWrapper.java:1124)
at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1079)
at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:971)
at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4829)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5143)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:743)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:719)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:695)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:986)
at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1858)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112)
at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:772)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:426)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1585)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:308)
at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:123)
at org.apache.catalina.util.LifecycleBase.setStateInternal(LifecycleBase.java:424)
at org.apache.catalina.util.LifecycleBase.setState(LifecycleBase.java:367)
at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:972)
at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:831)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1432)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1422)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:944)
at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:261)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
at org.apache.catalina.core.StandardService.startInternal(StandardService.java:422)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:801)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
at org.apache.catalina.startup.Catalina.start(Catalina.java:695)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:350)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:492)
Caused by: java.lang.IllegalArgumentException: Could not resolve placeholder 'encryption.active_key_label' in value "${encryption.active_key_label}"
at org.springframework.util.PropertyPlaceholderHelper.parseStringValue(PropertyPlaceholderHelper.java:174)
at org.springframework.util.PropertyPlaceholderHelper.replacePlaceholders(PropertyPlaceholderHelper.java:126)
at org.springframework.core.env.AbstractPropertyResolver.doResolvePlaceholders(AbstractPropertyResolver.java:236)
at org.springframework.core.env.AbstractPropertyResolver.resolveRequiredPlaceholders(AbstractPropertyResolver.java:210)
at org.springframework.context.support.PropertySourcesPlaceholderConfigurer$2.resolveStringValue(PropertySourcesPlaceholderConfigurer.java:172)
at org.springframework.beans.factory.config.BeanDefinitionVisitor.resolveStringValue(BeanDefinitionVisitor.java:282)
at org.springframework.beans.factory.config.BeanDefinitionVisitor.resolveValue(BeanDefinitionVisitor.java:204)
at org.springframework.beans.factory.config.BeanDefinitionVisitor.visitIndexedArgumentValues(BeanDefinitionVisitor.java:150)
at org.springframework.beans.factory.config.BeanDefinitionVisitor.visitBeanDefinition(BeanDefinitionVisitor.java:84)
at org.springframework.beans.factory.config.PlaceholderConfigurerSupport.doProcessProperties(PlaceholderConfigurerSupport.java:220)
3
Caused by: java.lang.IllegalArgumentException: Could not resolve placeholder 'encryption.active_key_label' in value "${encryption.active_key_label}"
at org.springframework.util.PropertyPlaceholderHelper.parseStringValue(PropertyPlaceholderHelper.java:174)
at org.springframework.util.PropertyPlaceholderHelper.replacePlaceholders(PropertyPlaceholderHelper.java:126)
at org.springframework.core.env.AbstractPropertyResolver.doResolvePlaceholders(AbstractPropertyResolver.java:236)
at org.springframework.core.env.AbstractPropertyResolver.resolveRequiredPlaceholders(AbstractPropertyResolver.java:210)
at org.springframework.context.support.PropertySourcesPlaceholderConfigurer$2.resolveStringValue(PropertySourcesPlaceholderConfigurer.java:172)
at org.springframework.beans.factory.config.BeanDefinitionVisitor.resolveStringValue(BeanDefinitionVisitor.java:282)
at org.springframework.beans.factory.config.BeanDefinitionVisitor.resolveValue(BeanDefinitionVisitor.java:204)
at org.springframework.beans.factory.config.BeanDefinitionVisitor.visitIndexedArgumentValues(BeanDefinitionVisitor.java:150)
at org.springframework.beans.factory.config.BeanDefinitionVisitor.visitBeanDefinition(BeanDefinitionVisitor.java:84)
at org.springframework.beans.factory.config.PlaceholderConfigurerSupport.doProcessProperties(PlaceholderConfigurerSupport.java:220)
... 58 more
Extra facts: Tomcat server starts. Going to "http://localhost:8080/uaa/" says "FAILURE"

can not connect Pentaho report designer with MySQL database

I would also ask if Pentaho support multi-source in one report(Ex: a chart takes its data from two different databases )
I am trying to use Pentaho to create some report charts, when i setup the configuration and click on test, it generates this error:
Error connecting to database [sqltest] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database
Driver class 'org.gjt.mm.mysql.Driver' could not be found, make sure the 'MySQL' driver (jar file) is installed.
org.gjt.mm.mysql.Driver
org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database
Driver class 'org.gjt.mm.mysql.Driver' could not be found, make sure the 'MySQL' driver (jar file) is installed.
org.gjt.mm.mysql.Driver
at org.pentaho.di.core.database.Database.normalConnect(Database.java:415)
at org.pentaho.di.core.database.Database.connect(Database.java:353)
at org.pentaho.di.core.database.Database.connect(Database.java:306)
at org.pentaho.di.core.database.Database.connect(Database.java:294)
at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:84)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2459)
at org.pentaho.ui.database.event.DataHandler.testDatabaseConnection(DataHandler.java:541)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:329)
at org.pentaho.ui.xul.swing.tags.SwingButton$OnClickRunnable.run(SwingButton.java:58)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:251)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:705)
at java.awt.EventQueue.access$000(EventQueue.java:101)
at java.awt.EventQueue$3.run(EventQueue.java:666)
at java.awt.EventQueue$3.run(EventQueue.java:664)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:675)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:211)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:128)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:121)
at java.awt.WaitDispatchSupport$2.run(WaitDispatchSupport.java:182)
at java.awt.WaitDispatchSupport$4.run(WaitDispatchSupport.java:221)
at java.security.AccessController.doPrivileged(Native Method)
at java.awt.WaitDispatchSupport.enter(WaitDispatchSupport.java:219)
at java.awt.Dialog.show(Dialog.java:1072)
at java.awt.Component.show(Component.java:1650)
at java.awt.Component.setVisible(Component.java:1602)
at java.awt.Window.setVisible(Window.java:1013)
at java.awt.Dialog.setVisible(Dialog.java:1003)
at org.pentaho.ui.xul.swing.tags.SwingDialog.show(SwingDialog.java:238)
at org.pentaho.reporting.ui.datasources.jdbc.ui.XulDatabaseDialog.open(XulDatabaseDialog.java:254)
at org.pentaho.reporting.ui.datasources.jdbc.ui.ConnectionPanel$AddDataSourceAction.actionPerformed(ConnectionPanel.java:252)
at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2018)
at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2341)
at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:402)
at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:259)
at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:252)
at java.awt.AWTEventMulticaster.mouseReleased(AWTEventMulticaster.java:289)
at java.awt.AWTEventMulticaster.mouseReleased(AWTEventMulticaster.java:289)
at java.awt.Component.processMouseEvent(Component.java:6504)
at javax.swing.JComponent.processMouseEvent(JComponent.java:3321)
at java.awt.Component.processEvent(Component.java:6269)
at java.awt.Container.processEvent(Container.java:2229)
at java.awt.Component.dispatchEventImpl(Component.java:4860)
at java.awt.Container.dispatchEventImpl(Container.java:2287)
at java.awt.Component.dispatchEvent(Component.java:4686)
at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4832)
at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4492)
at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4422)
at java.awt.Container.dispatchEventImpl(Container.java:2273)
at java.awt.Window.dispatchEventImpl(Window.java:2713)
at java.awt.Component.dispatchEvent(Component.java:4686)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:707)
at java.awt.EventQueue.access$000(EventQueue.java:101)
at java.awt.EventQueue$3.run(EventQueue.java:666)
at java.awt.EventQueue$3.run(EventQueue.java:664)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:87)
at java.awt.EventQueue$4.run(EventQueue.java:680)
at java.awt.EventQueue$4.run(EventQueue.java:678)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:677)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:211)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:128)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:121)
at java.awt.WaitDispatchSupport$2.run(WaitDispatchSupport.java:182)
at java.awt.WaitDispatchSupport$4.run(WaitDispatchSupport.java:221)
at java.security.AccessController.doPrivileged(Native Method)
at java.awt.WaitDispatchSupport.enter(WaitDispatchSupport.java:219)
at java.awt.Dialog.show(Dialog.java:1072)
at java.awt.Component.show(Component.java:1650)
at java.awt.Component.setVisible(Component.java:1602)
at java.awt.Window.setVisible(Window.java:1013)
at java.awt.Dialog.setVisible(Dialog.java:1003)
at org.pentaho.reporting.libraries.designtime.swing.CommonDialog.setVisible(CommonDialog.java:281)
at org.pentaho.reporting.libraries.designtime.swing.CommonDialog.performEdit(CommonDialog.java:193)
at org.pentaho.reporting.ui.datasources.jdbc.ui.JdbcDataSourceDialog.performConfiguration(JdbcDataSourceDialog.java:788)
at org.pentaho.reporting.ui.datasources.jdbc.JdbcDataSourcePlugin.performEdit(JdbcDataSourcePlugin.java:71)
at org.pentaho.reporting.designer.core.actions.report.AddDataFactoryAction.actionPerformed(AddDataFactoryAction.java:78)
at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2018)
at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2341)
at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:402)
at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:259)
at javax.swing.AbstractButton.doClick(AbstractButton.java:376)
at javax.swing.plaf.basic.BasicMenuItemUI.doClick(BasicMenuItemUI.java:833)
at javax.swing.plaf.basic.BasicMenuItemUI$Handler.mouseReleased(BasicMenuItemUI.java:877)
at java.awt.AWTEventMulticaster.mouseReleased(AWTEventMulticaster.java:289)
at java.awt.Component.processMouseEvent(Component.java:6504)
at javax.swing.JComponent.processMouseEvent(JComponent.java:3321)
at java.awt.Component.processEvent(Component.java:6269)
at java.awt.Container.processEvent(Container.java:2229)
at java.awt.Component.dispatchEventImpl(Component.java:4860)
at java.awt.Container.dispatchEventImpl(Container.java:2287)
at java.awt.Component.dispatchEvent(Component.java:4686)
at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4832)
at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4492)
at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4422)
at java.awt.Container.dispatchEventImpl(Container.java:2273)
at java.awt.Window.dispatchEventImpl(Window.java:2713)
at java.awt.Component.dispatchEvent(Component.java:4686)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:707)
at java.awt.EventQueue.access$000(EventQueue.java:101)
at java.awt.EventQueue$3.run(EventQueue.java:666)
at java.awt.EventQueue$3.run(EventQueue.java:664)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:87)
at java.awt.EventQueue$4.run(EventQueue.java:680)
at java.awt.EventQueue$4.run(EventQueue.java:678)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:677)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:211)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:128)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:117)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:113)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:105)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:90)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Driver class 'org.gjt.mm.mysql.Driver' could not be found, make sure the 'MySQL' driver (jar file) is installed.
org.gjt.mm.mysql.Driver
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:474)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:399)
... 123 more
Caused by: java.lang.ClassNotFoundException: org.gjt.mm.mysql.Driver
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:467)
... 124 more
Hostname : localhost
Port : 3306
Database name : alger_fa
No Pentaho does-not support multi-source in one report..
have you placed this jar file mysql-connector-java-5.1.10 under below location
\pentaho_report_designer_working\report-designer\lib\jdbc\
if not then download and keep that jar file into above mention location , your problem will be solved.