Exception when using UIMA's Annotation Browser View - eclipse

When using the UIMA workbench in eclipse, I occasionally run into a NullPointerException when trying to open the Annotation Browser View.
This doesn't occur every time. This is only happening when I run a pipeline which overwrites an existing (potentially opened xmi file).
I have no idea how to fix that and haven't found anything online.
Any suggestions what I could do?
Thanks!
I get the following message (mulitple times) in my errorLog.
Problems occurred when invoking code from plug-in: "org.apache.uima.caseditor".
java.lang.NullPointerException
at org.apache.uima.ruta.caseditor.view.tree.AnnotationTreeViewPage.dispose(AnnotationTreeViewPage.java:384)
at org.apache.uima.caseditor.editor.CasEditorViewPage.setCASViewPage(CasEditorViewPage.java:202)
at org.apache.uima.caseditor.editor.CasEditorView.createViewPage(CasEditorView.java:123)
at org.apache.uima.caseditor.editor.CasEditorView.access$000(CasEditorView.java:49)
at org.apache.uima.caseditor.editor.CasEditorView$2.casDocumentChanged(CasEditorView.java:162)
at org.apache.uima.caseditor.editor.AnnotationEditor$6.run(AnnotationEditor.java:1012)
at org.eclipse.core.runtime.SafeRunner.run(SafeRunner.java:45)
at org.apache.uima.caseditor.editor.AnnotationEditor.doSetInput(AnnotationEditor.java:1009)
at org.eclipse.ui.texteditor.AbstractTextEditor.setInputWithNotify(AbstractTextEditor.java:4248)
at org.eclipse.ui.texteditor.AbstractTextEditor.setInput(AbstractTextEditor.java:4268)
at org.apache.uima.caseditor.editor.AnnotationEditor.handleElementContentReplaced(AnnotationEditor.java:905)
at org.eclipse.ui.texteditor.AbstractTextEditor$ElementStateListener.lambda$3(AbstractTextEditor.java:466)
at org.eclipse.swt.widgets.RunnableLock.run(RunnableLock.java:40)
at org.eclipse.swt.widgets.Synchronizer.runAsyncMessages(Synchronizer.java:185)
at org.eclipse.swt.widgets.Display.runAsyncMessages(Display.java:3919)
at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:3550)
at org.eclipse.e4.ui.internal.workbench.swt.PartRenderingEngine$5.run(PartRenderingEngine.java:1173)
at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:338)
at org.eclipse.e4.ui.internal.workbench.swt.PartRenderingEngine.run(PartRenderingEngine.java:1062)
at org.eclipse.e4.ui.internal.workbench.E4Workbench.createAndRunUI(E4Workbench.java:155)
at org.eclipse.ui.internal.Workbench.lambda$3(Workbench.java:635)
at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:338)
at org.eclipse.ui.internal.Workbench.createAndRunWorkbench(Workbench.java:559)
at org.eclipse.ui.PlatformUI.createAndRunWorkbench(PlatformUI.java:150)
at org.eclipse.ui.internal.ide.application.IDEApplication.start(IDEApplication.java:155)
at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:203)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:137)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:107)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:400)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:255)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:660)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:597)
at org.eclipse.equinox.launcher.Main.run(Main.java:1468)

Related

Getting Null Pointer Exception when mapping SQL Server Database to MySQL Database with MapReduce

I am new to Cloud Data Fusion and am trying to map tables in a SQL Server Database to a MySQL Database.
I have already faced many issues which I managed to solve namely:
Fixed permissions for the service account so it could access all the resources it required;
Added IP to the allowed connections in my SQL Server;
Am using system.profile.properties.dataproc:dataproc.conscrypt.provider.enable = false to prevent SSL bug issue as reported in another question.
After this last fix, I am now trying to deal with a NULL pointer exception on the MapReduce job at io.cdap.cdap.internal.app.runtime.ProgramControllerServiceAdapter#97-MapReduceRunner-phase-1.
The stacktrace provided by Data Fusion is as follows:
java.lang.NullPointerException: null
at io.cdap.plugin.db.batch.source.AbstractDBSource.loadSchemaFromDB(AbstractDBSource.java:138) ~[na:na]
at io.cdap.plugin.db.batch.source.AbstractDBSource.loadSchemaFromDB(AbstractDBSource.java:155) ~[na:na]
at io.cdap.plugin.db.batch.source.AbstractDBSource.prepareRun(AbstractDBSource.java:241) ~[na:na]
at io.cdap.plugin.db.batch.source.AbstractDBSource.prepareRun(AbstractDBSource.java:68) ~[na:na]
at io.cdap.cdap.etl.common.plugin.WrappedBatchSource.lambda$prepareRun$0(WrappedBatchSource.java:51) ~[na:na]
at io.cdap.cdap.etl.common.plugin.Caller$1.call(Caller.java:30) ~[na:na]
at io.cdap.cdap.etl.common.plugin.StageLoggingCaller.call(StageLoggingCaller.java:40) ~[na:na]
at io.cdap.cdap.etl.common.plugin.WrappedBatchSource.prepareRun(WrappedBatchSource.java:50) ~[na:na]
at io.cdap.cdap.etl.common.plugin.WrappedBatchSource.prepareRun(WrappedBatchSource.java:36) ~[na:na]
at io.cdap.cdap.etl.common.submit.SubmitterPlugin.lambda$prepareRun$2(SubmitterPlugin.java:71) ~[na:na]
at io.cdap.cdap.internal.app.runtime.AbstractContext$2.run(AbstractContext.java:551) ~[na:na]
at io.cdap.cdap.data2.transaction.Transactions$CacheBasedTransactional.finishExecute(Transactions.java:224) ~[na:na]
at io.cdap.cdap.data2.transaction.Transactions$CacheBasedTransactional.execute(Transactions.java:211) ~[na:na]
at io.cdap.cdap.internal.app.runtime.AbstractContext.execute(AbstractContext.java:546) ~[na:na]
at io.cdap.cdap.internal.app.runtime.AbstractContext.execute(AbstractContext.java:534) ~[na:na]
at io.cdap.cdap.etl.common.submit.SubmitterPlugin.prepareRun(SubmitterPlugin.java:69) ~[na:na]
at io.cdap.cdap.etl.batch.PipelinePhasePreparer.prepare(PipelinePhasePreparer.java:111) ~[na:na]
at io.cdap.cdap.etl.batch.mapreduce.MapReducePreparer.prepare(MapReducePreparer.java:97) ~[na:na]
at io.cdap.cdap.etl.batch.mapreduce.ETLMapReduce.initialize(ETLMapReduce.java:192) ~[na:na]
at io.cdap.cdap.api.mapreduce.AbstractMapReduce.initialize(AbstractMapReduce.java:109) ~[na:na]
at io.cdap.cdap.api.mapreduce.AbstractMapReduce.initialize(AbstractMapReduce.java:32) ~[na:na]
at io.cdap.cdap.internal.app.runtime.batch.MapReduceRuntimeService$1.initialize(MapReduceRuntimeService.java:182) ~[na:na]
at io.cdap.cdap.internal.app.runtime.batch.MapReduceRuntimeService$1.initialize(MapReduceRuntimeService.java:177) ~[na:na]
at io.cdap.cdap.internal.app.runtime.AbstractContext.lambda$initializeProgram$1(AbstractContext.java:640) ~[na:na]
at io.cdap.cdap.internal.app.runtime.AbstractContext.execute(AbstractContext.java:600) ~[na:na]
at io.cdap.cdap.internal.app.runtime.AbstractContext.initializeProgram(AbstractContext.java:637) ~[na:na]
at io.cdap.cdap.internal.app.runtime.batch.MapReduceRuntimeService.beforeSubmit(MapReduceRuntimeService.java:547) ~[na:na]
at io.cdap.cdap.internal.app.runtime.batch.MapReduceRuntimeService.startUp(MapReduceRuntimeService.java:226) ~[na:na]
at com.google.common.util.concurrent.AbstractExecutionThreadService$1$1.run(AbstractExecutionThreadService.java:47) ~[com.google.guava.guava-13.0.1.jar:na]
at io.cdap.cdap.internal.app.runtime.batch.MapReduceRuntimeService$2$1.run(MapReduceRuntimeService.java:450) [na:na]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_212]
Any help would be deeply appreciated.
Thanks.
P.S.:
After solving this issue, I am now running into this:
java.lang.ClassCastException: io.cdap.plugin.db.DBRecord cannot be cast to io.cdap.plugin.db.DBRecord
at io.cdap.plugin.db.batch.source.AbstractDBSource.transform(AbstractDBSource.java:267) ~[database-commons-1.2.0.jar:na]
at io.cdap.cdap.etl.common.plugin.WrappedBatchSource.lambda$transform$2(WrappedBatchSource.java:69) ~[cdap-etl-core-6.0.1.jar:na]
at io.cdap.cdap.etl.common.plugin.Caller$1.call(Caller.java:30) ~[cdap-etl-core-6.0.1.jar:na]
at io.cdap.cdap.etl.common.plugin.StageLoggingCaller.call(StageLoggingCaller.java:40) ~[cdap-etl-core-6.0.1.jar:na]
at io.cdap.cdap.etl.common.plugin.WrappedBatchSource.transform(WrappedBatchSource.java:68) ~[cdap-etl-core-6.0.1.jar:na]
at io.cdap.cdap.etl.common.plugin.WrappedBatchSource.transform(WrappedBatchSource.java:36) ~[cdap-etl-core-6.0.1.jar:na]
at io.cdap.cdap.etl.common.TrackedTransform.transform(TrackedTransform.java:74) ~[cdap-etl-core-6.0.1.jar:na]
at io.cdap.cdap.etl.batch.UnwrapPipeStage.consumeInput(UnwrapPipeStage.java:44) ~[cdap-etl-batch-6.0.1.jar:na]
at io.cdap.cdap.etl.batch.UnwrapPipeStage.consumeInput(UnwrapPipeStage.java:32) ~[cdap-etl-batch-6.0.1.jar:na]
at io.cdap.cdap.etl.batch.PipeStage.consume(PipeStage.java:44) ~[cdap-etl-batch-6.0.1.jar:na]
at io.cdap.cdap.etl.batch.PipeTransformExecutor.runOneIteration(PipeTransformExecutor.java:43) ~[cdap-etl-batch-6.0.1.jar:na]
at io.cdap.cdap.etl.batch.mapreduce.TransformRunner.transform(TransformRunner.java:142) ~[cdap-etl-batch-6.0.1.jar:na]
at io.cdap.cdap.etl.batch.mapreduce.ETLMapReduce$ETLMapper.map(ETLMapReduce.java:230) ~[cdap-etl-batch-6.0.1.jar:na]
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) [hadoop-mapreduce-client-core-2.8.5.jar:na]
at io.cdap.cdap.internal.app.runtime.batch.MapperWrapper.run(MapperWrapper.java:135) [na:na]
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) [hadoop-mapreduce-client-core-2.8.5.jar:na]
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) [hadoop-mapreduce-client-core-2.8.5.jar:na]
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175) [hadoop-mapreduce-client-app-2.8.5.jar:na]
at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_212]
at javax.security.auth.Subject.doAs(Subject.java:422) [na:1.8.0_212]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844) [hadoop-common-2.8.5.jar:na]
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169) [hadoop-mapreduce-client-app-2.8.5.jar:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_212]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_212]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_212]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_212]
at io.cdap.cdap.internal.app.runtime.batch.distributed.MapReduceContainerLauncher.launch(MapReduceContainerLauncher.java:114) [io.cdap.cdap.cdap-app-fabric-6.0.1.jar:na]
at org.apache.hadoop.mapred.YarnChild.main(Unknown Source) [hadoop-mapreduce-client-app-2.8.5.jar:na]
P.S. 2:
After solving the aforementioned issues, I am now capable of migrating the tables. However, I am sometimes getting the following stacktrace as a warning which then forces the job to end. Before actually failing, the job repeats itself (don't know if this is default behaviour or not). Also, it seems like it either cannot write so many rows to the destination database or the connection is lost. This is holding me back from migrating specific tables. Any idea as to why?
com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: No operations allowed after connection closed. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:404) at com.mysql.jdbc.Util.getInstance(Util.java:387) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:917) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:896) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:885) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:860) at com.mysql.jdbc.ConnectionImpl.throwConnectionClosedException(ConnectionImpl.java:1246) at com.mysql.jdbc.ConnectionImpl.checkClosed(ConnectionImpl.java:1241) at com.mysql.jdbc.ConnectionImpl.rollback(ConnectionImpl.java:4564) at io.cdap.plugin.db.batch.sink.ETLDBOutputFormat$1.close(ETLDBOutputFormat.java:90) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:670) at org.apache.hadoop.mapred.MapTask.closeQuietly(MapTask.java:2021) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:797) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at io.cdap.cdap.internal.app.runtime.batch.distributed.MapReduceContainerLauncher.launch(MapReduceContainerLauncher.java:114) at org.apache.hadoop.mapred.YarnChild.main(Unknown Source)
Thanks!
Are you using version 1.2.0 of the SQL Server Database plugin from Hub?
Are you specifying the Import Query in the SQL Server properties? If not, please try specify the Import Query:
SELECT * FROM <table name> WHERE $CONDITION
note: only specify the WHERE $CONDITION if the number of splits to generate is greater than 1.
For the second error, you are encountering a classloading bug: https://issues.cask.co/browse/CDAP-15636
For a workaround to this, please try to use the generic Database source and sink instead of the product specific databases. The configurations should be mostly the same.

Talend BigData Error on Run

I created my 1st BigData project. My connection works properly and I configured a tMongoDbInput with a simple query and I connect it to a TLogRow to dump the output.
Talend BigData Version 7.0.1
JAVA_HOME = C:\Program Files\Java\jdk1.8.0_181\bin
Why does Talend throw this error on run.
org.talend.designer.runprocess.ProcessorException: Job compile errors
At least job "GetNewClaims" has a compile errors, please fix and export again.
Error Line: 1
Detail Message: The type java.lang.Object cannot be resolved. It is indirectly referenced from required .class files
There may be some other errors caused by JVM compatibility. Make sure your JVM setup is similar to the studio.
at org.talend.designer.runprocess.JobErrorsChecker.checkLastGenerationHasCompilationError(JobErrorsChecker.java:338)
at org.talend.designer.runprocess.DefaultRunProcessService.checkLastGenerationHasCompilationError(DefaultRunProcessService.java:464)
at org.talend.designer.runprocess.RunProcessService.checkLastGenerationHasCompilationError(RunProcessService.java:316)
at org.talend.designer.runprocess.ProcessorUtilities.generateBuildInfo(ProcessorUtilities.java:812)
at org.talend.designer.runprocess.ProcessorUtilities.generateCode(ProcessorUtilities.java:586)
at org.talend.designer.runprocess.ProcessorUtilities.generateCode(ProcessorUtilities.java:1736)
at org.talend.designer.runprocess.RunProcessContext$1.run(RunProcessContext.java:582)
at org.eclipse.jface.operation.ModalContext.runInCurrentThread(ModalContext.java:466)
at org.eclipse.jface.operation.ModalContext.run(ModalContext.java:374)
at org.eclipse.jface.dialogs.ProgressMonitorDialog.run(ProgressMonitorDialog.java:527)
at org.eclipse.ui.internal.progress.ProgressMonitorJobsDialog.run(ProgressMonitorJobsDialog.java:284)
at org.eclipse.ui.internal.progress.ProgressManager.run(ProgressManager.java:1190)
at org.talend.designer.runprocess.RunProcessContext.exec(RunProcessContext.java:534)
at org.talend.designer.runprocess.ui.ProcessComposite.exec(ProcessComposite.java:1401)
at org.talend.designer.runprocess.ui.views.ProcessView$RunAction.run(ProcessView.java:701)
at org.talend.designer.runprocess.ui.actions.RunProcessAction.run(RunProcessAction.java:58)
at org.talend.designer.core.debug.JobLaunchConfigurationDelegate$1.run(JobLaunchConfigurationDelegate.java:84)
at org.eclipse.swt.widgets.RunnableLock.run(RunnableLock.java:35)
at org.eclipse.swt.widgets.Synchronizer.runAsyncMessages(Synchronizer.java:136)
at org.eclipse.swt.widgets.Display.runAsyncMessages(Display.java:4147)
at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:3764)
at org.eclipse.e4.ui.internal.workbench.swt.PartRenderingEngine$9.run(PartRenderingEngine.java:1151)
at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:332)
at org.eclipse.e4.ui.internal.workbench.swt.PartRenderingEngine.run(PartRenderingEngine.java:1032)
at org.eclipse.e4.ui.internal.workbench.E4Workbench.createAndRunUI(E4Workbench.java:148)
at org.eclipse.ui.internal.Workbench$5.run(Workbench.java:636)
at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:332)
at org.eclipse.ui.internal.Workbench.createAndRunWorkbench(Workbench.java:579)
at org.eclipse.ui.PlatformUI.createAndRunWorkbench(PlatformUI.java:150)
at org.talend.rcp.intro.Application.start(Application.java:265)
at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:196)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:134)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:104)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:380)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:235)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:648)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:603)
at org.eclipse.equinox.launcher.Main.run(Main.java:1465)
I wasn't fully configured with env't variables.
redid my JAVA_HOME to the JRE and got my path set properly and Talend started working.

Spring Data QueryDslJpsRepository CrudMethodMetaData is null

I am experiencing a strange error when running in Websphere Liberty which doesn't occur in unit testing code.
Unit testing is using a JpaTransactionManager and WebSphere is using WebSphereUowTransactionManager.
The application code is invoking findAll on QueryDslJpaRepository.
When I step the code I can see that it somehow ends up with metadata = null after CrudMethodMetadataPostProcessor and the proxy throws NPE.
java.lang.NullPointerException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207)
at com.sun.proxy.$Proxy148.getLockModeType(Unknown Source)
at org.springframework.data.jpa.repository.support.QueryDslJpaRepository.createQuery(QueryDslJpaRepository.java:180)
Larger Stack Trace

custom datasource in spark-sql with mist

I have custom data source which I use in sqlContext.read.format(....sparkjobs.fileloader.DataSource). This works well via spark-submit in local & yarn mode.
But, when I invoke the same job via Mist, it throws exception:
Failed to find data source. Please find packages at http://spark-packages.org
My fat jar has the DataSource class. And also, at the beginning of the code, I logged all the jars available at classpath and I can see my jar.
Error Log:
java.lang.ClassNotFoundException: Failed to find data source: c.p.b.f.sparkjobs.fileloader.DataSource. Pleas
e find packages at http://spark-packages.org
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:77)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:102)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:109)
at c.p.b.f.sparkjobs.fileloader.m.McParser.processFile(McParser.scala:44)
at c.p.b.f.sparkjobs.fileloader.FileProcessor$$anonfun$processFilesSingle$1.apply(FileProcessor.scala:7
3)
at c.p.b.f.sparkjobs.fileloader.FileProcessor$$anonfun$processFilesSingle$1.apply(FileProcessor.scala:7
0)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at c.p.b.f.sparkjobs.fileloader.FileProcessor.processFilesSingle(FileProcessor.scala:70)
at c.p.b.f.sparkjobs.fileloader.FileProcessor$$anonfun$processFilesBulk$1.apply(FileProcessor.scala:58)
at c.p.b.f.sparkjobs.fileloader.FileProcessor$$anonfun$processFilesBulk$1.apply(FileProcessor.scala:49)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.MapLike$DefaultKeySet.foreach(MapLike.scala:174)
at c.p.b.f.sparkjobs.fileloader.FileProcessor.processFilesBulk(FileProcessor.scala:49)
at c.p.b.f.sparkjobs.fileloader.FileProcessor.init(FileProcessor.scala:33)
at c.p.b.f.sparkjobs.fileloader.Loader.execute(Loader.scala:162)
at c.p.b.f.util.LoaderApp$.execute(LoaderApp.scala:40)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.hydrosphere.mist.jobs.jar.JobInstance.io$hydrosphere$mist$jobs$jar$JobInstance$$invokeMethod(JobInstance.scala:40)
at io.hydrosphere.mist.jobs.jar.JobInstance$$anonfun$run$1$$anonfun$apply$3$$anonfun$apply$4.apply(JobInstance.scala:24)
at io.hydrosphere.mist.jobs.jar.JobInstance$$anonfun$run$1$$anonfun$apply$3$$anonfun$apply$4.apply(JobInstance.scala:24)
at cats.syntax.CatchOnlyPartiallyApplied.apply(either.scala:294)
at io.hydrosphere.mist.jobs.jar.JobInstance$$anonfun$run$1$$anonfun$apply$3.apply(JobInstance.scala:24)
at io.hydrosphere.mist.jobs.jar.JobInstance$$anonfun$run$1$$anonfun$apply$3.apply(JobInstance.scala:23)
at cats.syntax.EitherOps$.flatMap$extension(either.scala:129)
at io.hydrosphere.mist.jobs.jar.JobInstance$$anonfun$run$1.apply(JobInstance.scala:23)
at io.hydrosphere.mist.jobs.jar.JobInstance$$anonfun$run$1.apply(JobInstance.scala:22)
at cats.syntax.EitherOps$.flatMap$extension(either.scala:129)
at io.hydrosphere.mist.jobs.jar.JobInstance.run(JobInstance.scala:22)
at io.hydrosphere.mist.worker.runners.scala.ScalaRunner$$anonfun$run$1.apply(ScalaRunner.scala:26)
at io.hydrosphere.mist.worker.runners.scala.ScalaRunner$$anonfun$run$1.apply(ScalaRunner.scala:25)
at cats.syntax.EitherOps$.flatMap$extension(either.scala:129)
at io.hydrosphere.mist.worker.runners.scala.ScalaRunner.run(ScalaRunner.scala:25)
at io.hydrosphere.mist.worker.runners.MistJobRunner$.run(MistJobRunner.scala:18)
at io.hydrosphere.mist.worker.WorkerActor$$anonfun$2.apply(WorkerActor.scala:78)
at io.hydrosphere.mist.worker.WorkerActor$$anonfun$2.apply(WorkerActor.scala:76)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: c.p.b.f.sparkjobs.fileloader.DataSource.DefaultSource
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4$$anonfun$apply$1.apply(ResolvedDataSource.scala:62)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4$$anonfun$apply$1.apply(ResolvedDataSource.scala:62)
at scala.util.Try$.apply(Try.scala:161)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4.apply(ResolvedDataSource.scala:62)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4.apply(ResolvedDataSource.scala:62)
at scala.util.Try.orElse(Try.scala:82)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:62)
... 47 more

Couchbase Java SDK: concurrent timeout exception

I am using Couchbase 3.0 And I am facing this problem , that when I try to Open two buckets simultaneously It gives me the error java.util.concurrent.TimeoutException.
I have a case where I have to open two buckets simultaneously. When I comment one of the bucket opening line the code works just fine, but as soon as I want to open both the buckets It gives me an exception.
The Java SDK version is 2.1.3
java.lang.RuntimeException: java.util.concurrent.TimeoutException
at com.couchbase.client.java.util.Blocking.blockForSingle(Blocking.java:93) ~[java-client-2.1.3.jar:2.1.3]
at com.couchbase.client.java.CouchbaseCluster.openBucket(CouchbaseCluster.java:108) ~[java-client-2.1.3.jar:2.1.3]
at com.couchbase.client.java.CouchbaseCluster.openBucket(CouchbaseCluster.java:94) ~[java-client-2.1.3.jar:2.1.3]
at com.couchbase.client.java.CouchbaseCluster.openBucket(CouchbaseCluster.java:84) ~[java-client-2.1.3.jar:2.1.3]
at com.trakinvest.core.db.DB$CouchbaseService.bucket(DB.scala:155) ~[classes/:na]
at com.trakinvest.core.service.SearchServiceSpec$.<init>(SearchServiceSpec.scala:26) [test-classes/:na]
at com.trakinvest.core.service.SearchServiceSpec$.<clinit>(SearchServiceSpec.scala) [test-classes/:na]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) [na:1.8.0_51]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [na:1.8.0_51]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [na:1.8.0_51]
at java.lang.reflect.Constructor.newInstance(Constructor.java:422) [na:1.8.0_51]
at org.specs2.reflect.Classes$class.createInstanceFor(Classes.scala:154) [specs2-common_2.11-2.3.12.jar:2.3.12]
at org.specs2.reflect.Classes$.createInstanceFor(Classes.scala:207) [specs2-common_2.11-2.3.12.jar:2.3.12]
at org.specs2.reflect.Classes$$anonfun$createInstanceOfEither$1.apply(Classes.scala:145) [specs2-common_2.11-2.3.12.jar:2.3.12]
at org.specs2.reflect.Classes$$anonfun$createInstanceOfEither$1.apply(Classes.scala:145) [specs2-common_2.11-2.3.12.jar:2.3.12]
at scala.Option.map(Option.scala:146) [scala-library-2.11.6.jar:na]
at org.specs2.reflect.Classes$class.createInstanceOfEither(Classes.scala:145) [specs2-common_2.11-2.3.12.jar:2.3.12]
at org.specs2.reflect.Classes$.createInstanceOfEither(Classes.scala:207) [specs2-common_2.11-2.3.12.jar:2.3.12]
at org.specs2.reflect.Classes$class.org$specs2$reflect$Classes$$createInstanceForConstructor(Classes.scala:118) [specs2-common_2.11-2.3.12.jar:2.3.12]
at org.specs2.reflect.Classes$$anonfun$4.apply(Classes.scala:98) [specs2-common_2.11-2.3.12.jar:2.3.12]
at org.specs2.reflect.Classes$$anonfun$4.apply(Classes.scala:98) [specs2-common_2.11-2.3.12.jar:2.3.12]
at scala.collection.immutable.List.map(List.scala:273) [scala-library-2.11.6.jar:na]
at org.specs2.reflect.Classes$class.tryToCreateObjectEither(Classes.scala:98) [specs2-common_2.11-2.3.12.jar:2.3.12]
at org.specs2.reflect.Classes$.tryToCreateObjectEither(Classes.scala:207) [specs2-common_2.11-2.3.12.jar:2.3.12]
at org.specs2.reflect.Classes$class.tryToCreateObject(Classes.scala:70) [specs2-common_2.11-2.3.12.jar:2.3.12]
at org.specs2.reflect.Classes$.tryToCreateObject(Classes.scala:207) [specs2-common_2.11-2.3.12.jar:2.3.12]
at org.specs2.specification.SpecificationStructure$$anonfun$createSpecificationFromClassOrObject$1.apply(BaseSpecification.scala:132) [specs2-core_2.11-2.3.12.jar:2.3.12]
at org.specs2.specification.SpecificationStructure$$anonfun$createSpecificationFromClassOrObject$1.apply(BaseSpecification.scala:132) [specs2-core_2.11-2.3.12.jar:2.3.12]
at scala.Option.orElse(Option.scala:289) [scala-library-2.11.6.jar:na]
at org.specs2.specification.SpecificationStructure$.createSpecificationFromClassOrObject(BaseSpecification.scala:132) [specs2-core_2.11-2.3.12.jar:2.3.12]
at org.specs2.specification.SpecificationStructure$.createSpecificationEither(BaseSpecification.scala:117) [specs2-core_2.11-2.3.12.jar:2.3.12]
at org.specs2.runner.SbtRunner.org$specs2$runner$SbtRunner$$specificationRun(SbtRunner.scala:73) [specs2-core_2.11-2.3.12.jar:2.3.12]
at org.specs2.runner.SbtRunner$$anonfun$newTask$1$$anon$5.execute(SbtRunner.scala:59) [specs2-core_2.11-2.3.12.jar:2.3.12]
at sbt.ForkMain$Run$2.call(ForkMain.java:294) [test-agent-0.13.8.jar:0.13.8]
at sbt.ForkMain$Run$2.call(ForkMain.java:284) [test-agent-0.13.8.jar:0.13.8]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_51]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_51]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_51]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_51]
Caused by: java.util.concurrent.TimeoutException: null
I've upgraded the version of Couchbase SDK and the problem does not occur now ! I think it might be a versioning problem ! If anyone else finds this issue try to upgrade the Couchbase SDK to resolve this error !
P.S. It was quite a long chat and hence i am giving this answer here !