Build failed while running build.xml - eclipse

I am running my build.xml as 1 Ant Build .
I have jdk and jre 1.7.0_80 installed on the machine; this is the console log :
Buildfile: C:\Users\Alessandro\workspace\cqwebapp\build.xml
clean:
[delete] Deleting directory C:\Users\Alessandro\workspace\cqwebapp\build
init:
[mkdir] Created dir: C:\Users\Alessandro\workspace\cqwebapp\build
[mkdir] Created dir: C:\Users\Alessandro\workspace\cqwebapp\build\classes
compile:
[javac] Compiling 85 source files to C:\Users\Alessandro\workspace\cqwebapp\build\classes
[javac] An exception has occurred in the compiler (1.7.0_80). Please file a bug at the Java Developer Connection (http://java.sun.com/webapps/bugreport) after checking the Bug Parade for duplicates. Include your program and the following diagnostic in your report. Thank you.
[javac] java.lang.NullPointerException
[javac] at com.sun.tools.javac.file.Paths.getPathEntries(Paths.java:215)
[javac] at com.sun.tools.javac.file.Paths.getPathEntries(Paths.java:200)
[javac] at com.sun.tools.javac.file.Paths.computeBootClassPath(Paths.java:389)
[javac] at com.sun.tools.javac.file.Paths.lazy(Paths.java:166)
[javac] at com.sun.tools.javac.file.JavacFileManager.getLocation(JavacFileManager.java:857)
[javac] at com.sun.tools.javac.file.JavacFileManager.hasLocation(JavacFileManager.java:674)
[javac] at com.sun.tools.javac.processing.JavacProcessingEnvironment.initProcessorIterator(JavacProcessingEnvironment.java:224)
[javac] at com.sun.tools.javac.processing.JavacProcessingEnvironment.<init>(JavacProcessingEnvironment.java:188)
[javac] at com.sun.tools.javac.main.JavaCompiler.initProcessAnnotations(JavaCompiler.java:992)
[javac] at com.sun.tools.javac.main.JavaCompiler.compile(JavaCompiler.java:821)
[javac] at com.sun.tools.javac.main.Main.compile(Main.java:439)
[javac] at com.sun.tools.javac.main.Main.compile(Main.java:353)
[javac] at com.sun.tools.javac.main.Main.compile(Main.java:342)
[javac] at com.sun.tools.javac.main.Main.compile(Main.java:333)
[javac] at com.sun.tools.javac.Main.compile(Main.java:76)
[javac] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[javac] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
[javac] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
[javac] at java.base/java.lang.reflect.Method.invoke(Unknown Source)
[javac] at org.apache.tools.ant.taskdefs.compilers.Javac13.execute(Javac13.java:57)
[javac] at org.apache.tools.ant.taskdefs.Javac.compile(Javac.java:1160)
[javac] at org.apache.tools.ant.taskdefs.Javac.execute(Javac.java:936)
[javac] at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293)
[javac] at jdk.internal.reflect.GeneratedMethodAccessor85.invoke(Unknown Source)
[javac] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
[javac] at java.base/java.lang.reflect.Method.invoke(Unknown Source)
[javac] at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
[javac] at org.apache.tools.ant.Task.perform(Task.java:348)
[javac] at org.apache.tools.ant.Target.execute(Target.java:435)
[javac] at org.apache.tools.ant.Target.performTasks(Target.java:456)
[javac] at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405)
[javac] at org.apache.tools.ant.Project.executeTarget(Project.java:1376)
[javac] at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
[javac] at org.eclipse.ant.internal.core.ant.EclipseDefaultExecutor.executeTargets(EclipseDefaultExecutor.java:36)
[javac] at org.apache.tools.ant.Project.executeTargets(Project.java:1260)
[javac] at org.eclipse.ant.internal.core.ant.InternalAntRunner.run(InternalAntRunner.java:705)
[javac] at org.eclipse.ant.internal.core.ant.InternalAntRunner.run(InternalAntRunner.java:527)
[javac] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[javac] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
[javac] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
[javac] at java.base/java.lang.reflect.Method.invoke(Unknown Source)
[javac] at org.eclipse.ant.core.AntRunner.run(AntRunner.java:371)
[javac] at org.eclipse.ant.internal.launching.launchConfigurations.AntLaunchDelegate$1.run(AntLaunchDelegate.java:269)
[javac] at java.base/java.lang.Thread.run(Unknown Source)
BUILD FAILED
C:\Users\Alessandro\workspace\cqwebapp\build.xml:36: Compile failed; see the compiler error output for details.
Total time: 546 milliseconds
Those images are :
The jdk1.7.0_80 folder ( gained by installing the corrispective
jdk.exe from Oracle Website)
The JRE
the current Compiler selected
Build Path pt1
Build Path pt2

Related

AOP is failing while build sbt assembly

I am trying to build sbt code. I am using sbt assembly but it is breaking at Aop. Used following MergeStragegy for Aop
Following is the error log
<!DOCTYPE aspectj PUBLIC "-//AspectJ//DTD//EN" "http://www.eclipse.org/aspectj/dtd/aspectj.dtd">
/home/puneet/repo/target/streams/_global/assemblyOption/_global/streams/assembly/sbtMergeTarget-e2e021ed2f7893685f6d16c35a11a6d2dcda6205.tmp[error] org.xml.sax.SAXParseExceptionpublicId: -//AspectJ//DTD//EN; systemId: http://www.eclipse.org/aspectj/dtd/aspectj.dtd; lineNumber: 1; columnNumber: 2; The markup declarations contained or pointed to by the document type declaration must be well-formed.
[error] at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:203)
[error] at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.fatalError(ErrorHandlerWrapper.java:177)
[error] at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:400)
[error] at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:327)
[error] at com.sun.org.apache.xerces.internal.impl.XMLScanner.reportFatalError(XMLScanner.java:1473)
[error] at com.sun.org.apache.xerces.internal.impl.XMLDTDScannerImpl.scanDecls(XMLDTDScannerImpl.java:2044)
[error] at com.sun.org.apache.xerces.internal.impl.XMLDTDScannerImpl.scanDTDExternalSubset(XMLDTDScannerImpl.java:307)
[error] at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$DTDDriver.dispatch(XMLDocumentScannerImpl.java:1174)
[error] at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$DTDDriver.next(XMLDocumentScannerImpl.java:1045)
[error] at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(XMLDocumentScannerImpl.java:959)
[error] at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:602)
[error] at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:505)
[error] at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:842)
[error] at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:771)
[error] at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141)
[error] at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213)
[error] at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:643)
[error] at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl.parse(SAXParserImpl.java:327)
[error] at scala.xml.factory.XMLLoader.loadXML(XMLLoader.scala:41)
[error] at scala.xml.factory.XMLLoader.loadXML$(XMLLoader.scala:37)
[error] at scala.xml.XML$.loadXML(XML.scala:60)
[error] at scala.xml.factory.XMLLoader.loadFile(XMLLoader.scala:48)
[error] at scala.xml.factory.XMLLoader.loadFile$(XMLLoader.scala:48)
[error] at scala.xml.XML$.loadFile(XML.scala:60)
[error] at AopMerge$.$anonfun$apply$1(AopMerge.scala:17)
[error] at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
[error] at scala.collection.Iterator.foreach(Iterator.scala:941)
[error] at scala.collection.Iterator.foreach$(Iterator.scala:941)
[error] at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
[error] at scala.collection.IterableLike.foreach(IterableLike.scala:74)
[error] at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
[error] at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
[error] at scala.collection.TraversableLike.map(TraversableLike.scala:238)
[error] at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
[error] at scala.collection.AbstractTraversable.map(Traversable.scala:108)
[error] at AopMerge$.apply(AopMerge.scala:17)
[error] at sbtassembly.MergeStrategy.apply(MergeStrategy.scala:20)
[error] at sbtassembly.Assembly$.applyStrategy$1(Assembly.scala:110)
[error] at sbtassembly.Assembly$.$anonfun$applyStrategies$11(Assembly.scala:135)
[error] at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
[error] at scala.collection.Iterator.foreach(Iterator.scala:941)
[error] at scala.collection.Iterator.foreach$(Iterator.scala:941)
[error] at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
[error] at scala.collection.IterableLike.foreach(IterableLike.scala:74)
[error] at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
[error] at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
[error] at scala.collection.TraversableLike.map(TraversableLike.scala:238)
[error] at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
[error] at scala.collection.AbstractTraversable.map(Traversable.scala:108)
[error] at sbtassembly.Assembly$.applyStrategies(Assembly.scala:132)
[error] at sbtassembly.Assembly$.x$1$lzycompute$1(Assembly.scala:25)
[error] at sbtassembly.Assembly$.x$1$1(Assembly.scala:23)
[error] at sbtassembly.Assembly$.stratMapping$lzycompute$1(Assembly.scala:23)
[error] at sbtassembly.Assembly$.stratMapping$1(Assembly.scala:23)
[error] at sbtassembly.Assembly$.inputs$lzycompute$1(Assembly.scala:68)
[error] at sbtassembly.Assembly$.inputs$1(Assembly.scala:58)
[error] at sbtassembly.Assembly$.apply(Assembly.scala:85)
[error] at sbtassembly.Assembly$.$anonfun$assemblyTask$1(Assembly.scala:244)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon$4.work(Transform.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:281)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.Execute.work(Execute.scala:290)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:281)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
[error] (repo / assembly) org.xml.sax.SAXParseExceptionpublicId: -//AspectJ//DTD//EN; systemId: http://www.eclipse.org/aspectj/dtd/aspectj.dtd; lineNumber: 1; columnNumber: 2; The markup declarations contained or pointed to by the document type declaration must be well-formed.
Last week the same changes where working. But somehow it is throwing this error. What could be the reason?
sbt version -> 1.3.10
I have created a new MergeStrategy for aop.xml files (part of the Kamon dependencies).
If replacing http with https does not work, it might be worth trying disabling DTD validation as follows:
import java.io.FileInputStream
import org.xml.sax.InputSource
val parser = {
val factory = javax.xml.parsers.SAXParserFactory.newInstance()
factory.setFeature("http://apache.org/xml/features/nonvalidating/load-external-dtd", false)
factory.newSAXParser()
}
val xmls: Seq[Elem] = files.map(f => XML.loadXML(new InputSource(new FileInputStream(f)), parser))

java.lang.NoClassDefFoundError: org/vafer/jdeb/Console when starting intellij sbt project

Whenever I create a new simple project in Intellij, I always get stuck at this stage when the project loads the build.sbt file.
Here's the error:
/usr/lib/jvm/java-1.8.0-openjdk-amd64/bin/java -Djline.terminal=jline.UnsupportedTerminal -Dsbt.log.noformat=true -Dfile.encoding=UTF-8 -Didea.managed=true -Dfile.encoding=UTF-8 -jar /home/giangvdq/.local/share/JetBrains/IdeaIC2020.2/Scala/launcher/sbt-launch.jar
[info] welcome to sbt 1.3.13 (Private Build Java 1.8.0_265)
[error] java.lang.NoClassDefFoundError: org/vafer/jdeb/Console
[error] Use 'last' for the full log.
Here's the build error message
Here's the full log file:
giangvdq#L0109-GiangVDQ:~/workspaces/fpt/untitled$ sbt compile
[info] welcome to sbt 1.3.13 (Ubuntu Java 11.0.8)
[error] java.lang.NoClassDefFoundError: org/vafer/jdeb/Console
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? r
[info] welcome to sbt 1.3.13 (Ubuntu Java 11.0.8)
[error] java.lang.NoClassDefFoundError: org/vafer/jdeb/Console
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? last
[debug] > Exec(reload, None, None)
[debug] > Exec(sbtStashOnFailure, None, None)
[debug] > Exec(onFailure loadFailed, None, None)
[debug] > Exec(loadp, None, None)
[info] welcome to sbt 1.3.13 (Ubuntu Java 11.0.8)
[error] java.lang.NoClassDefFoundError: org/vafer/jdeb/Console
[error] at java.base/java.lang.Class.forName0(Native Method)
[error] at java.base/java.lang.Class.forName(Class.java:398)
[error] at sbt.internal.inc.ModuleUtilities$.getObject(ModuleUtilities.scala:24)
[error] at sbt.internal.inc.ModuleUtilities$.getCheckedObject(ModuleUtilities.scala:32)
[error] at sbt.internal.inc.ModuleUtilities$.$anonfun$getCheckedObjects$1(ModuleUtilities.scala:37)
[error] at scala.collection.immutable.Stream.$anonfun$map$1(Stream.scala:418)
[error] at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1171)
[error] at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1161)
[error] at scala.collection.generic.Growable.loop$1(Growable.scala:57)
[error] at scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:61)
[error] at scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
[error] at scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:184)
[error] at scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:47)
[error] at scala.collection.TraversableLike.$plus$plus(TraversableLike.scala:151)
[error] at scala.collection.TraversableLike.$plus$plus$(TraversableLike.scala:147)
[error] at scala.collection.immutable.List.$plus$plus(List.scala:210)
[error] at sbt.internal.PluginDiscovery$.discoverAll(PluginDiscovery.scala:57)
[error] at sbt.internal.Load$.loadPlugins(Load.scala:1299)
[error] at sbt.internal.Load$.loadPluginDefinition(Load.scala:1244)
[error] at sbt.internal.Load$.noPlugins(Load.scala:1219)
[error] at sbt.internal.Load$.plugins(Load.scala:1204)
[error] at sbt.internal.Load$.$anonfun$loadUnit$2(Load.scala:688)
[error] at sbt.internal.Load$.timed(Load.scala:1376)
[error] at sbt.internal.Load$.$anonfun$loadUnit$1(Load.scala:688)
[error] at sbt.internal.Load$.timed(Load.scala:1376)
[error] at sbt.internal.Load$.loadUnit(Load.scala:682)
[error] at sbt.internal.Load$.$anonfun$builtinLoader$4(Load.scala:480)
[error] at sbt.internal.BuildLoader$.$anonfun$componentLoader$5(BuildLoader.scala:180)
[error] at sbt.internal.BuildLoader.apply(BuildLoader.scala:245)
[error] at sbt.internal.Load$.loadURI$1(Load.scala:542)
[error] at sbt.internal.Load$.loadAll(Load.scala:558)
[error] at sbt.internal.Load$.loadURI(Load.scala:488)
[error] at sbt.internal.Load$.load(Load.scala:467)
[error] at sbt.internal.Load$.$anonfun$apply$1(Load.scala:243)
[error] at sbt.internal.Load$.timed(Load.scala:1376)
[error] at sbt.internal.Load$.apply(Load.scala:243)
[error] at sbt.internal.GlobalPlugin$.build(GlobalPlugin.scala:59)
[error] at sbt.internal.GlobalPlugin$.load(GlobalPlugin.scala:64)
[error] at sbt.internal.Load$.loadGlobal(Load.scala:185)
[error] at sbt.internal.Load$.defaultWithGlobal(Load.scala:143)
[error] at sbt.internal.Load$.$anonfun$defaultLoad$1(Load.scala:50)
[error] at sbt.internal.Load$.timed(Load.scala:1376)
[error] at sbt.internal.Load$.defaultLoad(Load.scala:46)
[error] at sbt.BuiltinCommands$.liftedTree1$1(Main.scala:847)
[error] at sbt.BuiltinCommands$.doLoadProject(Main.scala:847)
[error] at sbt.BuiltinCommands$.$anonfun$loadProjectImpl$2(Main.scala:801)
[error] at sbt.Command$.$anonfun$applyEffect$4(Command.scala:149)
[error] at sbt.Command$.$anonfun$applyEffect$2(Command.scala:144)
[error] at sbt.Command$.process(Command.scala:187)
[error] at sbt.MainLoop$.process$1(MainLoop.scala:199)
[error] at sbt.MainLoop$.processCommand(MainLoop.scala:235)
[error] at sbt.MainLoop$.$anonfun$next$2(MainLoop.scala:147)
[error] at sbt.State$StateOpsImpl$.runCmd$1(State.scala:273)
[error] at sbt.State$StateOpsImpl$.process$extension(State.scala:277)
[error] at sbt.MainLoop$.$anonfun$next$1(MainLoop.scala:147)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.MainLoop$.next(MainLoop.scala:147)
[error] at sbt.MainLoop$.run(MainLoop.scala:138)
[error] at sbt.MainLoop$.$anonfun$runWithNewLog$1(MainLoop.scala:116)
[error] at sbt.io.Using.apply(Using.scala:27)
[error] at sbt.MainLoop$.runWithNewLog(MainLoop.scala:110)
[error] at sbt.MainLoop$.runAndClearLast(MainLoop.scala:65)
[error] at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:50)
[error] at sbt.MainLoop$.runLogged(MainLoop.scala:41)
[error] at sbt.StandardMain$.runManaged(Main.scala:132)
[error] at sbt.xMain$.run(Main.scala:67)
[error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] at java.base/java.lang.reflect.Method.invoke(Method.java:566)
[error] at sbt.internal.XMainConfiguration.run(XMainConfiguration.scala:45)
[error] at sbt.xMain.run(Main.scala:39)
[error] at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
[error] at xsbt.boot.Launch$.withContextLoader(Launch.scala:128)
[error] at xsbt.boot.Launch$.run(Launch.scala:109)
[error] at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:35)
[error] at xsbt.boot.Launch$.launch(Launch.scala:117)
[error] at xsbt.boot.Launch$.apply(Launch.scala:18)
[error] at xsbt.boot.Boot$.runImpl(Boot.scala:56)
[error] at xsbt.boot.Boot$.main(Boot.scala:18)
[error] at xsbt.boot.Boot.main(Boot.scala)
[error] Caused by: java.lang.ClassNotFoundException: org.vafer.jdeb.Console
[error] at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471)
[error] at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
[error] at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
[error] at java.base/java.lang.Class.forName0(Native Method)
[error] at java.base/java.lang.Class.forName(Class.java:398)
[error] at sbt.internal.inc.ModuleUtilities$.getObject(ModuleUtilities.scala:24)
[error] at sbt.internal.inc.ModuleUtilities$.getCheckedObject(ModuleUtilities.scala:32)
[error] at sbt.internal.inc.ModuleUtilities$.$anonfun$getCheckedObjects$1(ModuleUtilities.scala:37)
[error] at scala.collection.immutable.Stream.$anonfun$map$1(Stream.scala:418)
[error] at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1171)
[error] at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1161)
[error] at scala.collection.generic.Growable.loop$1(Growable.scala:57)
[error] at scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:61)
[error] at scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
[error] at scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:184)
[error] at scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:47)
[error] at scala.collection.TraversableLike.$plus$plus(TraversableLike.scala:151)
[error] at scala.collection.TraversableLike.$plus$plus$(TraversableLike.scala:147)
[error] at scala.collection.immutable.List.$plus$plus(List.scala:210)
[error] at sbt.internal.PluginDiscovery$.discoverAll(PluginDiscovery.scala:57)
[error] at sbt.internal.Load$.loadPlugins(Load.scala:1299)
[error] at sbt.internal.Load$.loadPluginDefinition(Load.scala:1244)
[error] at sbt.internal.Load$.noPlugins(Load.scala:1219)
[error] at sbt.internal.Load$.plugins(Load.scala:1204)
[error] at sbt.internal.Load$.$anonfun$loadUnit$2(Load.scala:688)
[error] at sbt.internal.Load$.timed(Load.scala:1376)
[error] at sbt.internal.Load$.$anonfun$loadUnit$1(Load.scala:688)
[error] at sbt.internal.Load$.timed(Load.scala:1376)
[error] at sbt.internal.Load$.loadUnit(Load.scala:682)
[error] at sbt.internal.Load$.$anonfun$builtinLoader$4(Load.scala:480)
[error] at sbt.internal.BuildLoader$.$anonfun$componentLoader$5(BuildLoader.scala:180)
[error] at sbt.internal.BuildLoader.apply(BuildLoader.scala:245)
[error] at sbt.internal.Load$.loadURI$1(Load.scala:542)
[error] at sbt.internal.Load$.loadAll(Load.scala:558)
[error] at sbt.internal.Load$.loadURI(Load.scala:488)
[error] at sbt.internal.Load$.load(Load.scala:467)
[error] at sbt.internal.Load$.$anonfun$apply$1(Load.scala:243)
[error] at sbt.internal.Load$.timed(Load.scala:1376)
[error] at sbt.internal.Load$.apply(Load.scala:243)
[error] at sbt.internal.GlobalPlugin$.build(GlobalPlugin.scala:59)
[error] at sbt.internal.GlobalPlugin$.load(GlobalPlugin.scala:64)
[error] at sbt.internal.Load$.loadGlobal(Load.scala:185)
[error] at sbt.internal.Load$.defaultWithGlobal(Load.scala:143)
[error] at sbt.internal.Load$.$anonfun$defaultLoad$1(Load.scala:50)
[error] at sbt.internal.Load$.timed(Load.scala:1376)
[error] at sbt.internal.Load$.defaultLoad(Load.scala:46)
[error] at sbt.BuiltinCommands$.liftedTree1$1(Main.scala:847)
[error] at sbt.BuiltinCommands$.doLoadProject(Main.scala:847)
[error] at sbt.BuiltinCommands$.$anonfun$loadProjectImpl$2(Main.scala:801)
[error] at sbt.Command$.$anonfun$applyEffect$4(Command.scala:149)
[error] at sbt.Command$.$anonfun$applyEffect$2(Command.scala:144)
[error] at sbt.Command$.process(Command.scala:187)
[error] at sbt.MainLoop$.process$1(MainLoop.scala:199)
[error] at sbt.MainLoop$.processCommand(MainLoop.scala:235)
[error] at sbt.MainLoop$.$anonfun$next$2(MainLoop.scala:147)
[error] at sbt.State$StateOpsImpl$.runCmd$1(State.scala:273)
[error] at sbt.State$StateOpsImpl$.process$extension(State.scala:277)
[error] at sbt.MainLoop$.$anonfun$next$1(MainLoop.scala:147)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.MainLoop$.next(MainLoop.scala:147)
[error] at sbt.MainLoop$.run(MainLoop.scala:138)
[error] at sbt.MainLoop$.$anonfun$runWithNewLog$1(MainLoop.scala:116)
[error] at sbt.io.Using.apply(Using.scala:27)
[error] at sbt.MainLoop$.runWithNewLog(MainLoop.scala:110)
[error] at sbt.MainLoop$.runAndClearLast(MainLoop.scala:65)
[error] at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:50)
[error] at sbt.MainLoop$.runLogged(MainLoop.scala:41)
[error] at sbt.StandardMain$.runManaged(Main.scala:132)
[error] at sbt.xMain$.run(Main.scala:67)
[error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] at java.base/java.lang.reflect.Method.invoke(Method.java:566)
[error] at sbt.internal.XMainConfiguration.run(XMainConfiguration.scala:45)
[error] at sbt.xMain.run(Main.scala:39)
[error] at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
[error] at xsbt.boot.Launch$.withContextLoader(Launch.scala:128)
[error] at xsbt.boot.Launch$.run(Launch.scala:109)
[error] at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:35)
[error] at xsbt.boot.Launch$.launch(Launch.scala:117)
[error] at xsbt.boot.Launch$.apply(Launch.scala:18)
[error] at xsbt.boot.Boot$.runImpl(Boot.scala:56)
[error] at xsbt.boot.Boot$.main(Boot.scala:18)
[error] at xsbt.boot.Boot.main(Boot.scala)
[error] java.lang.NoClassDefFoundError: org/vafer/jdeb/Console
[error] Use 'last' for the full log.
[debug] > Exec(loadFailed, None, None)
[debug] > Exec(last, None, None)
Before this I might have copied some .jar files from one project to another and this issue arose from then.
I've tried resetting Intellij to defaults, when that didn't work, I tried removing the Intellij installation and its related folders, excluding .ivy and .sbt folders. But nothing worked. I didn't try to delete any jar files though because I didn't wanna mess it up further.
Does anyone know how to solve this error?
Nevermind I found the solution.
I went into this directory /home/username/ and deleted the .sbt folder.
I also reinstalled Intellij and everything worked again

Unable to run the main method

I am currently working on a system where staff members in an organization can be informed regarding their pending tasks to be completed by their managers via devices. This is demonstrated via the following classes:
Staff
Manager (Extends Staff)
Device (Receives tasks and removes them once completed)
Task
System (Assigns tasks)
Main
I am currently stuck at a point where I am unable to determine why I cannot run my classes, as I receive a non-zero exit code of 1.
The problem currently seems to lie within the Main.scala file, at the moment:
object Main {
def main(args: Array[String]): Unit = {
var system = new System
var s1 = new Staff(1, "John", "johndoe#outlook.com", "Brazil")
system.addStaff(s1)
var s2 = new Manager(2, "Reese", "reesecups#gmail.com", "Japan")
system.addStaff(s2)
s2.assignTask(system, 1, "PLEASE WORK")
}
}
The code used to run as intended, however when I added the following line to the Main.scala file:
s2.assignTask(system, 1, "PLEASE WORK")
I received the following error:
[error] java.lang.RuntimeException: No main class detected.
[error] at scala.sys.package$.error(package.scala:30)
[error] at sbt.Defaults$.$anonfun$bgRunTask$4(Defaults.scala:1477)
[error] at scala.Option.getOrElse(Option.scala:189)
[error] at sbt.Defaults$.$anonfun$bgRunTask$3(Defaults.scala:1477)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon$4.work(Transform.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:281)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.Execute.work(Execute.scala:290)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:281)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(Unknown Source)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
[error] at java.util.concurrent.FutureTask.run(Unknown Source)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
[error] at java.lang.Thread.run(Unknown Source)
[error] (Compile / bgRun) No main class detected.
Is there any indication to what the problem may be?

Unable to run the simplest Spark Streaming application ever

With the following ultra simple streaming app:
object Streaming {
def main(args: Array[String]): Unit = {
val sparkConf = new SparkConf().setAppName("Simple Application").setMaster("local[*]")
val streamingContext = new StreamingContext(sparkConf, Seconds(10))
val lines = streamingContext.socketTextStream("localhost", 8888)
lines.print()
streamingContext.start()
streamingContext.awaitTermination()
}
}
Running sbt run and sending a text to this host, I'm getting this amazing error:
java.lang.IllegalArgumentException: null
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:432)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:432)
at org.apache.xbean.asm5.ClassReader.a(Unknown Source)
at org.apache.xbean.asm5.ClassReader.b(Unknown Source)
at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:262)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:261)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:261)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2292)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2066)
at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1358)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.take(RDD.scala:1331)
at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:735)
at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:734)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:844)
[error] (run-main-0) java.lang.IllegalArgumentException
[error] java.lang.IllegalArgumentException
[error] at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
[error] at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
[error] at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
[error] at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46)
[error] at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449)
[error] at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:432)
[error] at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
[error] at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
[error] at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
[error] at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
[error] at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
[error] at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
[error] at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
[error] at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:432)
[error] at org.apache.xbean.asm5.ClassReader.a(Unknown Source)
[error] at org.apache.xbean.asm5.ClassReader.b(Unknown Source)
[error] at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
[error] at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
[error] at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:262)
[error] at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:261)
[error] at scala.collection.immutable.List.foreach(List.scala:392)
[error] at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:261)
[error] at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159)
[error] at org.apache.spark.SparkContext.clean(SparkContext.scala:2292)
[error] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2066)
[error] at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1358)
[error] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
[error] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
[error] at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
[error] at org.apache.spark.rdd.RDD.take(RDD.scala:1331)
[error] at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:735)
[error] at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:734)
[error] at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
[error] at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
[error] at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
[error] at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
[error] at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
[error] at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
[error] at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
[error] at scala.util.Try$.apply(Try.scala:192)
[error] at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
[error] at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
[error] at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
[error] at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
[error] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
[error] at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
[error] at java.base/java.lang.Thread.run(Thread.java:844)
15:25:30.859 [spark-listener-group-executorManagement] INFO org.apache.spark.scheduler.AsyncEventQueue - Stopping listener queue executorManagement.
java.lang.InterruptedException: null
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2050)
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2084)
at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:435)
at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:94)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:83)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:79)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:78)
15:25:30.859 [spark-listener-group-shared] INFO org.apache.spark.scheduler.AsyncEventQueue - Stopping listener queue shared.
java.lang.InterruptedException: null
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2050)
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2084)
at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:435)
at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:94)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:83)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:79)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:78)
15:25:30.860 [spark-listener-group-appStatus] INFO org.apache.spark.scheduler.AsyncEventQueue - Stopping listener queue appStatus.
java.lang.InterruptedException: null
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2050)
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2084)
at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:435)
at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:94)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:83)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:79)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:78)
15:25:30.860 [org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReferenceCleaner] WARN org.apache.hadoop.fs.FileSystem - exception in the cleaner thread but it will continue to run
java.lang.InterruptedException: null
at java.base/java.lang.Object.wait(Native Method)
at java.base/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:151)
at java.base/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:172)
at org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReferenceCleaner.run(FileSystem.java:3063)
at java.base/java.lang.Thread.run(Thread.java:844)
15:25:30.862 [Spark Context Cleaner] ERROR org.apache.spark.ContextCleaner - Error in cleaning thread
java.lang.InterruptedException: null
at java.base/java.lang.Object.wait(Native Method)
at java.base/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:151)
at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:181)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:178)
at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:73)
[error] java.lang.RuntimeException: Nonzero exit code: 1
[error] at sbt.Run$.executeTrapExit(Run.scala:124)
[error] at sbt.Run.run(Run.scala:77)
[error] at sbt.Defaults$.$anonfun$bgRunTask$5(Defaults.scala:1185)
[error] at sbt.Defaults$.$anonfun$bgRunTask$5$adapted(Defaults.scala:1180)
[error] at sbt.internal.BackgroundThreadPool.$anonfun$run$1(DefaultBackgroundJobService.scala:366)
[error] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
[error] at scala.util.Try$.apply(Try.scala:209)
[error] at sbt.internal.BackgroundThreadPool$BackgroundRunnable.run(DefaultBackgroundJobService.scala:289)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
[error] at java.base/java.lang.Thread.run(Thread.java:844)
What am I doing wrong here?
Here is the full project: https://github.com/joan38/spark-issue
I managed to run it in a Docker container so I guess it's environmental as gemelen commented.

Cassandra: Get metadata using keyspace and table name

I'm creating a custom trigger for casssandra.
So, to get the metadata, the statement in the given example trigger file fails.
TableMetadata metadata = Schema.instance.getTableMetadata(auditKeyspace, auditTable);
I get this error while build
AuditTrigger.java:27: error: package org.apaceh.cassandra.schema does not exist
[javac] import org.apaceh.cassandra.schema.TableMetadata;
[javac] ^
[javac] /home/bkoganti/cassandra/examples/triggers/src/org/apache/cassandra/triggers/AuditTrigger.java:28: error: cannot find symbol
[javac] import org.apache.cassandra.schema.Schema;
[javac] ^
[javac] symbol: class Schema
[javac] location: package org.apache.cassandra.schema
[javac] /home/bkoganti/cassandra/examples/triggers/src/org/apache/cassandra/triggers/AuditTrigger.java:50: error: cannot find symbol
[javac] TableMetadata metadata = Schema.instance.getTableMetadata(auditKeyspace, auditTable);
[javac] ^
[javac] symbol: class TableMetadata
[javac] location: class AuditTrigger
[javac] /home/bkoganti/cassandra/examples/triggers/src/org/apache/cassandra/triggers/AuditTrigger.java:50: error: package Schema does not exist
[javac] TableMetadata metadata = Schema.instance.getTableMetadata(auditKeyspace, auditTable);
[javac] ^
[javac] 4 errors
There is no class TableMetadata and Schema available.
So, how do I get the metadata using the keyspace name and table name.
This is a typo missed during some refactors. I created a jira and provided patch with fix here CASSANDRA-13796 you can see the change on github
audit.row()
.add("keyspace_name", update.metadata().keyspace)
- .add("table_name", update.metadata().table)
+ .add("table_name", update.metadata().name)
.add("primary_key", update.metadata().partitionKeyType.getString(update.partitionKey().getKey()));