I am not using IntelliJ 15 for a long time, but never ever had such an issue. When I do: New Project -> Scala, then everything works fine, but when I do New Project -> SBT, then I can't even have main, because it gives me this:
Exception in thread "main" java.lang.ClassNotFoundException: testing
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
I did try deleting Make from Edit Configurations, I also tried adding Scala script in Edit Configurations, but still have this problem (it says scala script couldn't be found even though I linked it properly). Also, I read this topic:
How to run a Scala script within IntelliJ IDEA?
but haven't found solution. Thank you for your suggestions.
Took me some time, but fixed - the problem was pretty much obvious. It was enough to go: File -> Project Structure -> Modules and add to Source Folders places where you create your scala file, or simply create scala script file in main -> scala.
Related
I'm doing a basic program of implementing Drools and the program runs on an application configuration but when I try to run the JAR, I face an error.
The error I get on the terminal:
`Suhita-MacBookPro:Drool-CreditScore-Sample sgoswami$ spark-submit --class main.scala.suhita.Sample --master local[*] target/DroolsMaven-1.0-SNAPSHOT.jar
java.lang.ClassNotFoundException: main.scala.suhita.Sample
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:230)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:712)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)`
Drools-Project
>src
>main
>scala
>suhita
- Sample
- Applicant
>META-INF
-kmodule.xml
-manifest.MF
>resources.rules
-rules
This happens sometime when your classes are not loaded property. I have seen a few times recently. So there are two ways to fix this issue:
Refresh classes that may include sbt clean compile
or you can try reloading idea classes from top menu. It's very vague to say but sometimes restarting intellij also works as it reloads all the classes again.
I am sure one the method will work. Let me know if it persists.
Looking at the tree structure of your project
Drools-Project
>src
>main
>scala
>suhita
- Sample
- Applicant
You don't need to provide class name from main as main.scala.suhita.Sample
Simply use suhita.Simple for class name as
spark-submit --class suhita.Sample --master local[*] target/DroolsMaven-1.0-SNAPSHOT.jar
and it should work
So I am trying to run a Spark job on Yarn-cluster mode kicked off via Oozie workflow, but have been encountering the following error (relevant stacktrace below)
java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:388)
at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:296)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:179)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1917)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1896)
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1896)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:180)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:132)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:151)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
...
Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
at org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:414)
at org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:323)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:144)
at org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:294)
... 28 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
... 33 more
Caused by: java.lang.UnsupportedOperationException: Unable to find org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
at org.apache.hadoop.hbase.ipc.RpcControllerFactory.instantiate(RpcControllerFactory.java:58)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.createAsyncProcess(ConnectionManager.java:2317)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:688)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630)
... 38 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
... 42 more
Some background information:
The job runs on spark 1.4.1 (specified correct spark.yarn.jar field in the spark.conf file).
oozie.libpath is set to the hdfs directory in which the jar of my program resides.
org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory, the class not found, exists in phoenix-4.5.1-HBase-1.0-client.jar. I've specified this jar in spark.driver.extraClassPath and spark.executor.extraClassPath in my spark.conf file. I've also added the phoenix-core dependency in my pom file, so that the class exists in my shaded project jar as well.
Observations so far:
adding an extra field in my spark.conf file spark.driver.userClassPathFirst and setting it to true gets rid of the classnotfound exception. However, it also prevents me from initializing a spark context (null pointer exception). From googling around it seems that including this field messes up classpaths, so may not be the way to go about it since I cannot even initialize a spark context this way.
I noticed that in the oozie stdout log, I do not see the classpath of the phoenix jar. So maybe for some reason spark.driver.extraClassPath and spark.executor.extraClassPath aren't actually picking up the jar as an extraClassPath? I do know that I'm specifying the correct jar file path, as other jobs have spark.conf files with the same parameters.
I found a hacky way to make the phoenix jar show up in the classpath (in the oozie stdout log) by copying the jar to the same directory as where my program jar resides. This works whether or not spark.executor.extraClassPath is changed to point to the new jar location. However, the classnotfound exception persists, even though I clearly see the ClientRpcControllerFactory jar when I unzip the jar)
Other things I've tried:
I tried using the sparkConf.setJars() and sparkContext.addJar() methods, but still encountered the same error
added the jar in the spark.driver.extraClassPath field in my job properties file, but it hasn't seemed to help (Spark docs indicated that this field is necessary when running in client mode, so may not be relevant for my case)
Any help/ideas/suggestions would be greatly appreciated.
I use CDH 5.5.1 + Phoenix 4.5.2 (both installed with parcels) and faced the same problem. I think the problem disappeared after I switched to client mode. I can't verify this because I am getting other error with cluster mode now.
I tried to trace Phoenix source code and found some interesting things. Hope Java / Scala expert identify the root cause.
The PhoenixDriver class was loaded.
This showed the jar was found initially. After layers of Class Loader /
context switch (?), the jar lost from the classpath.
If I Class.forName() a non-existing class in my program, there is no need to call sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331). The stack is like:
java.lang.ClassNotFoundException: NONEXISTINGCLASS
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
I copied Phoenix code into my program for testing. I still get the ClassNotFoundExcpetion if I call ConnectionQueryServicesImpl.init (ConnectionQueryServicesImpl.java:1896). However, a call to ConnectionQueryServicesImpl.openConnection (ConnectionQueryServicesImpl.java:296) returned usable HBase connection. So it seems PhoenixContextExecutor was causing the loss of the jar, but I don't know how.
Source code of Cloudera Phoenix 4.5.2 : https://github.com/cloudera-labs/phoenix/blob/phoenix1-4.5.2_1.2.0/phoenix-core/src/main/java/org/apache/
(Not sure whether I should post a comment... but I have no reputation anyway)
So I managed to fix my issue and get my job to run. My solution is very hacky, but will post it here in case it may help others in the future.
Basically, the problem as I understand it was that the org.apache.hadoop.hbase.util.ReflectionUtils class, which is responsible for finding the ClientRpcControllerFactory class, was being loaded from some cloudera directory in the cluster instead of from my own jar. When I set spark.driver.userClassPathFirst to true, it prioritized loading the ReflectionUtils class from my jar, and so was able to location the ClientRpcControllerFactory class. But that messed up some other classpaths and kept giving me a NullPointerException when I tried to initialize a SparkContext, so I looked for another solution.
I tried to figure out if it was possible to exclude all default cdh jars from being included in my classpath, but found that the value in spark.yarn.jar was pulling in all these cdh jars, and I definitely needed to specify that jar.
So the solution was to include all classes under org.apache.hadoop.hbase from the Phoenix jar into spark-assembly jar (the jar that spark.yarn.jar pointed to), which got rid of the original exception and did not give me a NPE when trying to initialize a SparkContext. I found that now the ReflectionUtils class was being loaded from the spark-assembly jar, and since the ClientRpcControllerFactorywas also included in that jar, it was able to find it. After this, I encountered a few more classNotFoundExceptions for Phoenix classes, so I put those classes into the spark-assembly jar as well.
Finally, I had a java.lang.RuntimeException: hbase-default.xml File Seems to be for and old Version of HBase problem. I found that my application jar contained such a file, but changing hbase.defaults.for.version.skip to true didn't do anything. So I included another hbase-default.xml file in the spark-assembly jar with the skip flag to true, and it finally worked.
Some observations:
I noticed that my spark-assembly jar was completely missing an org.apache.hadoop.hbase directory. A coworker told me that usually I should expect to find an hbase directory in my spark-assembly jar, so maybe I was working with a bad spark-assembly jar. Edit: I checked a spark-assembly jar that I newly downloaded (v1.5.2) and it doesn't have it, so maybe the apache.hadoop.hbase package is not included in it.
ClassNotFoundExceptions and classloader problems are hard to debug.
I tried many different run configs, but whatever I do I get this exception when running specs2 tests in IntelliJ for scala.
It always fails to find a class that ends with a $ sign. I checked - and there really is no such class file. There's AppControllerIT.class and lots of classes like AppControllerIT$innerFunctionOrclass.clas, but not AppControllerIT$.class
Any ideas?
Thanks!
com.haha.market.api.e2e.controllers.AppControllerIT$
java.lang.ClassNotFoundException: com.haha.market.api.e2e.controllers.AppControllerIT$
STACKTRACE
java.net.URLClassLoader.findClass(URLClassLoader.java:381)
java.lang.ClassLoader.loadClass(ClassLoader.java:424)
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
java.lang.ClassLoader.loadClass(ClassLoader.java:357)
org.specs2.reflect.Classes$$anonfun$loadClassEither$1.apply(Classes.scala:140)
org.specs2.reflect.Classes$$anonfun$loadClassEither$1.apply(Classes.scala:140)
org.specs2.control.ActionT$$anonfun$safe$1.apply(ActionT.scala:89)
org.specs2.control.ActionT$$anonfun$reader$1$$anonfun$apply$6.apply(ActionT.scala:80)
org.specs2.control.Status$.safe(Status.scala:100)
Classes with $ signs at the end are generate from compiled Scala objects. This means you may have an object defined similar to this:
package com.haha.market.api.e2e.controllers
object AppControllerIT {
}
From your error, it seems that an older compiled artifact or a library (?) is polluting your classpath. First, try cleaning up the project (mvn clean or sbt clean). Next, try to clean any libraries you have in your project inside IntelliJ. IntelliJ sometimes caches multiple versions of the same libraries which may cause confusion during runtime. To clean those up go to "File -> Project Structure" in IntelliJ and manually delete any duplicated libraries you may have.
I'm trying to run tests for an sbt-based Scala application in intellij. But I get the following error which I'm not sure how to fix:
Testing started at 21:07 ...
java.lang.IncompatibleClassChangeError: Found class scala.collection.mutable.ArrayOps, but interface was expected
at org.scalatest.tools.Runner$.checkArgsForValidity(Runner.scala:895)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:729)
at org.scalatest.tools.Runner$.run(Runner.scala:711)
at org.scalatest.tools.Runner.run(Runner.scala)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:144)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:35)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
When I decorate my classes with the Junit runner attribute they work fine (and it's actually a much better run in terms of the visuals):
#RunWith(classOf[JUnitRunner])
I got the same error when trying to run tests for my Play Framework app. Drilling down into the stack trace, I found that the problem class was FakeRequest, which is in the play-test library. I had two different versions of the library, one for Play 2.4 and one for 2.3. I was able to resolve this issue by removing the play-test version for Play 2.3 (open Module Settings -> Libraries -> find and delete the bad dependency).
Your issue is probably with some other problematic dependency, but following the same steps as above may help fix it.
This seems to be a problem with scala test runner framework. I had come across the same problem; eventually like you suggested end up using junit test runner to make it work. But the problem in my case was it was pulling in a transitive dependency and no such class error.
Make sure the libraries what you are using for the JUnitRunner are same. Most of the times “Incompatible Class Change Error” occurs because of backward compatibility. And also have a look at scala library jar at the time of compiling and running.
I'm having some trouble getting the scala plugin to work with IntelliJ IDEA 10.5.1 Community Edition on Mac OSX 10.6.8. I'm following these instructions but whenever I try to run the simple HelloWorld application, I get this error:
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -Didea.launcher.port=7533 -Didea.launcher.bin.path=/Applications/IntelliJ IDEA 10 CE.app/bin -Dfile.encoding=UTF-8 -classpath /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/deploy.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/dt.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/javaws.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/jce.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/jconsole.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/management-agent.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/plugin.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/sa-jdi.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/../Classes/alt-rt.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/../Classes/alt-string.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/../Classes/charsets.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/../Classes/classes.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/../Classes/jsse.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/../Classes/ui.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/ext/apple_provider.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/ext/dnsns.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/ext/localedata.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/ext/sunjce_provider.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/ext/sunpkcs11.jar:/Users/A482930/IdeaProjects/ScalaPractice/out/production/ScalaPractice:/Users/A482930/scala/lib/scala-library.jar:/Users/A482930/scala/lib/scala-swing.jar:/Users/A482930/scala/lib/scala-dbc.jar:/Applications/IntelliJ IDEA 10 CE.app/lib/idea_rt.jar com.intellij.rt.execution.application.AppMain HelloWorld
Exception in thread "main" java.lang.ClassNotFoundException: HelloWorld
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:169)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:113)
Process finished with exit code 1
I checked the module settings and the compiler library seems to be set up correctly. The version of scala I'm using is the 2.9.0.1 with the IzPack Installer. I've tried both the IDEA plugin listed under available plugins as well as the July 5 2011 nightly here.
Rather then helping me troubleshoot my specific issue, does anyone know of a step by step tutorial that actually works without issues for a configuration similar to mine? I'm ok with using older versions of scala and even IDEA as long as they work.
I'm not sure you still need help with this, but I just ran a simple example with Scala 2.9.0.1 and it worked. I've been having tons of issues with the plugin though, so I guess it would help to know the exact steps you followed.
In my case, I did this:
Created Project
Added Scala Facet in the Project Wizard and added the Scala libs as a global lib (there are a few issues with this, but it should work here)
Create your HelloWorld example
Create a new Scala Compilation Server Runner. Add the Classpath of your project
Run the project
A colleague of mine created this some months ago, after having some issues getting started:
https://github.com/runeflobakk/sbt-idea-scalatest
Guess it is more SBT focused, but maybe you can find some use in it anyway?