I am trying to run Scalding sample word count example. I have followed this github link for steps:-
https://github.com/twitter/scalding/wiki/Getting-Started
But I am getting ClassNotFoundException. Below is my StackTrace:-
[cloudera#localhost scalding-develop]$ **sudo scripts/scald.rb --local WordCount --input input.txt --output ./someOutputFile.tsv**
can not find /root/.sbt/boot/scala-2.9.3/lib/scala-library.jar appending SBT_VERSION [0.12.0] to SBT_HOME
scripts/scald.rb:139: warning: already initialized constant SBT_HOME
scripts/scald.rb:140: warning: already initialized constant SCALA_LIB_DIR
Exception in thread "main" java.lang.Throwable: If you know what exactly caused this error, please consider contributing to GitHub via following link.
https://github.com/twitter/scalding/wiki/Common-Exceptions-and-possible-reasons#javalangclassnotfoundexception
at com.twitter.scalding.Tool$.main(Tool.scala:146)
at com.twitter.scalding.Tool.main(Tool.scala)
Caused by: java.lang.ClassNotFoundException: WordCount
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:188)
at com.twitter.scalding.Job$.apply(Job.scala:39)
at com.twitter.scalding.Tool.getJob(Tool.scala:49)
at com.twitter.scalding.Tool.run(Tool.scala:69)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at com.twitter.scalding.Tool$.main(Tool.scala:132)
... 1 more
Please let me know where exactly is the issue?
Thanks.
Check if your JDK's bin directory is there in your PATH. I had a similar issue and I had use update-alternatives to install java but had not included /usr/lib/jvm/jdk-1.7.0/bin in my PATH.
Check - Running your Scalding jobs in Eclipse article at
http://hokiesuns.blogspot.co.uk/2012/07/running-your-scalding-jobs-in-eclipse.html
Using Maven for Scalding should be your first step into this new brave world.
After a few months give sbt a try and see if you like it
Don't go directly to sbt as its usage is not that trivial
I suggest not using scald.rb.
Instead use sbt, check this github repo, https://github.com/deanwampler/activator-scalding. It includes all the necessary basic setup.
Related
I have a Gradle project in IntelliJ IDEA 2016.2. Everytime I run the Scala tests in the project, I get the following exception:
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.CommandLineWrapper.main(CommandLineWrapper.java:48)
Caused by: java.lang.ClassNotFoundException: org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:123)
... 5 more
I checked the versions of the dependencies and I have added the Scala SDK to the project module as well. I also added the Scala plugin to the Gradle file and installed the Scala plugin in IntelliJ IDEA. Also, the tests run without an error on my colleague's computer so we have no idea what the error could be.
Found the cause: I have an accentuated letter in my user directory's name and IDEA is always trying to use some file from AppData under that directory. I have already changed the idea.properties file, but it has no effect regarding that file.
A possible workaround is using gradle (or maven/sbt/etc.). In my case, I use gradle, I just add #RunWith(classOf[JUnitRunner]) to the scala class I want to test, then execute gradle's test task.
For me the solution of command line length limitation was crucial. Idea offers about 3 ways of how to overcome too long command. Choose another and check.
It's in run configurations.
So I am trying to run a Spark job on Yarn-cluster mode kicked off via Oozie workflow, but have been encountering the following error (relevant stacktrace below)
java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:388)
at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:296)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:179)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1917)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1896)
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1896)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:180)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:132)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:151)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
...
Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
at org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:414)
at org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:323)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:144)
at org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:294)
... 28 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
... 33 more
Caused by: java.lang.UnsupportedOperationException: Unable to find org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
at org.apache.hadoop.hbase.ipc.RpcControllerFactory.instantiate(RpcControllerFactory.java:58)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.createAsyncProcess(ConnectionManager.java:2317)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:688)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630)
... 38 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
... 42 more
Some background information:
The job runs on spark 1.4.1 (specified correct spark.yarn.jar field in the spark.conf file).
oozie.libpath is set to the hdfs directory in which the jar of my program resides.
org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory, the class not found, exists in phoenix-4.5.1-HBase-1.0-client.jar. I've specified this jar in spark.driver.extraClassPath and spark.executor.extraClassPath in my spark.conf file. I've also added the phoenix-core dependency in my pom file, so that the class exists in my shaded project jar as well.
Observations so far:
adding an extra field in my spark.conf file spark.driver.userClassPathFirst and setting it to true gets rid of the classnotfound exception. However, it also prevents me from initializing a spark context (null pointer exception). From googling around it seems that including this field messes up classpaths, so may not be the way to go about it since I cannot even initialize a spark context this way.
I noticed that in the oozie stdout log, I do not see the classpath of the phoenix jar. So maybe for some reason spark.driver.extraClassPath and spark.executor.extraClassPath aren't actually picking up the jar as an extraClassPath? I do know that I'm specifying the correct jar file path, as other jobs have spark.conf files with the same parameters.
I found a hacky way to make the phoenix jar show up in the classpath (in the oozie stdout log) by copying the jar to the same directory as where my program jar resides. This works whether or not spark.executor.extraClassPath is changed to point to the new jar location. However, the classnotfound exception persists, even though I clearly see the ClientRpcControllerFactory jar when I unzip the jar)
Other things I've tried:
I tried using the sparkConf.setJars() and sparkContext.addJar() methods, but still encountered the same error
added the jar in the spark.driver.extraClassPath field in my job properties file, but it hasn't seemed to help (Spark docs indicated that this field is necessary when running in client mode, so may not be relevant for my case)
Any help/ideas/suggestions would be greatly appreciated.
I use CDH 5.5.1 + Phoenix 4.5.2 (both installed with parcels) and faced the same problem. I think the problem disappeared after I switched to client mode. I can't verify this because I am getting other error with cluster mode now.
I tried to trace Phoenix source code and found some interesting things. Hope Java / Scala expert identify the root cause.
The PhoenixDriver class was loaded.
This showed the jar was found initially. After layers of Class Loader /
context switch (?), the jar lost from the classpath.
If I Class.forName() a non-existing class in my program, there is no need to call sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331). The stack is like:
java.lang.ClassNotFoundException: NONEXISTINGCLASS
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
I copied Phoenix code into my program for testing. I still get the ClassNotFoundExcpetion if I call ConnectionQueryServicesImpl.init (ConnectionQueryServicesImpl.java:1896). However, a call to ConnectionQueryServicesImpl.openConnection (ConnectionQueryServicesImpl.java:296) returned usable HBase connection. So it seems PhoenixContextExecutor was causing the loss of the jar, but I don't know how.
Source code of Cloudera Phoenix 4.5.2 : https://github.com/cloudera-labs/phoenix/blob/phoenix1-4.5.2_1.2.0/phoenix-core/src/main/java/org/apache/
(Not sure whether I should post a comment... but I have no reputation anyway)
So I managed to fix my issue and get my job to run. My solution is very hacky, but will post it here in case it may help others in the future.
Basically, the problem as I understand it was that the org.apache.hadoop.hbase.util.ReflectionUtils class, which is responsible for finding the ClientRpcControllerFactory class, was being loaded from some cloudera directory in the cluster instead of from my own jar. When I set spark.driver.userClassPathFirst to true, it prioritized loading the ReflectionUtils class from my jar, and so was able to location the ClientRpcControllerFactory class. But that messed up some other classpaths and kept giving me a NullPointerException when I tried to initialize a SparkContext, so I looked for another solution.
I tried to figure out if it was possible to exclude all default cdh jars from being included in my classpath, but found that the value in spark.yarn.jar was pulling in all these cdh jars, and I definitely needed to specify that jar.
So the solution was to include all classes under org.apache.hadoop.hbase from the Phoenix jar into spark-assembly jar (the jar that spark.yarn.jar pointed to), which got rid of the original exception and did not give me a NPE when trying to initialize a SparkContext. I found that now the ReflectionUtils class was being loaded from the spark-assembly jar, and since the ClientRpcControllerFactorywas also included in that jar, it was able to find it. After this, I encountered a few more classNotFoundExceptions for Phoenix classes, so I put those classes into the spark-assembly jar as well.
Finally, I had a java.lang.RuntimeException: hbase-default.xml File Seems to be for and old Version of HBase problem. I found that my application jar contained such a file, but changing hbase.defaults.for.version.skip to true didn't do anything. So I included another hbase-default.xml file in the spark-assembly jar with the skip flag to true, and it finally worked.
Some observations:
I noticed that my spark-assembly jar was completely missing an org.apache.hadoop.hbase directory. A coworker told me that usually I should expect to find an hbase directory in my spark-assembly jar, so maybe I was working with a bad spark-assembly jar. Edit: I checked a spark-assembly jar that I newly downloaded (v1.5.2) and it doesn't have it, so maybe the apache.hadoop.hbase package is not included in it.
ClassNotFoundExceptions and classloader problems are hard to debug.
All I did was download wildfly-8.1.0.CR2 and extract it. standalone.bat and add-user.bat work but jboss-cli.bat does not.
F:\wildfly-8.1.0.CR2\bin>jboss-cli
java.lang.UnsatisfiedLinkError: Could not load library. Reasons: [no jansi64-1.9 in
java.library.path, no jansi-1.9 in java.library.path, no jansi in java.library.path,
D:\pgarner\AppData\Local\Temp\jansi-64-1.9.dll: The application has failed to start
because its side-by-side configuration is incorrect. Please see the application event
log or use the command-line sxstrace.exe tool for more detail]
at org.fusesource.hawtjni.runtime.Library.doLoad(Library.java:184)
at org.fusesource.hawtjni.runtime.Library.load(Library.java:142)
at org.fusesource.jansi.internal.Kernel32.<clinit>(Kernel32.java:37)
at org.fusesource.jansi.WindowsAnsiOutputStream.<clinit>(WindowsAnsiOutputStream.java:52)
at org.jboss.aesh.terminal.WindowsTerminal.init(WindowsTerminal.java:53)
at org.jboss.aesh.console.Console.setTerminal(Console.java:193)
at org.jboss.aesh.console.Console.reset(Console.java:154)
at org.jboss.aesh.console.Console.<init>(Console.java:105)
at org.jboss.aesh.console.Console.<init>(Console.java:101)
at org.jboss.as.cli.impl.Console$Factory.getConsole(Console.java:85)
at org.jboss.as.cli.impl.Console$Factory.getConsole(Console.java:78)
at org.jboss.as.cli.impl.CommandContextImpl.initBasicConsole(CommandContextImpl.java:349)
at org.jboss.as.cli.impl.CommandContextImpl.<init>(CommandContextImpl.java:296)
at org.jboss.as.cli.impl.CommandContextFactoryImpl.newCommandContext(CommandContextFactoryImpl.java:76)
at org.jboss.as.cli.impl.CliLauncher.initCommandContext(CliLauncher.java:273)
at org.jboss.as.cli.impl.CliLauncher.main(CliLauncher.java:253)
at org.jboss.as.cli.CommandLineMain.main(CommandLineMain.java:34)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.jboss.modules.Module.run(Module.java:312)
at org.jboss.modules.Main.main(Main.java:460)
Press any key to continue . . .
When I start up Wildfly using standalone.bat I see the following entry for java.library.path in server.log:
java.library.path = F:\Java\jdk1.7.0_45\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;F:\WANdisco\uberSVN\bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;F:\GnuPG\pub;F:\7-Zip;"E:\WebTest\build\bin";F:\WANdisco\uberSVN\bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;F:\GnuPG\pub;F:\7-Zip;.
The following file does indeed appear on my file system when I attempt to run jboss-cli:
D:\pgarner\AppData\Local\Temp\jansi-64-1.9.dll
I also tried using wildfly-8.0.0.Final instead of wildfly-8.1.0.CR2 and the same exact problem happened.
How to resolve this problem? I assumed the CLI should just work right out of the box after extracting all the files out of the zip file.
We're facing the same error and the problem is due to the jansi dll dependencies. In fact you need to install the Microsoft Visual C++ 2008 Redistributable Package corresponding to your platform. For x64, you can follow this link :
http://www.microsoft.com/en-US/download/details.aspx?id=2092
Have a similar problem. I can start Wildfly and deploy my application without errors, but everytime I redeploy my application, I get the following error:
Caused by:
java.lang.UnsatisfiedLinkError: Could not load library. Reasons: [no
jansi64-1.9 in java.library.path, no jansi-1.9 in java.library.path,
no jansi in java.library.path, Native Library
C:\Users\zb\AppData\Local\Temp\jansi-64-1.9.dll already loaded in
another classloader]
It seems this jansi library is stuck after deploy.
Restarting the server helps temporarily.
The instructions for running kafka from the quick start page are not working for me.
http://kafka.apache.org/07/quickstart.html
Kafka builds fine
05:55:01/kafka-0.8.1-src:58 $sbt package
[info] Set current project to kafka-0-8-1-src (in build file:/shared/kafka-0.8.1-src/)
[info] Packaging /shared/kafka-0.8.1-src/target/scala-2.10/kafka-0-8-1-src_2.10-0.1-SNAPSHOT.jar ...
[info] Done packaging.
[success] Total time: 0 s, completed Apr 17, 2014 5:55:07 AM
But does not run fine..
05:55:07/kafka-0.8.1-src:59 $bin/zookeeper-server-start.sh config/zookeeper.properties
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/zookeeper/server/quorum/QuorumPeerMain
Caused by: java.lang.ClassNotFoundException: org.apache.zookeeper.server.quorum.QuorumPeerMain
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Similar errors occur for kafka-server-start.sh and all the other scripts inside bin
You downloaded kafka-0.8.1-src.tgz from download page. The instructions on quickstart link are meant for Binary download . Download one from the Binary downloads section of http://kafka.apache.org/downloads.html page. Now try .It should work. Or if you want to build from the src.tgz package you downloaded, then run ./gradlew jar. It will download all the required dependencies.
I was facing same problem On Windows 10, What i have done is :
Don't download/install Zookeeper separately, download only kafka_2.12-1.1.0 (or higher version)
Create temp folder , ( like this E:\DevApplications\kafka\temp)
Open zookeeper.properties ( i have it # E:\DevApplications\kafka\kafka_2.12-1.1.0\config)
Update dataDir (for me : dataDir=E:/DevApplications/kafka/temp) Note the forward slash
Open CMD and run zookeeper-server-start.bat with zookeeper.properties as second parameter like
.\zookeeper-server-start.bat ..\..\config\zookeeper.properties
Once zookeeper is started start kafka server by typing
.\kafka-server-start.bat ..\..\config\server.properties
Hope this helps.
To add to the Chandra Kant solutions, if you have proxy connection in your network then please use the below command
./gradlew -Dhttp.proxyHost=<PROXY-HOST> -Dhttp.proxyPort=<PROXY-PORT> jar
Thanks #Chandra kant it helped me a lot
You can also reach this exception if you try starting Kafka 0.9.0.0 running a java version less than java 1.7. Set your $JAVA_HOME to 1.7 or above and make sure JAVA_HOME/bin is on your classpath.
I'm having some trouble getting the scala plugin to work with IntelliJ IDEA 10.5.1 Community Edition on Mac OSX 10.6.8. I'm following these instructions but whenever I try to run the simple HelloWorld application, I get this error:
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -Didea.launcher.port=7533 -Didea.launcher.bin.path=/Applications/IntelliJ IDEA 10 CE.app/bin -Dfile.encoding=UTF-8 -classpath /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/deploy.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/dt.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/javaws.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/jce.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/jconsole.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/management-agent.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/plugin.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/sa-jdi.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/../Classes/alt-rt.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/../Classes/alt-string.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/../Classes/charsets.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/../Classes/classes.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/../Classes/jsse.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/../Classes/ui.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/ext/apple_provider.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/ext/dnsns.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/ext/localedata.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/ext/sunjce_provider.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/ext/sunpkcs11.jar:/Users/A482930/IdeaProjects/ScalaPractice/out/production/ScalaPractice:/Users/A482930/scala/lib/scala-library.jar:/Users/A482930/scala/lib/scala-swing.jar:/Users/A482930/scala/lib/scala-dbc.jar:/Applications/IntelliJ IDEA 10 CE.app/lib/idea_rt.jar com.intellij.rt.execution.application.AppMain HelloWorld
Exception in thread "main" java.lang.ClassNotFoundException: HelloWorld
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:169)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:113)
Process finished with exit code 1
I checked the module settings and the compiler library seems to be set up correctly. The version of scala I'm using is the 2.9.0.1 with the IzPack Installer. I've tried both the IDEA plugin listed under available plugins as well as the July 5 2011 nightly here.
Rather then helping me troubleshoot my specific issue, does anyone know of a step by step tutorial that actually works without issues for a configuration similar to mine? I'm ok with using older versions of scala and even IDEA as long as they work.
I'm not sure you still need help with this, but I just ran a simple example with Scala 2.9.0.1 and it worked. I've been having tons of issues with the plugin though, so I guess it would help to know the exact steps you followed.
In my case, I did this:
Created Project
Added Scala Facet in the Project Wizard and added the Scala libs as a global lib (there are a few issues with this, but it should work here)
Create your HelloWorld example
Create a new Scala Compilation Server Runner. Add the Classpath of your project
Run the project
A colleague of mine created this some months ago, after having some issues getting started:
https://github.com/runeflobakk/sbt-idea-scalatest
Guess it is more SBT focused, but maybe you can find some use in it anyway?