Can't use akka in IDEA plugin development - scala

When I develop an IDEA plugin, I want to use akka, but have some problems.
I created a demo project here: https://github.com/freewind/idea-plugin-akka-demo
You can just clone it and reproduce the problem on your computer. (Notice the Setup section)
And I copy the problem here:
Problems
1. Can't use default akka configuration
If I removed:
src/main/resources/application.conf
src/main/scala/freewind/MyAkkaConfig
and run this plugin, it will report this error when starting:
com.intellij.ide.plugins.PluginManager$StartupAbortedException:
com.intellij.diagnostic.PluginException: No configuration setting found for key 'akka'
[Plugin: com.yourcompany.unique.plugin.id]
2. Can't load the configuration from file
Then I copied the reference.conf from akka jar, to src/main/resources/application.conf, but it still report the same error. Seems akka in IDEA plugin can't find this file automatically.
3. ClassNotFoundException: akka.actor.LightArrayRevolverScheduler
So I have to use MyAkkaConfig.scala to hardcode the configuration in scala code, but this time, it reports another error:
com.intellij.ide.plugins.PluginManager$StartupAbortedException:
com.intellij.diagnostic.PluginException: ClassNotFoundException: akka.actor.LightArrayRevolverScheduler
[Plugin: com.yourcompany.unique.plugin.id]
The akka.actor.LightArrayRevolverScheduler is used in MyAkkaConfig.scala, and is included in akka-actor_2.11:2.3.12:jar. But why IDEA can't load it?

For the 3rd problem, it can be fixed by passing the classloader:
val system = ActorSystem("my-actor", MyAkkaConfig.config, this.getClass.getClassLoader)
But we also can remove the MyAkkaConfig.config, to use the file application.conf under resources

Related

Scala Play Rest service cannot find Controllers in routes file [Play 2.6, Scala]

I have recently revisited a project I have not worked on in over a year. Yesterday I was able to successfully run the REST service with no problems. Today while I was refactoring the location of certain controllers in this project I started to encounter errors related to controllers could not be found within a given package.
My routes file that looks like this:
UserController is defined as such:
However; when trying to compile this project, I receive a list of errors like: (redacted most, only included one controller for sample)
type UserController is not a member of package com.jkdev.controllers
[error] POST /users com.jkdev.controllers.UserController.createUser
Additionally, my Binders are no longer being detected by the routes file as well, so I am seeing errors like: [error] /Users/...../Developer/cashflows/metadata/conf/routes: object binders is not a member of package com.jkdev.
Like I mentioned, yesterday this was working, so I tried reverting to that commit and rebuilding, but this issue persists.
I have attempted to delete all target directories and recompile, ran sbt clean; cleanFiles. All of which provided nothing of value.
Overall this feels like a build error; but I changed nothing about the build file so I have no idea where to look next
The problem was:
I updated the intelliJ SBT preferences to use sbt shell for all builds,tests,runs, etc.
After deleting all of the ./target directories, doing SBT compile and run did not reproduce the target directories for my projects.
After updating IJ SBT preferences to not use SBT shell for builds; I was able to re-compile w/ IJ and reproduce the target directories.
Doing sbt run afterwards successfully launches the server.

Databricks - java.lang.NoClassDefFoundError: org/json/JSONException

We can't figure out the following issue: we are trying to use Apache Hudi to save data to the storage. The problem is when we upload a fat jar which includes the org.json package in dependencies, the df.save() application is failing on
java.lang.NoClassDefFoundError: org/json/JSONException
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:10847)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10047)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10128)
at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:209)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:227)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:424)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:308)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1122)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1170)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
at org.apache.hudi.hive.HoodieHiveClient.updateHiveSQLs(HoodieHiveClient.java:384)
at org.apache.hudi.hive.HoodieHiveClient.updateHiveSQLUsingHiveDriver(HoodieHiveClient.java:367)
at org.apache.hudi.hive.HoodieHiveClient.updateHiveSQL(HoodieHiveClient.java:357)
at org.apache.hudi.hive.HoodieHiveClient.createTable(HoodieHiveClient.java:262)
at org.apache.hudi.hive.HiveSyncTool.syncSchema(HiveSyncTool.java:176)
at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:130)
at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:94)
at org.apache.hudi.HoodieSparkSqlWriter$.org$apache$hudi$HoodieSparkSqlWriter$$syncHive(HoodieSparkSqlWriter.scala:321)
at org.apache.hudi.HoodieSparkSqlWriter$$anonfun$metaSync$2.apply(HoodieSparkSqlWriter.scala:363)
at org.apache.hudi.HoodieSparkSqlWriter$$anonfun$metaSync$2.apply(HoodieSparkSqlWriter.scala:359)
Even if I go to the cluster libraries and explicitly add this dependency it still fails on save. On the other hand, when I just create new JSONException("hello") in my notebook everything seem to work fine. What could cause this behaviour? Thanks
This is probably because the jar is not making it's way to the executor nodes, try addJar (https://spark.apache.org/docs/latest/api/java/org/apache/spark/SparkContext.html#addJar-java.lang.String-)
What version of Hudi are you using? There is a problem with JSON in version 0.6.0 and there is an opened issue. I suggest you to use version 0.5.2 by now.
Turns out that the problem was with different classpath between metastore service and spark process, because they run in separated JVM's. The problem was fixed in an init script that downloads the jar to the classpath folder.

Eclipse Kotlin plugin not compiling Kotlin classes

Using Eclipse 2018.12, I've installed the Eclipse Kotlin plugin (0.8.20.20200316-1305), created a Kotlin project (using the "New..." options under "Kotlin") and wrote a "Hello World!" program:
Test.kt:
fun main() = System.out.println("Hello World!")
However, when I try to run it I get the message
Error: Could not find or load main class TestKt
Caused by: java.lang.ClassNotFoundException: TestKt
Upon further inspection it seems the class has never been compiled. There is no corresponding .class file anywhere. Under the project directory I see Eclipse configuration directory .settings and files .project and .classpath. Besides that, I have an empty bin directory and a src directory with Test.kt only.
I found two similar questions about this in SO:
In Unable to Run Kotlin Application in Eclipse, the accepted answer simply indicates working with a new version of Eclipse and Kotlin plugin, both older than what I have now. Besides, the problem may have been solved in that case due to a new installation rather than the version.
Kotlin - Error: Could not find or load main class _DefaultPackage is quite old and the accepted answer does not apply anymore. It was about not naming the main class properly, but in my case there is not even a byte code file to be found.
Running "Project -> Compile Kotlin classes" had no effect.
How can I get this simple example to run?
Update: I've updated to 2020-06 (not that it should matter since Eclipse Kotlin lists 2018-12 in its requirements) and replaced Zuly by AdoptOpenJDK HotSpotJDK 11. The error persists. The Eclipse log does not show any related messages.
Update 2: re-created the project in a brand-new workspace but the problem persists.

can't compile play showcase html

I installed playN however I get this error:
[INFO] --- gwt-maven-plugin:2.4.0:compile (default) # playn-showcase-html ---
[ERROR] Error: Could not find or load main class com.google.gwt.dev.Compiler
I checked the m2 repository, and the gwt jars for 2.4 2.5 gwt seem to be there.
If I try to use GWT 2.5 in the project then I get this
The GWT SDK 'C:\Users\user.m2\repository\com\google\gwt' on the project's build path is not valid (Version is not supported, must be 2.0.0 or later)
playn-showcase-html
Unknown Google Web Toolkit Problem
Does this make sense at all?
How to fix it?
Thanks
It seems that some gwt jars in the maven repository were corrupted. I deleted them, i run the playN sample again, the jars were downloaded correctly, and the whole think worked
C:\Users\user.m2 <-- this is very strange. it looks like somehow your maven repository path is munged up.
Check your environmental variables for MAVEN_REPOSITORY and see if its set to C:\Users\user. Also, find your maven installation directory, and look under the conf directory for a settings.xml, and see if you have <localRepository>${env.MAVEN_REPOSITORY}/.m2/repository</localRepository>.

noClassDefFoundError using Scala Plugin for Eclipse

I successfully implemented and ran several Scala tutorials in Eclipse using the Scala plugin. Then suddenly I tried to compile and run an example, and this error came up:
Exception in thread "main" java.lang.NoClassDefFoundError: hello/HelloWorld
Caused by: java.lang.ClassNotFoundException: hello.HelloWorld
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:315)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:330)
at java.lang.ClassLoader.loadClass(ClassLoader.java:250)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:398)
After this point I could no longer run any Scala programs in Eclipse. I tried cleaning and rebuilding my project, closing and reopening my project, and closing and reopening Eclipse.
Eclipse version number 3.5.2 and Scala plugin 2.8.0
Here is the original code:
package hello
object HelloWorld {
def main(args: Array[String]){
println("hello world")
}
}
If you see this when you attempt to run as a Scala application then the most likely explanation is that your project didn't compile and no class files were generated. Please check whether or not that's the case: look in your project's output folder for hello/HelloWorld.class.
If your project didn't compile that could either be because there's an error which you've missed (and if this error isn't being reported in the Problems view that could be a bug, in which case please open a ticket on Trac) or because you've turned off automatic builds and not done a manual build of your project.
I had the same problem. Project doesn't compile but there are no errors highlighted and AFAIK the code is OK. It seems to be a problem with the Run Configurations.
Solution 1: Delete the existing Run Configuration for your object and create a new one
Solution 2: Create a new object and cut / paste all your code into that file
When running "clean" does not un-hose Eclipse, I next try saving my work, exiting Eclipse, and re-starting. That usually gets things going again, but not always. A few times I've had to update the Scala plugin with a more recent version (I'm using the latest nightly), to get things working again. I doubt that this worked because the new plugin happened to fix the bug, but rather expect that loading the new plugin gives the whole Eclipse-Scala
system a "total reset" that gets it unhosed.
I was getting this problem in a project that combined .java & .scala files.
The solution for me was:
Remove all .java files
Edit the scala code as needed so it compiles without them.
Add the .java files back in.
Edit the scala code back.
The other solutions given here didn't work for me. I tried: clean project, restarting Eclipse, closing-&-opening the project, creating a new .scala file. No joy.
I'm using Eclipse 3.7 (latest stable), Scala IDE 2.0.0 and Scala 2.9 on Ubuntu Linux 11.10.
The symptoms in my case were:
My project was working, but then it stopped compiling for no apparent reason. The IDE didn't show any compilation errors for .scala files, but there were no .class files in the output directory & I got a NoClassDefError if I tried to run anything.
If I created a deliberate error in a .scala file, that did get picked up as a compilation error.
The .java files were registering errors due to the missing scala classes.
I suppose there's probably a boot-strapping bug somewhere in the IDE plugin for .java/.scala mixes. I've done hybrid projects with this setup without problems, so it's only triggered in some situations. I don't know what the trigger is, but once triggered, there's no nice solution.
I had moved my one and only class/object/application to a package, but had not added the package declaration.
sbt compiled and ran fine; eclipse would not
Adding the package declaration at the top of the file fixed it.
Scala 2.8.3 plugin; no compile error
I encountered this error too but after doing the suggestions here (cleaning, deleting Run Configuration etc), I realized that I set the workspace wrongly that is why the class is not being found.
An indication that this is a problem is when the same error occurs when you try to compile a java project.
I encountered this error (compilation worked in sbt but failed in eclipse) when I created a new package object called "common". Deleting the package object in eclipse caused the compile error to go away. There was nothing in it.
I was using sbt-eclipse to build the eclipse project. I'm using scala eclipse 3.0.0-vfinal-20130326-1146-Typesafe.