Scala akka actor system - RuntimeException when loading extension - scala

I am new with scala and I face the following problem.
import akka.actor.Actor
import akka.actor.Props
import akka.event.Logging
import akka.actor.ActorSystem
object test extends App {
val system = ActorSystem("hello-world")
val myActor = system.actorOf(Props[MyActor], "myactor2")
}
class MyActor extends Actor {
val log = Logging(context.system, this)
val props1 = Props[MyActor]
def receive = {
case "test" => log.info("received test")
case _ => log.info("received unknown message")
}
val child = context.actorOf(Props[MyActor], name = "myChild")
}
So when I run it, I receive
[error] (run-main-1) java.lang.RuntimeException: While trying to load extension [akka.actor.InstanceCountingExtension]
java.lang.RuntimeException: While trying to load extension [akka.actor.InstanceCountingExtension]
at akka.actor.ActorSystemImpl.$anonfun$loadExtensions$1(ActorSystem.scala:906)
at scala.collection.Iterator.foreach(Iterator.scala:929)
at scala.collection.Iterator.foreach$(Iterator.scala:929)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1406)
at scala.collection.IterableLike.foreach(IterableLike.scala:71)
...
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
Caused by: java.lang.ClassNotFoundException: akka.actor.InstanceCountingExtension
at java.lang.ClassLoader.findClass(ClassLoader.java:530)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
[trace] Stack trace suppressed: run last compile:run for the full output.
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
The problem is the ActorSystem, but i can't understand why. When I delete the system an myActor everything works fine. It even works when i make it only val system = ActorSystem but then system couldn't create actorOf.

You probably have this setting in your application.conf
akka.library-extensions += "akka.actor.InstanceCountingExtension"
This is an actor system extension used by akka-actor-tests to make assertions about the number of actor instances currently running in a system.
I don't think this is intended as an extension to be used by applications, it is more like a test utility.
However, depending on what you need:
if you just want to run your program, just delete the configuration line above, and all should be fine
if you need the extension for some reason, import akka-actor-tests test jar, or alternatively make sure you got this class available at runtime.

The problem was with the test folder. When I removed the test it started working.

Related

OpenJFX application fails to launch using Scala

Launching following application fails with java.lang.NoSuchMethodException: dev.buildingdragons.dragon.Dragon$.<init>() (find the full stack trace below).
To me, it looks like there is a constructor missing, but which? And Why?
I do now there are projects like ScalaFX but before using them I want to fully understand what is going on so I really want to create a walking skeleton.
Environment:
Windows 10 Professional
IntelliJ IDEA Ultimate 2019.1.2
OpenJDK 64-Bit Server VM Zulu11.2+3 (build 11.0.1+13-LTS, mixed mode)
Dragon.scala:
package dev.buildingdragons.dragon
import javafx.application.{Application, Platform}
import javafx.scene.Scene
import javafx.scene.control.Button
import javafx.stage.Stage
object Dragon extends Application {
def main(args: Array[String]) = Application.launch(args: _*)
override def start(stage: Stage): Unit = {
val scene = new Scene(new Button("Test"))
stage.setTitle("Hello, Dragon!")
stage.setScene(scene)
stage.showAndWait()
Platform.exit()
}
}
build.gradle:
plugins {
id 'scala'
id 'org.openjfx.javafxplugin' version '0.0.7'
}
compileScala.targetCompatibility = 1.8
// In this section you declare where to find the dependencies of your project
repositories {
mavenCentral()
}
dependencies {
compile 'org.scala-lang:scala-library:2.12.8'
compile 'org.scalafx:scalafx_2.12:11-R16'
}
javafx {
version = "11.0.2"
modules = ['javafx.controls']
}
Full stack trace:
Exception in Application constructor
Exception in thread "main" java.lang.RuntimeException: Unable to construct Application instance: class dev.buildingdragons.dragon.Dragon$
at com.sun.javafx.application.LauncherImpl.launchApplication1(LauncherImpl.java:890)
at com.sun.javafx.application.LauncherImpl.lambda$launchApplication$2(LauncherImpl.java:195)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.lang.NoSuchMethodException: dev.buildingdragons.dragon.Dragon$.<init>()
at java.base/java.lang.Class.getConstructor0(Class.java:3350)
at java.base/java.lang.Class.getConstructor(Class.java:2152)
at com.sun.javafx.application.LauncherImpl.lambda$launchApplication1$8(LauncherImpl.java:801)
at com.sun.javafx.application.PlatformImpl.lambda$runAndWait$12(PlatformImpl.java:455)
at com.sun.javafx.application.PlatformImpl.lambda$runLater$10(PlatformImpl.java:428)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at com.sun.javafx.application.PlatformImpl.lambda$runLater$11(PlatformImpl.java:427)
at com.sun.glass.ui.InvokeLaterDispatcher$Future.run(InvokeLaterDispatcher.java:96)
at com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at com.sun.glass.ui.win.WinApplication.lambda$runLoop$3(WinApplication.java:174)
... 1 more
EDIT: As suggested by eugene-ryzhikov I modified Dragon.scala:
package dev.buildingdragons.dragon
import javafx.application.{Application, Platform}
import javafx.scene.Scene
import javafx.scene.control.Button
import javafx.stage.Stage
class Dragon extends Application {
override def start(stage: Stage): Unit = {
val scene = new Scene(new Button("Test"))
stage.setTitle("Hello, Dragon!")
stage.setScene(scene)
stage.showAndWait()
Platform.exit()
}
}
object Dragon {
def main(args: Array[String]) = Application.launch(classOf[Dragon], args: _*)
}
That solved the original problem but now I ran into the problem I was afraid of: Java's Project Jigsaw:
Exception in Application start method
java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at javafx.graphics/com.sun.javafx.application.LauncherImpl.launchApplicationWithArgs(LauncherImpl.java:464)
at javafx.graphics/com.sun.javafx.application.LauncherImpl.launchApplication(LauncherImpl.java:363)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at java.base/sun.launcher.LauncherHelper$FXHelper.main(LauncherHelper.java:1051)
Caused by: java.lang.RuntimeException: Exception in Application start method
at javafx.graphics/com.sun.javafx.application.LauncherImpl.launchApplication1(LauncherImpl.java:900)
at javafx.graphics/com.sun.javafx.application.LauncherImpl.lambda$launchApplication$2(LauncherImpl.java:195)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.lang.IllegalAccessError: superclass access check failed: class com.sun.javafx.scene.control.ControlHelper (in unnamed module #0x4e08e183) cannot access class com.sun.javafx.scene.layout.RegionHelper (in module javafx.graphics) because module javafx.graphics does not export com.sun.javafx.scene.layout to unnamed module #0x4e08e183
at java.base/java.lang.ClassLoader.defineClass1(Native Method)
at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1016)
at java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:174)
at java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:802)
at java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:700)
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:623)
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
at javafx.scene.control.Control.<clinit>(Control.java:86)
at dev.buildingdragons.dragon.Dragon.start(Dragon.scala:10)
at javafx.graphics/com.sun.javafx.application.LauncherImpl.lambda$launchApplication1$9(LauncherImpl.java:846)
at javafx.graphics/com.sun.javafx.application.PlatformImpl.lambda$runAndWait$12(PlatformImpl.java:455)
at javafx.graphics/com.sun.javafx.application.PlatformImpl.lambda$runLater$10(PlatformImpl.java:428)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at javafx.graphics/com.sun.javafx.application.PlatformImpl.lambda$runLater$11(PlatformImpl.java:427)
at javafx.graphics/com.sun.glass.ui.InvokeLaterDispatcher$Future.run(InvokeLaterDispatcher.java:96)
at javafx.graphics/com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at javafx.graphics/com.sun.glass.ui.win.WinApplication.lambda$runLoop$3(WinApplication.java:174)
... 1 more
Exception running application dev.buildingdragons.dragon.Dragon
I believe that is due to the fact that Scala does not enforce module restrictions at compile time but the JVM does at runtime.
That error is the consequence of having your start method within the object. Move it to the new Dragon class with main method in the companion Dragon object. Also use the launch method where you can pass the application class.

java.lang.ClassNotFoundException in scala program

I am trying to connect hbase from spark and I want to run scala jar file in spark-submit. Im not sure how to write classes in scala, can any one help
package com.jeevan.sparkhbase
import org.apache.spark._
import org.apache.spark.rdd.NewHadoopRDD
import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor}
import org.apache.hadoop.hbase.client.HBaseAdmin
import org.apache.hadoop.hbase.mapreduce.TableInputFormat
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HColumnDescriptor
import org.apache.hadoop.hbase.util.Bytes
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.client.HTable;
class InsertData {
def main(arg: Array[String]) {
val conf = HBaseConfiguration.create()
val tableName = "emp"
conf.set(TableInputFormat.INPUT_TABLE, tableName)
val myTable = new HTable(conf, tableName);
var p = new Put(new String("row999").getBytes());
p.add("cf".getBytes(), "column_name".getBytes(), new String("value999").getBytes());
myTable.put(p);
myTable.flushCommits();
}
}
I used maven to build jar and want to execute this jar file with spark-submit. following is the spark-submit command i used to run the jar
spark-submit --class com.jeevan.sparkhbase.InsertData --master local[*] SHIntegration-0.0.1-SNAPSHOT-jar-with-dependencies.jar
I am getting this error
java.lang.ClassNotFoundException: com.jeevan.sparkhbase.InsertData
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:230)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:732)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
can someone how to write this above code with class and object. appreciate your help
A couple things could be wrong here including how you packaged your jar.
First, InsertData should be an object not a class.
object InsertData {
def main(arg: Array[String]) {
// stuff
}
}
Second, you aren't actually connecting to Spark anywhere. You'll need to add something like this in your app:
val spark = SparkSession.builder().appName(jobName).master("local[1]").getOrCreate()
Check out my spark-hello-world for a complete example project.

Eclipse will not recognize my Scala main

What I've tried:
Ensuring Scala perspective is set, disabled, set again
Ensuring that I right click on the Object that extends App
I never get 'Run as Scala application...' at this point
Defining an explicit 'main' in Object after removing 'extends App'
What I get is 'Run Configurations..." when I right-click on Object with main(). No matter what I do there, e.g. enter name of Object that extends App or has an explicit main, main() is never found and I get classloader stack trace dumps indicating that there is no main.
While I get no Scala compile errors, no matter what I try, I never get 'Run As...Scala Application'.
Code:
object fatfinger extends App {
import com.mongodb.casbah.Imports._
import Common._
import MongoFactory._ */
object Insert {
def main(args: Array[String]) {
val apple = Stock("AAPL", 600)
val google = Stock("GOOG", 650)
val netflix = Stock("NFLX", 60)
saveStock(apple)
saveStock(google)
saveStock(netflix) }
def saveStock(stock: Stock) {
val mongoObj = buildMongoDbObject(stock)
MongoFactory.collection.save(mongoObj) } }
}
Stacktrace:
Exception in thread "main" java.lang.NoClassDefFoundError: Insert
Caused by: java.lang.ClassNotFoundException: Insert
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)

Writing ByteString to File Causing NullPointerException

I've got an Akka actor that reads the contents of a file in chunks of around 1500 bytes. When the actor receives a NextBlock message it replies with the next block of data wrapped in a ByteString. A couple very simple tests and manually eyeballing data indicate the actor is working correctly. I'm using Scala 2.11.5, Akka 2.3.9, ScalaTest 2.2.1 and SBT 0.13.5.
I'm having an issue setting up larger test. I want to write around 10kB or so of a test pattern data into a file then verify the Actor's output is what I expect. I'm creating the test pattern via ByteStringBuilder. When I go to write the test data to the file I'm getting NullPointerExceptions. Here is the code for a striped down version of the test that exhibits the issue:
import java.nio.ByteOrder
import java.nio.file.StandardOpenOption._
import java.nio.file.{Files, Paths}
import akka.actor.ActorSystem
import akka.testkit.TestKit
import akka.util.{ByteString, ByteStringBuilder}
import org.scalatest.{BeforeAndAfterAll, Matchers, WordSpecLike}
class ByteBufferTest extends TestKit(ActorSystem("ByteBufferTest"))
with WordSpecLike with Matchers with BeforeAndAfterAll {
implicit val byteOrder = ByteOrder.BIG_ENDIAN
val file = Paths.get("test.data")
Files.deleteIfExists(file)
createTestFile()
"A ByteBufferTest" must { "work" in { assert(true) } }
def createTestFile(): Unit = {
val out = Files.newByteChannel(file, CREATE, WRITE)
out.write(contents.toByteBuffer) // Here is where the NPE occurs
out.close()
}
val contents: ByteString = {
val builder = new ByteStringBuilder
(0 to 255).foreach(builder.putInt)
builder.result()
}
override protected def afterAll(): Unit = {
Files.delete(file)
system.shutdown()
}
}
I've tried getting around this a bunch of different ways
converting the ByteString to a ByteBuffer and writing it via ByteChannel (show above)
write individual bytes of the ByteString to a BufferedOutputStream
convert the ByteString to Array[Byte] and write that via BufferedOutputStream
No matter what I try I keep ending up with something along the lines of
[debug] Running TaskDef(demo.ByteBufferTest, org.scalatest.tools.Framework$$anon$1#40fbf2c, false, [SuiteSelector])
java.lang.NullPointerException at demo.ByteBufferTest.createTestFile(ByteBufferTest.scala:32)
at demo.ByteBufferTest.<init>(ByteBufferTest.scala:21)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:374)
at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:641)
at sbt.TestRunner.runTest$1(TestFramework.scala:84)
at sbt.TestRunner.run(TestFramework.scala:94)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:219)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:219)
at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:207)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:219)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:219)
at sbt.TestFunction.apply(TestFramework.scala:224)
at sbt.Tests$$anonfun$7.apply(Tests.scala:196)
at sbt.Tests$$anonfun$7.apply(Tests.scala:196)
at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:45)
at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:45)
at sbt.std.Transform$$anon$4.work(System.scala:64)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:244)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Anyone have any ideas what I'm doing wrong?
I've put a example project up on github
I didn't find the exact root cause but it seems that it is related to the initialization when the test class is launched from scala test.
I suggest to put the createTestFile() in the method of beforeAll(). Tested and it works.
override def beforeAll {
createTestFile()
}

scala code throw exception in spark

I am new to scala and spark. Today I tried to write some code, and let it run on spark, but got an exception.
this code work in local scala
import org.apache.commons.lang.time.StopWatch
import org.apache.spark.{SparkConf, SparkContext}
import scala.collection.mutable.ListBuffer
import scala.util.Random
def test(): List[Int] = {
val size = 100
val range = 100
var listBuffer = new ListBuffer[Int] // here throw an exception
val random = new Random()
for (i <- 1 to size)
listBuffer += random.nextInt(range)
listBuffer.foreach(x => println(x))
listBuffer.toList
}
but when I put this code into spark, it throw an exception says:
15/01/01 14:06:17 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
at com.tudou.sortedspark.Sort$.test(Sort.scala:35)
at com.tudou.sortedspark.Sort$.sort(Sort.scala:23)
at com.tudou.sortedspark.Sort$.main(Sort.scala:14)
at com.tudou.sortedspark.Sort.main(Sort.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
if I comment out the below code, the code work in spark
for (i <- 1 to size)
can someone explain why, please.
Thanks #Imm, I have solved this issue. The root cause is that my local scala is 2.11.4, but my spark cluster is running at 1.2.0 version. The 1.2 version of spark was compiled by 2.10 scala.
So the solution is compile local code by 2.10 scala, and upload the compiled jar into spark. Everything works fine.