Why not giving compilation error? - scala

I have below piece of code
object SubClass extends MyTrait {
private[this] val a = 10
def main(args: Array[String]) {
println(a)
}
}
trait MyTrait {
protected val a = 5
}
And it gives following runtime error. Can somebody explain why we didn't catch it in compile time.
Exception in thread "main" java.lang.ClassFormatError: Duplicate field
name&signature in class file SubClass$ at
java.lang.ClassLoader.defineClass1(Native Method) at
java.lang.ClassLoader.defineClassCond(ClassLoader.java:631) at
java.lang.ClassLoader.defineClass(ClassLoader.java:615) at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283) at
java.net.URLClassLoader.access$000(URLClassLoader.java:58) at
java.net.URLClassLoader$1.run(URLClassLoader.java:197) at
java.security.AccessController.doPrivileged(Native Method) at
java.net.URLClassLoader.findClass(URLClassLoader.java:190) at
java.lang.ClassLoader.loadClass(ClassLoader.java:306) at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at
java.lang.ClassLoader.loadClass(ClassLoader.java:247) at
SubClass.main(TraitTest.scala)

Because software has bugs?
https://issues.scala-lang.org/browse/SI-7475
That would be my guess.
The related ticket has received recent attention:
https://issues.scala-lang.org/browse/SI-2568

Related

java.lang.NoClassDefFoundError issue while running a scala code for a UDF

I am trying to write a UDF in scala in order to get all the months between two dates passed. This is what i have written.
package com.company.datediff
import org.apache.hadoop.hive.ql.exec.UDF
import java.time._
class hive_udf extends UDF {
def evaluate(date1: String, date2: String): String = {
val s1 = LocalDate.parse(date1)
val s2= LocalDate.parse(date2)
val p = Period.between(s1, s2)
val l=p.getMonths()
val min1= s1.getMonthValue()
val max1= s2.getMonthValue()
var arr1=""
for (i <- min1 to max1){
arr1=arr1.concat(","+ i)
}
/*var i=min1
while (i<= max1){
arr1=arr1.concat(","+ i)
}*/
return arr1
}
}
When running this code without for loop, code runs perfectly fine. After inclusion of for loop, I am getting 'java.lang.NoClassDefFoundError' and
Execution Error, return code -101 from
'org.apache.hadoop.hive.ql.exec.FunctionTask. scala/Function1'
PFB the details of full error:
java.lang.NoClassDefFoundError: scala/Function1
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.hive.ql.exec.Registry.registerToSessionRegistry(Registry.java:518)
at org.apache.hadoop.hive.ql.exec.Registry.registerPermanentFunction(Registry.java:207)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.registerPermanentFunction(FunctionRegistry.java:1536)
at org.apache.hadoop.hive.ql.exec.FunctionTask.createPermanentFunction(FunctionTask.java:136)
at org.apache.hadoop.hive.ql.exec.FunctionTask.execute(FunctionTask.java:75)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1748)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1494)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1291)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1158)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1148)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:217)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:169)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:380)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:740)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:685)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:233)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: java.lang.ClassNotFoundException: scala.Function1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 26 more
FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.FunctionTask. scala/Function1
I have a limited exposure to java or Scala. Not sure where I am going wrong. Any help is appreciated. Thanks

WebSockets with Embedding Jetty

I implemented example explained in this link. But I rewrite it in scala. But I am getting java.lang.NoClassDefFoundError: org/eclipse/jetty/io/nio/AsyncConnection error.
Here is my RoomWebSocketHandler class:
import java.io.IOException
import javax.servlet.http.HttpServletRequest
import org.eclipse.jetty.websocket.WebSocket.Connection
import org.eclipse.jetty.websocket.{WebSocket, WebSocketHandler}
import scala.collection.mutable
class RoomWebSocketHandler extends WebSocketHandler {
private val webSockets = new mutable.ArrayBuffer[StateWebSocket]()
override def doWebSocketConnect(request: HttpServletRequest, protocol: String): WebSocket = {
new StateWebSocket()
}
private class StateWebSocket extends WebSocket.OnTextMessage {
var connection: Connection = _
def onOpen(connection: Connection) {
this.connection = connection
webSockets += this
}
def onMessage(data: String) {
try {
for (webSocket <- webSockets) {
webSocket.connection.sendMessage(data)
}
} catch {
case x: IOException => this.connection.close()
}
}
def onClose(closeCode: Int, message: String) {
webSockets -= this
}
}
}
and this is my Main class:
import java.net.InetSocketAddress
import grizzled.slf4j.Logger
import org.eclipse.jetty.server.{Handler, Server}
import org.eclipse.jetty.server.handler.{DefaultHandler, HandlerList, ResourceHandler}
import org.eclipse.jetty.servlet.ServletContextHandler
object Main {
var jettyServer: Option[Server] = None
def startServer(): Unit = {
val LOCAL_PORT = 4041
logger.debug("startServer begin")
jettyServer match {
case Some(s) =>
logger.info("Server is already running")
logger.debug("startServer end")
return
case None =>
logger.info("Server is not running")
}
val server = new Server(new InetSocketAddress("127.0.0.1", LOCAL_PORT))
server.setStopAtShutdown(true)
val handlers = new HandlerList()
val roomWebSocketHandler = new RoomWebSocketHandler();
roomWebSocketHandler.setHandler(new DefaultHandler());
handlers.setHandlers(Array[Handler](roomWebSocketHandler, new DefaultHandler()))
server.setHandler(handlers)
logger.debug("Starting jetty-server")
jettyServer = Some(server)
server.start()
logger.info("Server started on port: " + LOCAL_PORT)
logger.debug("startServer end")
}
def main(args: Array[String]): Unit = {
startServer()
}
}
and here is my dependencies:
dependencies {
compile 'org.slf4j:slf4j-api:1.7.12'
compile 'ch.qos.logback:logback-classic:1.1.3'
compile 'com.h2database:h2:1.4.188'
compile 'org.clapper:grizzled-slf4j_2.11:1.0.2'
compile 'org.eclipse.jetty:jetty-webapp:9.3.3.v20150827'
compile 'org.eclipse.jetty:jetty-websocket:8.1.17.v20150415'
compile 'org.eclipse.jetty:jetty-http:8.1.17.v20150415'
compile 'org.eclipse.jetty:jetty-io:8.1.17.v20150415'
compile 'org.eclipse.jetty:jetty-util:8.1.17.v20150415'
compile 'org.eclipse.jetty:jetty-continuation:8.1.17.v20150415'
compile 'org.eclipse.jetty:jetty-server:8.1.17.v20150415'
compile 'org.eclipse.jetty:jetty-jmx:8.1.17.v20150415'
compile 'commons-cli:commons-cli:1.3.1'
compile 'org.scala-lang:scala-library:2.11.7'
runtime 'javax.servlet:javax.servlet-api:3.1.0'
testCompile 'org.scalacheck:scalacheck_2.11:1.12.4'
}
Error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/eclipse/jetty/io/nio/AsyncConnection
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.eclipse.jetty.websocket.WebSocketHandler.<init>(WebSocketHandler.java:32)
at mypackage.RoomWebSocketHandler.<init>(RoomStateWebSocketHandler.scala:15)
at mypackage.Main$.startServer(Main.scala:48)
at mypackage.Main$.main(Main.scala:109)
at mypackage.Main.main(Main.scala)
Caused by: java.lang.ClassNotFoundException: org.eclipse.jetty.io.nio.AsyncConnection
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 29 more
I updated all my dependencies to Jetty 9.x and used examples shown in this link

Scala 2.8 HelloWorld Program ClassNotFoundException

I'm trying to run a scala program,
class HelloWorld {
var myField = 0;
def getMyField() : Int = {
return this.myField;
}
}
I keep getting a ClassNotFoundException even though the program is named HelloWorld.scala
Full Error below
Exception in thread "main" java.lang.ClassNotFoundException: HelloWorld
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:190)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:122)
Okay, looking at the documentation of scala http://www.scala-lang.org/documentation/getting-started.html
I changed the code to this,
object HelloWorld {
def main(args: Array[String]) {
println("Hello, world!")
}
}
This is the error now
Information:24/7/15 10:55 AM - Compilation completed with 1 error and 7 warnings in 2s 32ms
Error:scalac: Error: org.jetbrains.jps.incremental.scala.remote.ServerException
Error compiling sbt component 'compiler-interface-2.8.0.final-51.0'
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply(AnalyzingCompiler.scala:145)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply(AnalyzingCompiler.scala:142)
at sbt.IO$.withTemporaryDirectory(IO.scala:285)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:142)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:139)
at sbt.IO$.withTemporaryDirectory(IO.scala:285)
at sbt.compiler.AnalyzingCompiler$.compileSources(AnalyzingCompiler.scala:139)
at sbt.compiler.IC$.compileInterfaceJar(IncrementalCompiler.scala:33)
at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl$.org$jetbrains$jps$incremental$scala$local$CompilerFactoryImpl$$getOrCompileInterfaceJar(CompilerFactoryImpl.scala:87)
at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl$$anonfun$getScalac$1.apply(CompilerFactoryImpl.scala:44)
at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl$$anonfun$getScalac$1.apply(CompilerFactoryImpl.scala:43)
at scala.Option.map(Option.scala:145)
at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl.getScalac(CompilerFactoryImpl.scala:43)
at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl.createCompiler(CompilerFactoryImpl.scala:22)
at org.jetbrains.jps.incremental.scala.local.CachingFactory$$anonfun$createCompiler$1.apply(CachingFactory.scala:24)
at org.jetbrains.jps.incremental.scala.local.CachingFactory$$anonfun$createCompiler$1.apply(CachingFactory.scala:24)
at org.jetbrains.jps.incremental.scala.local.Cache$$anonfun$getOrUpdate$2.apply(Cache.scala:20)
at scala.Option.getOrElse(Option.scala:120)
at org.jetbrains.jps.incremental.scala.local.Cache.getOrUpdate(Cache.scala:19)
at org.jetbrains.jps.incremental.scala.local.CachingFactory.createCompiler(CachingFactory.scala:23)
at org.jetbrains.jps.incremental.scala.local.LocalServer.compile(LocalServer.scala:22)
at org.jetbrains.jps.incremental.scala.remote.Main$.make(Main.scala:62)
at org.jetbrains.jps.incremental.scala.remote.Main$.nailMain(Main.scala:20)
at org.jetbrains.jps.incremental.scala.remote.Main.nailMain(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.martiansoftware.nailgun.NGSession.run(NGSession.java:319)
Warning:scalac: /tmp/sbt_20e2573f/compiler-interface-sources/CompilerInterface.scala:161: error: object creation impossible, since method registerTopLevelSym in trait GlobalCompat of type (sym: this.Symbol)Unit is not defined
Warning:scalac: new Compiler() with RangePositions // unnecessary in 2.11
Warning:scalac: ^
Warning:scalac: /tmp/sbt_20e2573f/compiler-interface-sources/CompilerInterface.scala:165: error: class Compiler needs to be abstract, since method registerTopLevelSym in trait GlobalCompat of type (sym: Compiler.this.Symbol)Unit is not defined
Warning:scalac: class Compiler extends CallbackGlobal(command.settings, dreporter, output)
Warning:scalac: ^
Warning:scalac: two errors found
Your errors are not in the code but in the way you're trying to run the code.
I loaded the code into the REPL and got no error. I loaded the code into IntelliJ and got lots of style warnings but no errors. I compiled the code and ran it from the command line:
%> java HelloWorld
Error: Main method not found in class HelloWorld, please define the
main method as:
public static void main(String[] args)
Which is what one should expect.
I am not getting how you are trying to run your program. Since I am not getting any errors. Let me share you a simple way to run your code snippet.
class HelloWorld{
var myField = 0;
def getMyField() : Int = {
return this.myField;
}
}
object Test extends App{
println(new HelloWorld().getMyField())
}
Try to run Test object by right clicking inside Test. Hope I was helpful.
Assuming you are running IntelliJ, try to build the project (CTRL+F9) and then run it. I had the same issue and it worked for me.

Standalone HBase with Spark, HBaseTest.scala is giving error

Hi I am using standalone hbase and I want to test spark on it. There is no hadoop on my machine.
when I try to get count of a table using HBaseTest.scala (in scala examples)
I get following error:
ERROR TableInputFormat: java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:416)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:393)
at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:274)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:194)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:156)
at org.apache.hadoop.hbase.mapreduce.TableInputFormat.setConf(TableInputFormat.java:101)
at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:91)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1632)
at org.apache.spark.rdd.RDD.count(RDD.scala:1012)
at org.apache.spark.examples.HBaseTest$.main(HBaseTest.scala:59)
at org.apache.spark.examples.HBaseTest.main(HBaseTest.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:607)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:190)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:414)
... 23 more
Caused by: java.lang.VerifyError: class org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result overrides final method getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:176)
at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:69)
at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:857)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:662)
... 28 more
Exception in thread "main" java.io.IOException: No table was provided.
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:154)
at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:95)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1632)
at org.apache.spark.rdd.RDD.count(RDD.scala:1012)
at org.apache.spark.examples.HBaseTest$.main(HBaseTest.scala:59)
at org.apache.spark.examples.HBaseTest.main(HBaseTest.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:607)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:190)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I am not able to figure out whats the issue here.
HBaseTest.scala:
object HBaseTest {
def main(args: Array[String]) {
val sparkConf = new SparkConf().setAppName("HBaseTest").setMaster("local")
val sc = new SparkContext(sparkConf)
val conf = HBaseConfiguration.create()
// Other options for configuring scan behavior are available. More information available at
// http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/mapreduce/TableInputFormat.html
conf.set("zookeeper.znode.parent", "/hbase-unsecure")
conf.set("hbase.zookeeper.quorum", "localhost")
conf.set("hbase.zookeeper.property.clientPort","2181")
conf.addResource(new Path("/usr/lib/hbase/hbase-0.94.8/conf/hbase-site.xml"))
conf.set(TableInputFormat.INPUT_TABLE,"test")
// Initialize hBase table if necessary
val admin = new HBaseAdmin(conf)
if (!admin.isTableAvailable("test")) {
print ("inside if statement")
val tableDesc = new HTableDescriptor(TableName.valueOf("test"))
admin.createTable(tableDesc)
}
val hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat],
classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable],
classOf[org.apache.hadoop.hbase.client.Result])
hBaseRDD.count()
sc.stop()
}
}
You ar using TableInputFormat class as input format. TableInputFormat class is belong to hadoop Map-reduce API. You need to install hadoop for using TableInputFormat.

Dynamically loading a Scala object

I have a number of objects (not classes) that manipulate databases, and I want to make a smaller helper class so I can do something like java my.helper.class my.database.class and execute the the run method.
For example, this compiles
trait A extends Runnable
class B extends A { def run() = println("run") }
object Test extends App {
Class.forName(args(0)).newInstance().asInstanceOf[A].run()
}
And then does what I expect.
$scala Test B
run
This also compiles
trait A extends Runnable
object B extends A { def run() = println("run") }
object Test extends App {
Class.forName(args(0)).newInstance().asInstanceOf[A].run()
}
But this happens:
$scala Test B
java.lang.InstantiationException: B
at java.lang.Class.newInstance(Class.java:418)
at Test$.delayedEndpoint$Test$1(Test.scala:9)
at Test$delayedInit$body.apply(Test.scala:8)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:383)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at Test$.main(Test.scala:8)
at Test.main(Test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at scala.reflect.internal.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:68)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:99)
at scala.reflect.internal.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:68)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:99)
at scala.tools.nsc.CommonRunner$class.run(ObjectRunner.scala:22)
at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:39)
at scala.tools.nsc.CommonRunner$class.runAndCatch(ObjectRunner.scala:29)
at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:39)
at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:72)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:94)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:103)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
Caused by: java.lang.NoSuchMethodException: B.<init>()
at java.lang.Class.getConstructor0(Class.java:2971)
at java.lang.Class.newInstance(Class.java:403)
... 28 more
Which makes sense, and I figured this would work:
$scala Test B$
java.lang.IllegalAccessException: Class Test$ can not access a member of class B$ with modifiers "private"
at sun.reflect.Reflection.ensureMemberAccess(Reflection.java:101)
at java.lang.Class.newInstance(Class.java:427)
at Test$.delayedEndpoint$Test$1(Test.scala:9)
at Test$delayedInit$body.apply(Test.scala:8)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:383)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at Test$.main(Test.scala:8)
at Test.main(Test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at scala.reflect.internal.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:68)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:99)
at scala.reflect.internal.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:68)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:99)
at scala.tools.nsc.CommonRunner$class.run(ObjectRunner.scala:22)
at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:39)
at scala.tools.nsc.CommonRunner$class.runAndCatch(ObjectRunner.scala:29)
at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:39)
at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:72)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:94)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:103)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
But it also fails. I know I could just make all these static objects into classes, but that doesn't makes sense in this application, so I'm specifically looking for the elegant way to do this.
I personally think the most elegant way is to not dynamically load things like this. Is it really that difficult to specify the valid input? This allows much greater flexibility with respect to where your instances of A come from.
object Test extends App {
args(0) match {
case "B" => B
case "C" =>
val someOtherConfig = args(1)
new C(someOtherParam)
case other => throw new Exception("invalid input")
} run
}
I would use Scopt to parse parameters