I have a number of objects (not classes) that manipulate databases, and I want to make a smaller helper class so I can do something like java my.helper.class my.database.class and execute the the run method.
For example, this compiles
trait A extends Runnable
class B extends A { def run() = println("run") }
object Test extends App {
Class.forName(args(0)).newInstance().asInstanceOf[A].run()
}
And then does what I expect.
$scala Test B
run
This also compiles
trait A extends Runnable
object B extends A { def run() = println("run") }
object Test extends App {
Class.forName(args(0)).newInstance().asInstanceOf[A].run()
}
But this happens:
$scala Test B
java.lang.InstantiationException: B
at java.lang.Class.newInstance(Class.java:418)
at Test$.delayedEndpoint$Test$1(Test.scala:9)
at Test$delayedInit$body.apply(Test.scala:8)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:383)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at Test$.main(Test.scala:8)
at Test.main(Test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at scala.reflect.internal.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:68)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:99)
at scala.reflect.internal.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:68)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:99)
at scala.tools.nsc.CommonRunner$class.run(ObjectRunner.scala:22)
at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:39)
at scala.tools.nsc.CommonRunner$class.runAndCatch(ObjectRunner.scala:29)
at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:39)
at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:72)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:94)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:103)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
Caused by: java.lang.NoSuchMethodException: B.<init>()
at java.lang.Class.getConstructor0(Class.java:2971)
at java.lang.Class.newInstance(Class.java:403)
... 28 more
Which makes sense, and I figured this would work:
$scala Test B$
java.lang.IllegalAccessException: Class Test$ can not access a member of class B$ with modifiers "private"
at sun.reflect.Reflection.ensureMemberAccess(Reflection.java:101)
at java.lang.Class.newInstance(Class.java:427)
at Test$.delayedEndpoint$Test$1(Test.scala:9)
at Test$delayedInit$body.apply(Test.scala:8)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:383)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at Test$.main(Test.scala:8)
at Test.main(Test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at scala.reflect.internal.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:68)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:99)
at scala.reflect.internal.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:68)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:99)
at scala.tools.nsc.CommonRunner$class.run(ObjectRunner.scala:22)
at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:39)
at scala.tools.nsc.CommonRunner$class.runAndCatch(ObjectRunner.scala:29)
at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:39)
at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:72)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:94)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:103)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
But it also fails. I know I could just make all these static objects into classes, but that doesn't makes sense in this application, so I'm specifically looking for the elegant way to do this.
I personally think the most elegant way is to not dynamically load things like this. Is it really that difficult to specify the valid input? This allows much greater flexibility with respect to where your instances of A come from.
object Test extends App {
args(0) match {
case "B" => B
case "C" =>
val someOtherConfig = args(1)
new C(someOtherParam)
case other => throw new Exception("invalid input")
} run
}
I would use Scopt to parse parameters
Related
I'm running a gatling simulation that uses numeric input from a yml file to feed its scenario. Everything works when my numeric inputs are large enough that they cannot be parsed as instances of java.lang.Integer, but small numeric values are apparently parsed to Integers and result in a ClassCastException.
import java.io.FileInputStream
import io.gatling.core.Predef.{Feeder, Simulation}
import org.yaml.snakeyaml.Yaml
import org.yaml.snakeyaml.constructor.Constructor
import io.gatling.core.Predef.{scenario, _}
import scala.collection.JavaConversions
class TestClass extends Simulation {
val yaml = new Yaml(new Constructor(classOf[Holder]))
val holder = yaml.load(new FileInputStream("src/test/resources/data.yml")).asInstanceOf[Holder]
scenario("sim").feed(getUserEmulationFeeder(holder))
def getUserEmulationFeeder(holder:Holder) : Feeder[Long] = {
val iterable = JavaConversions.iterableAsScalaIterable(holder.numbers)
iterable.map(l => Map("userToEmulate" -> l)).iterator
}
}
data.yml has the following data:
numbers:
- 30687965369
- 31415388869
- 2
and is being deserialized into:
import scala.beans.BeanProperty
class Holder {
#BeanProperty var numbers = new java.util.ArrayList[Long]()
}
Removing the 2 fixes the ClassCastException.
The full stacktrace is:
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.gatling.mojo.MainWithArgsInFile.runMain(MainWithArgsInFile.java:50)
at io.gatling.mojo.MainWithArgsInFile.main(MainWithArgsInFile.java:33)
Caused by: java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long
at scala.runtime.BoxesRunTime.unboxToLong(BoxesRunTime.java:105)
at com.mercurygate.TestClass$$anonfun$getUserEmulationFeeder$1.apply(TestClass.scala:25)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at com.mercurygate.TestClass.getUserEmulationFeeder(TestClass.scala:25)
at com.mercurygate.TestClass.<init>(TestClass.scala:21)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at io.gatling.app.Gatling$.io$gatling$app$Gatling$$$anonfun$1(Gatling.scala:41)
at io.gatling.app.Gatling$lambda$1.apply(Gatling.scala:41)
at io.gatling.app.Gatling$lambda$1.apply(Gatling.scala:41)
at io.gatling.app.Gatling.run(Gatling.scala:92)
at io.gatling.app.Gatling.runIfNecessary(Gatling.scala:75)
at io.gatling.app.Gatling.start(Gatling.scala:65)
at io.gatling.app.Gatling$.start(Gatling.scala:57)
at io.gatling.app.Gatling$.fromArgs(Gatling.scala:49)
at io.gatling.app.Gatling$.main(Gatling.scala:43)
at io.gatling.app.Gatling.main(Gatling.scala)
... 6 more
P.S. Sorry for the complexity of the example. It's only when I combine snakeyaml, gatling, and the small input that I get the error.
I have a tool that is trying to build instances of sub-classes of various scala collections, for example scala.collection.Seq. I don't know in advance what specific class should be built, so I am trying to use reflection to get the apply method in the companion object as follows (similar to declaring List[Int](1, 2, 3)).
import scala.reflect.runtime.{universe => ru}
import scala.reflect.runtime.universe._
def makeNewInstance[T <: scala.collection.Seq[_]](clazz: Class[T], args: List[_]): T = {
val clazzMirror: ru.Mirror = ru.runtimeMirror(clazz.getClassLoader)
val clazzSymbol = clazzMirror.classSymbol(clazz)
val companionObject = clazzSymbol.companion.asModule
val instanceMirror = clazzMirror reflect (clazzMirror reflectModule companionObject).instance
val typeSignature = instanceMirror.symbol.typeSignature
val name = "apply"
val ctor = typeSignature.member(TermName(name)).asMethod
instanceMirror.reflectMethod(ctor)(args:_*).asInstanceOf[T]
}
makeNewInstance(clazz = classOf[scala.collection.mutable.ListBuffer[Int]], args = List[Int](1,2,3))
Nonetheless, I am getting the following exception. I am unable to figure out what I should be passing into the apply method.
java.lang.IllegalArgumentException: argument type mismatch
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at scala.reflect.runtime.JavaMirrors$JavaMirror$JavaVanillaMethodMirror1.jinvokeraw(JavaMirrors.scala:373)
at scala.reflect.runtime.JavaMirrors$JavaMirror$JavaMethodMirror.jinvoke(JavaMirrors.scala:339)
at scala.reflect.runtime.JavaMirrors$JavaMirror$JavaVanillaMethodMirror.apply(JavaMirrors.scala:355)
at Main$$anon$1.makeNewInstance(test.scala:12)
at Main$$anon$1.<init>(test.scala:15)
at Main$.main(test.scala:1)
at Main.main(test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at scala.reflect.internal.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:70)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:101)
at scala.reflect.internal.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:70)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:101)
at scala.tools.nsc.CommonRunner$class.run(ObjectRunner.scala:22)
at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:39)
at scala.tools.nsc.CommonRunner$class.runAndCatch(ObjectRunner.scala:29)
at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:39)
at scala.tools.nsc.ScriptRunner.scala$tools$nsc$ScriptRunner$$runCompiled(ScriptRunner.scala:175)
at scala.tools.nsc.ScriptRunner$$anonfun$runScript$1.apply(ScriptRunner.scala:192)
at scala.tools.nsc.ScriptRunner$$anonfun$runScript$1.apply(ScriptRunner.scala:192)
at scala.tools.nsc.ScriptRunner$$anonfun$withCompiledScript$1$$anonfun$apply$mcZ$sp$1.apply(ScriptRunner.scala:161)
at scala.tools.nsc.ScriptRunner$$anonfun$withCompiledScript$1.apply$mcZ$sp(ScriptRunner.scala:161)
at scala.tools.nsc.ScriptRunner$$anonfun$withCompiledScript$1.apply(ScriptRunner.scala:129)
at scala.tools.nsc.ScriptRunner$$anonfun$withCompiledScript$1.apply(ScriptRunner.scala:129)
at scala.tools.nsc.util.package$.trackingThreads(package.scala:43)
at scala.tools.nsc.util.package$.waitingForThreads(package.scala:27)
at scala.tools.nsc.ScriptRunner.withCompiledScript(ScriptRunner.scala:128)
at scala.tools.nsc.ScriptRunner.runScript(ScriptRunner.scala:192)
at scala.tools.nsc.ScriptRunner.runScriptAndCatch(ScriptRunner.scala:205)
at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:67)
at scala.tools.nsc.MainGenericRunner.run$1(MainGenericRunner.scala:87)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:98)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:103)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
Thank-you for any help you can offer in advance.
I will answer my own question, check if the constructor accepts varargs or not:
if (ctor.isVarargs) instanceMirror.reflectMethod(ctor)(args.toList).asInstanceOf[T]
else instanceMirror.reflectMethod(ctor)(args:_*).asInstanceOf[T]
Thanks to a friend who pointed me in the right direction.
Hi I am using standalone hbase and I want to test spark on it. There is no hadoop on my machine.
when I try to get count of a table using HBaseTest.scala (in scala examples)
I get following error:
ERROR TableInputFormat: java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:416)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:393)
at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:274)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:194)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:156)
at org.apache.hadoop.hbase.mapreduce.TableInputFormat.setConf(TableInputFormat.java:101)
at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:91)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1632)
at org.apache.spark.rdd.RDD.count(RDD.scala:1012)
at org.apache.spark.examples.HBaseTest$.main(HBaseTest.scala:59)
at org.apache.spark.examples.HBaseTest.main(HBaseTest.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:607)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:190)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:414)
... 23 more
Caused by: java.lang.VerifyError: class org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result overrides final method getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:176)
at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:69)
at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:857)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:662)
... 28 more
Exception in thread "main" java.io.IOException: No table was provided.
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:154)
at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:95)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1632)
at org.apache.spark.rdd.RDD.count(RDD.scala:1012)
at org.apache.spark.examples.HBaseTest$.main(HBaseTest.scala:59)
at org.apache.spark.examples.HBaseTest.main(HBaseTest.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:607)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:190)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I am not able to figure out whats the issue here.
HBaseTest.scala:
object HBaseTest {
def main(args: Array[String]) {
val sparkConf = new SparkConf().setAppName("HBaseTest").setMaster("local")
val sc = new SparkContext(sparkConf)
val conf = HBaseConfiguration.create()
// Other options for configuring scan behavior are available. More information available at
// http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/mapreduce/TableInputFormat.html
conf.set("zookeeper.znode.parent", "/hbase-unsecure")
conf.set("hbase.zookeeper.quorum", "localhost")
conf.set("hbase.zookeeper.property.clientPort","2181")
conf.addResource(new Path("/usr/lib/hbase/hbase-0.94.8/conf/hbase-site.xml"))
conf.set(TableInputFormat.INPUT_TABLE,"test")
// Initialize hBase table if necessary
val admin = new HBaseAdmin(conf)
if (!admin.isTableAvailable("test")) {
print ("inside if statement")
val tableDesc = new HTableDescriptor(TableName.valueOf("test"))
admin.createTable(tableDesc)
}
val hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat],
classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable],
classOf[org.apache.hadoop.hbase.client.Result])
hBaseRDD.count()
sc.stop()
}
}
You ar using TableInputFormat class as input format. TableInputFormat class is belong to hadoop Map-reduce API. You need to install hadoop for using TableInputFormat.
Got exception when trying to serialize/deserialize case class with optional field using lift-json.
scala> import net.liftweb.json._
import net.liftweb.json._
scala> import net.liftweb.json.Serialization.{read, write}
import net.liftweb.json.Serialization.{read, write}
scala> implicit val formats = DefaultFormats
formats: net.liftweb.json.DefaultFormats.type = net.liftweb.json.DefaultFormats$#707a7686
scala> case class Person(Name:String,Age:Option[Int])
defined class Person
scala> val friends=List(Person("Dan",Some(21)),Person("Ben",None))
friends: List[Person] = List(Person(Dan,Some(21)), Person(Ben,None))
scala> read[List[Person]](write(friends))
java.lang.InternalError: Malformed class name
at java.lang.Class.getSimpleName(Class.java:1169)
at net.liftweb.json.ScalaSigReader$$anonfun$findClass$3.apply(ScalaSig.scala:45)
at net.liftweb.json.ScalaSigReader$$anonfun$findClass$3.apply(ScalaSig.scala:45)
at scala.collection.LinearSeqOptimized$class.find(LinearSeqOptimized.scala:100)
at scala.collection.immutable.List.find(List.scala:76)
at net.liftweb.json.ScalaSigReader$.findClass(ScalaSig.scala:45)
at net.liftweb.json.ScalaSigReader$.findClass(ScalaSig.scala:41)
at net.liftweb.json.ScalaSigReader$.readConstructor(ScalaSig.scala:24)
at net.liftweb.json.Meta$Reflection$.term$1(Meta.scala:275)
at net.liftweb.json.Meta$Reflection$.typeParameters(Meta.scala:292)
at net.liftweb.json.Meta$.mkContainer$1(Meta.scala:107)
at net.liftweb.json.Meta$.fieldMapping$1(Meta.scala:134)
at net.liftweb.json.Meta$.toArg$1(Meta.scala:154)
at net.liftweb.json.Meta$$anonfun$constructors$1$1$$anonfun$apply$1.apply(Meta.scala:98)
at net.liftweb.json.Meta$$anonfun$constructors$1$1$$anonfun$apply$1.apply(Meta.scala:97)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
at scala.collection.LinearSeqOptimized$class.foreach(LinearSeqOptimized.scala:59)
at scala.collection.immutable.List.foreach(List.scala:76)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
at scala.collection.immutable.List.map(List.scala:76)
at net.liftweb.json.Meta$$anonfun$constructors$1$1.apply(Meta.scala:97)
at net.liftweb.json.Meta$$anonfun$constructors$1$1.apply(Meta.scala:96)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
at scala.collection.LinearSeqOptimized$class.foreach(LinearSeqOptimized.scala:59)
at scala.collection.immutable.List.foreach(List.scala:76)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
at scala.collection.immutable.List.map(List.scala:76)
at net.liftweb.json.Meta$.constructors$1(Meta.scala:96)
at net.liftweb.json.Meta$$anonfun$mappingOf$1.apply(Meta.scala:168)
at net.liftweb.json.Meta$$anonfun$mappingOf$1.apply(Meta.scala:160)
at net.liftweb.json.Meta$Memo.memoize(Meta.scala:197)
at net.liftweb.json.Meta$.mappingOf(Meta.scala:160)
at net.liftweb.json.Extraction$.mkMapping$1(Extraction.scala:193)
at net.liftweb.json.Extraction$.mkMapping$1(Extraction.scala:190)
at net.liftweb.json.Extraction$.net$liftweb$json$Extraction$$extract0(Extraction.scala:198)
at net.liftweb.json.Extraction$.extract(Extraction.scala:42)
at net.liftweb.json.JsonAST$JValue.extract(JsonAST.scala:300)
at net.liftweb.json.Serialization$.read(Serialization.scala:58)
at .<init>(<console>:16)
at .<clinit>(<console>)
at .<init>(<console>:11)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:704)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:914)
at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:546)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:577)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:543)
at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:694)
at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:745)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:651)
at scala.tools.nsc.interpreter.ILoop.processLine$1(ILoop.scala:542)
at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:550)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:822)
at scala.tools.nsc.interpreter.ILoop.main(ILoop.scala:851)
at xsbt.ConsoleInterface.run(ConsoleInterface.scala:57)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:73)
at sbt.compiler.AnalyzingCompiler.console(AnalyzingCompiler.scala:64)
at sbt.Console.console0$1(Console.scala:23)
at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply$mcV$sp(Console.scala:24)
at sbt.TrapExit$.executeMain$1(TrapExit.scala:33)
at sbt.TrapExit$$anon$1.run(TrapExit.scala:42)
Notes:
If I use String like "case class Person(Name:String,Age:Option[String])" instead of Int, it will return the correct result.
If I use "java.lang.Integer" like "case class Person(Name:String,Age:Option[java.lang.Integer])", it will return the correct result as well.
My question is why I have to use java type here? Is there a better/cleaner way to express this?
Please see the FAQ at the end of this README. Extraction does not work properly for classes that are defined in the REPL.
I have below piece of code
object SubClass extends MyTrait {
private[this] val a = 10
def main(args: Array[String]) {
println(a)
}
}
trait MyTrait {
protected val a = 5
}
And it gives following runtime error. Can somebody explain why we didn't catch it in compile time.
Exception in thread "main" java.lang.ClassFormatError: Duplicate field
name&signature in class file SubClass$ at
java.lang.ClassLoader.defineClass1(Native Method) at
java.lang.ClassLoader.defineClassCond(ClassLoader.java:631) at
java.lang.ClassLoader.defineClass(ClassLoader.java:615) at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283) at
java.net.URLClassLoader.access$000(URLClassLoader.java:58) at
java.net.URLClassLoader$1.run(URLClassLoader.java:197) at
java.security.AccessController.doPrivileged(Native Method) at
java.net.URLClassLoader.findClass(URLClassLoader.java:190) at
java.lang.ClassLoader.loadClass(ClassLoader.java:306) at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at
java.lang.ClassLoader.loadClass(ClassLoader.java:247) at
SubClass.main(TraitTest.scala)
Because software has bugs?
https://issues.scala-lang.org/browse/SI-7475
That would be my guess.
The related ticket has received recent attention:
https://issues.scala-lang.org/browse/SI-2568