Java List to Scala Conversion Error - scala

I have a Java code base that returns me a java.util.List that I consume in my Scala layer as below:
import scala.collection.JavaConverters._
val myList = myServiceClient.getMyList.asScala.toList //fails here!
println(myList)
I then hit the following error:
Exception in thread "main" javax.xml.ws.soap.SOAPFaultException: scala.collection.immutable.$colon$colon cannot be cast to java.util.List
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:161)
at com.sun.proxy.$Proxy49.getSlaveList(Unknown Source)
at Test$.main(Test.scala:35)
at Test.main(Test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Caused by: java.lang.ClassCastException: scala.collection.immutable.$colon$colon cannot be cast to java.util.List
at org.apache.cxf.binding.soap.SoapMessage.getHeaders(SoapMessage.java:56)
at org.apache.cxf.binding.soap.interceptor.SoapHeaderOutFilterInterceptor.handleMessage(SoapHeaderOutFilterInterceptor.java:37)
at org.apache.cxf.binding.soap.interceptor.SoapHeaderOutFilterInterceptor.handleMessage(SoapHeaderOutFilterInterceptor.java:29)
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:308)
at org.apache.cxf.endpoint.ClientImpl.doInvoke(ClientImpl.java:514)
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:423)
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:324)
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:277)
at org.apache.cxf.frontend.ClientProxy.invokeSync(ClientProxy.java:96)
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:139)
... 8 more

So the original problem was a couple of lines above in my code base to what I posted in my original question:
I had to do the following when I pass the List to the Apache CXF library:
val headerList = Seq(
new Header(new QName("http://www.myService.com/MyServices/", "UserName"), "", new JAXBDataBinding(classOf[String])),
new Header(new QName("http://www.myService.com/MyServices//", "Password"), "", new JAXBDataBinding(classOf[String]))
)
import scala.collection.JavaConverters._
proxy.getRequestContext.put(Header.HEADER_LIST, headerList.asJava)

Related

Small number causes java.lang.ClassCastException when snakeyaml deserialized object is passed to Gatling feeder

I'm running a gatling simulation that uses numeric input from a yml file to feed its scenario. Everything works when my numeric inputs are large enough that they cannot be parsed as instances of java.lang.Integer, but small numeric values are apparently parsed to Integers and result in a ClassCastException.
import java.io.FileInputStream
import io.gatling.core.Predef.{Feeder, Simulation}
import org.yaml.snakeyaml.Yaml
import org.yaml.snakeyaml.constructor.Constructor
import io.gatling.core.Predef.{scenario, _}
import scala.collection.JavaConversions
class TestClass extends Simulation {
val yaml = new Yaml(new Constructor(classOf[Holder]))
val holder = yaml.load(new FileInputStream("src/test/resources/data.yml")).asInstanceOf[Holder]
scenario("sim").feed(getUserEmulationFeeder(holder))
def getUserEmulationFeeder(holder:Holder) : Feeder[Long] = {
val iterable = JavaConversions.iterableAsScalaIterable(holder.numbers)
iterable.map(l => Map("userToEmulate" -> l)).iterator
}
}
data.yml has the following data:
numbers:
- 30687965369
- 31415388869
- 2
and is being deserialized into:
import scala.beans.BeanProperty
class Holder {
#BeanProperty var numbers = new java.util.ArrayList[Long]()
}
Removing the 2 fixes the ClassCastException.
The full stacktrace is:
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.gatling.mojo.MainWithArgsInFile.runMain(MainWithArgsInFile.java:50)
at io.gatling.mojo.MainWithArgsInFile.main(MainWithArgsInFile.java:33)
Caused by: java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long
at scala.runtime.BoxesRunTime.unboxToLong(BoxesRunTime.java:105)
at com.mercurygate.TestClass$$anonfun$getUserEmulationFeeder$1.apply(TestClass.scala:25)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at com.mercurygate.TestClass.getUserEmulationFeeder(TestClass.scala:25)
at com.mercurygate.TestClass.<init>(TestClass.scala:21)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at io.gatling.app.Gatling$.io$gatling$app$Gatling$$$anonfun$1(Gatling.scala:41)
at io.gatling.app.Gatling$lambda$1.apply(Gatling.scala:41)
at io.gatling.app.Gatling$lambda$1.apply(Gatling.scala:41)
at io.gatling.app.Gatling.run(Gatling.scala:92)
at io.gatling.app.Gatling.runIfNecessary(Gatling.scala:75)
at io.gatling.app.Gatling.start(Gatling.scala:65)
at io.gatling.app.Gatling$.start(Gatling.scala:57)
at io.gatling.app.Gatling$.fromArgs(Gatling.scala:49)
at io.gatling.app.Gatling$.main(Gatling.scala:43)
at io.gatling.app.Gatling.main(Gatling.scala)
... 6 more
P.S. Sorry for the complexity of the example. It's only when I combine snakeyaml, gatling, and the small input that I get the error.

java.lang.NoClassDefFoundError issue while running a scala code for a UDF

I am trying to write a UDF in scala in order to get all the months between two dates passed. This is what i have written.
package com.company.datediff
import org.apache.hadoop.hive.ql.exec.UDF
import java.time._
class hive_udf extends UDF {
def evaluate(date1: String, date2: String): String = {
val s1 = LocalDate.parse(date1)
val s2= LocalDate.parse(date2)
val p = Period.between(s1, s2)
val l=p.getMonths()
val min1= s1.getMonthValue()
val max1= s2.getMonthValue()
var arr1=""
for (i <- min1 to max1){
arr1=arr1.concat(","+ i)
}
/*var i=min1
while (i<= max1){
arr1=arr1.concat(","+ i)
}*/
return arr1
}
}
When running this code without for loop, code runs perfectly fine. After inclusion of for loop, I am getting 'java.lang.NoClassDefFoundError' and
Execution Error, return code -101 from
'org.apache.hadoop.hive.ql.exec.FunctionTask. scala/Function1'
PFB the details of full error:
java.lang.NoClassDefFoundError: scala/Function1
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.hive.ql.exec.Registry.registerToSessionRegistry(Registry.java:518)
at org.apache.hadoop.hive.ql.exec.Registry.registerPermanentFunction(Registry.java:207)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.registerPermanentFunction(FunctionRegistry.java:1536)
at org.apache.hadoop.hive.ql.exec.FunctionTask.createPermanentFunction(FunctionTask.java:136)
at org.apache.hadoop.hive.ql.exec.FunctionTask.execute(FunctionTask.java:75)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1748)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1494)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1291)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1158)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1148)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:217)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:169)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:380)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:740)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:685)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:233)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: java.lang.ClassNotFoundException: scala.Function1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 26 more
FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.FunctionTask. scala/Function1
I have a limited exposure to java or Scala. Not sure where I am going wrong. Any help is appreciated. Thanks

How to use phantom in scala main object

I'm trying to use phantom to handle my Cassandra
This is my example to learn how can I use phantom (thanks to Thiago)
https://github.com/iamthiago/cassandra-phantom
I can run successfully 'SongTest'.
But I have some trouble to run database.create in my scala main object.
Same code in 'scalatest' successfully run. But in my scala main object, it fail.
This is my source code
package com.cassandra.phantom.modeling
import com.cassandra.phantom.modeling.database._
import com.cassandra.phantom.modeling.connector._
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.duration._
/**
* Created by karamko on 2017. 3. 28..
*/
class TestCass extends EmbeddedDatabase with Connector.connector.Connector {
def create() {
database.create(5.seconds)
}
}
object Main extends App{
val test = new TestCass
test.create()
}
and This is my error
Exception in thread "main" java.lang.NoClassDefFoundError: scala/reflect/runtime/package$
at com.outworkers.phantom.column.AbstractColumn$class.com$outworkers$phantom$column$AbstractColumn$$_name(AbstractColumn.scala:55)
at com.outworkers.phantom.column.Column.com$outworkers$phantom$column$AbstractColumn$$_name$lzycompute(Column.scala:24)
at com.outworkers.phantom.column.Column.com$outworkers$phantom$column$AbstractColumn$$_name(Column.scala:24)
at com.outworkers.phantom.column.AbstractColumn$class.name(AbstractColumn.scala:58)
at com.outworkers.phantom.column.Column.name(Column.scala:24)
at com.outworkers.phantom.column.PrimitiveColumn.qb(PrimitiveColumn.scala:38)
at com.outworkers.phantom.builder.query.RootCreateQuery$$anonfun$lightweight$1.apply(CreateQuery.scala:48)
at com.outworkers.phantom.builder.query.RootCreateQuery$$anonfun$lightweight$1.apply(CreateQuery.scala:48)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.Set$Set4.foreach(Set.scala:200)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractSet.scala$collection$SetLike$$super$map(Set.scala:47)
at scala.collection.SetLike$class.map(SetLike.scala:92)
at scala.collection.AbstractSet.map(Set.scala:47)
at com.outworkers.phantom.builder.query.RootCreateQuery.lightweight(CreateQuery.scala:48)
at com.outworkers.phantom.builder.query.RootCreateQuery.ifNotExists(CreateQuery.scala:71)
at com.outworkers.phantom.CassandraTable.autocreate(CassandraTable.scala:92)
at com.cassandra.phantom.modeling.database.SongsDatabase$$anon$1$$anonfun$createQueries$1.apply(SongsDatabase.scala:15)
at com.cassandra.phantom.modeling.database.SongsDatabase$$anon$1$$anonfun$createQueries$1.apply(SongsDatabase.scala:15)
at com.outworkers.phantom.database.ExecutableCreateStatementsList.future(Database.scala:173)
at com.outworkers.phantom.database.Database.createAsync(Database.scala:85)
at com.outworkers.phantom.database.Database.create(Database.scala:75)
at com.cassandra.phantom.modeling.SongsStreaming$.main(Main.scala:29)
at com.cassandra.phantom.modeling.SongsStreaming.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: scala.reflect.runtime.package$
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 30 more
I solve my problem adding below dependency
libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value
Thanks a lot

Spark isn't loading, what's up with that?

So the problem that I am having is that I don't seem to be able to create a sparkcontext. And I have no idea why not.
Here is my code:
import org.apache.spark.{SparkConf, SparkContext}
object spark_test{
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setAppName("Datasets Test").setMaster("local")
val sc= new SparkContext(conf)
println(sc)
}
}
And here is the result that I am getting:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.apache.spark.SparkConf.getAkkaConf(SparkConf.scala:203)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:68)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
at spark_test$.main(test.scala:6)
at spark_test.main(test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Any thoughts?
Your scala version is too new and spark-core version is too old . I am using scala 2.11.8 and spark-core_2.11:2.0.1,you can try it!

Can deserialize avros to Scala case-classes from in-memory, but why not from files? Record can't be cast to case class?

I'm trying to use Salat-Avro to serialize and deserialize Scala case classes.
I can serialize and deserialize fine in memory, but I can only serialize to files; I can't deserialize form file yet.
Why won't my DatumReader succeed when reading from a file like it did when reading from a stream?
[error] (run-main) java.lang.ExceptionInInitializerError
java.lang.ExceptionInInitializerError
at Main.main(salat-avro-example.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
Caused by: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to models.Record
at Main$.<init>(salat-avro-example.scala:55)
at Main$.<clinit>(salat-avro-example.scala)
at Main.main(salat-avro-example.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
[error] {file:/home/julianpeeters/salat-avro-example/}default-7321ab/compile:run: Nonzero exit code: 1
[error] Total time: 18 s, completed Aug 30, 2012 12:04:01 AM
Here's the code:
val obj2 = grater[Record].asObjectFromDataFile(infile)
calls:
lazy val asDatumReader: AvroDatumReader[X] = asGenericDatumReader
lazy val asGenericDatumReader: AvroGenericDatumReader[X] = new AvroGenericDatumReader[X](asAvroSchema)def asObjectFromDataFile(infile: File): X = {
val asDataFileReader: DataFileReader[X] = new DataFileReader[X](infile, asDatumReader)
asDataFileReader.next()
} `
The code can also be seen at Github.com: Salat-Avro-Example.scala and
Salat-Avro.avrograter.scala
How do I fix this? Thanks!
Now I see that dataFileReader.next returned a record, but the values of the fields were still UTF-8, and I needed to unmarshall the values back into a scala object with applyValues. Something like the hackish thing below worked for me:
val objIterator = asDataFileReader.asScala
.iterator
.map(i => asGenericDatumReader.applyValues(i.asInstanceOf[GenericData.Record]).asInstanceOf[X])