Unable to call method from scala protobuf jar - scala

I have define scala object(say, MyObject) which extends following
trait GeneratedMessageCompanion[A <: GeneratedMessage with Message[A]]
And when I call parseFrom method on the object, I get following error:
Caused by: java.lang.NoSuchMethodError:....MyObject$.parseFrom([B)Lscalapb/GeneratedMessage;
I tried both scalapb-runtime_2.11 and scalapb-runtime_2.12.
Edit: Issue is solved. It was case of dependency mismatches.

Related

java.lang.ClassCastException: java.lang.Class cannot be cast to myClass

I'm working on a project where I have the following structure:
(1) An API that holds some case classes, traits, and abstract classes.
(2) Main Jar that uses API(1) as a dependency (API is published and used in the build.sbt as deps).
(3) We have a need sometimes to create jars and load them in the main jar (2) using class utils during runtime, knowing that these jars also use API(1) as a dependency.
Now everything was working fine with the 3rd point which is about loading classes from the jar on runtime and making sure that these extend the API class correctly:
clazz = classLoader.loadClass("name") return clazz.asInstanceOf[MyClass]
But after I made a small change to MyClass in the API which was mainly about adding an implicit parameter to the signature of my abstract class, I can no longer identify the external jars (being loaded during runtime) and I keep getting the following error:
[error] (run-main-0) java.lang.ClassCastException: java.lang.Class cannot be cast to MyClass
[error] java.lang.ClassCastException: java.lang.Class cannot be cast to MyClass
MyClass signature from the API looks something like this:
abstract class MyClass (implicit val something: Type) extends AnotherClass {
Any help to bypass this is appreciated guys. I can give more details if needed as well.
Thanks

java.lang.NoSuchMethodError When Running A Code From External .jar

I have a piece of code that throws java.lang.NoSuchMethodError on runtime that I could not resolve:
private def saveActivationEvent(event: ActivationEvent) = activationEventService.createIfFirst(event)
Implementations:
case class ActivationEvent extends Event
class ActivationEventService extends AbstractEventService[ActivationEvent]
abstract class AbstractExalateEventService[E <: Event] {
def createIfFirst(event: E)(implicit reader: BSONDocumentReader[E], writer: BSONDocumentWriter[E]): Future[Option[BSONObjectID]] = ...
}
I thought it was happening because of the type erasure... Could someone help me to understand the problem?
As I mention in a comment above, any time you see a NoSuchMethodError, the first thing you should check is that your compile and runtime dependency versions match. For what it's worth I can't think of a way that type erasure could have anything to do with a NoSuchMethodError—you may see a ClassCastException if someone has a bad type test that matches because of erasure, but even in that case the problem isn't really the erasure, it's the fact that someone is trying to work around it (and ignored the compiler's warnings).

Using Mockito & Guice to test interfaces with generics in Scala

I am new to Scala, and I'm running into this problem when I'm trying to unit test some of my interfaces.
I have an InputService trait with method
def poll(parameters: HashMap[String, String]): Option[T]
where T is generic, so InputService has a type parameter [T].
In my module, I have
val inputService: InputService[String] = mock(classOf[InputService[String]])
bind[InputService[String]].toInstance(inputService)
and in my InputServiceTest, I have
var inputService: InputService[String] = _
before {
inputService = Guice.createInjector(new MockWatcherModule).getInstance(classOf[InputService[String]])
}
But the issue is when I run it, it gives me this error
Exception encountered when invoking run on a nested suite - Guice configuration errors:
1) No implementation for services.InputService was bound.
while locating services.InputService
I think it's because it's looking for services.InputService to bound, but it only has services.InputService[String]. However, when I just use InputService instead of InputService[String], I get the error Trait missing Type Parameter.
Any suggestions?
EDIT:
Turns out that I can use typeLiteral from scala-guice and KeyExtensions to solve my issue. Thanks Tavian!
Due to type erasure, in the getInstance(classOf[InputService[String]]) call, you're just passing InputService.class. You need to pass a TypeLiteral instead to encode the generic type information. From a quick Google it looks like
import net.codingwell.scalaguice._
import net.codingwell.scalaguice.InjectorExtensions._
Guice.createInjector(new MockWatcherModule).instance[InputService[String]]
will work.

Reference a java nested class in Spark Scala

I'm trying to read some data from hadoop into an RDD in Spark using the interactive Scala shell but I'm having trouble accessing some of the classes I need to deserialise the data.
I start by importing the necessary class
import com.example.ClassA
Which works fine. ClassA is located in a jar in the 'jars' path and has ClassB as a public static nested class
I'm then trying to use ClassB like so:
val rawData = sc.newAPIHadoopFile(dataPath, classOf[com.exmple.mapreduce.input.Format[com.example.ClassA$ClassB]], classOf[org.apache.hadoop.io.LongWritable], classOf[com.example.ClassA$ClassB])
This is slightly complicated by one of the other classes taking ClassB as a type, but I think that should be fine.
When I execute this line, I get the following error:
<console>:17: error: type ClassA$ClassB is not a member of package com.example
I have also tried using the import statement
import com.example.ClassA$ClassB
and it also seems fine with that.
Any advice as to how I could proceed to debug this would be appreciated
Thanks for reading.
update:
Changing the '$' to a '.' to reference the nested class seems to get past this problem, although I then got the following syntax error:
'<console>:17: error: inferred type arguments [org.apache.hadoop.io.LongWritable,com.example.ClassA.ClassB,com.example.mapredu‌​ce.input.Format[com.example.ClassA.ClassB]] do not conform to method newAPIHadoopFile's type parameter bounds [K,V,F <: org.apache.hadoop.mapreduce.InputFormat[K,V]]
Notice the types that the newAPIHadoopFile expects:
K,V,F <: org.apache.hadoop.mapreduce.InputFormat[K,V]
the important part here is that the generic type InputFormat expects the types K and V, i.e. the exact types of the first two parameters to the method.
In your case, the third parameter should be of type
F <: org.apache.hadoop.mapreduce.InputFormat[LongWritable, ClassA.ClassB]
does your class extend FileInputFormat<LongWritable, V>?

Creating a Hibernate Method Validator in Scala

I'm trying to convert this Java code into Scala, and I am failing:
Java (which compiles without error):
Validation.byProvider(HibernateValidator.class).configure().
buildValidatorFactory().getValidator().unwrap(MethodValidator.class);
Scala:
Validation.byProvider(classOf[HibernateValidator]).configure.
buildValidatorFactory.getValidator.unwrap( classOf[MethodValidator] )
Scala error:
inferred type arguments [Nothing,org.hibernate.validator.HibernateValidator] do
not conform to method byProvider's type parameter bounds [T <:
javax.validation.Configuration[T],U <:
javax.validation.spi.ValidationProvider[T]]
What am I doing wrong?
I am using Scala 2.10 and have JBoss 7.1.0 on the classpath.
It looks like scala is having a little trouble infering some types. This should work:
Validation.byProvider[HibernateValidatorConfiguration, HibernateValidator](classOf[HibernateValidator])
.configure.buildValidatorFactory.getValidator.unwrap(classOf[MethodValidator])
If you look at the source of byProvider you'll find this:
public static <T extends javax.validation.Configuration<T>,
U extends javax.validation.spi.ValidationProvider<T>>
javax.validation.bootstrap.ProviderSpecificBootstrap<T>
byProvider(java.lang.Class<U> providerType)
So scala should pick up that HibernateValidator has HibernateValidatorConfiguration implemented, but it doesn't.