Error while passing arguments to methods. I have an object builddeequ_rules and calling methods using Scala reflection.
def build(rules: List[Map[String, Any]]): Check = {
for (constraint <- rules) {
val name = constraint("name")
val args = constraint("args")
val hiObj = builddeequ_rules
val mtd = hiObj.getClass.getMethod(name.toString,args.getClass)
mtd.invoke(hiObj,args)
}
import com.amazon.deequ.checks.{Check, CheckLevel}
object builddeequ_rules {
var checks = Check(CheckLevel.Warning, "Data unit test")
def isComplete(args: Any) {
val arg = args.asInstanceOf[Map[String,Any]]
val columnName = arg("column").toString
checks = checks.isComplete(columnName)
}
def isUnique(args: Any) {
val arg = args.asInstanceOf[Map[String,Any]]
val columnName = arg("column").toString
checks = checks.isUnique(columnName)
}
def isPositive(args: Any) {
val arg = args.asInstanceOf[Map[String,Any]]
val columnName = arg("column").toString
checks = checks.isPositive(columnName)
}
I am getting below error. Need help!
Error: type mismatch;
found : Any
required: Object
mtd.invoke(hiObj,args)
java.lang.Object is more or less scala.AnyRef. scala.Any is (simplyfying) a superset of objects and primitives. So the compiler is warning you, that you are trying to pass something that could potentially be primitive (Any) as java.lang.Object.
On bytecode level Any will quite often be just Object, sure, but Scala's type system make the distinction between things that are "natively" Objects and things that could involve autoboxing to make them Objects, and that's the error you see.
So the solution here would be to have this object annotated as AnyRef or even better, as java.lang.Object to clearly show that you want to use it for something Java/JVM-specific.
Related
I hope to process various generic data sources accessed by Kafka, so I developed the following code:
def accessKafkaSource[T: ClassTag](sEnv: StreamExecutionEnvironment): DataStream[T] = {
val kafkaSource: KafkaSource[T] = KafkaSource.builder()
.setBootstrapServers("")
.setGroupId("")
.setTopics("test")
.setStartingOffsets(OffsetsInitializer.committedOffsets(OffsetResetStrategy.LATEST))
.setValueOnlyDeserializer(new AbstractDeserializationSchema[T]() {
override def deserialize(msg: Array[Byte]): T = {
// JSONUtil.toBean(StrUtil.str(msg, StandardCharsets.UTF_8), classOf[T])
JSONUtil.toBean(StrUtil.str(msg, StandardCharsets.UTF_8), classTag[T].runtimeClass)
}
})
.build()
Since the commented out code will get an error: class type required but t found, I modified the code, but caused a new problem: type mismatch; found : _$ 1 where type _$ 1 required: T。 How should my needs be realized?
As AminMal notes, runtimeClass is not guaranteed to return the class object of T, just what it erases to in the runtime. AnyVals in particular will break this.
If everything you wish to deserialize is an AnyRef (this is likely the case), you can often safely cast the result of runtimeClass:
def kindaSafeClass[T <: AnyRef : ClassTag]: Class[T] = classTag[T].runtimeClass.asInstanceOf[Class[T]]
The situation this would be unsafe is when generics are involved (erasure...), as can be seen by
val clazz = kindaSafeClass[List[String]]
val lst = List(1)
val cast =
if (clazz.isInstance(lst)) {
println(s"$lst is an instance of $clazz")
clazz.cast(lst)
} else ???
println(cast)
println(cast.head.isEmpty)
which will print List(1) is an instance of class scala.collection.immutable.List, then List(1), and then blow up with a ClassCastException when we try to cast 1 to a String.
But if your T will always be an AnyRef and you can be sure that it's not generic, you can
// Note: T must not be generic (e.g. List[String])
def accessKafkaSource[T <: AnyRef : ClassTag](sEnv: StreamExecutionEnvironment): DataStream[T] =
// as before until...
JSONUtils.toBean(StrUtil.str(msg, StandardCharsets.UTF_8), kindaSafeClass[T])
// as before...
That's because runtimeClass returns a Class[_], not a Class[T]. This kind of approach would make perfect sense in Java, like:
JSONUtils.toBean(whateverString, MyClass.class); // and so on
In Scala, (of course there are unsafe approaches to make this work) but if you're using some JSON library (like Play Json, circe, etc,.), you can do this:
// method signature would look something like this
def accessKafkaSource[T : Reads](sEnv: StreamExecutionEnvironment): DataStream[T] = {
// just going to write the deserialization part:
override def deserialize(msg: Array[Byte]): T = {
Json.parse(msg).as[T] // asOpt is safer, not going to get into exception handling and other stuff
}
}
This behavior is also applicable to other json libraries in scala. Or if you have other kind of documents like xml, expect an implicit function from Array[Byte] => DocumentType (Like JsValue, String, Xml, anything), and another one DocumentType => T. Because this function accessKafkaSource should not be responsible to figure out how data should be deserialized/serialized.
I have this code:
trait ModelDataService[F[_]] {
def getModelVersion(modelVersionId: Long): F[ModelVersion]
}
class ModelDataServiceIdInterpreter[F[_] : Monad] extends ModelDataService[F] {
override def getModelVersion(modelVersionId: Long): F[ModelVersion] = {
val mv = ModelVersion(1, 1, "ModelType", "Status", None, None, Some(ModelContract("ModelName", Some(ModelSignature("infer", Seq(ModelField(name="blah", profile=DataProfileType.NUMERICAL)), Seq.empty[ModelField])))), None, "", None)
Monad[F].pure(mv)
}
}
I am trying to do this:
val model = modelDataService.getModelVersion(modelVersionId)
val batchSize = model.monitoringConfiguration
I get a compile error
value monitoringConfiguration is not a member of type parameter F[a.grpc.entities.ModelVersion]
However, a.grpc.entities.ModelVersion has the monitoringConfiguration field. I guess it has something to do with F. Is there a way I can access the batchSize inside model?
It depends on constraints that you have over your higher-kinded type F. If your F is a functor, then you can access the value inside using map. If F is a FlatMap, or a Monad, you can also use flatMap.
val batchSize: F[Long] =
modelDataService
.getModelVersion(modelVersionId)
.map(_.monitoringConfiguration)
The way the program is structured is that your value will be inside of F at all times, until you really need to get it out. And that is done by initiation of F to some concrete type, e.g. IO from cats-effect, or Future from native scala library. Or if you don't perform any side-effects, it can be as simple as Option. Once the type is concrete, you have different ways of getting the value out, depending on the type. For Future it can be .onComplete, for Option it can be .getOrElse, etc.
val service: ModelDataService[Option] = new ModelDataServiceIdInterpreter[Option]
val maybeBatchSize: Option[Long] =
modelDataService
.getModelVersion(modelVersionId)
.map(_.monitoringConfiguration)
val batchSizeDefaultValue = 10L
val batchSize = maybeBatchSize.getOrElse(batchSizeDefaultValue)
I am following through Scala Tour provided on docs.scala-lang.org. I am stuck at the extractor objects tutorial: https://docs.scala-lang.org/tour/extractor-objects.html
Here is the code I am trying to compile:
object IdGenerator {
private val id: AtomicInteger = new AtomicInteger
def apply(name: String): String = id.incrementAndGet + "--" + name
def unapply(genID: String): Option[String] = {
val idParts = genID.split("--")
if (idParts.head.nonEmpty && idParts.tail.nonEmpty)
Some(idParts(0))
else
None
}
}
println(IdGenerator("ABC"))
println(IdGenerator("DEF"))
println(IdGenerator("XYZ"))
IdGenerator(idName) = IdGenerator("ABC")
println(idName)
println(IdGenerator.unapply(IdGenerator("ABC")))
Here is the error:
D:\MyApps\ScalaPrac\helloworld\hello\src\main\scala\example\Hello.scala:68:5: value update is not a member of object example.IdGenerator
IdGenerator(idName) = IdGenerator("ABC")
It says that value update is not a member of object. Sure, it isn't. But I am not asking it to look for update method, I want it to look for unapply instead.
IdGenerator(idName) = x looks like an assignment, but it actually is syntax sugar for IdGenerator.update(idName, x). That explains the error message you get.
You need to use the val keyword to extract idName:
val IdGenerator(idName) = IdGenerator("ABC")
Considering the following scala program:
val arr: Seq[String] = Seq("abc", "def")
val cls = arr.head.getClass
println(cls)
val ttg: TypeTag[Seq[String]] = typeOf[Seq[String]]
val fns = ttg.tpe
.members
val fn = fns
.filter(_.name.toString == "head")
.head // Unsafely access it for now, use Option and map under normal conditions
.asMethod // Coerce to a method symbol
val fnTp = fn.returnType
println(fnTp)
val fnCls = ttg.mirror.runtimeClass(fnTp)
assert(fnTp == cls)
Since TypeTag has both Seq and String information, I would expect that fn.returnType give the correct result "String", but in this case I got the following program output:
cls = class java.lang.String
fnTp = A
And subsequently throw this exception:
A needed class was not found. This could be due to an error in your runpath. Missing class: no Java class corresponding to A found
java.lang.NoClassDefFoundError: no Java class corresponding to A found
at scala.reflect.runtime.JavaMirrors$JavaMirror.typeToJavaClass(JavaMirrors.scala:1258)
at scala.reflect.runtime.JavaMirrors$JavaMirror.runtimeClass(JavaMirrors.scala:202)
at scala.reflect.runtime.JavaMirrors$JavaMirror.runtimeClass(JavaMirrors.scala:65)
Obviously type String was erased, leaving only a wildcard type 'A'
Why TypeTag is unable to yield the correct erased type as intended?
Seq.head is defined as def head: A. And fn is just a method symbol of the method head from a generic class Seq[A], it doesn't know anything about the concrete type. So its returnType is exactly that A just as defined in Seq.
If you want to know what that A would be in some concrete Type, you'd have to specify that explicitly. For instance, you can use infoIn on the method symbol:
scala> val fnTp = fn.infoIn(ttg.tpe)
fnTp: reflect.runtime.universe.Type = => String
scala> val fnRetTp = fnTp.resultType
fnRetTp: reflect.runtime.universe.Type = String
So here's what I have:
type CompType = Manifest[_ <: Component]
type EntityType = HashSet[CompType]
type CompSet = HashSet[Component]
val type_map = new HashMap[String, EntityType]
val entity_map = new HashMap[EntityID, CompSet]
def createEntityType(type_name: String) = {
val ent_id = new EntityID
val ent_type: CompSet =
type_map(type_name) map (c_type => c_type.erasure.newInstance())
entity_map += (ent_id -> ent_type)
ent_id
}
But as you can see, the map function doesn't create a CompSet, it creates a HashSet[Any].
Is there any way of solving this problem?
The whole point is to save object types for instantiation at a later time within the program, but I can't get this to work, and all the examples of reflection expect some kind of type parameter to cast to via _.asInstanceOf[SomeClassType].
type_map(type_name) map (c_type => c_type.erasure.newInstance().asInstanceOf[Component])
?
BTW, Manifest is deprecated in 2.10, you should use ClassTags and TypeTags instead.