Basically I want to read untyped JSON into a type that's specified by a string. Pseudocode.
def getObject(json: Json, typeString: String): typeOf(typeString) = extract[typeOf(typeString)](json)
typeOf is just some random thing that gives a type from a string.
I would say that it is impossible without runtime reflection. With runtime reflection I would try the following:
get the Class[_] by its name from ClassLoader - it would only work if you specified full name (with packages and everything) in String,
then use something like Jackson which uses runtime reflection to (de)serialize things new ObjectMapper().readValue(json, obtainedClass),
obviously return type would be Any - theoretically you could use path-dependent types here, but personally I see little benefit.
However:
it would be pretty fragile - any mismatch between JSON and class and it fails,
you would have to pass full name of a class (again, fragile) - if you passed data from outside, that sounds like a potential security issue, if you passing data internally... why not using type class?
if you need it for persistence... then you could persist it together with a discriminator, which you could use to provide right type class. (After all set of classes that you serialize is finite and could easily be traced). Then type safe approach with e.g. Circe would be possible.
Well I guess you could also change signature into:
def getObject[T: ClassTag](json: Json): T
or
def getObject[T](json: Json, clazz: Class[T]): T
and be sure that function return what you want. Getting Class[_] by its name and passing it reduces us to original solution.
EDIT:
To show an example of how to extract type from discriminator (pseudocode):
// store discriminator in DB
// use it to deserialize and dispatch with predefined function
def deserializeAndHandle(discriminator: String, json: String): Unit = discriminator match {
case "my.package.A" => decode[my.package.A](json).map(handlerForA)
case "my.package.B" => decode[my.package.B](json).map(handlerForB)
case "my.package.C" => decode[my.package.C](json).map(handlerForC)
case _ =>
}
deserializeAndHandle(discriminator, json)
// store discriminator in DB
// use it to deserialize to Object which can be pattern-matched later
def deserializeToObject(discriminator: String, json: String): Option[Any] = discriminator match {
case "my.package.A" => decode[my.package.A](json).toOption
case "my.package.B" => decode[my.package.B](json).toOption
case "my.package.C" => decode[my.package.C](json).toOption
case _ => None
}
deserializeToObject(discriminator, json) map {
case a : A => ...
case b : B => ...
case c : C => ...
} getOrElse ???
// wrap unrelated types with sealed trait to make it sum type
// use sum type support of Circe
sealed trait Envelope
final case class EnvelopeA(a: A) extends Envelope
final case class EnvelopeB(b: B) extends Envelope
final case class EnvelopeA(c: C) extends Envelope
def deserializeEnveloped(json): Option[Envelope] = decode[Envelope](json).toOption
deserializeEnveloped(json) map {
case EnvelopeA(a) => ...
case EnvelopeB(b) => ...
case EnvelopeC(c) => ...
} getOrElse ???
Related
I'm fairly new to Scala in general, and Scala 3 in particular, and I'm trying to write some code that deals with transparently encoding + decoding values before they are passed to another library.
Basically, I need to map a set of types like Ints to a counterpart in the underlying library. The code I've written is too verbose to replicate here in full, but here's a minimal example demonstrating the kind of thing, using a higher-kinded Encoder type that encapsulates encoding values into types which depend on the values' original types:
trait Encoder[T] {
type U
def encode(v: T): U
}
object Encoder {
given Encoder[Int] with {
override type U = String
override def encode(v: Int): String = v.toString
}
}
case class Value[T : Encoder](v: T) {
val encoder: Encoder[T] = summon[Encoder[T]]
}
I also need to be able to write functions that deal with specific types of Value and which have 'concrete' return types. Like this:
def doStuff(v1: Value[Int]): String = {
v1.encoder.encode(v1.v)
}
However, even though in this case v1.codec.encode does indeed return a String, I get an error:
-- [E007] Type Mismatch Error: -------------------------------------------------
2 | v1.encoder.encode(v1.v)
| ^^^^^^^^^^^^^^^^^^^^^^^
| Found: v1.encoder.U
| Required: String
What can I do differently to solve this error? Really appreciate any pointers to help a newbie out š
Answering the question in the comments
Is there any sensible way I tell the compiler that Iām only interested in Values with Encoders that encode to String?
You can force Value to remember its encoder's result type with an extra type argument.
case class Value[T, R](val v: T)(
using val encoder: Encoder[T],
val eqv: encoder.U =:= R,
)
The encoder is the same as your encoder, just moved to the using list so we can use it in implicit resolution.
eqv is a proof that R (our type parameter) is equivalent to the encoder's U type.
Then doStuff can take a Value[Int, String]
def doStuff(v1: Value[Int, String]): String = {
v1.eqv(v1.encoder.encode(v1.v))
}
Let's be clear about what's happening here. v1.encoder.encode(v1.v) returns an encoder.U. Scala isn't smart enough to know what that is. However, we also have a proof that encoder.U is equal to String, and that proof can be used to convert an encoder.U to a String. And that's exactly what =:=.apply does.
We have to do this back in the case class because you've already lost the type information by the time we hit doStuff. Only the case class (which instantiates the implicit encoder) knows what the result type is, so we need to expose it there.
If you have other places in your codebase where you don't care about the result type, you can fill in a type parameter R for it, or use a wildcard Value[Int, ?].
I would also suggest giving Match Types a try if we are only talking about Scala 3 here.
import scala.util.Try
type Encoder[T] = T match
case Int => String
case String => Either[Throwable, Int]
case class Value[T](v: T):
def encode: Encoder[T] = v match
case u: Int => u.toString
case u: String => Try(u.toInt).toEither
object Main extends App:
val (v1, v2) = (Value(1), Value(2))
def doStuff(v: Value[Int]): String =
v.encode
println(doStuff(v1) + doStuff(v2)) //12
println(Value(v1.encode).encode) //Right(1)
I have a simple method to retrieve a nested key from a hashmap. I need to pattern match on Map[String,Any] so that I can keep iterating into the nested data until I get to a flat value:
def get(map: Map[String, Any], key: String): Any = {
var fields: mutable.Seq[String] = key.split('.')
var currentKey: String = fields.head
var currentValue: Any = map
while (fields.nonEmpty && currentValue.isInstanceOf[Map[String, Any]]) {
currentKey = fields.head
fields = fields.drop(1)
currentValue match {
case m: Map[String, Any] => currentValue = m.getOrElse(currentKey, None)
case _ =>
}
}
if (fields.nonEmpty) None else currentValue
}
It works when I use it only within scala, but if it gets called from java, I get the error non-variable type argument String in type scala.collection.immutable.Map[String,Any].
I've seen some other solutions that require you to refactor code and wrap the map in a case class, but that would be very disruptive to all the code that relies on this method. Is there any simpler fix?
You cannot pattern match on Map[String,Any] because of type erasure. The compiler will have warned of this. This code is just matching on Map[_,_] so it will succeed with any key type, not just String.
So the method is inherently buggy and it appears that calling from Java is exposing bugs that did not emerge when using Scala.
Since you are not using this from Java yet, I would switch to a typesafe implementation for the Java code and then migrate the legacy code to this version as quickly as possible. While this may be disruptive, it would be fixing a design error that introduced buggy code, so it should be done sooner rather than later. (Whenever you see Any used as a value type it is likely that the design went wrong at some point)
The typesafe version is not that difficult, here is an outline implementation:
class MyMap[T] {
trait MapType
case class Value(value: T) extends MapType
case class NestedMap(map: Map[String, MapType]) extends MapType
def get(map: Map[String, MapType], key: String): Option[T] = {
def loop(fields: List[String], map: Map[String, MapType]): Option[T] =
fields match {
case Nil =>
None
case field :: rest =>
map.get(field).flatMap{
case Value(res) => Some(res)
case NestedMap(m) => loop(rest, m)
}
}
loop(key.split('.').toList, map)
}
}
In reality MyMap should actually hold the Map data rather than passing it in to get, and there would be methods for safely building nested maps.
I have the following case class.
case class CustomAttributeInfo[T,Y](
attribute:MyAttribute[_],
fieldName:String,
valueParser:T => Y){}
The case class takes three values.
The last argument is a function that will parse an input of any type and return the part of the input we wish to keep.
(Imagine, for just one example, I pass in a jsonstring, convert to json object, and extract an Int).
The companion object will supply a range of functions that we can pass to the case class. The one shown here, simply takes the input as a string and returns it as a string (the most simple possible example).
object CustomAttributeInfo {
val simpleString = (s:String) => s
}
I create the case class as follows:
CustomAttributeInfo(MyAttribute(var1, var2), name, CustomAttributeInfo.simpleString)
Later, I call the function 'valueParser'
customAttributeInfo.valueParser(k)
Compilation error
Error:(366, 69) type mismatch;
found : k.type (with underlying type String)
required: _$13
case Some(info) => Some((info.attribute, info.valueParser(k)))
I am not a generics expert (obviously). I have done some reading, but I have not seen a discussion about a case like this. Any advice and explanation would be most welcome
You haven't provide enough information to answer your question.
The following code compiles.
If you still have compile error provide MCVE.
case class MyAttribute[_](var1: Any, var2: Any)
case class CustomAttributeInfo[T,Y](attribute:MyAttribute[_], fieldName:String, valueParser:T => Y) {}
object CustomAttributeInfo {
val simpleString = (s:String) => s
}
val var1: Any = ???
val var2: Any = ???
val name: String = ???
val customAttributeInfo = CustomAttributeInfo(MyAttribute(var1, var2), name, CustomAttributeInfo.simpleString)
val k = "abc"
customAttributeInfo.valueParser(k)
#Dmytro was right that a simple example compiled. In my actual codebase code, however, we needed to be specific about the type.
This worked:
object CustomAttributeInfo {
type T = Any
val simpleString = (s:T) => s.toString
}
I'm trying to implement something like clever parameters converter function with Scala.
Basically in my program I need to read parameters from a properties file, so obviously they are all strings and I would like then to convert each parameter in a specific type that I pass as parameter.
This is the implementation that I start coding:
def getParam[T](key : String , value : String, paramClass : T): Any = {
value match {
paramClass match {
case i if i == Int => value.trim.toInt
case b if b == Boolean => value.trim.toBoolean
case _ => value.trim
}
}
/* Exception handling is missing at the moment */
}
Usage:
val convertedInt = getParam("some.int.property.key", "10", Int)
val convertedBoolean = getParam("some.boolean.property.key", "true", Boolean)
val plainString = getParam("some.string.property.key", "value",String)
Points to note:
For my program now I need just 3 main type of type: String ,Int and Boolean,
if is possible I would like to extends to more object type
This is not clever, cause I need to explicit the matching against every possibile type to convert, I would like an more reflectional like approach
This code doesn't work, it give me compile error: "object java.lang.String is not a value" when I try to convert( actually no conversion happen because property values came as String).
Can anyone help me? I'm quite newbie in Scala and maybe I missing something
The Scala approach for a problem that you are trying to solve is context bounds. Given a type T you can require an object like ParamMeta[T], which will do all conversions for you. So you can rewrite your code to something like this:
trait ParamMeta[T] {
def apply(v: String): T
}
def getParam[T](key: String, value: String)(implicit meta: ParamMeta[T]): T =
meta(value.trim)
implicit case object IntMeta extends ParamMeta[Int] {
def apply(v: String): Int = v.toInt
}
// and so on
getParam[Int](/* ... */, "127") // = 127
There is even no need to throw exceptions! If you supply an unsupported type as getParam type argument, code will even not compile. You can rewrite signature of getParam using a syntax sugar for context bounds, T: Bound, which will require implicit value Bound[T], and you will need to use implicitly[Bound[T]] to access that values (because there will be no parameter name for it).
Also this code does not use reflection at all, because compiler searches for an implicit value ParamMeta[Int], founds it in object IntMeta and rewrites function call like getParam[Int](..., "127")(IntMeta), so it will get all required values at compile time.
If you feel that writing those case objects is too boilerplate, and you are sure that you will not need another method in these objects in future (for example, to convert T back to String), you can simplify declarations like this:
case class ParamMeta[T](f: String => T) {
def apply(s: String): T = f(s)
}
implicit val stringMeta = ParamMeta(identity)
implicit val intMeta = ParamMeta(_.toInt)
To avoid importing them every time you use getParam you can declare these implicits in a companion object of ParamMeta trait/case class, and Scala will pick them automatically.
As for original match approach, you can pass a implicit ClassTag[T] to your function, so you will be able to match classes. You do not need to create any values for ClassTag, as the compiler will pass it automatically. Here is a simple example how to do class matching:
import scala.reflect.ClassTag
import scala.reflect._
def test[T: ClassTag] = classTag[T].runtimeClass match {
case x if x == classOf[Int] => "I'm an int!"
case x if x == classOf[String] => "I'm a string!"
}
println(test[Int])
println(test[String])
However, this approach is less flexible than ParamMeta one, and ParamMeta should be preferred.
How do i define generic parameter "on the fly"?
Example:
I have some method def get[T](name: String)
Simple case class
case class User(name: String, password: String, age: Option[Int])
Then i get all my case accessors
def getMethods[T: TypeTag] = typeOf[T].decls.sorted.collect {
case m: MethodSymbol if m.isCaseAccessor => m
}.toList
val caseAccessors = getMethods[User]
And i need to call get method with every accessor method and parametrize it by accessorMethod return type
For example:
caseAccessors.map{accessorMehtod => get[accessorMehtod.returnType](accessorMehtod.name)}
Is there any way to do it?
As it said in the comments to your question - it's not possible to extract type from runtime-reflection of object in compile-time - so you're loosing all typechecks (and possible have to do asInstanceOf somewhere).
But, depending on your needs - you could choose some typesafe alternative, like using Shapeless Records, which could also allow you to access fields by name, but in typesafe way. So, there will be just HList[String :: String :: Option[Int]] (which stores all field types inside) instead of List[Method].
You can also easely convert record into the case class: User(userRecord("name"), userRecord("passsword"),userRecord("age"). But it's all requires that possible field names ("name", "password", etc.) should be known at compile time.