How do I get an object's type and pass it along to asInstanceOf in Scala? - scala

I have a Scala class that reads formatting information from a JOSN template file, and data from a different file. The goal is to format as a JSON object specified by the template file. I'm getting the layout working, but now I want to set the type of my output to the type in my template (i.e. if I have a field value as a String in the template, it should be a string in the output, even if it's an integer in the raw data).
Basically, I'm looking for a quick and easy way of doing something like:
output = dataValue.asInstanceOf[templateValue.getClass]
That line gives me an error that type getClass is not a member of Any. But I haven't been able to find any other member or method that gives me an variable type at runtime. Is this possible, and if so, how?
Clarification
I should add, by this point in my code, I know I'm dealing with just a key/value pair. What I'd like is the value's type.
Specifically, given the JSON template below, I want the name to be cast to a String, age to be cast to an integer, and salary to be cast a decimal on output regardless of how it appears in the raw data file (it could be all strings, age and salary could both be ints, etc.). What I was hoping for is a simple cast that didn't require me to do pattern matching to handle each data type specifically.
Example template:
people: [{
name: "value",
age: 0,
salary: 0.00
}]

Type parameters must be known at compile time (type symbols), and templateValue.getClass is just a plain value (of type Class), so it cannot be used as type parameter.
What to do instead - this depends on your goal, which isn't yet clear to me... but it may look like
output = someMethod(dataValue, templateValue.getClass),
and inside that method you may do different computations depending on second argument of type Class.

How do I get an object's type and pass it along to asInstanceOf in Scala?
The method scala.reflect.api.JavaUniverse.typeOf[T] requires it's type argument to be hard-coded by the caller or type-inferred. To type-infer, create a utility method like the following (works for all types, even generics - it counteracts java runtime type arg erasure by augmenting T during compilation with type tag metadata):
// http://www.scala-lang.org/api/current/index.html#scala.reflect.runtime.package
import scala.reflect.runtime.universe._
def getType[T: TypeTag](a: T): Type = typeOf[T]
3 requirements here:
type arg implements TypeTag (but previous implementation via Manifest still available...)
one or more input args are typed T
return type is Type (if you want the result to be used externally to the method)
You can invoke without specifying T (it's type-inferred):
import scala.reflect.runtime.universe._
def getType[T: TypeTag](a: T): Type = typeOf[T]
val ls = List[Int](1,2,3)
println(getType(ls)) // prints List[Int]
However, asInstanceOf will only cast the type to a (binary consistent) type in the hierarchy with no conversion of data or format. i.e. the data must already be in the correct binary format - so that won't solve your problem.
Data Conversion
A few methods convert between Integers and Strings:
// defined in scala.Any:
123.toString // gives "123"
// implicitly defined for java.lang.String via scala.collection.immutable.StringOps:
123.toHexString // gives "7b"
123.toOctalString // gives "173"
"%d".format(123) // also gives "123"
"%5d".format(123) // gives " 123"
"%05d".format(123) // gives "00123"
"%01.2f".format(123.456789) // gives "123.46"
"%01.2f".format(123.456789) // gives "0.46"
// implicitly defined for java.lang.String via scala.collection.immutable.StringOps:
" 123".toInt // gives 123
"00123".toInt // gives 123
"00123.4600".toDouble // gives 123.46
".46".toDouble // gives 0.46
Parsing directly from file to target type (no cast or convert):
Unfortunately, scala doesn't have a method to read the next token in a stream as an integer/float/short/boolean/etc. But you can do this by obtaining a java FileInputStream, wrapping it in a DataInputStream and then calling readInt, readFloat, readShort, readBoolean, etc.

In a type-level context the value-level terms still have a few accessors. The first one and the one you asked for is the type of the value itself (type):
output = dataValue.asInstanceOf[templateValue.type]
if the type of the value has inner members, those become available as well:
class A {
class B {}
}
val a: A = new A
val b: a.B = new a.B
Notice b: a.B.
I must also mention how to access such members without a value-level term:
val b: A#B = new a.B

Related

Implicit object works inline but not when it is imported

I am using avro4s to help with avro serialization and deserialization.
I have a case class that includes Timestamps and need those Timestamps to be converted to nicely formatted strings before I publish the records to Kafka; the default encoder is converting my Timestamps to Longs. I read that I needed to write a decoder and encoder (from the avro4s readme).
Here is my case class:
case class MembershipRecordEvent(id: String,
userHandle: String,
planId: String,
teamId: Option[String] = None,
note: Option[String] = None,
startDate: Timestamp,
endDate: Option[Timestamp] = None,
eventName: Option[String] = None,
eventDate: Timestamp)
I have written the following encoder:
Test.scala
def test() = {
implicit object MembershipRecordEventEncoder extends Encoder[MembershipRecordEvent] {
override def encode(t: MembershipRecordEvent, schema: Schema) = {
val dateFormat = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss")
val record = new GenericData.Record(schema)
record.put("id", t.id)
record.put("userHandle", t.userHandle)
record.put("teamId", t.teamId.orNull)
record.put("note", t.note.orNull)
record.put("startDate", dateFormat.format(t.startDate))
record.put("endDate", if(t.endDate.isDefined) dateFormat.format(t.endDate.get) else null)
record.put("eventName", t.eventName.orNull)
record.put("eventDate", dateFormat.format(t.eventDate))
record
}
}
val recordInAvro2 = Encoder[MembershipRecordEvent].encode(testRecord, AvroSchema[MembershipRecordEvent]).asInstanceOf[GenericRecord]
println(recordInAvro2)
}
If I declare the my implicit object in line, like I did above, it creates the GenericRecord that I am looking for just fine. I tried to abstract the implicit object to a file, wrapped in an object, and I import Implicits._ to use my custom encoder.
Implicits.scala
object Implicits {
implicit object MembershipRecordEventEncoder extends Encoder[MembershipRecordEvent] {
override def encode(t: MembershipRecordEvent, schema: Schema) = {
val dateFormat = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss")
val record = new GenericData.Record(schema)
record.put("id", t.id)
record.put("userHandle", t.userHandle)
record.put("teamId", t.teamId.orNull)
record.put("note", t.note.orNull)
record.put("startDate", dateFormat.format(t.startDate))
record.put("endDate", if(t.endDate.isDefined) dateFormat.format(t.endDate.get) else null)
record.put("eventName", t.eventName.orNull)
record.put("eventDate", dateFormat.format(t.eventDate))
record
}
}
}
Test.scala
import Implicits._
val recordInAvro2 = Encoder[MembershipRecordEvent].encode(testRecord, AvroSchema[MembershipRecordEvent]).asInstanceOf[GenericRecord]
println(recordInAvro2)
It fails to use my encoder (doesn't hit my breakpoints). I have tried a myriad of things to try and see why it fails to no avail.
How can I correctly import an implicit object?
Is there a simpler solution to encode my case class's Timestamps to Strings without writing an encoder for the entire case class?
TL;DR
As suggested in one of the comments above, you can place it in the companion object.
The longer version:
Probably you have another encoder, that is used instead of the encoder you defined in Implicits.
I'll quote some phrases from WHERE DOES SCALA LOOK FOR IMPLICITS?
When a value of a certain name is required, lexical scope is searched for a value with that name. Similarly, when an implicit value of a certain type is required, lexical scope is searched for a value with that type.
Any such value which can be referenced with its “simple” name, without selecting from another value using dotted syntax, is an eligible implicit value.
There may be more than one such value because they have different names.
In that case, overload resolution is used to pick one of them. The algorithm for overload resolution is the same used to choose the reference for a given name, when more than one term in scope has that name. For example, println is overloaded, and each overload takes a different parameter type. An invocation of println requires selecting the correct overloaded method.
In implicit search, overload resolution chooses a value among more than one that have the same required type. Usually this entails selecting a narrower type or a value defined in a subclass relative to other eligible values.
The rule that the value must be accessible using its simple name means that the normal rules for name binding apply.
In summary, a definition for x shadows a definition in an enclosing scope. But a binding for x can also be introduced by local imports. Imported symbols can’t override definitions of the same name in an enclosing scope. Similarly, wildcard imports can’t override an import of a specific name, and names in the current package that are visible from other source files can’t override imports or local definitions.
These are the normal rules for deciding what x means in a given context, and also determine which value x is accessible by its simple name and is eligible as an implicit.
This means that an implicit in scope can be disabled by shadowing it with a term of the same name.
Now I'll state the companion object logic:
Implicit syntax can avoid the import tax, which of course is a “sin tax,” by leveraging “implicit scope”, which depends on the type of the implicit instead of imports in lexical scope.
When an implicit of type T is required, implicit scope includes the companion object T:
When an F[T] is required, implicit scope includes both the companion of F and the companion of the type argument, e.g., object C for F[C].
In addition, implicit scope includes the companions of the base classes of F and C, including package objects, such as p for p.F.

Scala - calling a method with generic type parameter given a string value that determines the correct type

I am designing an API interface in a 2-tier architecture. It takes a string parameter fieldName from URL and returns a JSON string. fieldName refers to a field in my database table. You can think of its signature as:
def controller(fieldName: String): String
In the controller, I would like to call a method in my data access layer to perform the following query:
SELECT fieldName, SUM(salary) FROM Employee GROUP BY fieldName
because the type of the field varies, the type of the query result will be different. This method is parametrized by a generic type parameter T which corresponds to the type of the field with name fieldName.
def getTotalSalaryByField[T](fieldName: String): Map[T, Long]
if fieldName is "age", T should be Int.
if fieldName is "name", T should be String.
and so on.
given a particular fieldName at runtime, how do I call this method giving it the correct type?
I don't want to write a lot of if-else or pattern matching statements to select the type. It would look like this:
fieldName match {
case "age" => serializeToJson(getTotalSalaryByField[Int]("age"))
case "name" => serializeToJson(getTotalSalaryByField[String]("name"))
...
// 100 more for 100 more fields
}
This is ugly. If I were to write this in Python, it will take only one line:
json.dumps(getTotalSalaryByField(fieldName))
Is Scala somehow not suitable for rest backend programming? because this seems to be a common pattern people will encounter, and static typing gets in the way. I would like to see some suggestions as to how I should approach the whole problem in scala-ish way, even if it means remodeling, rewriting the DAL and controllers.
EDIT:
#drexin, the actual signature of the DAL method is
def myDAOMethod[T](tf: Option[TimeFilter], cf: CrossFilter)
(implicit attr: XFAttribute[T]): Map[T, Long]
T is the type of fieldName. Long is the type of y. As you can see, I need to select a type to be T based on fieldName from url parameter.
EDIT:
added some code and made the usecase clear
There is a difference between knowing the type of a variable at compile time and at runtime.
If the fieldName is not known at compile time (i.e. it's a parameter), and if the type of the column varies by fieldName, then you are not going to be able to specify the return type of the method at compile time.
You will need to use a DAO method that returns AnyRef, rather than one which returns T for a compile-time-specified type T.
Old answer:
The database access library can return the values without needing to know their type, and your code needs to do the same.
You are looking to use a DAO method which takes a type param T:
def myDAOMethod[T](tf: Option[TimeFilter], cf: CrossFilter)
(implicit attr: XFAttribute[T]): Map[T, Long]
... but as you state, you don't know the type T in advance, so this method is inapplicable. (T is used to convert the database column data into a String or an Int).
Your DAO should offer an untyped version of this method, something more like:
def doSelect(tf: Option[TimeFilter], cf: CrossFilter): Map[AnyRef, Long]
What database access library are you using?
If you are using Play framework then I would recommend you to use Play JSON APIs.
https://www.playframework.com/documentation/2.1.1/ScalaJson
Instead of
def getTotalSalaryByField[T](fieldName: String): Map[T, Long] = ???
you can write
def getTotalSalaryByField(fieldName: String): Map[JsValue, Long] = ???
since all types like JsNumber, JsString, JsNull are inherit from the generic JSON trait, JsValue.
wrap your queryResult with Json.toJson()
OR
def getTotalSalaryByField(fieldName: String): JsObject = ???
I think you have three options here:
use pattern matching (which is something you wanted to avoid)
use AnyRef and then try to guess proper type at runtime; maybe serializeToJson can do it itself?
use Type Providers: http://docs.scala-lang.org/overviews/macros/typeproviders.html
Third option is probably what you are looking for, but it bases on experimental scala features (i.e. macros). What I imagine it would be doing is connecting to your database during the compilation phase and inspecting the schema. If you are generating schema using some separate sql file(s) then it should be even easier. Then macro will generate all boilerplate with proper types hard-coded.
Probably you won't find working example for exactly what you need but there is one for RFC files which you can use as inspiration: https://github.com/travisbrown/type-provider-examples
I'm not 100% sure I understand your question, so apologies if I'm answering something else entirely.
You need Scala to be able to guess the type of your field based on the expected return type. That is, in the following code:
val result : Map[String, Long] = myDAOMethod(tf, cf)
You expect Scala to correctly infer that since you want a Map[String, Long], your x variable is of type T.
It seems to me that you already have everything you need for that. I do not know what XFAttribute[T] is, but I suspect it allows you to transform entries in a result set to instances of type T.
What's more, you've already declared it as a implicit parameter.
Provided you have an implicit XFAttribute[String] in scope, then, the previous code should compile, run, and be type-safe.
As a small improvement, I'd change the signature of your method to use context bounds, but that's primarily a matter of taste:
// Declare the implicit "parsers"
implicit val IntAttribute: XFAttribute[Int] = ???
implicit val StringAttribute: XFAttribute[String] = ???
// Empty implementation, fill it with your actual code.
def myDAOMethod[T: XFAttribute](tf: Option[TimeFilter], cf: CrossFilter): Map[T, Long] = ???
// Scala will look for an implicit XFAttribute[Int] in scope, and find IntAttribute.
val ages: Map[Int, Long] = myDAOMethod(tf, cf)
// Scala will look for an implicit XFAttribute[String] in scope, and find StringAttribute.
val names: Map[String, Long] = myDAOMethod(tf, cf)
I'm not sure whether your question implies that you'd also like to strongly tie the String "age" to the type Int. That's also possible, of course, but it's another answer entirely and I don't want to pollute this question with unnecessary rambling.

avoid type conversion in Scala

I have this weird requirement where data comes in as name ->value pair from a service and all the name-> value type is string only (which really they are not but that's how data is stored)
This is a simplified illustration.
case class EntityObject(type:String,value:String)
EntityObject("boolean","true")
now when getting that EntityObject if type is "boolean" then I have to make sure value is not anything else but boolean so first get type out and check value and cast value to that type. e.g in this case check value is boolean so have to cast string value to boolean to validate. If it was anything else besides boolean then it should fail.
e.g. if data came in as below, casting will fail and it should report back to the caller about this error.
EntityObject("boolean","1")
Due to this weird requirement it forces type conversion in validation code which doesn't look elegant and against type safe programming. Any elegant way to handle this in scala (may be in a more type safe manner)?
Here is where I'm going to channel an idea taken from a tweet by Miles Sabin in regards to hereogenous mappings (see this gist on github.) If you know the type of object mapping names a head of time you can use a nifty little trick which involves dependent types. Hold on, 'cause it's a wild ride:
trait AssocConv[K] { type V ; def convert: String => V }
def makeConv[V0](name: String, con: String => V0) = new AssocConv[name.type]{
V = V0
val convert = con
}
implicit val boolConv = makeConv("boolean", yourMappingFunc)
def convEntity(name: String, value: String)(implicit conv: AssocConv[name.type]): Try[conv.V] = Try{ conv.convert(value) }
I haven't tested this but it "should" work. I've also enclosed it in a Scala Try so that it catches exceptions thrown by your conversion function (in case you're doing things like _.toInt as the converter.)
You're really talking about conversion, not casting. Casting would be if the value really were an instance of Boolean at runtime, whereas what you have is a String representation of a Boolean.
If you're already working with a case class, I think a pattern matching expression would work pretty well here.
For example,
def convert(entity : EntityObject) : Any = entity match {
case EntityObject("boolean", "true") => true
case EntityObject("boolean", "false") => false
case EntityObject("string", s) => s
// TODO: add Regex-based matchers for numeric types
}
Anything that doesn't match one of the specified patterns would cause a MatchError, or you could put a catchall expression at the end to throw your own exception.
In this particular example, since the function returns Any, the calling coffee would need to do an actual type cast to get the specific type, but at least by that point all validation/conversion would have already been performed. Alternatively, you could just put the code that uses the values directly into the above function and avoid casting. I don't know what your specific needs are, so I can't offer anything more detailed.

Scala priority of method call on implicit object

Let's say I have the following scala code:
case class Term(c:Char) {
def unary_+ = Plus(this)
}
case class Plus(t:Term)
object Term {
implicit def fromChar(c:Char) = Term(c)
}
Now I get this from the scala console:
scala> val p = +'a'
p: Int = 97
scala> val q:Plus = +'a'
<console>:16: error: type mismatch;
found : Int
required: Plus
val q:Plus = +'a'
^
Because '+' is already present on the Char type, the implicit conversion does not take place, I think. Is there a way to override the default behaviour and apply '+' on the converted Term before applying on the Char type?
(BTW, the example is artificial and I'm not looking for alternative designs. The example is just here to illustrate the problem)
No, there is no way to override the default + operator, not even with an implicit conversion. When it encounters an operator (actually a method, as operators are just plain methods) that is not defined on the receiving object, the compiler will look for an implicit conversion to an object to does provide this operator. But if the operator is already defined on the target object, it will never look up for any conversion, the original operator will always be called.
You should thus define a separate operator whose name will not conflict with any preexisting operator.
UPDATE:
The precise rules that govern implicit conversions are defined in the Scala Language Specification:
Views are applied in three situations.
If an expression e is of type T , and T does not conform to the expression’s
expected type pt. In this case an implicit v is searched which is applicable to
e and whose result type conforms to pt. The search proceeds as in the case of
implicit parameters, where the implicit scope is the one of T => pt. If such a
view is found, the expression e is converted to v(e).
In a selection e.m with e of type T , if the selector m does not denote a member
of T . In this case, a view v is searched which is applicable to e and whose result
contains a member named m. The search proceeds as in the case of implicit
parameters, where the implicit scope is the one of T . If such a view is found,
the selection e.m is converted to v(e).m.
In a selection e.m(args) with e of type T , if the selector m denotes some member(s) of T , but none of these members is applicable to the arguments args. In
this case a view v is searched which is applicable to e and whose result contains a method m which is applicable to args. The search proceeds as in the
case of implicit parameters, where the implicit scope is the one of T . If such a
view is found, the selection e.m is converted to v(e).m(args).
In other words, an implicit conversion occurs in 3 situations:
when an expression is of type T but is used in a context where the unrelated type T' is expected, an implicit conversion from T to T' (if any such conversion is in scope) is applied.
when trying to access an object's member that does not exists on said object, an implicit conversion from the object into another object that does have this member (if any such conversion is in scope) is applied.
when trying to call a method of an object's with a parameter list that does not match any of the corresponding overloads, the compiler applies an implicit conversion from the object into another object that does have a method of this name and with a compatible parameter list (if any such conversion is in scope).
Note for completeness that this actually applies to more than just methods (inner objects/vals with an apply method are eligible too). Note also that this is the case that Randall Schulz was talking about in his comment below.
So in your case, points (2) and (3) are relevant. Given that you want to define a method named unary_+, which already exists for type Int, case (2) won't kick in. And given that your version has the same parameter list as the built-in Int.unary_+ method (they are both parameterless), point (3) won't kick in either. So you definitly cannot define an implicit that will redefine unary_+.

Scala - mapping a list of integers to a method that receives java.lang.Object

Working in Scala-IDE, I have a Java library, in which one of the methods receives java.lang.Object. And I want to map a list of Int values to it. The only solution that works is:
val listOfInts = groupOfObjects.map(_.getNeededInt)
for(int <- listOfInts) libraryObject.libraryMethod(int)
while the following one:
groupOfObjects.map(_.getNeededInt).map(libraryMethod(_)
and even
val listOfInts = groupOfObjects.map(_.getNeededInt)
val result = listOfInts.map(libraryObject.libraryMethod(_))
say
type mismatch; found : Int required: java.lang.Object Note: an
implicit exists from scala.Int => java.lang.Integer, but methods
inherited from Object are rendered ambiguous. This is to avoid a
blanket implicit which would convert any scala.Int to any AnyRef. You
may wish to use a type ascription: x: java.lang.Integer.
and something like
val result = listOfInts.map(libraryObject.libraryMethod(x => x.toInt))
or
val result = listOfInts.map(libraryObject.libraryMethod(_.toInt))
does not work also.
1) Why is it happening? As far as I know, the for and map routines do not differ that much!
2) Also: what means You may wish to use a type ascription: x: java.lang.Integer? How would I do that? I tried designating the type explicitly, like x: Int => x.toInt, but that is too erroneus. So what is the "type ascription"?
UPDATE:
The solution proposed by T.Grottker, adds to it. The error that I am getting with it is this:
missing parameter type for expanded function ((x$3) => x$3.asInstanceOf[java.lang.Object])
missing parameter type for expanded function ((x$3) => x$3.asInstanceOf{#null#}[java.lang.Object]{#null#}) {#null#}
and I'm like, OMG, it just grows! Who can explain what all these <null> things mean here? I just want to know the truth. (NOTE: I had to replace <> brakets with # because the SO engine cut out the whole thing then, so use your imagination to replace them back).
The type mismatch tells you exactly the problem: you can convert to java.lang.Integer but not to java.lang.Object. So tell it you want to ask for an Integer somewhere along the way. For example:
groupOfObjects.map(_.getNeededInt: java.lang.Integer).map(libraryObject.libraryMethod(_))
(The notation value: Type--when used outside of the declaration of a val or var or parameter method--means to view value as that type, if possible; value either needs to be a subclass of Type, or there needs to be an implicit conversion that can convert value into something of the appropriate type.)