Scala - calling a method with generic type parameter given a string value that determines the correct type - scala

I am designing an API interface in a 2-tier architecture. It takes a string parameter fieldName from URL and returns a JSON string. fieldName refers to a field in my database table. You can think of its signature as:
def controller(fieldName: String): String
In the controller, I would like to call a method in my data access layer to perform the following query:
SELECT fieldName, SUM(salary) FROM Employee GROUP BY fieldName
because the type of the field varies, the type of the query result will be different. This method is parametrized by a generic type parameter T which corresponds to the type of the field with name fieldName.
def getTotalSalaryByField[T](fieldName: String): Map[T, Long]
if fieldName is "age", T should be Int.
if fieldName is "name", T should be String.
and so on.
given a particular fieldName at runtime, how do I call this method giving it the correct type?
I don't want to write a lot of if-else or pattern matching statements to select the type. It would look like this:
fieldName match {
case "age" => serializeToJson(getTotalSalaryByField[Int]("age"))
case "name" => serializeToJson(getTotalSalaryByField[String]("name"))
...
// 100 more for 100 more fields
}
This is ugly. If I were to write this in Python, it will take only one line:
json.dumps(getTotalSalaryByField(fieldName))
Is Scala somehow not suitable for rest backend programming? because this seems to be a common pattern people will encounter, and static typing gets in the way. I would like to see some suggestions as to how I should approach the whole problem in scala-ish way, even if it means remodeling, rewriting the DAL and controllers.
EDIT:
#drexin, the actual signature of the DAL method is
def myDAOMethod[T](tf: Option[TimeFilter], cf: CrossFilter)
(implicit attr: XFAttribute[T]): Map[T, Long]
T is the type of fieldName. Long is the type of y. As you can see, I need to select a type to be T based on fieldName from url parameter.
EDIT:
added some code and made the usecase clear

There is a difference between knowing the type of a variable at compile time and at runtime.
If the fieldName is not known at compile time (i.e. it's a parameter), and if the type of the column varies by fieldName, then you are not going to be able to specify the return type of the method at compile time.
You will need to use a DAO method that returns AnyRef, rather than one which returns T for a compile-time-specified type T.
Old answer:
The database access library can return the values without needing to know their type, and your code needs to do the same.
You are looking to use a DAO method which takes a type param T:
def myDAOMethod[T](tf: Option[TimeFilter], cf: CrossFilter)
(implicit attr: XFAttribute[T]): Map[T, Long]
... but as you state, you don't know the type T in advance, so this method is inapplicable. (T is used to convert the database column data into a String or an Int).
Your DAO should offer an untyped version of this method, something more like:
def doSelect(tf: Option[TimeFilter], cf: CrossFilter): Map[AnyRef, Long]
What database access library are you using?

If you are using Play framework then I would recommend you to use Play JSON APIs.
https://www.playframework.com/documentation/2.1.1/ScalaJson
Instead of
def getTotalSalaryByField[T](fieldName: String): Map[T, Long] = ???
you can write
def getTotalSalaryByField(fieldName: String): Map[JsValue, Long] = ???
since all types like JsNumber, JsString, JsNull are inherit from the generic JSON trait, JsValue.
wrap your queryResult with Json.toJson()
OR
def getTotalSalaryByField(fieldName: String): JsObject = ???

I think you have three options here:
use pattern matching (which is something you wanted to avoid)
use AnyRef and then try to guess proper type at runtime; maybe serializeToJson can do it itself?
use Type Providers: http://docs.scala-lang.org/overviews/macros/typeproviders.html
Third option is probably what you are looking for, but it bases on experimental scala features (i.e. macros). What I imagine it would be doing is connecting to your database during the compilation phase and inspecting the schema. If you are generating schema using some separate sql file(s) then it should be even easier. Then macro will generate all boilerplate with proper types hard-coded.
Probably you won't find working example for exactly what you need but there is one for RFC files which you can use as inspiration: https://github.com/travisbrown/type-provider-examples

I'm not 100% sure I understand your question, so apologies if I'm answering something else entirely.
You need Scala to be able to guess the type of your field based on the expected return type. That is, in the following code:
val result : Map[String, Long] = myDAOMethod(tf, cf)
You expect Scala to correctly infer that since you want a Map[String, Long], your x variable is of type T.
It seems to me that you already have everything you need for that. I do not know what XFAttribute[T] is, but I suspect it allows you to transform entries in a result set to instances of type T.
What's more, you've already declared it as a implicit parameter.
Provided you have an implicit XFAttribute[String] in scope, then, the previous code should compile, run, and be type-safe.
As a small improvement, I'd change the signature of your method to use context bounds, but that's primarily a matter of taste:
// Declare the implicit "parsers"
implicit val IntAttribute: XFAttribute[Int] = ???
implicit val StringAttribute: XFAttribute[String] = ???
// Empty implementation, fill it with your actual code.
def myDAOMethod[T: XFAttribute](tf: Option[TimeFilter], cf: CrossFilter): Map[T, Long] = ???
// Scala will look for an implicit XFAttribute[Int] in scope, and find IntAttribute.
val ages: Map[Int, Long] = myDAOMethod(tf, cf)
// Scala will look for an implicit XFAttribute[String] in scope, and find StringAttribute.
val names: Map[String, Long] = myDAOMethod(tf, cf)
I'm not sure whether your question implies that you'd also like to strongly tie the String "age" to the type Int. That's also possible, of course, but it's another answer entirely and I don't want to pollute this question with unnecessary rambling.

Related

Doobie cannot find or construct a Read instance for type T

I'm using doobie to query some data and everything works fine, like this:
case class Usuario(var documento: String, var nombre: String, var contrasena: String)
def getUsuario(doc: String) =
sql"""SELECT documento, nombre, contrasena FROM "Usuario" WHERE "documento" = $doc"""
.query[Usuario]
.option
.transact(xa)
.unsafeRunSync()
But if I declare a function with type restriction like this:
def getOption[T](f: Fragment): Option[T] = {
f.query[T]
.option
.transact(xa)
.unsafeRunSync()
}
I got these errors:
Error:(42, 12) Cannot find or construct a Read instance for type:
T
This can happen for a few reasons, but the most common case is that a data
member somewhere within this type doesn't have a Get instance in scope. Here are
some debugging hints:
- For Option types, ensure that a Read instance is in scope for the non-Option
version.
- For types you expect to map to a single column ensure that a Get instance is
in scope.
- For case classes, HLists, and shapeless records ensure that each element
has a Read instance in scope.
- Lather, rinse, repeat, recursively until you find the problematic bit.
You can check that an instance exists for Read in the REPL or in your code:
scala> Read[Foo]
and similarly with Get:
scala> Get[Foo]
And find the missing instance and construct it as needed. Refer to Chapter 12
of the book of doobie for more information.
f.query[T].option.transact(xa).unsafeRunSync()
Error:(42, 12) not enough arguments for method query: (implicit evidence$1: doobie.util.Read[T], implicit h: doobie.LogHandler)doobie.Query0[T].
Unspecified value parameter evidence$1.
f.query[T].option.transact(xa).unsafeRunSync()
Does anyone know how to make what I want? I think it's something with implicits but I don't know how to fix it.
In order for doobie to be able to transform the result of SQL query to your case class, it needs an instance of Read typeclass in scope.
For example for Usuario it needs instance of Read[Usuario]. Fortunately, doobie is able to derive typeclasses for types from typeclasses it already knows, like String, so in most cases, we don't need to create these explicitly.
In your case, you want to create method getOption which has type parameter T, which means, that compiler doesn't know for which typeclass of which type to look for.
You can fix it very easily, by just adding context-bound for Read to your type (like T: Read or by adding implicit parameter). It means that your method will pass "request" to resolve typeclass later in compile-time when the concrete type of T would be already known.
So your fixed method will be:
def getOption[T: Read](f: Fragment): Option[T] = {
f.query[T]
.option
.transact(xa)
.unsafeRunSync()
or with implicit parameter:
def getOption[T](f: Fragment)(implicit read: Read[T]): Option[T] = {
f.query[T]
.option
.transact(xa)
.unsafeRunSync()

Scala: Why use implicit on function argument?

I have a following function:
def getIntValue(x: Int)(implicit y: Int ) : Int = {x + y}
I see above declaration everywhere. I understand what above function is doing. It is a currying function which takes two arguments. If you omit the second argument, it will invoke implicit definition which returns int instead. So I think it is something very similar to defining a default value for the argument.
implicit val temp = 3
scala> getIntValue(3)
res8: Int = 6
I was wondering what are the benefits of above declaration?
Here's my "pragmatic" answer: you typically use currying as more of a "convention" than anything else meaningful. It comes in really handy when your last parameter happens to be a "call by name" parameter (for example: : => Boolean):
def transaction(conn: Connection)(codeToExecuteInTransaction : => Boolean) = {
conn.startTransaction // start transaction
val booleanResult = codeToExecuteInTransaction //invoke the code block they passed in
//deal with errors and rollback if necessary, or commit
//return connection to connection pool
}
What this is saying is "I have a function called transaction, its first parameter is a Connection and its second parameter will be a code-block".
This allows us to use this method like so (using the "I can use curly brace instead of parenthesis rule"):
transaction(myConn) {
//code to execute in a transaction
//the code block's last executable statement must be a Boolean as per the second
//parameter of the transaction method
}
If you didn't curry that transaction method, it would look pretty unnatural doing this:
transaction(myConn, {
//code block
})
How about implicit? Yes it can seem like a very ambiguous construct, but you get used to it after a while, and the nice thing about implicit functions is they have scoping rules. So this means for production, you might define an implicit function for getting that database connection from the PROD database, but in your integration test you'll define an implicit function that will superscede the PROD version, and it will be used to get a connection from a DEV database instead for use in your test.
As an example, how about we add an implicit parameter to the transaction method?
def transaction(implicit conn: Connection)(codeToExecuteInTransaction : => Boolean) = {
}
Now, assuming I have an implicit function somewhere in my code base that returns a Connection, like so:
def implicit getConnectionFromPool() : Connection = { ...}
I can execute the transaction method like so:
transaction {
//code to execute in transaction
}
and Scala will translate that to:
transaction(getConnectionFromPool) {
//code to execute in transaction
}
In summary, Implicits are a pretty nice way to not have to make the developer provide a value for a required parameter when that parameter is 99% of the time going to be the same everywhere you use the function. In that 1% of the time you need a different Connection, you can provide your own connection by passing in a value instead of letting Scala figure out which implicit function provides the value.
In your specific example there are no practical benefits. In fact using implicits for this task will only obfuscate your code.
The standard use case of implicits is the Type Class Pattern. I'd say that it is the only use case that is practically useful. In all other cases it's better to have things explicit.
Here is an example of a typeclass:
// A typeclass
trait Show[a] {
def show(a: a): String
}
// Some data type
case class Artist(name: String)
// An instance of the `Show` typeclass for that data type
implicit val artistShowInstance =
new Show[Artist] {
def show(a: Artist) = a.name
}
// A function that works for any type `a`, which has an instance of a class `Show`
def showAListOfShowables[a](list: List[a])(implicit showInstance: Show[a]): String =
list.view.map(showInstance.show).mkString(", ")
// The following code outputs `Beatles, Michael Jackson, Rolling Stones`
val list = List(Artist("Beatles"), Artist("Michael Jackson"), Artist("Rolling Stones"))
println(showAListOfShowables(list))
This pattern originates from a functional programming language named Haskell and turned out to be more practical than the standard OO practices for writing a modular and decoupled software. The main benefit of it is it allows you to extend the already existing types with new functionality without changing them.
There's plenty of details unmentioned, like syntactic sugar, def instances and etc. It is a huge subject and fortunately it has a great coverage throughout the web. Just google for "scala type class".
There are many benefits, outside of your example.
I'll give just one; at the same time, this is also a trick that you can use on certain occasions.
Imagine you create a trait that is a generic container for other values, like a list, a set, a tree or something like that.
trait MyContainer[A] {
def containedValue:A
}
Now, at some point, you find it useful to iterate over all elements of the contained value.
Of course, this only makes sense if the contained value is of an iterable type.
But because you want your class to be useful for all types, you don't want to restrict A to be of a Seq type, or Traversable, or anything like that.
Basically, you want a method that says: "I can only be called if A is of a Seq type."
And if someone calls it on, say, MyContainer[Int], that should result in a compile error.
That's possible.
What you need is some evidence that A is of a sequence type.
And you can do that with Scala and implicit arguments:
trait MyContainer[A] {
def containedValue:A
def aggregate[B](f:B=>B)(implicit ev:A=>Seq[B]):B =
ev(containedValue) reduce f
}
So, if you call this method on a MyContainer[Seq[Int]], the compiler will look for an implicit Seq[Int]=>Seq[B].
That's really simple to resolve for the compiler.
Because there is a global implicit function that's called identity, and it is always in scope.
Its type signature is something like: A=>A
It simply returns whatever argument is passed to it.
I don't know how this pattern is called. (Can anyone help out?)
But I think it's a neat trick that comes in handy sometimes.
You can see a good example of that in the Scala library if you look at the method signature of Seq.sum.
In the case of sum, another implicit parameter type is used; in that case, the implicit parameter is evidence that the contained type is numeric, and therefore, a sum can be built out of all contained values.
That's not the only use of implicits, and certainly not the most prominent, but I'd say it's an honorable mention. :-)

Varargs with different type parameters in scala

I'm new to Scala...
Anyway, I want to do something like:
val bar = new Foo("a" -> List[Int](1), "b" -> List[String]("2"), ...)
bar("a") // gives List[Int] containing 1
bar("b") // gives List[String] containing "2"
The problem when I do:
class Foo(pairs: (String, List[_])*) {
def apply(name: String): List[_] = pairs.toMap(name)
}
pairs is gonna be Array[(String, List[Any]) (or something like that) and apply() is wrong anyway since List[_] is one type instead of "different types". Even if the varargs * returned a tuple I'm still not sure how I'd go about getting bar("a") to return a List[OriginalTypePassedIn]. So is there actually a way of doing this? Scala seems pretty flexible so it feels like there should be some advanced way of doing this.
No.
That's just the nature of static type systems: a method has a fixed return type. It cannot depend on the values of the method's parameters, because the parameters are not known at compile time. Suppose you have bar, which is an instance of Foo, and you don't know anything about how it was instantiated. You call bar("a"). You will get back an instance of the correct type, but since that type isn't determined until runtime, there's no way for a compiler to know it.
Scala does, however, give you a convenient syntax for subtyping Foo:
object bar extends Foo {
val a = List[Int](1)
val b = List[String]("2")
}
This can't be done. Consider this:
val key = readStringFromUser();
val value = bar(key);
what would be the type of value? It would depend on what the user has input. But types are static, they're determined and used at compile time.
So you'll either have to use a fixed number of arguments for which you know their types at compile time, or use a generic vararg and do type casts during runtime.

Scala Implicit Conversion Gotchas

EDIT
OK, #Drexin brings up a good point re: loss of type safety/surprising results when using implicit converters.
How about a less common conversion, where conflicts with PreDef implicits would not occur? For example, I'm working with JodaTime (great project!) in Scala. In the same controller package object where my implicits are defined, I have a type alias:
type JodaTime = org.joda.time.DateTime
and an implicit that converts JodaTime to Long (for a DAL built on top of ScalaQuery where dates are stored as Long)
implicit def joda2Long(d: JodaTime) = d.getMillis
Here no ambiguity could exist between PreDef and my controller package implicits, and, the controller implicits will not filter into the DAL as that is in a different package scope. So when I do
dao.getHeadlines(articleType, Some(jodaDate))
the implicit conversion to Long is done for me, IMO, safely, and given that date-based queries are used heavily, I save some boilerplate.
Similarly, for str2Int conversions, the controller layer receives servlet URI params as String -> String. There are many cases where the URI then contains numeric strings, so when I filter a route to determine if the String is an Int, I do not want to stringVal.toInt everytime; instead, if the regex passes, let the implicit convert the string value to Int for me. All together it would look like:
implicit def str2Int(s: String) = s.toInt
get( """/([0-9]+)""".r ) {
show(captures(0)) // captures(0) is String
}
def show(id: Int) = {...}
In the above contexts, are these valid use cases for implicit conversions, or is it more, always be explicit? If the latter, then what are valid implicit conversion use cases?
ORIGINAL
In a package object I have some implicit conversions defined, one of them a simple String to Int:
implicit def str2Int(s: String) = s.toInt
Generally this works fine, methods that take an Int param, but receive a String, make the conversion to Int, as do methods where the return type is set to Int, but the actual returned value is a String.
Great, now in some cases the compiler errors with the dreaded ambiguous implicit:
both method augmentString in object Predef of type (x: String)
scala.collection.immutable.StringOps and method str2Int(s: String) Int
are possible conversion functions from java.lang.String to ?{val
toInt: ?}
The case where I know this is happening is when attempting to do manual inline String-to-Int conversions. For example, val i = "10".toInt
My workaround/hack has been to create an asInt helper along with the implicits in the package object: def asInt(i: Int) = i and used as, asInt("10")
So, is implicit best practice implicit (i.e. learn by getting burned), or are there some guidelines to follow so as to not get caught in a trap of one's own making? In other words, should one avoid simple, common implicit conversions and only utilize where the type to convert is unique? (i.e. will never hit ambiguity trap)
Thanks for the feedback, implicits are awesome...when they work as intended ;-)
I think you're mixing two different use cases here.
In the first case, you're using implicit conversions used to hide the arbitrary distinction (or arbitrary-to-you, anyway) between different classes in cases where the functionality is identical. The JodaTime to Long implicit conversion fits in that category; it's probably safe, and very likely a good idea. I would probably use the enrich-my-library pattern instead, and write
class JodaGivesMS(jt: JodaTime) { def ms = jt.getMillis }
implicit def joda_can_give_ms(jt: JodaTime) = new JodaGivesMS(jt)
and use .ms on every call, just to be explicit. The reason is that units matter here (milliseconds are not microseconds are not seconds are not millimeters, but all can be represented as ints), and I'd rather leave some record of what the units are at the interface, in most cases. getMillis is rather a mouthful to type every time, but ms is not too bad. Still, the conversion is reasonable (if well-documented for people who may modify the code in years to come (including you)).
In the second case, however, you're performing an unreliable transformation between one very common type and another. True, you're doing it in only a limited context, but that transformation is still liable to escape and cause problems (either exceptions or types that aren't what you meant). Instead, you should write those handy routines that you need that correctly handle the conversion, and use those everywhere. For example, suppose you have a field that you expect to be "yes", "no", or an integer. You might have something like
val Rint = """(\d+)""".r
s match {
case "yes" => println("Joy!")
case "no" => println("Woe!")
case Rint(i) => println("The magic number is "+i.toInt)
case _ => println("I cannot begin to describe how calamitous this is")
}
But this code is wrong, because "12414321431243".toInt throws an exception, when what you really want is to say that the situation is calamitous. Instead, you should write code that matches properly:
case object Rint {
val Reg = """([-]\d+)""".r
def unapply(s: String): Option[Int] = s match {
case Reg(si) =>
try { Some(si.toInt) }
catch { case nfe: NumberFormatException => None }
case _ => None
}
}
and use this instead. Now instead of performing a risky and implicit conversion from String to Int, when you perform a match it will all be handled properly, both the regex match (to avoid throwing and catching piles of exceptions on bad parses) and the exception handling even if the regex passes.
If you have something that has both a string and an int representation, create a new class and then have implicit conversions to each if you don't want your use of the object (which you know can safely be either) to keep repeating a method call that doesn't really provide any illumination.
I try not to convert anything implicitly just to convert it from one type to another, but only for the pimp my library pattern. It can be a bit confusing, when you pass a String to a function that takes an Int. Also there is a huge loss of type safety. If you would pass a string to a function that takes an Int by mistake the compiler could not detect it, as it assumes you want to do it. So always do type conversion explicitly and only use implicit conversions to extend classes.
edit:
To answer your updated question: For the sake of readability, please use the explicit getMillis. In my eyes valid use cases for implicits are "pimp my library", view/context bounds, type classes, manifests, builders... but not being too lazy to write an explicit call to a method.

What is the best way to create and pass around dictionaries containing multiple types in scala?

By dictionary I mean a lightweight map from names to values that can be used as the return value of a method.
Options that I'm aware of include making case classes, creating anon objects, and making maps from Strings -> Any.
Case classes require mental overhead to create (names), but are strongly typed.
Anon objects don't seem that well documented and it's unclear to me how to use them as arguments since there is no named type.
Maps from String -> Any require casting for retrieval.
Is there anything better?
Ideally these could be built from json and transformed back into it when appropriate.
I don't need static typing (though it would be nice, I can see how it would be impossible) - but I do want to avoid explicit casting.
Here's the fundamental problem with what you want:
def get(key: String): Option[T] = ...
val r = map.get("key")
The type of r will be defined from the return type of get -- so, what should that type be? From where could it be defined? If you make it a type parameter, then it's relatively easy:
import scala.collection.mutable.{Map => MMap}
val map: MMap[String, (Manifest[_], Any) = MMap.empty
def get[T : Manifest](key: String): Option[T] = map.get(key).filter(_._1 <:< manifest[T]).map(_._2.asInstanceOf[T])
def put[T : Manifest](key: String, obj: T) = map(key) = manifest[T] -> obj
Example:
scala> put("abc", 2)
scala> put("def", true)
scala> get[Boolean]("abc")
res2: Option[Boolean] = None
scala> get[Int]("abc")
res3: Option[Int] = Some(2)
The problem, of course, is that you have to tell the compiler what type you expect to be stored on the map under that key. Unfortunately, there is simply no way around that: the compiler cannot know what type will be stored under that key at compile time.
Any solution you take you'll end up with this same problem: somehow or other, you'll have to tell the compiler what type should be returned.
Now, this shouldn't be a burden in a Scala program. Take that r above... you'll then use that r for something, right? That something you are using it for will have methods appropriate to some type, and since you know what the methods are, then you must also know what the type of r must be.
If this isn't the case, then there's something fundamentally wrong with the code -- or, perhaps, you haven't progressed from wanting the map to knowing what you'll do with it.
So you want to parse json and turn it into objects that resemble the javascript objets described in the json input? If you want static typing, case classes are pretty much your only option and there are already libraries handling this, for example lift-json.
Another option is to use Scala 2.9's experimental support for dynamic typing. That will give you elegant syntax at the expense of type safety.
You can use approach I've seen in the casbah library, when you explicitly pass a type parameter into the get method and cast the actual value inside the get method. Here is a quick example:
case class MultiTypeDictionary(m: Map[String, Any]) {
def getAs[T <: Any](k: String)(implicit mf: Manifest[T]): T =
cast(m.get(k).getOrElse {throw new IllegalArgumentException})(mf)
private def cast[T <: Any : Manifest](a: Any): T =
a.asInstanceOf[T]
}
implicit def map2multiTypeDictionary(m: Map[String, Any]) =
MultiTypeDictionary(m)
val dict: MultiTypeDictionary = Map("1" -> 1, "2" -> 2.0, "3" -> "3")
val a: Int = dict.getAs("1")
val b: Int = dict.getAs("2") //ClassCastException
val b: Int = dict.getAs("4") //IllegalArgumetExcepton
You should note that there is no real compile-time checks, so you have to deal with all exceptions drawbacks.
UPD Working MultiTypeDictionary class
If you have only a limited number of types which can occur as values, you can use some kind of union type (a.k.a. disjoint type), having e.g. a Map[Foo, Bar | Baz | Buz | Blargh]. If you have only two possibilities, you can use Either[A,B], giving you a Map[Foo, Either[Bar, Baz]]. For three types you might cheat and use Map[Foo, Either[Bar, Either[Baz,Buz]]], but this syntax obviously doesn't scale well. If you have more types you can use things like...
http://cleverlytitled.blogspot.com/2009/03/disjoint-bounded-views-redux.html
http://svn.assembla.com/svn/metascala/src/metascala/OneOfs.scala
http://www.chuusai.com/2011/06/09/scala-union-types-curry-howard/