Anorm - generic insert - scala

Is there a way to use Anorm like a regular ORM? I'd like to have a method that just inserts an element provided.
def insert[T](element: T)(implicit connection: Connection) = {
element.insert(connection)
}
I can definitely implement it by myself, but feels like I'm re-implementing an ORM... Old anorm version had this Magic[T] but I can't see it now

The documentation clearly states that Anorm is not an ORM (and will never be).
As indicated, to insert or update a T value, an instance of the ToStatement typeclass must be provided.
Some macros are provided to automatically materialize such instance.
import anorm.{ Macro, SQL, ToParameterList }
import anorm.NamedParameter
case class Bar(v: Int)
val bar1 = Bar(1)
// Convert all supported properties as parameters
val toParams1: ToParameterList[Bar] = Macro.toParameters[Bar]
val params1: List[NamedParameter] = toParams1(bar1)
// --> List(NamedParameter(v,ParameterValue(1)))
val names1: List[String] = params1.map(_.name)
// --> List(v)
val placeholders = names1.map { n => s"{$n}" } mkString ", "
// --> "{v}"
val generatedStmt = s"""INSERT INTO bar(${names1 mkString ", "}) VALUES ($placeholders)"""
val generatedSql1 = SQL(generatedStmt).on(params1: _*)

Related

scala - how to avoid using mutable list and casting

I have written the following code which works fine. However, I wanted to check if there is a way I can avoid using mutable list and casting datatype from any to string.
import scala.collection.mutable.ListBuffer
val databases = spark.catalog.listDatabases.select($"name").collect().map(_(0)).toList
var tables= new ListBuffer[String]()
databases.foreach{database =>
val t = spark.catalog.listTables(database.asInstanceOf[String]).filter($"isTemporary" === false).filter($"tableType" =!= "VIEW").select($"name").collect.map(database+"."+_(0).asInstanceOf[String]).toList
tables = tables ++ t
}
tables.foreach(println)
You can use row deconstructor to get rid of casting:
val databases: List[String] = listDatabases.select($"name").collect().map {
case Row(name: String) => name
}.to[List]
As for mutability, just use flatMap instead of foreach:
val tables = databases.flatMap { db => listTables(db).filter .... }

Any better, more idiomatic way to convert SQL ResultSet to a Scala List or other collection type?

I'm using the following naive code to convert a ResultSet to a Scala List:
val rs = pstmt.executeQuery()
var nids = List[String]()
while (rs.next()) {
nids = nids :+ rs.getString(1)
}
rs.close()
Is there a better approach, something more idiomatic to Scala, that doesn't require using a mutable object?
Why don't you try this:
new Iterator[String] {
def hasNext = resultSet.next()
def next() = resultSet.getString(1)
}.toStream
Taken from this answer here
I have a similar problem and my solution is:
Iterator.from(0).takeWhile(_ => rs.next()).map(_ => rs.getString(1)).toList
Hope that will help.
Using ORM tool like slick or Quill as mention in comment section is considered to be better approach.
If you want to use the Scala Code for processing the ResultSet.You can use the tailRecursion.
#scala.annotation.tailrec
def getResult(resultSet: ResultSet, list: List[String] = Nil): List[String] = {
if (resultSet.next()) {
val value = resultSet.getString(0)
getResult(resultSet, value :: list)
}
else {
list
}
}
This method returns the list of string that contains column value at 0th position. This process is pure immutable so you don't need to worry.On the plus side this method is tail recursion so Scala will internally optimize this method accordingly.
Thanks

Write a class support for yield keywords in Scala

How can I make a class support for keywords in scala?
e.g:
class A(data: String) {
...
}
val a = A("I'm A")
for {
data <- a
} yield {
data
}
Thanks
The compiler rewrites all for comprehensions into the necessary constituent parts: map(), flatMap(), withFilter(), foreach(). That's why many Scala syntax rules are suspended inside the for comprehension, e.g. can't create variables in the standard fashion, val x = 2, and can't throw in println() statements.
In your example, this will work.
class A(data: String) {
def map[B](f: (String) => B) = f(data)
}
val a = new A("I'm A")
for {
data <- a
} yield {
data
} // res0: String = I'm A
But note that if you have multiple generators (the <- is a generator) then only the final one is turned into a map() call. The previous generators are all flatMap() calls.
If your for comprehension includes an if condition then you'll need a withFilter() as well.
I recommend avoiding for comprehensions until you have a good feel for how they work.

How can I use Gson in Scala to serialize a List?

I was hoping to use Scala and Gson together. It seems to mostly work, but when I do something like this, it treats the list as an object, not an array:
case class MyType (val x:String, val y:List[SomeOtherType]) {
def toJson() = new Gson().toJson(this)
}
And my JSON turns out something like this:
{
"x":"whatever",
"y": {
}
}
Normally Gson converts lists to arrays. I'm sure this is all because Gson doesn't know about Scala's collection classes, but any ideas on what I can do to make this work? Or other suggestions using Scala-native JSON libraries?
You may try lift json, it's native scala lib: http://www.assembla.com/spaces/liftweb/wiki/JSON_Support
Or other suggestions
spray-json is a lightweight, clean and efficient JSON implementation in Scala.
It sports the following features:
Simple immutable model of the JSON language elements
An efficient JSON PEG parser (implemented with parboiled)
Choice of either compact or pretty JSON-to-string printing
Type-class based (de)serialization of custom objects (no reflection, no intrusion)
You can use Java converters in a type adapter, but it's a bit finicky:
case class GsonListAdapter() extends JsonSerializer[List[_]] with JsonDeserializer[List[_]] {
import sun.reflect.generics.reflectiveObjects.ParameterizedTypeImpl
import scala.collection.JavaConverters._
#throws(classOf[JsonParseException])
def deserialize(jsonElement: JsonElement, t: Type, jdc: JsonDeserializationContext): List[_] = {
val p = scalaListTypeToJava(t.asInstanceOf[ParameterizedType]) // Safe casting because List is a ParameterizedType.
val javaList: java.util.List[_ <: Any] = jdc.deserialize(jsonElement, p)
javaList.asScala.toList
}
override def serialize(obj: List[_], t: Type, jdc: JsonSerializationContext): JsonElement = {
val p = scalaListTypeToJava(t.asInstanceOf[ParameterizedType]) // Safe casting because List is a ParameterizedType.
jdc.serialize(obj.asInstanceOf[List[Any]].asJava, p)
}
private def scalaListTypeToJava(t: ParameterizedType): ParameterizedType = {
ParameterizedTypeImpl.make(classOf[java.util.List[_]], t.getActualTypeArguments, null)
}
}
val gson = new GsonBuilder().registerTypeHierarchyAdapter(classOf[List[_]], new GsonListAdapter()).create()
val l1 = List("a", "c")
val stringListType = new TypeToken[List[String]] {}.getType
val json1 = gson.toJson(l1, stringListType)
println(json1) // ["a","c"]
val newL1: List[String] = gson.fromJson(json1, stringListType)
assert(l1 === newL1)
val l2 = List(1, 3)
val intListType = new TypeToken[List[Int]] {}.getType
val json2 = gson.toJson(l2, intListType)
println(json2) // [1,3]
val newL2: List[Int] = gson.fromJson(json2, intListType)
assert(l2 === newL2)
Or other suggestions
The Jackson add-on jackson-module-scala provides some scala support, including serialization of lists.

Filling a Scala immutable Map from a database table

I have a SQL database table with the following structure:
create table category_value (
category varchar(25),
property varchar(25)
);
I want to read this into a Scala Map[String, Set[String]] where each entry in the map is a set of all of the property values that are in the same category.
I would like to do it in a "functional" style with no mutable data (other than the database result set).
Following on the Clojure loop construct, here is what I have come up with:
def fillMap(statement: java.sql.Statement): Map[String, Set[String]] = {
val resultSet = statement.executeQuery("select category, property from category_value")
#tailrec
def loop(m: Map[String, Set[String]]): Map[String, Set[String]] = {
if (resultSet.next) {
val category = resultSet.getString("category")
val property = resultSet.getString("property")
loop(m + (category -> m.getOrElse(category, Set.empty)))
} else m
}
loop(Map.empty)
}
Is there a better way to do this, without using mutable data structures?
If you like, you could try something around
def fillMap(statement: java.sql.Statement): Map[String, Set[String]] = {
val resultSet = statement.executeQuery("select category, property from category_value")
Iterator.continually((resultSet, resultSet.next)).takeWhile(_._2).map(_._1).map{ res =>
val category = res.getString("category")
val property = res.getString("property")
(category, property)
}.toIterable.groupBy(_._1).mapValues(_.map(_._2).toSet)
}
Untested, because I don’t have a proper sql.Statement. And the groupBy part might need some more love to look nice.
Edit: Added the requested changes.
There are two parts to this problem.
Getting the data out of the database and into a list of rows.
I would use a Spring SimpleJdbcOperations for the database access, so that things at least appear functional, even though the ResultSet is being changed behind the scenes.
First, some a simple conversion to let us use a closure to map each row:
implicit def rowMapper[T<:AnyRef](func: (ResultSet)=>T) =
new ParameterizedRowMapper[T]{
override def mapRow(rs:ResultSet, row:Int):T = func(rs)
}
Then let's define a data structure to store the results. (You could use a tuple, but defining my own case class has advantage of being just a little bit clearer regarding the names of things.)
case class CategoryValue(category:String, property:String)
Now select from the database
val db:SimpleJdbcOperations = //get this somehow
val resultList:java.util.List[CategoryValue] =
db.query("select category, property from category_value",
{ rs:ResultSet => CategoryValue(rs.getString(1),rs.getString(2)) } )
Converting the data from a list of rows into the format that you actually want
import scala.collection.JavaConversions._
val result:Map[String,Set[String]] =
resultList.groupBy(_.category).mapValues(_.map(_.property).toSet)
(You can omit the type annotations. I've included them to make it clear what's going on.)
Builders are built for this purpose. Get one via the desired collection type companion, e.g. HashMap.newBuilder[String, Set[String]].
This solution is basically the same as my other solution, but it doesn't use Spring, and the logic for converting a ResultSet to some sort of list is simpler than Debilski's solution.
def streamFromResultSet[T](rs:ResultSet)(func: ResultSet => T):Stream[T] = {
if (rs.next())
func(rs) #:: streamFromResultSet(rs)(func)
else
rs.close()
Stream.empty
}
def fillMap(statement:java.sql.Statement):Map[String,Set[String]] = {
case class CategoryValue(category:String, property:String)
val resultSet = statement.executeQuery("""
select category, property from category_value
""")
val queryResult = streamFromResultSet(resultSet){rs =>
CategoryValue(rs.getString(1),rs.getString(2))
}
queryResult.groupBy(_.category).mapValues(_.map(_.property).toSet)
}
There is only one approach I can think of that does not include either mutable state or extensive copying*. It is actually a very basic technique I learnt in my first term studying CS. Here goes, abstracting from the database stuff:
def empty[K,V](k : K) : Option[V] = None
def add[K,V](m : K => Option[V])(k : K, v : V) : K => Option[V] = q => {
if ( k == q ) {
Some(v)
}
else {
m(q)
}
}
def build[K,V](input : TraversableOnce[(K,V)]) : K => Option[V] = {
input.foldLeft(empty[K,V]_)((m,i) => add(m)(i._1, i._2))
}
Usage example:
val map = build(List(("a",1),("b",2)))
println("a " + map("a"))
println("b " + map("b"))
println("c " + map("c"))
> a Some(1)
> b Some(2)
> c None
Of course, the resulting function does not have type Map (nor any of its benefits) and has linear lookup costs. I guess you could implement something in a similar way that mimicks simple search trees.
(*) I am talking concepts here. In reality, things like value sharing might enable e.g. mutable list constructions without memory overhead.