Scala Guava type mismatch issue - scala

I am trying to implement a simple usecase using Guava caching but facing some issues as shown below:
case class Person(x:Int, y:String)
val db = Map(1 -> Person(1,"A"), 2 -> Person(2,"B"), 3 -> Person(3,"C"))
val loader:CacheLoader[Int,Person] = new CacheLoader[Int,Person](){
def load(key: Int): Person = {
db(key)
}
}
lazy val someData = CacheBuilder.newBuilder().expireAfterWrite(60, MINUTES).maximumSize(10).build(loader)
someData.get(3)
The error I am getting is related to types which I am not able figure out
scala> someData.get(3)
<console>:24: error: type mismatch;
found : Int(3)
required: Int
someData.get(3)
Can someone advice on what can be the issue.

That's a common issue with Java's use-site covariance annotations.
This here works with scala 2.12.4 and guava 24.1:
import com.google.common.cache._
import java.util.concurrent.TimeUnit._
object GuavaCacheBuilderTypeProblem {
case class Person(x:Int, y:String)
val db = Map(1 -> Person(1,"A"), 2 -> Person(2,"B"), 3 -> Person(3,"C"))
val loader: CacheLoader[java.lang.Integer, Person] =
new CacheLoader[java.lang.Integer, Person](){
def load(key: java.lang.Integer): Person = {
db(key)
}
}
lazy val someData = CacheBuilder
.newBuilder()
.expireAfterWrite(60, MINUTES)
.maximumSize(10)
.build[java.lang.Integer, Person](loader)
someData.get(3)
}
Answers with similar errors:
compiler error when using Google guava from scala code

Related

Evalutate complex type with quasiquote scala, unlifting

I need to compile function and then evaluate it with different parameters of type List[Map[String, AnyRef]].
I have the following code that does not compile with such the type but compiles with simple type like List[Int].
I found that there are just certain implementations of Liftable in scala.reflect.api.StandardLiftables.StandardLiftableInstances
import scala.reflect.runtime.universe
import scala.reflect.runtime.universe._
import scala.tools.reflect.ToolBox
val tb = universe.runtimeMirror(getClass.getClassLoader).mkToolBox()
val functionWrapper =
"""
object FunctionWrapper {
def makeBody(messages: List[Map[String, AnyRef]]) = Map.empty
}""".stripMargin
val functionSymbol =
tb.define(tb.parse(functionWrapper).asInstanceOf[tb.u.ImplDef])
val list: List[Map[String, AnyRef]] = List(Map("1" -> "2"))
tb.eval(q"$functionSymbol.function($list)")
Getting compilation error for this, how can I make it work?
Error:(22, 38) Can't unquote List[Map[String,AnyRef]], consider using
... or providing an implicit instance of
Liftable[List[Map[String,AnyRef]]]
tb.eval(q"$functionSymbol.function($list)")
^
The problem comes not from complicated type but from the attempt to use AnyRef. When you unquote some literal, it means you want the infrastructure to be able to create a valid syntax tree to create an object that would exactly match the object you pass. Unfortunately this is obviously not possible for all objects. For example, assume that you've passed a reference to Thread.currentThread() as a part of the Map. How it could possible work? Compiler is just not able to recreate such a complicated object (not to mention making it the current thread). So you have two obvious alternatives:
Make you argument also a Tree i.e. something like this
def testTree() = {
val tb = universe.runtimeMirror(getClass.getClassLoader).mkToolBox()
val functionWrapper =
"""
| object FunctionWrapper {
|
| def makeBody(messages: List[Map[String, AnyRef]]) = Map.empty
|
| }
""".stripMargin
val functionSymbol =
tb.define(tb.parse(functionWrapper).asInstanceOf[tb.u.ImplDef])
//val list: List[Map[String, AnyRef]] = List(Map("1" -> "2"))
val list = q"""List(Map("1" -> "2"))"""
val res = tb.eval(q"$functionSymbol.makeBody($list)")
println(s"testTree = $res")
}
The obvious drawback of this approach is that you loose type safety at compile time and might need to provide a lot of context for the tree to work
Another approach is to not try to pass anything containing AnyRef to the compiler-infrastructure. It means you create some function-like Wrapper:
package so {
trait Wrapper {
def call(args: List[Map[String, AnyRef]]): Map[String, AnyRef]
}
}
and then make your generated code return a Wrapper instead of directly executing the logic and call the Wrapper from the usual Scala code rather than inside compiled code. Something like this:
def testWrapper() = {
val tb = universe.runtimeMirror(getClass.getClassLoader).mkToolBox()
val functionWrapper =
"""
|object FunctionWrapper {
| import scala.collection._
| import so.Wrapper /* <- here probably different package :) */
|
| def createWrapper(): Wrapper = new Wrapper {
| override def call(args: List[Map[String, AnyRef]]): Map[String, AnyRef] = Map.empty
| }
|}
| """.stripMargin
val functionSymbol = tb.define(tb.parse(functionWrapper).asInstanceOf[tb.u.ImplDef])
val list: List[Map[String, AnyRef]] = List(Map("1" -> "2"))
val tree: tb.u.Tree = q"$functionSymbol.createWrapper()"
val wrapper = tb.eval(tree).asInstanceOf[Wrapper]
val res = wrapper.call(list)
println(s"testWrapper = $res")
}
P.S. I'm not sure what are you doing but beware of performance issues. Scala is a hard language to compile and thus it might easily take more time to compile your custom code than to run it. If performance becomes an issue you might need to use some other methods such as full-blown macro-code-generation or at least caching of the compiled code.

using datetime/timestamp in scala slick

is there an easy way to use datetime/timestamp in scala? What's best practice? I currently use "date" to persist data, but I'd also like to persist the current time.
I'm struggling to set the date. This is my code:
val now = new java.sql.Timestamp(new java.util.Date().getTime)
I also tried to do this:
val now = new java.sql.Date(new java.util.Date().getTime)
When changing the datatype in my evolutions to "timestamp", I got an error:
case class MyObjectModel(
id: Option[Int],
title: String,
createdat: Timestamp,
updatedat: Timestamp,
...)
object MyObjectModel{
implicit val myObjectFormat = Json.format[MyObjectModel]
}
Console:
app\models\MyObjectModel.scala:31: No implicit format for
java.sql.Timestamp available.
[error] implicit val myObjectFormat = Json.format[MyObjectModel]
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
Update:
object ProcessStepTemplatesModel {
implicit lazy val timestampFormat: Format[Timestamp] = new Format[Timestamp] {
override def reads(json: JsValue): JsResult[Timestamp] = json.validate[Long].map(l => Timestamp.from(Instant.ofEpochMilli(l)))
override def writes(o: Timestamp): JsValue = JsNumber(o.getTime)
}
implicit val processStepFormat = Json.format[ProcessStepTemplatesModel]
}
try using this in your code
implicit object timestampFormat extends Format[Timestamp] {
val format = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SS'Z'")
def reads(json: JsValue) = {
val str = json.as[String]
JsSuccess(new Timestamp(format.parse(str).getTime))
}
def writes(ts: Timestamp) = JsString(format.format(ts))
}
it is (de)serialized in a JS compatible format like the following "2018-01-06T18:31:29.436Z"
please note: the implicit object shall be decleared in the code before it is used
I guess your question is handled in What's the standard way to work with dates and times in Scala? Should I use Java types or there are native Scala alternatives?.
Go with Java 8 "java.time".
In the subject you mention Slick (Scala Database Library) but the error you got comes from a Json library and it says that you don't have a converter for java.sql.Timestamp to Json. Without knowing which Json library you are using it's hard to help you with a working example.

Converting to net.liftweb.json.JsonAST.JObject Lift in Scala

I am trying to construct a JSON object from a list where key is "products" and value is List[Product] where Product is a case class.But I am getting error that says "type mismatch; found : (String, List[com.mycompnay.ws.client.Product]) required: net.liftweb.json.JObject (which expands to) net.liftweb.json.JsonAST.JObject".
What I have done so far is as below:
val resultJson:JObject = "products" -> resultList
println(compact(render(resultJson)))
You're looking for decompose (doc). See this answer.
I tested the following code and it worked fine:
import net.liftweb.json._
import net.liftweb.json.JsonDSL._
import net.liftweb.json.Extraction._
implicit val formats = net.liftweb.json.DefaultFormats
case class Product(foo: String)
val resultList: List[Product] = List(Product("bar"), Product("baz"))
val resultJson: JObject = ("products" -> decompose(resultList))
println(compact(render(resultJson)))
Result:
{"products":[{"foo":"bar"},{"foo":"baz"}]}

Scala : How to find types of values inside a scala nested collection

consider the following variables in scala :
val nestedCollection_1 = Array(
"key_1" -> Map("key_11" -> "value_11"),
"key_2" -> Map("key_22" -> "value_22"))
val nestedCollection_2 = Map(
"key_3"-> ["key_33","value_33"],
"key_4"-> ["key_44"->"value_44"])
Following are my questions :
1) I want to read the values of the variables nestedCollection_1, nestedCollection_2 and ensure that the value of the variables are of the format
Array[Map[String, Map[String, String]]
and
Map[String, Array[String]]]
2) Is it possible to get the detailed type of a variable in scala? i.e. nestedColelction_1.SOME_METHOD should return Array[Map[String, Map[String, String]] as type of its values
I am not sure what exacltly do you mean. Compiler can ensure type of any variable if you just annotate the type:
val nestedCollection_2: Map[String, List[String]] = Map(
"key_3"-> List("key_33", "value_33"),
"key_4"-> List("key_44", "value_44"))
You can see type of variable in scala repl when you define it, or using Alt + = in Intellij Idea.
scala> val nestedCollection_2 = Map(
| "key_3"-> List("key_33", "value_33"),
| "key_4"-> List("key_44", "value_44"))
nestedCollection_2: scala.collection.immutable.Map[String,List[String]] = Map(key_3 -> List(key_33, value_33), key_4 -> List(key_44, value_44))
Edit
I think I get your question now. Here is how you can get type as String:
import scala.reflect.runtime.universe._
def typeAsString[A: TypeTag](elem: A) = {
typeOf[A].toString
}
Test:
scala> typeAsString(nestedCollection_2)
res0: String = Map[String,scala.List[String]]
scala> typeAsString(nestedCollection_1)
res1: String = scala.Array[(String, scala.collection.immutable.Map[String,String])]

How do I turn a Scala case class into a mongo Document

I'd like to build a generic method for transforming Scala Case Classes to Mongo Documents.
A promising Document constructor is
fromSeq(ts: Seq[(String, BsonValue)]): Document
I can turn a case class into a Map[String -> Any], but then I've lost the type information I need to use the implicit conversions to BsonValues. Maybe TypeTags can help with this?
Here's what I've tried:
import org.mongodb.scala.bson.BsonTransformer
import org.mongodb.scala.bson.collection.immutable.Document
import org.mongodb.scala.bson.BsonValue
case class Person(age: Int, name: String)
//transform scala values into BsonValues
def transform[T](v: T)(implicit transformer: BsonTransformer[T]): BsonValue = transformer(v)
// turn any case class into a Map[String, Any]
def caseClassToMap(cc: Product) = {
val values = cc.productIterator
cc.getClass.getDeclaredFields.map( _.getName -> values.next).toMap
}
// transform a Person into a Document
def personToDocument(person: Person): Document = {
val map = caseClassToMap(person)
val bsonValues = map.toSeq.map { case (key, value) =>
(key, transform(value))
}
Document.fromSeq(bsonValues)
}
<console>:24: error: No bson implicit transformer found for type Any. Implement or import an implicit BsonTransformer for this type.
(key, transform(value))
def personToDocument(person: Person): Document = {
Document("age" -> person.age, "name" -> person.name)
}
Below code works without manual conversion of an object.
import reactivemongo.api.bson.{BSON, BSONDocument, Macros}
case class Person(name:String = "SomeName", age:Int = 20)
implicit val personHandler = Macros.handler[Person]
val bsonPerson = BSON.writeDocument[Person](Person())
println(s"${BSONDocument.pretty(bsonPerson.getOrElse(BSONDocument.empty))}")
You can use Salat https://github.com/salat/salat. A nice example can be found here - https://gist.github.com/bhameyie/8276017. This is the piece of code that will help you -
import salat._
val dBObject = grater[Artist].asDBObject(artist)
artistsCollection.save(dBObject, WriteConcern.Safe)
I was able to serialize a case class to a BsonDocument using the org.bson.BsonDocumentWriter. The below code runs using scala 2.12 and mongo-scala-driver_2.12 version 2.6.0
My quest for this solution was aided by this answer (where they are trying to serialize in the opposite direction): Serialize to object using scala mongo driver?
import org.mongodb.scala.bson.codecs.Macros
import org.mongodb.scala.bson.codecs.DEFAULT_CODEC_REGISTRY
import org.bson.codecs.configuration.CodecRegistries.{fromRegistries, fromProviders}
import org.bson.codecs.EncoderContext
import org.bson.BsonDocumentWriter
import org.mongodb.scala.bson.BsonDocument
import org.bson.codecs.configuration.CodecRegistry
import org.bson.codecs.Codec
case class Animal(name : String, species: String, genus: String, weight: Int)
object TempApp {
def main(args: Array[String]) {
val jaguar = Animal("Jenny", "Jaguar", "Panthera", 190)
val codecProvider = Macros.createCodecProvider[Animal]()
val codecRegistry: CodecRegistry = fromRegistries(fromProviders(codecProvider), DEFAULT_CODEC_REGISTRY)
val codec = Macros.createCodec[Animal](codecRegistry)
val encoderContext = EncoderContext.builder.isEncodingCollectibleDocument(true).build()
var doc = BsonDocument()
val writr = new BsonDocumentWriter(doc) // need to call new since Java lib w/o companion object
codec.encode(writr, jaguar, encoderContext)
print(doc)
}
};