Convert prepareStament object to Json Scala - scala

I'am trying to convert prepareStament(object uses for sending SQL statement to the database ) to Json with scala.
So far, I've discovered that the best way to convert an object to Json in scala is to do it with the net.liftweb library.
But when I tried it, I got an empty json.
this is the code
import java.sql.DriverManager
import net.liftweb.json._
import net.liftweb.json.Serialization.write
object Main {
def main (args: Array[String]): Unit = {
implicit val formats = DefaultFormats
val jdbcSqlConnStr = "sqlserverurl**"
val conn = DriverManager.getConnection(jdbcSqlConnStr)
val statement = conn.prepareStatement("exec select_all")
val piedPierJSON2= write(statement)
println(piedPierJSON2)
}
}
this is the result
{}
I used an object I created , and the conversion worked.
case class Person(name: String, address: Address)
case class Address(city: String, state: String)
val p = Person("Alvin Alexander", Address("Talkeetna", "AK"))
val piedPierJSON3 = write(p)
println(piedPierJSON3)
This is the result
{"name":"Alvin Alexander","address":{"city":"Talkeetna","state":"AK"}}

I understood where the problem was, PrepareStament is an interface, and none of its subtypes are serializable...
I'm going to try to wrap it up and put it in a different class.

Related

json4s, how to deserialize json with FullTypeHints w/o explicitly setting TypeHints

I do specify FullTypeHints before deserialization
def serialize(definition: Definition): String = {
val hints = definition.tasks.map(_.getClass).groupBy(_.getName).values.map(_.head).toList
implicit val formats = Serialization.formats(FullTypeHints(hints))
writePretty(definition)
}
It produces json with type hints, great!
{
"name": "My definition",
"tasks": [
{
"jsonClass": "com.soft.RootTask",
"name": "Root"
}
]
}
Deserialization doesn't work, it ignores "jsonClass" field with type hint
def deserialize(jsonString: String): Definition = {
implicit val formats = DefaultFormats.withTypeHintFieldName("jsonClass")
read[Definition](jsonString)
}
Why should I repeat typeHints using Serialization.formats(FullTypeHints(hints)) for deserialization if hints are in json string?
Can json4s infer them from json?
The deserialiser is not ignoring the type hint field name, it just does not have anything to map it with. This is where the hints come in. Thus, you have to declare and assign your hints list object once again and pass it to the DefaultFormats object either by using the withHints method or by overriding the value when creating a new instance of DefaultFormats. Here's an example using the latter approach.
val hints = definition.tasks.map(_.getClass).groupBy(_.getName).values.map(_.head).toList
implicit val formats: Formats = new DefaultFormats {
outer =>
override val typeHintFieldName = "jsonClass"
override val typeHints = hints
}
I did it this way since I have contract:
withTypeHintFieldName is known in advance
withTypeHintFieldName contains fully qualified class name and it's always case class
def deserialize(jsonString: String): Definition = {
import org.json4s._
import org.json4s.native.JsonMethods._
import org.json4s.JsonDSL._
val json = parse(jsonString)
val classNames: List[String] = (json \\ $$definitionTypes$$ \\ classOf[JString])
val hints: List[Class[_]] = classNames.map(clz => Try(Class.forName(clz)).getOrElse(throw new RuntimeException(s"Can't get class for $clz")))
implicit val formats = Serialization.formats(FullTypeHints(hints)).withTypeHintFieldName($$definitionTypes$$)
read[Definition](jsonString)

Spark unable to find encoder(case class) although providing it

Trying to figure out why getting an error on encoders, any insight would be helpful!
ERROR Unable to find encoder for type SolrNewsDocument, An implicit Encoder[SolrNewsDocument] is needed to store `
Clearly I have imported spark.implicits._. I have also have provided an encoder as a case class.
def ingestDocsToSolr(newsItemDF: DataFrame) = {
case class SolrNewsDocument(
title: String,
body: String,
publication: String,
date: String,
byline: String,
length: String
)
import spark.implicits._
val solrDocs = newsItemDF.as[SolrNewsDocument].map { doc =>
val solrDoc = new SolrInputDocument
solrDoc.setField("title", doc.title.toString)
solrDoc.setField("body", doc.body)
solrDoc.setField("publication", doc.publication)
solrDoc.setField("date", doc.date)
solrDoc.setField("byline", doc.byline)
solrDoc.setField("length", doc.length)
solrDoc
}
// can be used for stream SolrSupport.
SolrSupport.indexDocs("localhost:2181", "collection", 10, solrDocs.rdd);
val solrServer = SolrSupport.getCachedCloudClient("localhost:2181")
solrServer.setDefaultCollection("collection")
solrServer.commit(false, false)
}
//Check this one.-Move case class declaration before function declaration.
//Encoder is created once case class statement is executed by compiler. Then only compiler will be able to use encoder inside function deceleration.
import spark.implicits._
case class SolrNewsDocument(title: String,body: String,publication: String,date: String,byline: String,length: String)
def ingestDocsToSolr(newsItemDF:DataFrame) = {
val solrDocs = newsItemDF.as[SolrNewsDocument]}
i got this error trying to iterate over a text file, and in my case, as of spark 2.4.x the issue was that i had to cast it to an RDD first (that used to be implicit)
textFile
.rdd
.flatMap(line=>line.split(" "))
Migrating our Scala codebase to Spark 2

How to use TypeInformation in a generic method using Scala

I'm trying to create a generic method in Apache Flink to parse a DataSet[String](JSON strings) using case classes. I tried to use the TypeInformation like it's mentioned here: https://ci.apache.org/projects/flink/flink-docs-stable/dev/types_serialization.html#generic-methods
I'm using liftweb to parse the JSON string, this is my code:
import net.liftweb.json._
import org.apache.flink.api.common.typeinfo.TypeInformation
import org.apache.flink.api.scala._
class Loader(settings: Map[String, String])(implicit environment: ExecutionEnvironment) {
val env: ExecutionEnvironment = environment
def load[T: TypeInformation](): DataSet[T] = {
val data: DataSet[String] = env.fromElements(
"""{"name": "name1"}""",
"""{"name": "name2"}"""
)
implicit val formats = DefaultFormats
data.map(item => parse(item).extract[T])
}
}
But I got the error:
No Manifest available for T
data.map(item => parse(item).extract[T])
Then I tried to add a Manifest and delete the TypeInformation like this:
def load[T: Manifest](): DataSet[T] = { ...
And I got the next error:
could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[T]
I'm very confuse about this, I'll really appreciate your help.
Thanks.

Getting json parsing error org.json4s.package$MappingException

I am trying to parse json extract elements into case class. Just curios why code is running one way and not the other way.
This code works
import org.json4s._
import org.json4s.DefaultFormats
import org.json4s.jackson.JsonMethods.parse
object JsonCase {
def main(args: Array[String]): Unit = {
implicit val formats = DefaultFormats
val input = """{"InputDB: "XYZ"}"""
case class config(stagingDB: String)
val spec = parse(input).extract[config]
println(spec.stagingDB)
}
}
Why below code doesn't work
import org.json4s._
import org.json4s.DefaultFormats
import org.json4s.jackson.JsonMethods.parse
implicit val formats = DefaultFormats
val input = """{"stagingDB": "XYZ"}"""
case class config(stagingDB: String)
val spec = parse(input).extract[config]
println(spec.stagingDB)
I find the opposite to be true. The second block of code works but the first fails because the quoting is wrong in input. There is no closing " for InputDB so it is not valid JSON.
More generally, when comparing two blocks of code you should remove as much of the shared code as possible. So config, input and formats should be outside the object and shared between both examples so that you know you are focussing on the differences in the code not the similarities.
Everything needs to be defined at least at object/class level (it makes no sense otherwise).
Below is I think what you want
object JsonCase {
// This way it applies to the whole object
implicit val formats = DefaultFormats
def main(args: Array[String]): Unit = {
// do stuff
....

How do I turn a Scala case class into a mongo Document

I'd like to build a generic method for transforming Scala Case Classes to Mongo Documents.
A promising Document constructor is
fromSeq(ts: Seq[(String, BsonValue)]): Document
I can turn a case class into a Map[String -> Any], but then I've lost the type information I need to use the implicit conversions to BsonValues. Maybe TypeTags can help with this?
Here's what I've tried:
import org.mongodb.scala.bson.BsonTransformer
import org.mongodb.scala.bson.collection.immutable.Document
import org.mongodb.scala.bson.BsonValue
case class Person(age: Int, name: String)
//transform scala values into BsonValues
def transform[T](v: T)(implicit transformer: BsonTransformer[T]): BsonValue = transformer(v)
// turn any case class into a Map[String, Any]
def caseClassToMap(cc: Product) = {
val values = cc.productIterator
cc.getClass.getDeclaredFields.map( _.getName -> values.next).toMap
}
// transform a Person into a Document
def personToDocument(person: Person): Document = {
val map = caseClassToMap(person)
val bsonValues = map.toSeq.map { case (key, value) =>
(key, transform(value))
}
Document.fromSeq(bsonValues)
}
<console>:24: error: No bson implicit transformer found for type Any. Implement or import an implicit BsonTransformer for this type.
(key, transform(value))
def personToDocument(person: Person): Document = {
Document("age" -> person.age, "name" -> person.name)
}
Below code works without manual conversion of an object.
import reactivemongo.api.bson.{BSON, BSONDocument, Macros}
case class Person(name:String = "SomeName", age:Int = 20)
implicit val personHandler = Macros.handler[Person]
val bsonPerson = BSON.writeDocument[Person](Person())
println(s"${BSONDocument.pretty(bsonPerson.getOrElse(BSONDocument.empty))}")
You can use Salat https://github.com/salat/salat. A nice example can be found here - https://gist.github.com/bhameyie/8276017. This is the piece of code that will help you -
import salat._
val dBObject = grater[Artist].asDBObject(artist)
artistsCollection.save(dBObject, WriteConcern.Safe)
I was able to serialize a case class to a BsonDocument using the org.bson.BsonDocumentWriter. The below code runs using scala 2.12 and mongo-scala-driver_2.12 version 2.6.0
My quest for this solution was aided by this answer (where they are trying to serialize in the opposite direction): Serialize to object using scala mongo driver?
import org.mongodb.scala.bson.codecs.Macros
import org.mongodb.scala.bson.codecs.DEFAULT_CODEC_REGISTRY
import org.bson.codecs.configuration.CodecRegistries.{fromRegistries, fromProviders}
import org.bson.codecs.EncoderContext
import org.bson.BsonDocumentWriter
import org.mongodb.scala.bson.BsonDocument
import org.bson.codecs.configuration.CodecRegistry
import org.bson.codecs.Codec
case class Animal(name : String, species: String, genus: String, weight: Int)
object TempApp {
def main(args: Array[String]) {
val jaguar = Animal("Jenny", "Jaguar", "Panthera", 190)
val codecProvider = Macros.createCodecProvider[Animal]()
val codecRegistry: CodecRegistry = fromRegistries(fromProviders(codecProvider), DEFAULT_CODEC_REGISTRY)
val codec = Macros.createCodec[Animal](codecRegistry)
val encoderContext = EncoderContext.builder.isEncodingCollectibleDocument(true).build()
var doc = BsonDocument()
val writr = new BsonDocumentWriter(doc) // need to call new since Java lib w/o companion object
codec.encode(writr, jaguar, encoderContext)
print(doc)
}
};