Deserialize binary Thrift message in Scala - scala

I am trying to desrialize a binary message in Scala:
val deserializer = new TDeserializer(new TBinaryProtocol.Factory());
try {
val obj = deserializer.deserialize(new ClientError{}, input._2.toArray)
Where ClientError is the trait generated with Scrooge from a Thrift file. The problem is, that deserialize() expects a TBase object, but TBase is an interface. How do I do this? Do I have to create a new class which implements both?
Thx for any help!

Try this:
def decode(bytes: Array[Byte]): ClientError = {
val protocolFactory = new TBinaryProtocol.Factory
val buffer = new TMemoryInputTransport(bytes)
val proto = protocolFactory.getProtocol(buffer)
ClientError.decode(proto)
}

def getClientError(binaryData: Array[Byte]) : ClientError = {
val tdser = new TDeserializer();
val cliErr = new ClientError()
tdser.deserialize(cliErr, binaryData)
return cliErr
}

Related

Scala JavaFx -- Cannot resolve overloaded method 'add' when trying to add tree table columns

I am trying to write an application using JavaFX and Scala (not ScalaFX). When I tried out this example from http://tutorials.jenkov.com/javafx/treetableview.html (Add TreeTableColumn to TreeTableView), I got a "Cannot resolve overloaded method 'add'" in the last two lines. I was wondering if you can help me get past this issue.
class Phase1 extends Application {
import javafx.scene.control.TreeTableColumn
import javafx.scene.control.TreeTableView
import javafx.scene.control.cell.TreeItemPropertyValueFactory
override def start(primaryStage: Stage): Unit = {
primaryStage.setTitle("Experimental Blocking Tree")
val scene = new Scene(new Group(), 1500, 800)
val sceneRoot = scene.getRoot.asInstanceOf[Group]
val treeTableView = new TreeTableView[Car]
val treeTableColumn1: TreeTableColumn[Car, String] = new TreeTableColumn[Car, String]("Brand")
val treeTableColumn2: TreeTableColumn[Car, String] = new TreeTableColumn[Car, String]("Model")
treeTableColumn1.setCellValueFactory(new TreeItemPropertyValueFactory[Car, String]("brand"))
treeTableColumn2.setCellValueFactory(new TreeItemPropertyValueFactory[Car, String]("model"))
treeTableView.getColumns.add(treeTableColumn1) // cannot resolve overloaded method here
treeTableView.getColumns.add(treeTableColumn2) // and here
}
}
Thanks in advance.
I had the same issue with displaying data in TreeTableView.
Jarek posted a solution here: GitHub Issue
Also this works for me:
import scalafx.beans.property.ReadOnlyStringProperty
case class Car (
val brand: ReadOnlyStringProperty,
val model: ReadOnlyStringProperty
)
class CarStringFactory(val stringValue: ReadOnlyStringProperty) extends scalafx.beans.value.ObservableValue[String, String] {
override def delegate: javafx.beans.value.ObservableValue[String] = stringValue
override def value: String = stringValue.get
}
class YourScalaFXApp {
// ... boilerplate code ...
import scalafx.scene.control.{TreeTableView, TreeTableColumn}
val treeTableView = new TreeTableView[Car]
val treeTableColumn1: TreeTableColumn[Car, String] = new TreeTableColumn[Car, String]("Brand"){
cellValueFactory = {p => new CarStringFactory(p.value.value.value.brand) }
}
val treeTableColumn2: TreeTableColumn[Car, String] = new TreeTableColumn[Car, String]("Model"){
cellValueFactory = {p => new CarStringFactory(p.value.value.value.model) }
}
treeTableView.getColumns.add(treeTableColumn1)
treeTableView.getColumns.add(treeTableColumn2)
}
Refer to
ScalaFX documentation: Properties
TreeTableColumn.cellValueFactory

Trying to define serializer using avro4s but getting a missing implicit error

I am using flink(1.7) kafka client and Avro4s(2.0.4), I want to serialize to byte array :
class AvroSerializationSchema[IN : SchemaFor : FromRecord: ToRecord] extends SerializationSchema[IN] {
override def serialize(element: IN): Array[Byte] = {
val str = AvroSchema[IN]
val schema: Schema = new Parser().parse(str.toString)
val out = new ByteArrayOutputStream()
val os = AvroOutputStream.data[IN].to(out).build(schema)
os.write(element)
out.close()
out.flush()
os.flush()
os.close()
out.toByteArray
}
}
However I am keep getting this exception :
Error:(15, 35) could not find implicit value for evidence parameter of type com.sksamuel.avro4s.Encoder[IN]
val os = AvroOutputStream.data[IN].to(out).build(schema)
and
Error:(15, 35) not enough arguments for method data: (implicit evidence$3: com.sksamuel.avro4s.Encoder[IN])com.sksamuel.avro4s.AvroOutputStreamBuilder[IN].
Unspecified value parameter evidence$3.
val os = AvroOutputStream.data[IN].to(out).build(schema)
According to code IN has to be Encoder type:
object AvroOutputStream {
/**
* An [[AvroOutputStream]] that does not write the schema. Use this when
* you want the smallest messages possible at the cost of not having the schema available
* in the messages for downstream clients.
*/ def binary[T: Encoder] = new AvroOutputStreamBuilder[T](BinaryFormat)
def json[T: Encoder] = new AvroOutputStreamBuilder[T](JsonFormat)
def data[T: Encoder] = new AvroOutputStreamBuilder[T](DataFormat)
}
so it should something like:
class AvroSerializationSchema[IN : Encoder] ...
You don't need to use FromRecord when writing to the output stream. That is for people who want to have a GenericRecord for their own use. You need to use Encoder.
class AvroSerializationSchema[IN : SchemaFor : Encoder] extends SerializationSchema[IN] {
override def serialize(element: IN): Array[Byte] = {
val str = AvroSchema[IN]
val schema: Schema = new Parser().parse(str.toString)
val out = new ByteArrayOutputStream()
val os = AvroOutputStream.data[IN].to(out).build(schema)
os.write(element)
out.close()
out.flush()
os.flush()
os.close()
out.toByteArray
}
}

Scalacache with redis support

I am trying to integrate redis to scalacache. Keys are usually string but values can be objects, Set[String], etc. Cache is initialized by this
val cache: RedisCache = RedisCache(config.host, config.port)
private implicit val scalaCache: ScalaCache[Array[Byte]] = ScalaCache(cacheService.cache)
But while calling put, i am getting this error "Could not find any Codecs for type Set[String] and Repr". Looks like i need to provide codec for my cache input as suggested here so i added,
class A extends Codec[Set[String], Array[Byte]] with GZippingBinaryCodec[Set[String]]
Even after, my class A, is throwing the same error. What am i missing.
As you mentioned in the link, you can either serialize values in a binary format:
import scalacache.serialization.binary._
or as JSON using circe:
import scalacache.serialization.circe._
import io.circe.generic.auto._
Looks like its solved in next release by binary and circe serialization. I am on version 10 and solved by the following,
implicit object SetBindaryCodec extends Codec[Any, Array[Byte]] {
override def serialize(value: Any): Array[Byte] = {
val stream: ByteArrayOutputStream = new ByteArrayOutputStream()
val oos = new ObjectOutputStream(stream)
oos.writeObject(value)
oos.close()
stream.toByteArray
}
override def deserialize(data: Array[Byte]): Any = {
val ois = new ObjectInputStream(new ByteArrayInputStream(data))
val value = ois.readObject
ois.close()
value
}
}
Perks of being up to date. Will upgrade the version, posted it just in case somebody needs it.

How to read serialized object with a method taking generic type by Scala/Kryo?

Use Kryo to read serialized object is easy when I know the specific class type, but if I want to create a method that takes simple generic type, how to do it?
I have code that can not be compiled:
def load[T](path: String): T = {
val instantiator = new ScalaKryoInstantiator
instantiator.setRegistrationRequired(false)
val kryo = instantiator.newKryo()
val input = new Input(FileUtils.readFileToByteArray(new File(path)))
kryo.readObject[T](input, classOf[T])
}
The error I got is:
class type required but T found
kryo.readObject[T](input, classOf[T])
I know what the error means, but don't know the right way to fix it.
The code is modified by my original type-specific code:
def load(path: String): SomeClassType = {
val instantiator = new ScalaKryoInstantiator
instantiator.setRegistrationRequired(false)
val kryo = instantiator.newKryo()
val input = new Input(FileUtils.readFileToByteArray(new File(path)))
kryo.readObject(input, classOf[SomeClassType])
}
I've found the answer, the key is ClassTag:
def load[M: ClassTag](path: String)(implicit tag: ClassTag[M]): M = {
val instantiator = new ScalaKryoInstantiator
instantiator.setRegistrationRequired(false)
val kryo = instantiator.newKryo()
val input = new Input(FileUtils.readFileToByteArray(new File(path)))
kryo.readObject(input, tag.runtimeClass.asInstanceOf[Class[M]])
}
In some threads, the last line is:
kryo.readObject(input, tag.runtimeClass)
This doesn't work in my case, it has to be:
tag.runtimeClass.asInstanceOf[Class[M]]

Creating serializable objects from Scala source code at runtime

To embed Scala as a "scripting language", I need to be able to compile text fragments to simple objects, such as Function0[Unit] that can be serialised to and deserialised from disk and which can be loaded into the current runtime and executed.
How would I go about this?
Say for example, my text fragment is (purely hypothetical):
Document.current.elements.headOption.foreach(_.open())
This might be wrapped into the following complete text:
package myapp.userscripts
import myapp.DSL._
object UserFunction1234 extends Function0[Unit] {
def apply(): Unit = {
Document.current.elements.headOption.foreach(_.open())
}
}
What comes next? Should I use IMain to compile this code? I don't want to use the normal interpreter mode, because the compilation should be "context-free" and not accumulate requests.
What I need to get hold off from the compilation is I guess the binary class file? In that case, serialisation is straight forward (byte array). How would I then load that class into the runtime and invoke the apply method?
What happens if the code compiles to multiple auxiliary classes? The example above contains a closure _.open(). How do I make sure I "package" all those auxiliary things into one object to serialize and class-load?
Note: Given that Scala 2.11 is imminent and the compiler API probably changed, I am happy to receive hints as how to approach this problem on Scala 2.11
Here is one idea: use a regular Scala compiler instance. Unfortunately it seems to require the use of hard disk files both for input and output. So we use temporary files for that. The output will be zipped up in a JAR which will be stored as a byte array (that would go into the hypothetical serialization process). We need a special class loader to retrieve the class again from the extracted JAR.
The following assumes Scala 2.10.3 with the scala-compiler library on the class path:
import scala.tools.nsc
import java.io._
import scala.annotation.tailrec
Wrapping user provided code in a function class with a synthetic name that will be incremented for each new fragment:
val packageName = "myapp"
var userCount = 0
def mkFunName(): String = {
val c = userCount
userCount += 1
s"Fun$c"
}
def wrapSource(source: String): (String, String) = {
val fun = mkFunName()
val code = s"""package $packageName
|
|class $fun extends Function0[Unit] {
| def apply(): Unit = {
| $source
| }
|}
|""".stripMargin
(fun, code)
}
A function to compile a source fragment and return the byte array of the resulting jar:
/** Compiles a source code consisting of a body which is wrapped in a `Function0`
* apply method, and returns the function's class name (without package) and the
* raw jar file produced in the compilation.
*/
def compile(source: String): (String, Array[Byte]) = {
val set = new nsc.Settings
val d = File.createTempFile("temp", ".out")
d.delete(); d.mkdir()
set.d.value = d.getPath
set.usejavacp.value = true
val compiler = new nsc.Global(set)
val f = File.createTempFile("temp", ".scala")
val out = new BufferedOutputStream(new FileOutputStream(f))
val (fun, code) = wrapSource(source)
out.write(code.getBytes("UTF-8"))
out.flush(); out.close()
val run = new compiler.Run()
run.compile(List(f.getPath))
f.delete()
val bytes = packJar(d)
deleteDir(d)
(fun, bytes)
}
def deleteDir(base: File): Unit = {
base.listFiles().foreach { f =>
if (f.isFile) f.delete()
else deleteDir(f)
}
base.delete()
}
Note: Doesn't handle compiler errors yet!
The packJar method uses the compiler output directory and produces an in-memory jar file from it:
// cf. http://stackoverflow.com/questions/1281229
def packJar(base: File): Array[Byte] = {
import java.util.jar._
val mf = new Manifest
mf.getMainAttributes.put(Attributes.Name.MANIFEST_VERSION, "1.0")
val bs = new java.io.ByteArrayOutputStream
val out = new JarOutputStream(bs, mf)
def add(prefix: String, f: File): Unit = {
val name0 = prefix + f.getName
val name = if (f.isDirectory) name0 + "/" else name0
val entry = new JarEntry(name)
entry.setTime(f.lastModified())
out.putNextEntry(entry)
if (f.isFile) {
val in = new BufferedInputStream(new FileInputStream(f))
try {
val buf = new Array[Byte](1024)
#tailrec def loop(): Unit = {
val count = in.read(buf)
if (count >= 0) {
out.write(buf, 0, count)
loop()
}
}
loop()
} finally {
in.close()
}
}
out.closeEntry()
if (f.isDirectory) f.listFiles.foreach(add(name, _))
}
base.listFiles().foreach(add("", _))
out.close()
bs.toByteArray
}
A utility function that takes the byte array found in deserialization and creates a map from class names to class byte code:
def unpackJar(bytes: Array[Byte]): Map[String, Array[Byte]] = {
import java.util.jar._
import scala.annotation.tailrec
val in = new JarInputStream(new ByteArrayInputStream(bytes))
val b = Map.newBuilder[String, Array[Byte]]
#tailrec def loop(): Unit = {
val entry = in.getNextJarEntry
if (entry != null) {
if (!entry.isDirectory) {
val name = entry.getName
// cf. http://stackoverflow.com/questions/8909743
val bs = new ByteArrayOutputStream
var i = 0
while (i >= 0) {
i = in.read()
if (i >= 0) bs.write(i)
}
val bytes = bs.toByteArray
b += mkClassName(name) -> bytes
}
loop()
}
}
loop()
in.close()
b.result()
}
def mkClassName(path: String): String = {
require(path.endsWith(".class"))
path.substring(0, path.length - 6).replace("/", ".")
}
A suitable class loader:
class MemoryClassLoader(map: Map[String, Array[Byte]]) extends ClassLoader {
override protected def findClass(name: String): Class[_] =
map.get(name).map { bytes =>
println(s"defineClass($name, ...)")
defineClass(name, bytes, 0, bytes.length)
} .getOrElse(super.findClass(name)) // throws exception
}
And a test case which contains additional classes (closures):
val exampleSource =
"""val xs = List("hello", "world")
|println(xs.map(_.capitalize).mkString(" "))
|""".stripMargin
def test(fun: String, cl: ClassLoader): Unit = {
val clName = s"$packageName.$fun"
println(s"Resolving class '$clName'...")
val clazz = Class.forName(clName, true, cl)
println("Instantiating...")
val x = clazz.newInstance().asInstanceOf[() => Unit]
println("Invoking 'apply':")
x()
}
locally {
println("Compiling...")
val (fun, bytes) = compile(exampleSource)
val map = unpackJar(bytes)
println("Classes found:")
map.keys.foreach(k => println(s" '$k'"))
val cl = new MemoryClassLoader(map)
test(fun, cl) // should call `defineClass`
test(fun, cl) // should find cached class
}