Can I avoid using structural typing in this scenario? - scala

I have some code that uses both a third-party library and my own library. In my own library, I don't want to have a dependency on the third-party one so I want one of my methods to accept a more generic type as a parameter. Unfortunately, I cannot extend or mixin a trait to the 3rd party classes since they are generated using factory methods & the classes are final.
I can get around this issue by using structural typing but I was wondering if there is an alternative? I don't want to have to iterate through each record returned by the factory method and "new up" instances of a separate type if possible.
I've boiled it down to a scenario like the following:
Third-party library code that cannot be changed
// Class inside library cannot be extended due to it being 'final'
final class SpecificRecord(val values: IndexedSeq[String]) {
def get(i: Int): String = {
values(i)
}
}
// A companion object simply to create some sample data in an iterator
object SpecificRecord{
def generateSpecificRecords(): Iterator[SpecificRecord] = new Iterator[SpecificRecord] {
var pointerLocation: Int = 0
private val state = IndexedSeq(
IndexedSeq("Row1 Col1", "Row1 Col2", "Row 1 Col3"),
IndexedSeq("Row2 Col1", "Row2 Col2", "Row 2 Col3")
)
override def hasNext: Boolean = {
if (pointerLocation < state.length) true else false
}
override def next(): SpecificRecord = {
val record = new SpecificRecord(state(pointerLocation))
pointerLocation += 1
record
}
}
}
As you can see above, the SpecificRecord class is final and the specificRecords val is an Iterator with a bunch of SpecificRecord in it. I don't want to have to iterate through each specificRecord and create a new, more generic, object if possible.
My code that can be changed
val specificRecords: Iterator[SpecificRecord] = SpecificRecord.generateSpecificRecords()
type gettable = {
def get(i: Int): String
}
def printRecord(records: Iterator[gettable]): Unit = {
for (record <- records) {
println(record.get(0), record.get(1), record.get(2))
}
}
printRecord(specificRecords)
This correctly prints:
(Row1 Col1,Row1 Col2,Row 1 Col3)
(Row2 Col1,Row2 Col2,Row 2 Col3)
I have a printRecord method that doesn't really care what type is passed in, as long as it has a method like get(Int): String. This a pretty decent solution but I was wondering if it would be possible to do this without structural typing?

This is a typical use case for type classes.
trait Gettable[T] {
def get(t: T, i: Int): String
}
object Gettable {
implicit object SpecificRecordGettable extends Gettable[SpecificRecord] {
def get(sr: SpecificRecord, i: Int) = sr.get(i)
}
}
def printRecord[T : Gettable](records: Iterator[T]) = {
val getter = implicitly[Gettable[T]]
records.foreach { record =>
println(getter.get(record, 0), getter.get(record, 1), getter.get(record, 2))
}
}
This is a bit more verbose than your approach with structured types: for each type you want to be gettable, you have to add an implicit object implementing the get, but it works without reflection, which is a good thing.
Another advantage of this approach is its flexibility: the underlying type does not have to have get specifically, you can implement anything with the implicit. E.g.:
implicit object ArrayGettable extends Gettable[Array[String]] {
def get(a: Array[String], i: Int) = a(i)
}
implicit object ProductGettable extends Gettable[Product] {
def get(p: Product, i: Int) = p.productIterator.drop(i).next.toString
}
Now, your printRecord works with string arrays too (as long as they have at least three elements), and even tuples and case classes.
Try this:
printRecord[Product](Iterator((1,2, "three"), ("foo", "bar", 5)))
Or this:
case class Foo(x: String, y: Int, z: Seq[Int])
printRecord[Product](Iterator(Foo("bar", 1, 1::2::Nil), ("foo", "bar", "baz")))
A similar but a little bit less verbose approach is to just define an implicit 'getter' without bothering with type classes:
def printRecord[T](records: Iterator[T])(implicit getter: (T,Int) => String) =
records.foreach { record =>
println(getter(record, 0), getter(record, 1), getter(record, 2))
}
object Getters {
implicit def getter(sr: SpecificRecord, i: Int) = sr.get(i)
implicit def getter(a: Array[String], i: Int) = a(i)
implicit def getter(p: Product, i: Int) = p.productIterator.drop(i).next.toString
}
This is fairly equivalent in usage, the difference being that type class lets you potentially define more than one method, but if you only ever need get, then this would save you a few keystrokes.

Related

Scala ClassCastException on Option.orNull

When I try to run following code:
def config[T](key: String): Option[T] = {
//in reality this is a map of various instance types as values
Some("string".asInstanceOf[T])
}
config("path").orNull
I'm getting error:
java.lang.String cannot be cast to scala.runtime.Null$
java.lang.ClassCastException
Following attempts are working fine:
config[String]("path").orNull
config("path").getOrElse("")
Since getOrElse works its confusing why null is so special and throws an error. Is there a way for orNull to work without specifying generic type ?
scalaVersion := "2.12.8"
Just to show how you may avoid the use asInstanceOf to get the values from a typed config.
sealed trait Value extends Product with Serializable
final case class IntValue(value: Int) extends Value
final case class StringValue(value: String) extends Value
final case class BooleanValue(value: Boolean) extends Value
type Config = Map[String, Value]
sealed trait ValueExtractor[T] {
def extract(config: Config)(fieldName: String): Option[T]
}
object ValueExtractor {
implicit final val IntExtractor: ValueExtractor[Int] =
new ValueExtractor[Int] {
override def extract(config: Config)(fieldName: String): Option[Int] =
config.get(fieldName).collect {
case IntValue(value) => value
}
}
implicit final val StringExtractor: ValueExtractor[String] =
new ValueExtractor[String] {
override def extract(config: Config)(fieldName: String): Option[String] =
config.get(fieldName).collect {
case StringValue(value) => value
}
}
implicit final val BooleanExtractor: ValueExtractor[Boolean] =
new ValueExtractor[Boolean] {
override def extract(config: Config)(fieldName: String): Option[Boolean] =
config.get(fieldName).collect {
case BooleanValue(value) => value
}
}
}
implicit class ConfigOps(val config: Config) extends AnyVal {
def getAs[T](fieldName: String)(default: => T)
(implicit extractor: ValueExtractor[T]): T =
extractor.extract(config)(fieldName).getOrElse(default)
}
Then, you can use it like this.
val config = Map("a" -> IntValue(10), "b" -> StringValue("Hey"), "d" -> BooleanValue(true))
config.getAs[Int](fieldName = "a")(default = 0) // res: Int = 10
config.getAs[Int](fieldName = "b")(default = 0) // res: Int = 0
config.getAs[Boolean](fieldName = "c")(default = false) // res: Boolean = false
Now, the problem becomes how to create the typed config from a raw source.
And even better, how to directly map the config to a case class.
But, those are more complex, and probably is better to just use something already done, like pureconfig.
Just as an academic exercise, lets see if we can support Lists & Maps.
Lets start with lists, a naive approach would be to have another case class for values which are lists, and create a factory of extractors for every kind of list (this process is formally know as implicit derivation).
import scala.reflect.ClassTag
final case class ListValue[T](value: List[T]) extends Value
...
// Note that, it has to be a def, since it is not only one implicit.
// But, rather a factory of implicits.
// Also note that, it needs another implicit parameter to construct the specific implicit.
// In this case, it needs a ClasTag for the inner type of the list to extract.
implicit final def listExtractor[T: ClassTag]: ValueExtractor[List[T]] =
new ValueExtractor[List[T]] {
override def extract(config: Config)(fieldName: String): Option[List[T]] =
config.get(fieldName).collect {
case ListValue(value) => value.collect {
// This works as a safe caster, which will remove all value that couldn't been casted.
case t: T => t
}
}
}
Now, you can use it like this.
val config = Map("l" ->ListValue(List(1, 2, 3)))
config.getAs[List[Int]](fieldName = "l")(default = List.empty)
// res: List[Int] = List(1, 2, 3)
config.getAs[List[String]](fieldName = "l")(default = List("Hey"))
// res: String = List() - The default is not used, since the field is a List...
// whose no element could be casted to String.
However, this approach is limited to plain types, if you need a List of other generic type, like a List of Lists. Then, this won't work.
val config = Map("l" ->ListValue(List(List(1, 2), List(3))))
val l = config.getAs[List[List[String]]](fieldName = "l")(default = List.empty)
// l: List[List[String]] = List(List(1, 2), List(3)) ???!!!
l.head
// res: List[String] = List(1, 2)
l.head.head
// java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.String
The problem here is type erasure, which ClassTags can not solve, you may try to use TypeTags which can preserve the complete type, but the solution becomes more cumbersome.
For Maps the solution is quite similar, especially if you fix the key type to String (assuming what you really want is a nested config). But, this post is too long now, so I would leave it as an exercise for the reader.
Nevertheless, as already said, this can be broken easily, and is not completely robust.
There are better approaches, but I myself am not very skilled on those (yet), and even if I would be, the answer would be more long and really not necessary at all.
Lucky for you, even if pureconfig does not support YAML directly, there is a module which does, pureconfig-yaml.
I would suggest you to take a look to the module, and if you have further problems ask a new question tagging pureconfig and yaml directly. Also, if it is just a small doubt, you may try asking in thegitter channel.

Class type required but E found (scala macro)

I'm trying to remove some of the boilerplate in an API I am writing.
Roughly speaking, my API currently looks like this:
def toEither[E <: WrapperBase](priority: Int)(implicit factory: (String, Int) => E): Either[E, T] = {
val either: Either[String, T] = generateEither()
either.left.map(s => factory(s, priority))
}
Which means that the user has to generate an implicit factory for every E used. I am looking to replace this with a macro that gives a nice compile error if the user provided type doesn't have the correct ctor parameters.
I have the following:
object GenericFactory {
def create[T](ctorParams: Any*): T = macro createMacro[T]
def createMacro[T](c: blackbox.Context)(ctorParams: c.Expr[Any]*)(implicit wtt: WeakTypeType[T]): c.Expr[T] = {
import c.universe._
c.Expr[T](q"new $wtt(..$ctorParams)")
}
}
If I provide a real type to this GenericFactory.create[String]("hey") I have no issues, but if I provide a generic type: GenericFactory.create[E]("hey") then I get the following compile error: class type required by E found.
Where have I gone wrong? Or if what I want is NOT possible, is there anything else I can do to reduce the effort for the user?
Sorry but I don't think you can make it work. The problem is that Scala (as Java) uses types erasure. It means that there is only one type for all generics kinds (possibly except for value-type specializations which is not important now). It means that the macro is expanded only once for all E rather then one time for each E specialization provided by the user. And there is no way to express a restriction that some generic type E must have a constructor with a given signature (and if there were - you wouldn't need you macro in the first place). So obviously it can not work because the compiler can't generate a constructor call for a generic type E. So what the compiler says is that for generating a constructor call it needs a real class rather than generic E.
To put it otherwise, macro is not a magic tool. Using macro is just a way to re-write a piece of code early in the compiler processing but then it will be processed by the compiler in a usual way. And what your macro does is rewrites
GenericFactory.create[E]("hey")
with something like
new E("hey")
If you just write that in your code, you'll get the same error (and probably will not be surprised).
I don't think you can avoid using your implicit factory. You probably could modify your macro to generate those implicit factories for valid types but I don't think you can improve the code further.
Update: implicit factory and macro
If you have just one place where you need one type of constructors I think the best you can do (or rather the best I can do ☺) is following:
Sidenote the whole idea comes from "Implicit macros" article
You define StringIntCtor[T] typeclass trait and a macro that would generate it:
import scala.language.experimental.macros
import scala.reflect.macros._
trait StringIntCtor[T] {
def create(s: String, i: Int): T
}
object StringIntCtor {
implicit def implicitCtor[T]: StringIntCtor[T] = macro createMacro[T]
def createMacro[T](c: blackbox.Context)(implicit wtt: c.WeakTypeTag[T]): c.Expr[StringIntCtor[T]] = {
import c.universe._
val targetTypes = List(typeOf[String], typeOf[Int])
def testCtor(ctor: MethodSymbol): Boolean = {
if (ctor.paramLists.size != 1)
false
else {
val types = ctor.paramLists(0).map(sym => sym.typeSignature)
(targetTypes.size == types.size) && targetTypes.zip(types).forall(tp => tp._1 =:= tp._2)
}
}
val ctors = wtt.tpe.decl(c.universe.TermName("<init>"))
if (!ctors.asTerm.alternatives.exists(sym => testCtor(sym.asMethod))) {
c.abort(c.enclosingPosition, s"Type ${wtt.tpe} has no constructor with signature <init>${targetTypes.mkString("(", ", ", ")")}")
}
// Note that using fully qualified names for all types except imported by default are important here
val res = c.Expr[StringIntCtor[T]](
q"""
(new so.macros.StringIntCtor[$wtt] {
override def create(s:String, i: Int): $wtt = new $wtt(s, i)
})
""")
//println(res) // log the macro
res
}
}
You use that trait as
class WrapperBase(val s: String, val i: Int)
case class WrapperChildGood(override val s: String, override val i: Int, val float: Float) extends WrapperBase(s, i) {
def this(s: String, i: Int) = this(s, i, 0f)
}
case class WrapperChildBad(override val s: String, override val i: Int, val float: Float) extends WrapperBase(s, i) {
}
object EitherHelper {
type T = String
import scala.util._
val rnd = new Random(1)
def generateEither(): Either[String, T] = {
if (rnd.nextBoolean()) {
Left("left")
}
else {
Right("right")
}
}
def toEither[E <: WrapperBase](priority: Int)(implicit factory: StringIntCtor[E]): Either[E, T] = {
val either: Either[String, T] = generateEither()
either.left.map(s => factory.create(s, priority))
}
}
So now you can do:
val x1 = EitherHelper.toEither[WrapperChildGood](1)
println(s"x1 = $x1")
val x2 = EitherHelper.toEither[WrapperChildGood](2)
println(s"x2 = $x2")
//val bad = EitherHelper.toEither[WrapperChildBad](3) // compilation error generated by c.abort
and it will print
x1 = Left(WrapperChildGood(left,1,0.0))
x2 = Right(right)
If you have many different places where you want to ensure different constructors exists, you'll need to make the macro much more complicated to generate constructor calls with arbitrary signatures passed from the outside.

Different ways of building typeclasses in Scala?

A typeclass example taken from the Programming Scala book:
case class Address(street: String, city: String)
case class Person(name: String, address: Address)
trait ToJSON {
def toJSON(level: Int = 0): String
val INDENTATION = " "
def indentation(level: Int = 0): (String,String) =
(INDENTATION * level, INDENTATION * (level+1))
}
implicit class AddressToJSON(address: Address) extends ToJSON {
def toJSON(level: Int = 0): String = {
val (outdent, indent) = indentation(level)
s"""{
|${indent}"street": "${address.street}",
|${indent}"city": "${address.city}"
|$outdent}""".stripMargin
}
}
implicit class PersonToJSON(person: Person) extends ToJSON {
def toJSON(level: Int = 0): String = {
val (outdent, indent) = indentation(level)
s"""{
|${indent}"name": "${person.name}",
|${indent}"address": ${person.address.toJSON(level + 1)}
|$outdent}""".stripMargin
}
}
val a = Address("1 Scala Lane", "Anytown")
val p = Person("Buck Trends", a)
println(a.toJSON())
println()
println(p.toJSON())
The code works fine, but I am under the impression (from some blog posts) that typeclasses are typically done this way in scala:
// src/main/scala/progscala2/implicits/toJSON-type-class.sc
case class Address(street: String, city: String)
case class Person(name: String, address: Address)
trait ToJSON[A] {
def toJSON(a: A, level: Int = 0): String
val INDENTATION = " "
def indentation(level: Int = 0): (String,String) =
(INDENTATION * level, INDENTATION * (level+1))
}
object ToJSON {
implicit def addressToJson: ToJSON[Address] = new ToJSON[Address] {
override def toJSON(address: Address, level: Int = 0) : String = {
val (outdent, indent) = indentation(level)
s"""{
|${indent}"street": "${address.street}",
|${indent}"city": "${address.city}"
|$outdent}""".stripMargin
}
}
implicit def personToJson: ToJSON[Person] = new ToJSON[Person] {
override def toJSON(a: Person, level: Int): String = {
val (outdent, indent) = indentation(level)
s"""{
|${indent}"name": "${a.name}",
|${indent}"address": ${implicitly[ToJSON[Address]].toJSON(a.address, level + 1)}
|$outdent}""".stripMargin
}
}
def toJSON[A](a: A, level: Int = 0)(implicit ev: ToJSON[A]) = {
ev.toJSON(a, level)
}
}
val a = Address("1 Scala Lane", "Anytown")
val p = Person("Buck Trends", a)
import ToJSON.toJSON
println(toJSON(a))
println(toJSON(p))
Which way is better or more correct? Any insights are welcome.
It's a stretch to call the first ToJSON a "type class" at all (although it's not like these terms are standardized, and even your second, more Scala-idiomatic version differs in many important ways from e.g. type classes in Haskell).
One of the properties of type classes that I would consider definitional is that they allow you to constrain generic types. Scala provides special syntax to support this in the form of context bounds, so I can write e.g. the following:
import io.circe.Encoder
def foo[A: Numeric: Encoder](a: A) = ...
This constrains the type A to have both Numeric and Encoder instances.
This syntax isn't available for the first ToJSON, and you'd have to use something like view bounds (now deprecated) or implicit implicit conversion parameters instead.
There are also many kinds of operations that can't be provided by the first ToJSON style. For example, suppose we've got a Monoid that uses the standard Scala encoding of type classes:
trait Monoid[A] {
def empty: A
def plus(x: A, y: A): A
}
And we wanted to translate it into the first style, where we have an unparametrized Monoid trait that will be the target of implicit conversions from types that we want to be able to treat as monoidal. We're entirely out of luck, since we don't have a type parameter that we can refer to our empty and plus signatures.
One other argument: the type classes in the standard library (Ordering, CanBuildFrom, etc.) all use the second style, as do the vast majority of third-party Scala libraries you'll come across.
In short, don't ever use the first version. It'll only work when you only have operations of the form A => Whatever (for some concrete Whatever), doesn't have nice syntactic support, and isn't generally considered idiomatic by the community.

Concise way to enforce implementation of factory in Scala

Let us assume we have a trait T. What is the best way to achieve the following:
Everybody who writes an implementation of T should be forced to provide a possibility that allows a parameter-free initialization of T, i.e., we probably have to enforce the implementation of a configurable factory.
All logic/data that only depends on the actual initialization parameters (of a certain implementation A of T) should be handled/stored centrally, but should be available in both the factory and A.
The most simple/concise way I see to achieve this (approximately) would be to add a trait for a factory and link T to this factory:
trait T {
val factory: TFactory
}
trait TFactory {
def build(): T
val description: String // example for logic/data that only depend on the parameters
}
// example implementation:
class A(val factory: AFactory, paramA: Int, paramB: Int, paramC: Int) extends T
class AFactory(paramA: Int, paramB: Int, paramC: Int) extends TFactory {
def build = new A(this, paramA, paramB, paramC)
val description = f"$paramA $paramB $paramC"
}
Obviously this does not really "enforce" the implementation of a factory (as long as there is an alternative implementation available) and obviously it is possible to generate instantiations of A which link to a "wrong" TFactory. What I also don't like about this approach is the repetition of the initialization parameters. I often create yet another class AParams which again wraps all parameters (for instance to facilitate adding new parameters). Thus, I end up with three classes, which imho is a lot of boilerplate for this simple problem.
My question is whether there is a (maybe completely) different approach, which achieves the same primary goals but is more concise?
I'm not quite sure I get the full intent of your requirements but what do you think of this behavior?
trait TFactory{
def build():T
val description:String
}
trait T extends TFactory
//can't declare A without build and not make it abstract
class A(paramA: Int, paramB: Int, paramC: Int) extends T {
def build = new A(paramA, paramB, paramC)
val description = f"$paramA $paramB $paramC"
}
val a1 = new A(1, 4, 5)
val a2 = a1.build()
//We can give ourselves as a factory to something that expects TFactory
val factory:TFactory = a1
val a_new = factory.build()
//More likely we can just give our build method
def func(f: ()=>T) = {
val new_t = f()
new_t
}
val a_newer = func(a1.build)
println(a1 +": " + a1.description)
println(a2 +": " + a2.description)
println(a_new +": " + a_new.description)
println(a_newer +": " + a_newer.description)
Output:
Main$$anon$1$A#69267649: 1 4 5
Main$$anon$1$A#69b1fbf4: 1 4 5
Main$$anon$1$A#24148662: 1 4 5
Main$$anon$1$A#3f829e6f: 1 4 5
Add a representation type parameter:
trait Factory[Prod] {
def build(): Prod
}
trait Prod[Repr] {
def factory: Factory[Repr]
}
Or, if you want to "enforce" that the type remains the same (I wouldn't do that unless you gain something from it):
trait Prod[Repr <: Prod[Repr]] {
def factory: Factory[Repr]
}
Then:
case class AConfig(a: Int, b: Int)
case class A(config: AConfig) extends Prod[A] {
def factory = AFactory(config)
}
case class AFactory(config: AConfig) extends Factory[A] {
def build() = A(config)
}
val f0 = AFactory(AConfig(1, 2))
val p0 = f0.build()
val f1 = p0.factory
val p1 = f1.build()
assert(p0 == p1)

Same method, different argument types, implemented in a single go

I have a (Java) class with operations like this:
abstract class Holder {
def set(i: Int): Unit
def set(s: String): Unit
def set(b: Boolean): Unit
...
}
Essentially, the all perform the same task, but just take different argument types. I would love to create a generic Accessor[T] that performs something like this:
class Accessor[T](holder: Holder) {
def set(value: T) { holder.set(value) }
}
... but that gives:
<console>:16: error: overloaded method value set with alternatives:
(s: String)Unit <and>
(i: Int)Unit
(b: Boolean)Unit
cannot be applied to (T)
def set(value: T) { holder.set(value) }
Is there any way out?
Use reflection.
class Setter(obj: AnyRef) {
val clazz = obj.getClass
def set[T : Manifest](v: T): Boolean = try {
val paramType = manifest[T].erasure
val method = clazz.getMethod("set", paramType)
method.invoke(obj, v.asInstanceOf[AnyRef])
true
} catch {
case ex => false
}
}
val holder = ..
val setter = new Setter(holder)
setter.set(5) // returns true
setter.set(1.0) // double not accepted, returns false
There was an experimental shortcut for that in Scala, but it got removed before 2.8.0 was released.
I think matching should work nicely
def set(value: T) {
value match {
case s: String => holder.set(s)
case i: Int => holder.set(i)
case b: Boolean => holder.set(b)
}
}
I don't fully understand your use case, but one thing that you might try doing--if performance is not of utmost importance--is creating a wrapper class that converts to a universal form for you, and then have all your methods take that wrapper class (with appropriate implicit conversions in place). For example:
class Wrap(val data: String)
implicit def wrapString(s: String) = new Wrap(s)
implicit def wrapBoolean(b: Boolean) = if (b) new Wrap("T") else new Wrap("F")
implicit def wrapLong(l: Long) = new Wrap(l.toString+"L")
class User {
private[this] var myData = ""
def set(w: Wrap) { println("Setting to "+w.data); myData = w.data }
}
val u = new User
u.set(true)
u.set(50L)
u.set(50) // Int gets upconverted to Long for free, so this works
u.set("Fish")
// u.set(3.14159) // This is a type mismatch
This is a little bit like taking an Any except that you can restrict the types however you like and specify the conversion into whatever universal representation you have in mind. However, if there does not exist a universal form, then I'm not sure in what sense that you mean the code is doing the same thing each time. (Maybe you mean that you can conceive of a macro (or another program) that would generate the code automatically--Scala doesn't have that support built in, but you can of course write a Scala program that produces Scala code.)
Looking back at the results gathered so far, there are a couple of solutions suggested:
Use pattern matching (leads to fragmentation of the different strategies of dealing with the different parameter types)
Use reflection (to expensive for something that ideally should be super fast)
... and adding the one that I eventually ended up implementing: write an adapter per type of parameter.
To be a little more precise, the whole exercise was about writing a wrapper around Kyoto Cabinet. Kyoto Cabinet has methods for associating byte array keys with byte array values and String keys with String values. And then it basically replicates most of the operations for dealing with keys and values for both byte array as well as Strings.
In order to create a Map wrapper around Kyoto Cabinet's DB class, I defined a trait TypedDBOperations[T], with T being the type of parameter, and had it implemented twice. If I now construct a Map[Array[Byte], Array[Byte]], an implicit conversion will automatically assign it the proper instane of TypedDBOperations, calling the Array[Byte] based operations of the DB class.
This is the trait that I have been talking about:
trait TypedDBOperations[K,V] {
def get(db: DB, key: K): V
def set(db: DB, key: K, value: V): Boolean
def remove(db: DB, key: K): Boolean
def get(cursor: Cursor): (K, V)
}
And these are the implementations for both type of key value combinations:
implicit object StringDBOperations extends TypedDBOperations[String] {
def get(cursor: Cursor) = {
val Array(a, b) = cursor.get_str(false)
(a, b)
}
def remove(db: DB, key: String) = db.remove(key)
def set(db: DB, key: String, value: String) = db.set(key, value)
def get(db: DB, key: String) = db.get(key)
}
implicit object ByteArrayOperations extends TypedDBOperations[Array[Byte]] {
def get(cursor: Cursor) = {
val Array(a, b) = cursor.get(false)
(a, b)
}
def remove(db: DB, key: Array[Byte]) = db.remove(key)
def set(db: DB, key: Array[Byte], value: Array[Byte]) = db.set(key, value)
def get(db: DB, key: Array[Byte]) = db.get(key)
}
Not the most satisfying solution ever, but it gets the job done. Again, note there still is quite a bit of duplication, but it seems there's no way to get rid of it.