It does not look very good for me to always repeat a line-long tuple definition every time I need it. Can I just name it and use as a type name? Would be nice to name its fields also instead of using ._1, ._2 etc.
Regarding your first question, you can simply use a type alias:
type KeyValue = (Int, String)
And, of course, Scala is an object-oriented language, so regarding your second about how to specialize a tuple, the magic word is inheritance:
case class KeyValue(key: Int, value: String) extends (Int, String)(key, value)
That's it. The class doesn't even need a body.
val kvp = KeyValue(42, "Hello")
kvp._1 // => res0: Int = 42
kvp.value // => res1: String = "Hello"
Note, however, that inheriting from case classes (which Tuple2 is), is deprecated and may be disallowed in the future. Here's the compiler warning you get for the above class definition:
warning: case class class KV has case class ancestor class Tuple2. This has been deprecated for unduly complicating both usage and implementation. You should instead use extractors for pattern matching on non-leaf nodes.
Type alias is fine for naming your Tuple, but try using a case class instead. You will be able to use named parameters
Example with tuple:
def foo(a : Int) : (Int, String) = {
(a,"bar")
}
val res = foo(1)
val size = res._1
val name= res._2
With a case class:
case class Result( size : Int , name : String )
def foo(a : Int) : Result = {
Result(a,"bar")
}
val res = foo(1)
val size = res.size
val name= res.name
Here's a solution that creates a type alias and a factory object.
scala> type T = (Int, String)
defined type alias T
scala> object T { def apply(i: Int, s: String): T = (i, s) }
defined module T
scala> new T(1, "a")
res0: (Int, String) = (1,a)
scala> T(1, "a")
res1: (Int, String) = (1,a)
However as others have mentioned, you probably should just create a case class.
Although as others have said, explicit (case) classes are best in the general sense.
However for localized scenarios what you can do is to use the tuple extractor to improve code readability:
val (first, second) = incrementPair(3, 4)
println(s"$first ... $second")
Given a method returning a tuple:
def incrementPair(pair: (Int, Int)) : (Int, Int) = {
val (first, second) = pair
(first + 1, second + 1)
}
Related
I am new to Scala and I am having to provide values extracted from an object/case class into a Seq. I was wondering whether there would be any generic way of extracting values of an object into Seq of those values in order?
Convert the following:
case class Customer(name: Option[String], age: Int)
val customer = Customer(Some("John"), 24)
into:
val values = Seq("John", 24)
case class extends Product class and it provides such method:
case class Person(age:Int, name:String, lastName:Option[String])
def seq(p:Product) = p.productIterator.toList
val s:Seq[Any] = seq(Person(100, "Albert", Some("Einstain")))
println(s) //List(100, Albert, Some(Einstain))
https://scalafiddle.io/sf/oD7qk8u/0
Problem is that you will get untyped list/array from it. Most of the time it is not optimal way of doing things, and you should always prefer statically typed solutions.
Scala 3 (Dotty) might give us HList out-of-the-box which is a way of getting product's values without loosing type information. Given val picard = Customer(Some("Picard"), 75) consider the difference between
val l: List[Any] = picard.productIterator.toList
l(1)
// val res0: Any = 75
and
val hl: (Option[String], Int) = Tuple.fromProductTyped(picard)
hl(1)
// val res1: Int = 75
Note how res1 did not loose type information.
Informally, it might help to think of an HList as making a case class more generic by dropping its name whilst retaining its fields, for example, whilst Person and Robot are two separate models
Robot(name: Option[String], age: Int)
Person(name: Option[String], age: Int)
they could both represented by a common "HList" that looks something like
(_: Option[String], _: Int) // I dropped the names
If it's enough for you to have Seq[Any] you can use productIterator approach proposed by #Scalway. If I understood correctly you want also to unpack Option fields. But you haven't specified what to do with None case like Customer(None, 24).
val values: Seq[Any] = customer.productIterator.map {
case Some(x) => x
case x => x
}.toSeq // List(John, 24)
Statically typed solution would be to use heterogeneous collection e.g. HList
class Default[A](val value: A)
object Default {
implicit val int: Default[Int] = new Default(0)
implicit val string: Default[String] = new Default("")
//...
}
trait LowPriorityUnpackOption extends Poly1 {
implicit def default[A]: Case.Aux[A, A] = at(identity)
}
object unpackOption extends LowPriorityUnpackOption {
implicit def option[A](implicit default: Default[A]): Case.Aux[Option[A], A] = at {
case Some(a) => a
case None => default.value
}
}
val values: String :: Int :: HNil =
Generic[Customer].to(customer).map(unpackOption) // John :: 24 :: HNil
Generally it would be better to work with Option monadically rather than to unpack them.
I have a generic map with values, some of which can be in turn lists of values.
I'm trying to process a given key and convert the results to the type expected by an outside caller, like this:
// A map with some values being other collections.
val map: Map[String, Any] = Map("foo" -> 1, "bar" -> Seq('a', 'b'. 'a'))
// A generic method with a "specialization" for collections (pseudocode)
def cast[T](key: String) = map.get(key).map(_.asInstanceOf[T])
def cast[C <: Iterable[T]](key: String) = map.get(key).map(list => list.to[C].map(_.asIntanceOf[T]))
// Expected usage
cast[Int]("foo") // Should return 1:Int
cast[Set[Char]]("bar") // Should return Set[Char]('a', 'b')
This is to show what I would like to do, but it does not work. The compiler error complains (correctly, about 2 possible matches). I've also tried to make this a single function with some sort of pattern match on the type to no avail.
I've been reading on #specialized, TypeTag, CanBuildFrom and other scala functionality, but I failed to find a simple way to put it all together. Separate examples I've found address different pieces and some ugly workarounds, but nothing that would simply allow an external user to call cast and get an exception is the cast was invalid. Some stuff is also old, I'm using Scala 2.10.5.
This appears to work but it has a some problems.
def cast[T](m: Map[String, Any], k: String):T = m(k) match {
case x: T => x
}
With the right input you get the correct output.
scala> cast[Int](map,"foo")
res18: Int = 1
scala> cast[Set[Char]](map,"bar")
res19: Set[Char] = Set(a, b)
But it throws if the type is wrong for the key or if the map has no such key (of course).
You can do this via implicit parameters:
val map: Map[String, Any] = Map("foo" -> 1, "bar" -> Set('a', 'b'))
abstract class Casts[B] {def cast(a: Any): B}
implicit val doubleCast = new Casts[Double] {
override def cast(a: Any): Double = a match {
case x: Int => x.toDouble
}
}
implicit val intCast = new Casts[Int] {
override def cast(a: Any): Int = a match {
case x: Int => x
case x: Double => x.toInt
}
}
implicit val seqCharCast = new Casts[Seq[Char]] {
override def cast(a: Any): Seq[Char] = a match {
case x: Set[Char] => x.toSeq
case x: Seq[Char] => x
}
}
def cast[T](key: String)(implicit p:Casts[T]) = p.cast(map(key))
println(cast[Double]("foo")) // <- 1.0
println(cast[Int]("foo")) // <- 1
println(cast[Seq[Char]]("bar")) // <- ArrayBuffer(a, b) which is Seq(a, b)
But you still need to iterate over all type-to-type options, which is reasonable as Set('a', 'b').asInstanceOf[Seq[Char]] throws, and you cannot use a universal cast, so you need to handle such cases differently.
Still it sounds like an overkill, and you may need to review your approach from global perspective
I need a Map where I put different types of values (Double, String, Int,...) in it, key can be String.
Is there a way to do this, so that I get the correct type with map.apply(k) like
val map: Map[String, SomeType] = Map()
val d: Double = map.apply("double")
val str: String = map.apply("string")
I already tried it with a generic type
class Container[T](element: T) {
def get: T = element
}
val d: Container[Double] = new Container(4.0)
val str: Container[String] = new Container("string")
val m: Map[String, Container] = Map("double" -> d, "string" -> str)
but it's not possible since Container takes an parameter. Is there any solution to this?
This is not straightforward.
The type of the value depends on the key. So the key has to carry the information about what type its value is. This is a common pattern. It is used for example in SBT (see for example SettingsKey[T]) and Shapeless Records (Example). However, in SBT the keys are a huge, complex class hierarchy of its own, and the HList in shapeless is pretty complex and also does more than you want.
So here is a small example of how you could implement this. The key knows the type, and the only way to create a Record or to get a value out of a Record is the key. We use a Map[Key, Any] internally as storage, but the casts are hidden and guaranteed to succeed. There is an operator to create records from keys, and an operator to merge records. I chose the operators so you can concatenate Records without having to use brackets.
sealed trait Record {
def apply[T](key:Key[T]) : T
def get[T](key:Key[T]) : Option[T]
def ++ (that:Record) : Record
}
private class RecordImpl(private val inner:Map[Key[_], Any]) extends Record {
def apply[T](key:Key[T]) : T = inner.apply(key).asInstanceOf[T]
def get[T](key:Key[T]) : Option[T] = inner.get(key).asInstanceOf[Option[T]]
def ++ (that:Record) = that match {
case that:RecordImpl => new RecordImpl(this.inner ++ that.inner)
}
}
final class Key[T] {
def ~>(value:T) : Record = new RecordImpl(Map(this -> value))
}
object Key {
def apply[T] = new Key[T]
}
Here is how you would use this. First define some keys:
val a = Key[Int]
val b = Key[String]
val c = Key[Float]
Then use them to create a record
val record = a ~> 1 ++ b ~> "abc" ++ c ~> 1.0f
When accessing the record using the keys, you will get a value of the right type back
scala> record(a)
res0: Int = 1
scala> record(b)
res1: String = abc
scala> record(c)
res2: Float = 1.0
I find this sort of data structure very useful. Sometimes you need more flexibility than a case class provides, but you don't want to resort to something completely type-unsafe like a Map[String,Any]. This is a good middle ground.
Edit: another option would be to have a map that uses a (name, type) pair as the real key internally. You have to provide both the name and the type when getting a value. If you choose the wrong type there is no entry. However this has a big potential for errors, like when you put in a byte and try to get out an int. So I think this is not a good idea.
import reflect.runtime.universe.TypeTag
class TypedMap[K](val inner:Map[(K, TypeTag[_]), Any]) extends AnyVal {
def updated[V](key:K, value:V)(implicit tag:TypeTag[V]) = new TypedMap[K](inner + ((key, tag) -> value))
def apply[V](key:K)(implicit tag:TypeTag[V]) = inner.apply((key, tag)).asInstanceOf[V]
def get[V](key:K)(implicit tag:TypeTag[V]) = inner.get((key, tag)).asInstanceOf[Option[V]]
}
object TypedMap {
def empty[K] = new TypedMap[K](Map.empty)
}
Usage:
scala> val x = TypedMap.empty[String].updated("a", 1).updated("b", "a string")
x: TypedMap[String] = TypedMap#30e1a76d
scala> x.apply[Int]("a")
res0: Int = 1
scala> x.apply[String]("b")
res1: String = a string
// this is what happens when you try to get something out with the wrong type.
scala> x.apply[Int]("b")
java.util.NoSuchElementException: key not found: (b,Int)
This is now very straightforward in shapeless,
scala> import shapeless._ ; import syntax.singleton._ ; import record._
import shapeless._
import syntax.singleton._
import record._
scala> val map = ("double" ->> 4.0) :: ("string" ->> "foo") :: HNil
map: ... <complex type elided> ... = 4.0 :: foo :: HNil
scala> map("double")
res0: Double with shapeless.record.KeyTag[String("double")] = 4.0
scala> map("string")
res1: String with shapeless.record.KeyTag[String("string")] = foo
scala> map("double")+1.0
res2: Double = 5.0
scala> val map2 = map.updateWith("double")(_+1.0)
map2: ... <complex type elided> ... = 5.0 :: foo :: HNil
scala> map2("double")
res3: Double = 5.0
This is with shapeless 2.0.0-SNAPSHOT as of the date of this answer.
I finally found my own solution, which worked best in my case:
case class Container[+T](element: T) {
def get[T]: T = {
element.asInstanceOf[T]
}
}
val map: Map[String, Container[Any]] = Map("a" -> Container[Double](4.0), "b" -> Container[String]("test"))
val double: Double = map.apply("a").get[Double]
val string: String = map.apply("b").get[String]
(a) Scala containers don't track type information for what's placed inside them, and
(b) the return "type" for an apply/get method with a simple String parameter/key is going to be static for a given instance of the object the method is to be applied to.
This feels very much like a design decision that needs to be rethought.
I don't think there's a way to get bare map.apply() to do what you'd want. As the other answers suggest, some sort of container class will be necessary. Here's an example that restricts the values to be only certain types (String, Double, Int, in this case):
sealed trait MapVal
case class StringMapVal(value: String) extends MapVal
case class DoubleMapVal(value: Double) extends MapVal
case class IntMapVal(value: Int) extends MapVal
val myMap: Map[String, MapVal] =
Map("key1" -> StringMapVal("value1"),
"key2" -> DoubleMapVal(3.14),
"key3" -> IntMapVal(42))
myMap.keys.foreach { k =>
val message =
myMap(k) match { // map.apply() in your example code
case StringMapVal(x) => "string: %s".format(x)
case DoubleMapVal(x) => "double: %.2f".format(x)
case IntMapVal(x) => "int: %d".format(x)
}
println(message)
}
The main benefit of the sealted trait is compile-time checking for non-exhaustive matches in pattern matching.
I also like this approach because it's relatively simple by Scala standards. You can go off into the weeds for something more robust, but in my opinion you're into diminishing returns pretty quickly.
If you want to do this you'd have to specify the type of Container to be Any, because Any is a supertype of both Double and String.
val d: Container[Any] = new Container(4.0)
val str: Container[Any] = new Container("string")
val m: Map[String, Container[Any]] = Map("double" -> d, "string" -> str)
Or to make things easier, you can change the definition of Container so that it's no longer type invariant:
class Container[+T](element: T) {
def get: T = element
override def toString = s"Container($element)"
}
val d: Container[Double] = new Container(4.0)
val str: Container[String] = new Container("string")
val m: Map[String, Container[Any]] = Map("double" -> d, "string" -> str)
There is a way but it's complicated. See Unboxed union types in Scala. Essentially you'll have to type the Map to some type Int |v| Double to be able to hold both Int and Double. You'll also pay a high price in compile times.
I want to build a Scala DSL to convert from a existing structure of Java POJOs to a structure equivalent to a Map.
However the incoming objects structure is very likely to contain a lot of null references, which will result in no value in the output map.
The performance is very important in this context so I need to avoid both reflection and throw/catch NPE.
I have considered already this topic which does not meet with my requirements.
I think the answer may lie in the usage of macros to generate some special type but I have no experience in the usage of scala macros.
More formally :
POJO classes provided by project : (there will be like 50 POJO, nested, so I want a solution which does not require to hand-write and maintain a class or trait for each of them)
case class Level1(
#BeanProperty var a: String,
#BeanProperty var b: Int)
case class Level2(
#BeanProperty var p: Level1,
#BeanProperty var b: Int)
expected behaviour :
println(convert(null)) // == Map()
println(convert(Level2(null, 3))) // == Map("l2.b" -> 3)
println(convert(Level2(Level1("a", 2), 3))) // == Map(l2.p.a -> a, l2.p.b -> 2, l2.b -> 3)
correct implementation but I want an easier DSL for writing the mappings
implicit def toOptionBuilder[T](f: => T) = new {
def ? : Option[T] = Option(f)
}
def convert(l2: Level2): Map[String, _] = l2? match {
case None => Map()
case Some(o2) => convert(o2.p, "l2.p.") + ("l2.b" -> o2.b)
}
def convert(l1: Level1, prefix: String = ""): Map[String, _] = l1? match {
case None => Map()
case Some(o1) => Map(
prefix + "a" -> o1.a,
prefix + "b" -> o1.b)
}
Here is how I want to write with a DSL :
def convertDsl(l2:Level2)={
Map(
"l2.b" -> l2?.b,
"l2.p.a" -> l2?.l1?.a,
"l2.p.b" -> l2?.l1?.b
)
}
Note that it is perfectly fine for me to specify that the property is optional with '?'.
What I want is to generate statically using a macro a method l2.?l1 or l2?.l1 which returns Option[Level1] (so type checking is done correctly in my DSL).
I couldn't refine it down to precisely the syntax you gave above, but generally, something like this might work:
sealed trait FieldSpec
sealed trait ValueFieldSpec[+T] extends FieldSpec
{
def value: Option[T]
}
case class IntFieldSpec(value: Option[Int]) extends ValueFieldSpec[Int]
case class StringFieldSpec(value: Option[String]) extends ValueFieldSpec[String]
case class Level1FieldSpec(input: Option[Level1]) extends FieldSpec
{
def a: ValueFieldSpec[_] = StringFieldSpec(input.map(_.a))
def b: ValueFieldSpec[_] = IntFieldSpec(input.map(_.b))
}
case class Level2FieldSpec(input: Option[Level2]) extends FieldSpec
{
def b: ValueFieldSpec[_] = IntFieldSpec(input.map(_.b))
def l1 = Level1FieldSpec(input.map(_.p))
}
case class SpecArrowAssoc(str: String)
{
def ->(value: ValueFieldSpec[_]) = (str, value)
}
implicit def str2SpecArrowAssoc(str: String) = SpecArrowAssoc(str)
implicit def Level2ToFieldSpec(input: Option[Level2]) = Level2FieldSpec(input)
def map(fields: (String, ValueFieldSpec[_])*): Map[String, _] =
Map[String, Any]((for {
field <- fields
value <- field._2.value
} yield (field._1, value)):_*)
def convertDsl(implicit l2: Level2): Map[String, _] =
{
map(
"l2.b" -> l2.?.b,
"l2.p.a" -> l2.?.l1.a,
"l2.p.b" -> l2.?.l1.b
)
}
Then we get:
scala> val myL2 = Level2(Level1("a", 2), 3)
myL2: Level2 = Level2(Level1(a,2),3)
scala> convertDsl(myL2)
res0: scala.collection.immutable.Map[String,Any] = Map(l2.b -> 3, l2.p.a -> a, l2.p.b -> 2)
Note that the DSL uses '.?' rather than just '?' as the only way I could see around Scala's trouble with semicolon inference and postfix operators (see, eg., #0__ 's answer to scala syntactic suger question).
Also, the strings you can provide are arbitrary (no checking or parsing of them is done), and this simplistic 'FieldSpec' hierarchy will assume that all your POJOs use 'a' for String fields and 'b' for Int fields etc.
With a bit of time and effort I'm sure this could be improved on.
I am hoping to write a Scala method which takes in a tuple of any size and type along with an index, and returns the element in the tuple at that index. I know how to do everything but preserve the type. I haven't yet figured out a way to make the return value be of the dynamic type of the tuple item.
Here is the function I have so far:
def subscript_get(tup: Product, index:Int): Any={
return tup.productElement(index)
}
The usage for example would be:
subscript_get((0,1,2,3),0)
--> Int = 0
subscript_get((0,1,"asdf",3),2)
--> java.lang.String = asdf
I know that I can cast the result back afterwards to what I am looking for, but this doesn't work for me because I can't always know what type I should cast to.
Is something like this even possible ? Thanks!
I'm not sure you want a solution that uses macros, but for the record (and since I've written precisely this method before), here's how you can implement this with the macro system in 2.10.
As I note in a comment above, this approach requires index to be an integer literal, and relies on "underspecified but intended" behavior in 2.10. It also raises some tricky questions about documentation.
import scala.language.experimental.macros
import scala.reflect.macros.Context
object ProductIndexer {
def at[T <: Product](t: T)(index: Int) = macro at_impl[T]
def at_impl[T <: Product: c.WeakTypeTag](c: Context)
(t: c.Expr[T])(index: c.Expr[Int]) = {
import c.universe._
index.tree match {
case Literal(Constant(n: Int)) if
n >= 0 &&
weakTypeOf[T].members.exists {
case m: MethodSymbol => m.name.decoded == "_" + (n + 1).toString
case _ => false
} => c.Expr[Any](Select(t.tree, newTermName("_" + (n + 1).toString)))
case Literal(Constant(_: Int)) => c.abort(
c.enclosingPosition,
"There is no element at the specified index!"
)
case _ => c.abort(
c.enclosingPosition,
"You must provide an integer literal!"
)
}
}
}
And then:
scala> import ProductIndexer._
import ProductIndexer._
scala> val triple = (1, 'a, "a")
triple: (Int, Symbol, String) = (1,'a,a)
scala> at(triple)(0)
res0: Int = 1
scala> at(triple)(1)
res1: Symbol = 'a
scala> at(triple)(2)
res2: String = a
All statically typed as expected, and if you give it an index that's out of range (or not a literal), you get a nice compile-time error.
You cannot do that. If you use Product, the (compile-time) type of the values in the tuples is lost. Further, a method cannot adapt its return type based on an value you pass in (not entirely true, see dependent method types, but true for an Int).
If you do not know what type to cast to, you could use pattern matching:
subscript_get(..., 1) match {
case v: Int => // do something with Int
case v: String => // do something with String
// snip
case _ => sys.error("don't know how to handle this")
}