Syntax for accepting tuple in a function in Scala - scala

I would like a function to consume tuple of 7 but compiler won't let me with the shown message. I failed to find a proper way how to do it. Is it even possible without explicitely typing all the type parameters like Tuple7[String,String...,String] and is it even a good idea to use Scala like this ?
def store(record:Tuple7): Unit = {
}
Error:(25, 20) class Tuple7 takes type parameters
def store(record: Tuple7): Unit = {
^

As stated by Luis you have to define what Type goes on which position for every position in the Tuple.
I`d like to add some approaches to express the same behaviour in different ways:
Tuple Syntax
For that you have two choices, what syntax to use to do so:
Tuple3[String, Int, Double]
(String, Int, Double)
Approach using Case Classes for better readability
Long tuples are hard to handle, especially when types are repeated. Scala offers a different approach for handling this. Instead of a Tuple7 you can use a case class with seven fields. The gain in this approach would be that you now can attach speaking names to each field and also the typing of each position makes more sense if a name is attached to it.
And the chance of putting values in wrong positions is reduced
(String, Int, String, Int)
// vs
case class(name: String, age: Int, taxNumber: String, numberOfChildren: Int)
using Seq with pattern matching
If your intention was to have a sequence of data seq in combination with pattern matching could also be a nice fit:
List("name", 24, "", 5 ) match {
case name:String :: age:Int ::_ :: _ :: Nil => doSomething(name, age)
}
This only works nice in a quite reduced scope. Normally you would lose a lot of type information as the List is of type Any.

You could do the following :
def store(record: (String, String, String, String, String, String, String)):Unit = {
}
which is the equivalent of :
def store(record: Tuple7[String, String, String, String, String, String, String]):Unit = {
}
You can read more about it in Programming in Scala, 2nd Edition, chapter "Next Steps in Scala", sub-chapter "Step 9. use Tuples".

Related

Reading data into custom case classes in Doobie

Let's say I have a case class X(id: Int, name: String, age: Int) and some function (in my case, withUniqueGeneratedKeys in doobie) that returns X. If I have already defined X I am good.
But in my case the core data structure is something like:
case class XData(name: String, age: Int)
case class MaterializedX(id: Int, element: XData)
And of course I could write a line like case class X(id: Int, name: String, age: Int) to create X but it would be duplication of logic - whenever something about XData changes, I'd have to change the code for X as well. Intuitively, it feels like there should be a way to derive X from XData and MaterializedX. This transformation might mean some code, but it would save me lots of future work because I have many item types beyond X.
How could this be done? Happy to hear other approaches.
I am using Scala 3.1.2 in case this matters. Thank you!
Edit: Changed title per helpful comment, to make this question easier to understand.
I think you should be more clear about the question, I mean what title says is almost completely different from your question description (and what you might be actually looking for). Anyway, in my assumptions, what you need is probably a custom Read[MaterializedX] and Write[MaterializedX], as follows:
implicit val pointMaterializedX: Read[MaterializedX] =
Read[(Int, String, Int)]
.map {
case (id, name, age) => MaterializedX(id, XData(name, age))
}
implicit val pointWrite: Write[MaterializedX] =
Write[(Int, String, Int)]
.contramap { materializedX =>
(materializedX.id, materializedX.element.name, materializedX.element.age)
}
More documentations here
This works out of the box with doobie. Nested case classes get flattened into one row. The following code compiles without defining any custom decoders.
case class XData(name: String, age: Int)
case class MaterializedX(id: Int, element: XData)
implicitly[doobie.Read[MaterializedX]]
implicitly[doobie.Write[MaterializedX]]

Why is the parameter type of a generic function not inferred?

In Scala 2.11.7, having the following case class and an additional apply method:
case class FieldValidator[T](key: String, isValid: T => Boolean,
errorMessage: Option[String] = None)
object FieldValidator {
def apply[T](key: String, isValid: T => Boolean,
errorMessage: String): FieldValidator[T] = ???
}
when I try to use:
FieldValidator[String](key, v => !required || v.nonEmpty, "xxx")
I'm getting a "missing parameter type" compilation error pointing at v.
When I explicitly specify the type of v, it compiles fine, and I can even skip the generic type of the apply method, i.e.
FieldValidator(key, (v: String) => !required || v.nonEmpty, "xxx")
Why isn't the type of v inferred when just the generic type of apply is provided?
It's not so much about generics, it's rather a problem with overloading and default parameters.
First, recall that since FieldValidator is a case-class, a synthetic factory method
def apply(
key: String,
isValid: T => Boolean,
errorMessage: Option[String] = None
)
is automatically added to the companion object FieldValidator. This results in Field validator having two generic methods with default parameters and same name.
Here is a shorter example that behaves in roughly the same way:
def foo[A](f: A => Boolean, x: Int = 0): Unit = {}
def foo[A](f: A => Boolean, x: String): Unit = {}
foo[String](_.isEmpty)
It results in:
error: missing parameter type for expanded function ((x$1: ) =>
x$1.isEmpty)
foo[String](_.isEmpty)
^
I can't pinpoint what exactly goes wrong, but essentially, you have confused the compiler with too much ambiguity by throwing three different sorts of polymorphism at it:
Overloading: you have two methods with name apply
Generics: your methods have a generic type parameter [A]
Default arguments: your errorMessage (x in my shorter example) can be omitted.
Together, this leaves the compiler with the choice between two equally named methods with unclear types and unclear number of expected type arguments. While flexibility is good, too much flexibility is simply too much, and the compiler gives up trying to figure out what you wanted, and forces you to specify all types of every single argument explicitly, without relying on inference.
Theoretically, it could have figured it out in this particular case, but this would require much more complex inference algorithms and much more backtracking and trial-and-error (which would slow down the compilation in the general case). You don't want the compiler to spend half a day playing typesystem-sudoku, even if it theoretically could figure out a unique solution. Exiting quickly with an error message is a reasonable alternative.
Workaround
As a simple work-around, consider reordering the arguments in a way that allows the compiler to eliminate ambiguity as fast as possible. For example, splitting the arguments into two argument lists, with two Strings coming first, would make it unambiguous:
case class FieldValidator[T](
key: String,
isValid: T => Boolean,
errorMessage: Option[String] = None
)
object FieldValidator {
def apply[T]
(key: String, errorMessage: String)
(isValid: T => Boolean)
: FieldValidator[T] = {
???
}
}
val f = FieldValidator[String]("key", "err"){
s => s.nonEmpty
}

Retaining type information in List[Any]

I'm using certain external library that has a method which is overloaded several times with different arguments, something like:
insertInto(index: Int, int: Int)
insertInto(index: Int, lng: Long)
insertInto(index: Int, dbl: Double)
insertInto(index: Int, str: String)
And a certain case class I'm using whose data I want to pass onto said methods, say:
case class C(str: String, lng: Long, dbl: Double, int: Int /* more values */)
val c = C("asd", 1, 1.1, 1)
Right now I'm using the library method like:
insertInto(1, c.int)
insertInto(2, c.lng)
insertInto(3, c.dbl)
insertInto(4, c.str)
//more insertions...
But since I'm always using the index of the value in the case classes I figured that maybe I could could save up on some lines of code (around 10) with something like the following:
c.productIterator.zipWithIndex.toList.foreach {
case (idx, value) => insertInto(idx, value)
}
But this doesn't work because I'd be iterating a List[Any] and therefore the compiler complains that I'm not passing the correct argument type to insertInto since Any is not String, Int, Long, Double, etc..
What would be the correct way of handling this? Thanks in advance
case class A[T](t:T)(implicit tag: ClassManifest[T]){
val newTag = tag
override def toString= t+" "+tag.toString
}
case class C(xs:List[Any])
val c=C(List(A("a"),A[Long](1), A[Double](1.1), A[Int](1)))
c.xs.foreach(println)
Try this. I am using Scala 2.9.3. In newer versions, you can use TypeTag and ClassTag. Check here
So now you have the class type information. You can devise some mechanism to map class type value as string to .class instance and then use asinstanceof to cast it.

Why can't tuples in Scala be traversed?

Suppose I create a Tuple6:
val tup = (true, 1 , "Hello" , 0.4 , "World" , 0 )
tup: (Boolean, Int, String, Float, String, Int)
And I can access the elements of the tuple using ._<position> like
tup._1 and tup._2 and so on. But why does
for (i <- 1 to 6)
println(tup._i)
give me an error saying that
value _i is not a member of (String, Int, Boolean, String, Double, Int)
I understand that it is clearly stated that Tuples are not iterable, but if ._1 works , shouldn't ._i work the same way ?
It all boils down to type.
What type would you like a dynamic accessor such as _<position> to have? In the general case, the only valid one would be Any. In a strongly-typed language such as Scala this is useless for most purposes.
The good news is that the problem can be handled in a type-safe manner - see e.g. the HList-style tuple handling in shapeless.
However, there is no such mechanism available in the standard Scala library (barring heavy metaprogramming such as macros).
For tuples there is an productIterator method that gives you an opportunity to iterate over elements of tuple. But obviously each element of such iteration will be of type Any
I understand that it is clearly stated that Tuples are not iterable, but if ._1 works , shouldn't ._i work the same way ?
Why should it? In one case you are calling the method _1 (which does exist), in the other case you are calling the method _i (which doesn't exist). Calling two different methods usually does not "work the same way", especially if one of them doesn't even exist.
You could provide an Iterator[U] where U is the least upper bound of the types T1 to Tn of the tuple.
implicit class FancyTuple2[T1,T2,U](private val tuple: (T1 with U,T2 with U)) extends AnyVal {
def iterator: Iterator[U] = Iterator(tuple._1, tuple._2)
}
You'd have to write (or generate) this for every arity that you need a tuple of.

Scala Option implicit conversion - Bad practice or missing feature?

I represented my data model as case classes typing values that may be null as Option.
case class Document(id: Long, title: String, subtitle: Option[String])
Now I try to instantiate the case class:
Document(123, "The Title", "Subtitle") // Doesn't work
But NOPE! This doesn't work, I have to wrap the optional value in a Some.
Document(123, "The Title", Some("Subtitle")) // Works
Scala is very clever about types in general, but why is it not self-evident that a hard coded literal, or (any string for that matter) is a different than null/None?
I was able to fix this and make Scala "more clever" by adding this implicit conversion
implicit def autoSome[T](any:T) = Some(any)
Document(123, "The Title", "Subtitle") // Now it works!
Question: Am I the only one that the language should provide implicit conversion T -> Some(T) out of the box? Or is there any gotchas that I'm not aware of about having so broad implicit everywhere by default?
This can cause untold number of problems. The issue here isn't what you might think but what you don't think could happen. That is, if you make another implicit class that works on Option types you could wind up creating artificial results that you never intended to happen, i.e. operators for your overloaded types being present in your non-overloaded types.
implicit class OptDouble(opt: Option[Double]) extends Any{
def *(x: Double) = Some((opt getOrElse 0.0) * x)
def ^(x: Double) = Some(Math.power(opt getOrElse 1.0, x))
}
val z = q^4.5
The type of z is Option[Double]. You wouldn't expect that to happen but first Scala did an implicit conversion to Option and then it used the implicit class to call the ^ operator. Now people looking at your code are going to be scratching their heads wondering why they have an Option. You might start seeing a few defensive x getOrElse 0.0 sprinkled around the code as people fight to keep Option away (and yes, this is from personal experience.)
That said, what you should do is use another apply on the object:
object Document{
def apply(id: Long, title: String, subtitle: String) = new Document(id, title, Some(subtitle))
}
which will do everything you wanted it to do as long as you don't have a default defined for subtitle, i.e. subtitle: Option[String] = None.
Most problems pointed out earlier can easily be fixed with a small change to the implicit:
implict def autoOpt[T](x: T): Option[T] = Option(x)
I can't really think of a good reason scala does not provide this conversion as a part of the default library of implicit converters.
The fact that implicts make code harder to understand can be used as an argument against using any implicit, but not as one against using this particular one
That's a pretty dangerous implementation:
scala> val s: String = null
s: String = null
scala> Document(123, "The Title", s)
res2: Document = Document(123,The Title,Some(null))