dsl for capturing field name - scala

I'm working on a mapper and wanted a typesafe way to capture class fieldnames for mapping and went with a syntax I'd used in C#:
case class Person(name: String, age: Int)
new Mapping[Person]() {
field(_.age).name("person_age").colType[java.lang.Integer]
field(_.name).name("person_name")
}
where def field(m: T => Unit): FieldMap
This triggers the following warnings:
Warning:(97, 13) a pure expression does nothing in statement position; you may be omitting necessary parentheses
field(_.age).name("person_age").colType[java.lang.Integer]
^
Warning:(98, 13) a pure expression does nothing in statement position; you may be omitting necessary parentheses
field(_.name).name("person_name")
^
So clearly that's not a desirable syntax. Any way I can tweak the signature of field to avoid the warning or is there a more idiomatic scala way of mapping fields in a typesafe manner?
Note: #sjrd's answer indeed gets rid of the warning, but the attempted feature doesn't seem feasible with scala reflection after all. My end goal is a Mapper that allows the specifying of T members in a compile time checked mannner, rather than strings, so it's less vulnerable to typo's and refactoring issues.

The field method takes a T => Unit function as parameter. Hence, the lambda _.age, which is equivalent to x => x.age, is typechecked as returning Unit. The compiler warns that you are using a pure expression (x.age) in statement position (expected type Unit), which basically means that the expression is useless, and might as well be removed.
There is a very simple symptomatic solution to your problem: replace m: T => Unit by m: T => Any. Now your expression x.age is not in statement position anymore, and the compiler is happy.
But your code suggests that there is something wrong a little bit deeper, since you obviously don't use the result of m anywhere. Why is m for anyway?

Related

Is returning Either/Option/Try/Or considered a viable / idiomatic approach when function has preconditions for arguments?

First of all, I'm very new to Scala and don't have any experience writing production code with it, so I lack understanding of what is considered a good/best practice among community. I stumbled upon these resources:
https://github.com/alexandru/scala-best-practices
https://nrinaudo.github.io/scala-best-practices/
It is mentioned there that throwing exceptions is not very good practice, which made me think what would be a good way to define preconditions for function then, because
A function that throws is a bit of a lie: its type implies it’s total function when it’s not.
After a bit of research, it seems that using Option/Either/Try/Or(scalactic) is a better approach, since you can use something like T Or IllegalArgumentException as return type to clearly indicate that function is actually partial, using exception as a way to store message that can be wrapped in other exceptions.
However lacking Scala experience I don't quite understand if this is actually viable approach for a real project or using Predef.require is a way to go. I would appreciate if someone explained how things are usually done in Scala community and why.
I've also seen Functional assertion in Scala, but while the idea itself looks interesting, I think PartialFunction is not very suitable for the purpose as it is, because often more than one argument is passed and tuples look like a hack in this case.
Option or Either is definitely the way to go for functional programming.
With Option it is important to document why None might be returned.
With Either, the left side is the unsuccessful value (the "error"), while the right side is the successful value. The left side does not necessarily have to be an Exception (or a subtype of it), it can be a simple error message String (type aliases are your friend here) or a custom data type that is suitable for you application.
As an example, I usually use the following pattern when error handling with Either:
// Somewhere in a package.scala
type Error = String // Or choose something more advanced
type EitherE[T] = Either[Error, T]
// Somewhere in the program
def fooMaybe(...): EitherE[Foo] = ...
Try should only be used for wrapping unsafe (most of the time, plain Java) code, giving you the ability to pattern-match on the result:
Try(fooDangerous()) match {
case Success(value) => ...
case Failure(value) => ...
}
But I would suggest only using Try locally and then go with the above mentioned data types from there.
Some advanced datatypes like cats.effect.IO or monix.reactive.Observable contain error handling natively.
I would also suggest looking into cats.data.EitherT for typeclass-based error handling. Read the documentation, it's definitely worth it.
As a sidenote, for everyone coming from Java, Scala treats all Exceptions as Java treats RuntimeExceptions. That means, even when an unsafe piece of code from one of your dependencies throws a (checked) IOException, Scala will never require you to catch or otherwise handle the exception. So as a rule of thumb, when using Java - dependencies, almost always wrap them in a Try (or an IO if they execute side effects or block the thread).
I think your reasoning is correct. If you have a simple total (opposite of partial) function with arguments that can have invalid types then the most common and simple solution is to return some optional result like Option, etc.
It's usually not advisable to throw exceptions as they break FP laws. You can use any library that can return a more advanced type than Option like Scalaz Validation if you need to compose results in ways that are awkward with Option.
Another two alternatives I could offer is to use:
Type constrained arguments that enforce preconditions. Example: val i: Int Refined Positive = 5 based on https://github.com/fthomas/refined. You can also write your own types which wrap primitive types and assert some properties. The problem here is if you have arguments that have multiple interdependent valid values which are mutually exclusive per argument. For instance x > 1 and y < 1 or x < 1 and y > 1. In such case you can return an optional value instead of using this approach.
Partial functions, which in the essence resemble optional return types: case i: Int if i > 0 => .... Docs: https://www.scala-lang.org/api/2.12.1/scala/PartialFunction.html.
For example:
PF's def lift: (A) ⇒ Option[B] converts PF to your regular function.
Turns this partial function into a plain function returning an Option
result.
Which is similar to returning an option. The problem with partial functions that they are a bit awkward to use and not fully FP friendly.
I think Predef.require belongs to very rare cases where you don't want to allow any invalid data to be constructed and is more of a stop-everything-if-this-happens kind of measure. Example would be that you get arguments you never supposed to get.
You use the return type of the function to indicate the type of the result.
If you want to describe a function that can fail for whatever reason, of the types you mentioned you would probably return Try or Either: I am going to "try" to give your a result, or I am going to return "either" a success or an failure.
Now you can specify a custom exception
case class ConditionException(message: String) extends RuntimeException(message)
that you would return if your condition is not satisfied, e.g
import scala.util._
def myfunction(a: String, minLength: Int): Try[String] = {
if(a.size < minLength) {
Failure(ConditionException(s"string $a is too short")
} else {
Success(a)
}
}
and with Either you would get
import scala.util._
def myfunction(a: String, minLength: Int): Either[ConditionException,String] = {
if(a.size < minLength) {
Left(ConditionException(s"string $a is too short")
} else {
Right(a)
}
}
Not that the Either solution clearly indicates the error your function might return

Adaptation of argument list by inserting () is deprecated: this is unlikely to be what you want [duplicate]

I'm just in the process of upgrading from Scala 2.10.x to 2.11.2 and I'm receiving the following warning with the following code:
override def validateKey(key: String): Either[InvalidKeyError, Unit] =
keys.contains(key) match {
case true => Right()
case false => Left(InvalidKeyError(context, key))
}
Adaptation of argument list by inserting () has been deprecated: this
is unlikely to be what you want. signature: Right.apply[A, B](b: B):
scala.util.Right[A,B] given arguments: after adaptation:
Right((): Unit)
I am able to solve this by changing the "true" case statement to:
case true => Right(()) //() is a shortcut to a Unit instance
Is this the proper way to address this warning?
Edit: perhaps a "why we have to do this now" type answer would be appropriate, my cursory investigation seems to indicate that Scala inserting "Unit" when it thinks it needs to causes other problems
Automatic Unit inference has been deprecated in scala 2.11, and the reason behind this is that it can lead to confusing behavior, especially for people learning the language.
Here's an example
class Foo[T](value: T)
val x = new Foo
This should not compile, right? You are calling the constructor with no arguments, where one is required. Surprisingly, until scala 2.10.4 this compiles just fine, with no errors or warnings.
And that's because the compiler inferred a Unit argument, so it actually replaced your code with
val x = new Foo[Unit](()) // Foo[Unit]
As the newly introduced warning message says, this is unlikely to be what you want.
Another famous example is this
scala> List(1,2,3).toSet()
// res1: Boolean = false
calling toSet() should be a compile-time error, since toSet does not take arguments, but the compiler desperately tries to make it compile, ultimately interpreting the code as
scala> List(1,2,3).toSet.apply(())
which means: test whether () belongs to the set. Since it's not the case, you get a false!
So, starting from scala 2.11, you have to be explicit if you want to pass () (aka Unit) as an argument. That's why you have to write:
Right(())
instead of
Right()
examples taken from Simplifying Scala — The Past, Present and Future by Simon Ochsenreither.
Perhaps it should be Right(()). Have you tried that?
My explanation is that since the Right.apply is polymorphic it can take all kind of parameters, doing Right() means passing in a Unit and the compiler simply advise you that maybe that's not what you want, he doesn't know that this is what you actually want.
If you see your deprecate message it states:
... after adaptation: Right((): Unit)
Which means that the compiler has automatically decided that you are passing in a Unit, since this is kinda like void he doesn't really likes it, specifically passing in a Unit like () explicitly tells the compiler that you do want a Unit there. Anyway seems a new deprecation form scala 2.11, I can't reproduce this on 2.10.4.

What is the reasoning for the imbalance of Scala's regular value assignment vs extractor assignment?

Scala appears to have different semantics for regular value assignment versus assignment during an extraction. This has created some very subtle runtime bugs for me as my codebase has migrated over time.
To illustrate:
case class Foo(s: String)
def anyRef(): AnyRef = { ... }
val Foo(x) = anyRef() // compiles but throws exception if anyRef() is not a Foo
val f: Foo = anyRef() // does not compile
I don't see why the two val assignment lines would be imbalanced with regard to compile/runtime behavior.
Some I am curious: Is there a reason for this behavior? Is it an undesirable artifact of the way the extraction is implemented?
(tested in Scala 2.11.7)
Yes, there is.
In the case of an extractor you are specifying a pattern that you expect to match at this position. This is translated to a call of an extractor method from Any to Option[String] and this method can be called with the value of type AnyRef you provide. You are just asserting, that you do indeed get a result and not "None".
In the other line you are using a type annotation, i.e. you are specifying the type of "f" explicitly. But then you are assigning an incompatible value.
Of course the compiler could add implicit type casts but making type casts so easy would not really suit a language like Scala.
You should also keep in mind that extractors have nothing to do with types. In the Pattern Foo(x) the name Foo has nothing to do with the type Foo. It is just the Name of an extractor method.
Using patterns in this way is of course a quite dynamic feature, but I think it is intentional.

How can I serialize anything without specifying type?

I'm integrating with ZeroMQ and Akka to send case classes from different instances. Problem is that when I try to compile this code:
def persistRelay(relayEvent:String, relayData:Any) = {
val relayData = ser.serialize(relayData).fold(throw _, identity)
relayPubSocket ! ZMQMessage(Seq(Frame(relayEvent), Frame(relayData)))
}
The Scala compilers throws back recursive value relayData needs type.
The case classes going in are all different and look like Team(users:List[Long], teamId:Long) and so on.
Is there a method to allow any type in the serializer or a workaround? I'd prefer to avoid writing a serializer for every single function creating the data unless absolutely necessary.
Thanks!
This isn't really a typing issue. The problem is:
val relayData = ser.serialize(relayData).fold(throw _, identity)
You're declaring a val relayData in the same line that you're making a reference to the method parameter relayData. The Scala compiler doesn't understand that you have/want two variables with the same name, and, instead, interprets it as a recursive definition of val relayData. Changing the name of one of those variables should fix the error.
Regardless, since you didn't quite follow what the Scala compiler was asking for, I think that it would also be good to fill you in on what it is that the compiler even wanted from you (even though it's advice that, if followed, probably would have just led to you getting yet another error that wouldn't seem to make a lot of sense, given the circumstances).
It said "recursive value relayData needs type". The meaning of this is that it wanted you to simply specify the type of relayData by having
val relayData = ...
become something like
val relayData: Serializable = ...
(or, in place of Serializable, use whatever type it was that you wanted relayData to have)
It needs this information in order to create a recursive definition. For instance, take the simple case of
val x = x + 1
This code is... bizarre, to say the least, but what I'm doing is defining x in a (shallowly) recursive way. But there's a problem: how can the compiler know what type to use for the inner x? It can't really determine the type through type inference, because type inference involves leveraging the type information of other definitions, and this definition requires x's type information. Now, we might be able to infer that I'm probably talking about an Int, but, theoretically, x could be so many things! In fact, here's the ambiguity in action:
val x: Int = x + 1 // Default value for an Int is '0'
x: Int = 1
val y: String = y + 1 // Default value for a String is 'null'
y: String = null1
All that really changed was the type annotation, but the results are drastically different–and this is only a very simple case! So, yeah, to summarize all this... in most cases, when it's complaining about recursive values needing types, you should just have some empathy on the poor compiler and give it the type information that it so direly craves. It would do the same for you, DeLongey! It would do the same for you!

Inject methods into existing classes

I want to come out a way to define a new method in some existing class in scala.
For example, I think the asInstanceOf[T] method has too long a name, I want to replace it with as[T].
A straight forward approach can be:
class WrappedAny(val a: Any) {
def as[T] = a.asInstanceOf[T]
}
implicit def wrappingAny(a: Any): WrappedAny = new WrappedAny(a)
Is there a more natural way with less code?
Also, a strange thing happens when I try this:
scala> class A
defined class A
scala> implicit def toA(x: Any): A = x
toA: (x: Any)A
scala> toA(1)
And the console hang. It seems that toA(Any) should not pass the type checking phase, and it can't when it's not implicit. And putting all the code into a external source code can produce the same problem. How did this happen? Is it a bug of the compiler(version 2.8.0)?
There's nothing technically wrong with your approach to pimping Any, although I think it's generally ill-advised. Likewise, there's a reason asInstanceOf and isInstanceOf are so verbosely named; it's to discourage you from using them! There's almost certainly a better, statically type-safe way to do whatever you're trying to do.
Regarding the example which causes your console to hang: the declared type of toA is Any => A, yet you've defined its result as x, which has type Any, not A. How can this possibly compile? Well, remember that when an apparent type error occurs, the compiler looks around for any available implicit conversions to resolve the problem. In this case, it needs an implicit conversion Any => A... and finds one: toA! So the reason toA type checks is because the compiler is implicitly redefining it as:
implicit def toA(x: Any): A = toA(x)
... which of course results in infinite recursion when you try to use it.
In your second example you are passing Any to a function that must return A. However it never returns A but the same Any you passed in. The compiler then tries to apply the implicit conversion which in turn does not return an A but Any, and so on.
If you define toA as not being implicit you get:
scala> def toA(x: Any): A = x
<console>:6: error: type mismatch;
found : Any
required: A
def toA(x: Any): A = x
^
As it happens, this has been discussed on Scala lists before. The pimp my class pattern is indeed a bit verbose for what it does, and, perhaps, there might be a way to clean the syntax without introducing new keywords.
The bit about new keywords is that one of Scala goals is to make the language scalable through libraries, instead of turning the language into a giant quilt of ideas that passed someone's criteria for "useful enough to add to the language" and, at the same time, making other ideas impossible because they weren't deemed useful and/or common enough.
Anyway, nothing so far has come up, and I haven't heard that there is any work in progress towards that goal. You are welcome to join the community through its mailing lists and contribute to its development.