Why are Map and Set aliased in scala.Predef? - scala

9 times out of 10, simply using Map and Set behave like I expect they would, but occasionally I am unexpectedly hit with
error: type mismatch;
[INFO] found : scala.collection.Set[String]
[INFO] required: Set[String]
As an example, from the REPL:
scala> case class Calculator[+T](name: String, parameters: Set[String])
defined class Calculator
scala> val binding=Map.empty[String, String]
binding: scala.collection.immutable.Map[String,String] = Map()
scala> Calculator("Hello",binding.keySet)
<console>:9: error: type mismatch;
found : scala.collection.Set[String]
required: Set[String]
Calculator("Hello",binding.keySet)
^
I think I understand the error, that is, the function call on the aliased types return the actual types.
And so it seems to me the solution is to import the un-aliased types. Upon which every other file in my project will now generate type mismatch errors, so I will have to import it in each file. Which leads to the question I ask in the title -- what was the purpose of the alias in Predef, if eventually I need to import the actual package anyway?
Is my understanding flawed, or is my use case not the typical one, or both?

You have misdiagnosed the problem. It isn't that it doesn't recognize the type alias is the same type as what it is aliasing. It's that the type alias is scala.collection.immutable.Set and that is not the same as scala.collection.Set.
Edit: by the way, I thought I'd fixed this, as evinced by the comment in the type diagnostics:
... Also, if the
* type error is because of a conflict between two identically named
* classes and one is in package scala, fully qualify the name so one
* need not deduce why "java.util.Iterator" and "Iterator" don't match.
Apparently needs more work.
Edit 7/17/2010: OK, it took me a shockingly long time, but now at least it says something hard to misunderstand.
files/neg/type-diagnostics.scala:4: error: type mismatch;
found : scala.collection.Set[String]
required: scala.collection.immutable.Set[String]
def f = Calculator("Hello",binding.keySet)
^

The real problem is that scala.collection.immutable.Map#keySet returns a scala.collection.Set (a read-only Set) instead of a scala.collection.immutable.Set (an immutable Set). I'll leave it for someone else to explain why that is...
Edit
Someone asks for an explanation for the return type of Map#keySet in this thread, but doesn't get an answer.

Related

IntelliJ IDEA: default parameter values in Scala

In Scala REPL I can use Seq[String]() as a default value for a parameter of type Seq[T].
Welcome to Scala version 2.11.7 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_101).
Type in expressions to have them evaluated.
Type :help for more information.
scala> def d[T](foo: Seq[T] = Seq[String]()) = 12
d: [T](foo: Seq[T])Int
scala> d()
res0: Int = 12
Trying the same in IDEA, it complains “Seq[String] doesn't conform to expected type Seq[T]”. Why?
IntelliJ IDEA 2016.2.4
Scala Plugin 2016.2.1
Scala 2.11.7
Note 1: Sorry, I know that my example function does not make much sense. However, my real (and useful) function is unnecessarily complex to post it here.
Note 2: At first, instead of type T my type name in the example was Any which was not a good idea (because it shadowed scala.Any) and caused some confusion. Thus I fixed that.
When you say def d[Any], the Any here is a generic place holder. It does not point to class Any in scala. It basically shadows the Any class defined globally in scala. So, when you assign Seq[String] to Seq[Any], the compiler does not know any relation between String and Any. Note that Any could be replaced with any character / word as generic place holder. The result would be the same.
Now coming, to why this works in REPL, I am not exactly sure why REPL accepts if Seq[String] when given as a default value, but I was able to reproduce the error in repl when I do the same operation inside the method body.
The following code in REPL throws error:
def d[Any](foo: Seq[Any]) = {
val a: Seq[Any] = Seq[String]()
}
<console>:12: error: type mismatch;
found : Seq[String]
required: Seq[Any]
val a: Seq[Any] = Seq[String]()
^
I am not sure why REPL was not able to catch the error while given as a default argument.
One alternative theory is, in general when you use generics, the value of the type will be determined based on the caller. For example,
def d[A](a:A) = {}
d(1) // Type of A is Int
d("a") // Type of A is String
So, when you give default value, it assigns the value of String to Any. hence the compilation success.Intellij's Type Checker works based on the first theory and shows an error. But strangely as someone pointed out earlier, the compilation succeeds.

Scala type system, constrain member's type by parameter of own type

Not really sure the standard terminology here, so I'll try to describe what I'm trying to do. In case you're curious, the app I'm actually trying to write is an asynchronous task queue similar to Resque or rq.
I have a type TaskDef[ArgsT <: AnyVal, ResultT <: AnyVal]. In case you're curious, TaskDef represents "how to execute an asynchronous task which takes argument type ArgsT and result type ResultT, or, the code behind a task".
I'm trying to define a type TaskInst[DefT <: TaskDef]. In case you're curious, TaskInst represents "a TaskDef and associated argument to run it with, or, an actual task instance being submitted to the queue". TaskInst has two members, definition: DefT and arguments whose type I cannot write in code.
In English, my desired constraint is: "For a given DefT, where DefT is some TaskDef[ArgsT, ResultT], TaskInst[DefT] should contain a DefT and an ArgsT". That is, the argument type of the task definition should match the type of the argument given to the task.
How do I express this in the Scala type system?
Alternatively, am I modeling my domain incorrectly and attempting to do something un-idiomatic? Would some alternative approach be more idiomatic?
Thanks in advance!
EDIT:
I think my historical self writing Java would probably have resorted to unchecked casts at this point. This is definitely feasible with some amount of unchecked casts and just leaving out the constraint between the type of the TaskInst's arguments vs the type of the embedded TaskDef's arguments. But, I do wonder whether this is something the compiler can enforce, and hopefully without too scary a syntax.
Define them as abstract types:
trait TaskDef {
type Arguments <: AnyVal
type Result <: AnyVal
}
Then use a type projection:
trait TaskInst[DefT <: TaskDef] {
def definition: DefT
def arguments: DefT#Arguments
}
Live Demo
An add-on to the answer that #rightfold gave:
If you are looking to use type parameters throughout, you will need to properly pass the type parameters through to the type constructors.
Excuse me, that's a bit ambiguous for me to say it that way, so let me use my current code as a concrete example.
trait TaskDef[ArgT_, ResT_] {
type ArgT = ArgT_
type ResT = ResT_
val name: String
def exec(arg: ArgT): String \/ ResT
}
class TaskInst[ArgT, ResT, DefT <: TaskDef[ArgT, ResT]] (
val id: UUID,
val defn: DefT,
val arg: ArgT
)
The main divergence of my current code from #rightfold's example is that TaskDef takes type parameters. Then, when TaskInst's declaration references TaskDef, it must provide appropriate types to the type constructor.
I initially made the mistake of passing in placeholders. That is, I was writing class TaskInst[DefT <: TaskDef[_, _]. Turns out, this doesn't mean what I thought it meant. (I don't know. Perhaps others might be inclined to follow the same line of thought. So, this is just a warning not to.) Don't do that, because then scalac will interpret the expected to mean a generated placeholder (which, as you might imagine, nothing matches), and then you get an obscure error message like the following.
[error] /Users/mingp/Code/scala-redis-queue/src/main/scala/io/mingp/srq/core/TaskInst.scala:8: type mismatch;
[error] found : TaskInst.this.arg.type (with underlying type _$1)
[error] required: _$1
[error] val arg: DefT#ArgT_
[error] ^
[error] one error found
Just posting this in hopes that future readers don't fall into the same hole I did.
EDIT:
As a further addendum, now that I've tried it out for a day, my impression is that this isn't actually a good data model for asynchronous tasks. You're better off combining, since stand-alone instances of TaskDef aren't really useful.

[Akka]: Passing generic type function without type loss

I have a actor message of the following type:
case class RetrieveEntities[A](func:(Vector[A]) => Vector[A])
I then would like to handle the message in the following way:
def receive = {
case RetrieveEntities(parameters, func) =>
context.become(retrieveEntities(func))
def retrieveEntities(func:(Vector[T]) => Vector[T])(implicit mf: Manifest[T]){
case _ => ...
}
And I instantiate the actor in the following way:
TestActorRef(new RetrieveEntitiesService[Picture])
The problem is I receive the following compiler error:
type mismatch;
[error] found : Vector[Any] => Vector[Any]
[error] required: Vector[T] => Vector[T]
[error] context.become(retrieveEntities(func))
Which I suppose means I lost the type information but I am unsure why and how to prevent it.
Your example code is a bit to short to give you a solution, but from what you show it seems like what you are trying to do is not possible.
This is why
In Scala (and Java) the type parameters are erased, which means they disappear after compilation, so during runtime they are no longer available. This means that your pattern match on RetrieveEntities(parameters, func) is really a match where A can be anything. You then go on and call a method that is typed with T and there is no way for the compiler to know what you mean with that.
Manifest (which is deprecated), TypeTag and ClassTag are a mechanism that tells the compiler to create an object that provides type information for those after compilation but you have to "save" that information.
To be able to know what A you typed your RetrieveEntitiesService with you would need to take an implicit ClassTag to the constructor to base any logic on it (since when calling the constructor is the time that you know what A is):
import scala.reflect.ClassTag
case class RetrieveEntities[A](func:(Vector[A]) => Vector[A])(implicit val tag: ClassTag[A])
You could then call runtimeClass on the tag to get the type of A:
scala> val retrieve = RetrieveEntities[String](identity)
scala> retrieve.tag.runtimeClass
res2: Class[_] = class java.lang.String
Note that this still would not let you type a method call with since we are now in runtime, but it would let you use that instance of Class to compare with the runtimeClass of E of the actor and then do a safe cast to RetrieveEntities[E] if you like. (and also regular runtime conditional flows, reflection etc.).
Two important notes before you start doing that
I would not advice you to go down that path until you are more confident with the type system and really really know that there is no other reasonable design that solves your problem. Again I can not help you towards such a solution with the sparse example code given. (Maybe your actor does not really need to know about the type of A for example, or there is a limited set of E:s that you might match on with concrete types)
As an additional warning, type and class tags are not thread safe in Scala 2.10, and might not be safe in 2.11 either, so mixing them with actors might be a bad idea. (http://docs.scala-lang.org/overviews/reflection/thread-safety.html)
johanandren's answer was certainly helpful, but at the end I found a way that it could compile and it is working for now.
I needed to give the compiler a more precise type annotation to make it work:
case RetrieveEntities(parameters, func:(Vector[T]) => Vector[T])
I still will continue to use Manifest instead of the new Reflection API (TypeTag and ClassTag) mainly because the library I am using (json4s) uses internally also the Manifest implementation, and I assume it will lead to less problems this way.

Strange type mismatch error

I have a table column errorFixed of type TableColumn[Error, Boolean] inside a TableView[Error]. My Error class has a val fixed: Boolean which I try to put into this table column.
I tried
errorFixed.cellValueFactory = features =>
ReadOnlyBooleanWrapper(features.value.fixed)
but it fails with
type mismatch;
found : scalafx.beans.property.ReadOnlyBooleanWrapper
required: scalafx.beans.value.ObservableValue[Boolean,Boolean]
which I really don't understand as ObservableValue[Boolean,Boolean] is a supertype of ReadOnlyBooleanWrapper according to the documentation.
If I cast it myself using .asInstanceOf[ObservableValue[Boolean, Boolean]] it seems to work. What is going on here?
Gist with stripped down project to reproduce
Short answer is: instead of
errorFixed.cellValueFactory = features =>
ReadOnlyBooleanWrapper(features.value.fixed)
you should use
errorFixed.cellValueFactory = features =>
ObjectProperty[Boolean](features.value.fixed)
or ReadOnlyObjectWrapper[Boolean].
A brief version of long answer: there are certain "frictions" between Scala and Java when working with primitive Java types, like boolean or int. This inconvenience shows up in property binding in ScalaFX. Not everything is inherited in an intuitive way. In this case
ReadOnlyBooleanWrapper
is a subclass of
ObservableValue[scala.Boolean, java.lang.Boolean]
but scala.Boolean is not a subclass of java.lang.Boolean which internally, in ScalaFX this leads to complications. Interesting thing is that the casting .asInstanceOf[ObservableValue[scala.Boolean, scala.Boolean]] works, though type parameters do not match at compile time.
Thanks for positing a full code example (gist) this really helps in clarifying the question.

How to use LongSerializer in scala with hector?

val mutator=HFactory.createMutator(keyspace,StringSerializer.get())
mutator.addInsertion("rahul", "user", HFactory.createColumn("birth_year", 1990,
StringSerializer.get(), LongSerializer.get()))//error in LongSerializer.get() as
mutator.execute()
I am using LongSerializer like above and i am getting the following error.
Description Resource Path Location Type
type mismatch; found : me.prettyprint.cassandra.serializers.LongSerializer
required: me.prettyprint.hector.api.Serializer[Any] Note: Long <: Any (and
me.prettyprint.cassandra.serializers.LongSerializer <:
me.prettyprint.cassandra.serializers.AbstractSerializer[Long]), but Java-defined trait
Serializer is invariant in type T. You may wish to investigate a wildcard type such as _
<: Any. (SLS 3.2.10) User.scala /winoria/app/models line 22 Scala Problem
Tell me the solution .
There are a couple of things happening here.
First, Java does not allow primitive types as generics, so Hector's LongSerializer is an AbstractSerializer[java.lang.Long]. However you are working in Scala, so you need an AbstractSerializer[scala.Long]. Depending on circumstances Scala's Long can either be a primitive long or java.lang.Long. The good news is Scala is smart enough to figure out what to use and when. What you need here is a little type coercion: LongSerializer.get().asInstanceOf[Serializer[Long]]
The other suspect is that you need me.prettyprint.hector.api.Serializer[Any]. It looks like whatever you are calling your method on is lacking proper type declarations. A sample of surrounding code would help diagnose this further.
Edit:
Thanks for posting the surrounding code. It looks like you have a type disagreement. createColumn[N, V] is inferred as createColumn[String, Int], because the 1990 argument you have provided is an Int. This gets converted to java.lang.Int, which is a class and does not have type conversions like primitives do. This is why you get the error "int cannot be cast to long".
val mutator = HFactory.createMutator(keyspace, StringSerializer.get)
mutator.addInsertion(
"rahul", "user",
HFactory.createColumn( // Either do: HFactory.createColumn[String, Long]
"birth_year", 1990L, // Or make sure the type is inferred as Long
StringSerializer.get,
// Coerce serializer to scala.Long
LongSerializer.get.asInstanceOf[Serializer[Long]]))
mutator.execute()