Can't create Breeze DenseMatrix with Scala3 - scala

When trying to create a dense matrix of type Option with Scala 3 I receive an error.
val dm1 = DenseMatrix((1,2),(1,2)) // <- this works
val dm2 = DenseMatrix((Some(1),Some(2)),(Some(1),Some(2))) <- doesn't work
Error: no implicit argument of type breeze.storage.Zero[V] was found for parameter zero of method apply in trait MatrixConstructors
Btw, it is working in Scastie and Scala 2.
https://scastie.scala-lang.org/89HUyuXNQrqWDPNpRbtrOw

Try to add necessary implicit
implicit val optIntZero: Zero[Option[Int]] = Zero(Some(0))
implicit val someIntZero: Zero[Some[Int]] = Zero(Some(0))
or more generally
implicit def optZero[A](implicit zero: Zero[A]): Zero[Option[A]] = Zero(Some(zero.zero))
implicit def someZero[A](implicit zero: Zero[A]): Zero[Some[A]] = Zero(Some(zero.zero))
or just
implicit def someZero[F[t] >: Some[t], A](implicit zero: Zero[A]): Zero[F[A]] = Zero(Some(zero.zero))
Problem with evidence parameters in breeze

Another easy approach. Create a "simple type" matrix and map to to the target type.
val v = DenseVector(1, 1, 1)
val dm1 = DenseMatrix(v,v)
val dm2 = dm1.map( Some(_))

Related

Why is Future[Set[Unit]] not accepted as Future[Unit]?

A common gotcha when working with futures is that when you expect Future[Unit], even Future[Future[Unit]] will be accepted (see e.g. Why Shouldn’t You Use Future[Unit] as a Return Type in a Scala Program).
I was surprised recently Future.sequence(setOfFutures) is not accepted in such situation:
import scala.concurrent._
import scala.concurrent.ExecutionContext.Implicits.global
val set = Set(Future(()))
def fuu: Future[Unit] = {
Future.sequence(set)
}
With Scala 2.12.13 I get error:
type mismatch;
found : scala.concurrent.Future[scala.collection.immutable.Set[Unit]]
With Scala 2.13 I get:
Cannot construct a collection of type Unit with elements of type Unit based on a collection of type scala.collection.immutable.Set[scala.concurrent.Future[Unit]].
When I change the body of the function to:
val s = Future.sequence(set)
s
I get set the same error as before.
Why is Future[Future[Unit]] accepted as a Future[Unit] and Future[Set[Unit]] or Future[List[Unit]] is not?
Consider the signature of Future.sequence in Scala 2.13
def sequence[A, CC[X] <: IterableOnce[X], To](in: CC[Future[A]])(
implicit
bf: BuildFrom[CC[Future[A]], A, To],
executor: ExecutionContext
): Future[To]
so given
val set = Set(Future(()))
def fuu: Future[Unit] = Future.sequence(set)
then inference will assign type paramers of sequence like so
To = Unit
A = Unit
CC = Set
For example consider fuu's return type Future[Unit] = Future[To]. Hence we have
def fuu: Future[Unit] = Future.sequence[Unit, Set, Unit](set)
so compiler needs to implicitly assign bf parameter
scala> implicitly[BuildFrom[Set[Future[Unit]], Unit, Unit]]
^
error: Cannot construct a collection of type Unit with elements of type Unit based on a collection of type Set[scala.concurrent.Future[Unit]].
Now consider Scala 2.12 signature of Future.sequence
def sequence[A, M[X] <: TraversableOnce[X]](in: M[Future[A]])(
implicit
cbf: CanBuildFrom[M[Future[A]],A,M[A]],
executor: ExecutionContext
): Future[M[A]]
so given
val set = Set(Future(()))
def fuu: Future[Unit] = Future.sequence(set)
inference becomes
A = Unit
M = Set
so we have
def fuu: Future[Unit] = Future.sequence[Unit, Set](set)
where compiler can successfully implicitly assign cbf parameter
scala> implicitly[CanBuildFrom[Set[Future[Unit]],Unit,Set[Unit]]]
res4: scala.collection.generic.CanBuildFrom[Set[scala.concurrent.Future[Unit]],Unit,Set[Unit]] = scala.collection.generic.GenSetFactory$$anon$1#1bff70a6
hence we effectively have in 2.12 the following situation
scala> def fuu: Future[Unit] = Future.sequence(set) : Future[Set[Unit]]
<console>:25: error: type mismatch;
found : scala.concurrent.Future[Set[Unit]]
required: scala.concurrent.Future[Unit]
def fuu: Future[Unit] = Future.sequence(set) : Future[Set[Unit]]
This should explain the difference between the two compiler error messages between the two Scala versions is not related to value discarding but to how inference assigned the corresponding types.

How to use the mutable.AnyRefMap.fromZip?

This is my code:
val nums = (2 to 10).toList
val flags = List.tabulate(nums.size)(_ => 1)
val num_flags = mutable.AnyRefMap.fromZip(nums, flags)
It got the exception:
cmd6.sc:1: overloaded method value fromZip with alternatives:
[K <: AnyRef, V](keys: scala.collection.mutable.Iterable[K], values: scala.collection.mutable.Iterable[V])scala.collection.mutable.AnyRefMap[K,V] <and>
[K <: AnyRef, V](keys: Array[K], values: Array[V])scala.collection.mutable.AnyRefMap[K,V]
cannot be applied to (List[Int], List[Int])
val num_flags = mutable.AnyRefMap.fromZip(nums, flags)
^
Compilation Failed.
How to use the mutable.AnyRefMap.fromZip ? And why the exception?
If you look at the AnyRefMap.fromZip method declaration, you'll see the following constraint:
K <: AnyRef
This means that K has to be a subtype of AnyRef, where Int is a subtype of AnyVal. Thus, the compiler fails, telling you the constraints do not match.
I don't see a reason to use AnyRefMap here, if you want a mutable map, just use mutable.Map:
def main(args: Array[String]): Unit = {
val keys = (2 to 10).toList
val values = List.tabulate(nums.length)(_ => 1)
val numFlags = mutable.Map(keys.zip(values):_*)
}

How do you implicitly convert to Java autoboxed types in Scala?

I've seen how we can go from Scala to Java, e.g. from Scala can't multiply java Doubles?:
implicit def javaToScalaDouble(d: java.lang.Double): Double = d.doubleValue
I tried something naive to go in the other direction:
implicit def toJavaDouble(d: Double): java.lang.Double = new java.lang.Double(d)
However, I still get a compilation error when a complex type:
Error:(123, 99) type mismatch;
found : java.util.Map[String,java.util.function.Function[edu.xxx.SimulationTimestep,scala.Double]]
required: java.util.Map[String,java.util.function.Function[edu.xxx.SimulationTimestep,java.lang.Double]]
EconomicTimeSeriesCSVWriter.write(timesteps(simulationId, cowId), new File(selected), datasetFunctions, isParallelizable)
The problem is with the type of datasetFunctions; it is defined as:
val datasetFunctions: util.Map[String, Function[SimulationTimestep, Double]] = ...
I've also imported JavaConversions._, though I don't think that seems to help with autoboxed types from what I can see so far.
Edited to add a simplified example:
import java.util
import scala.collection.JavaConversions._
type O = java.lang.Object
type JD = java.lang.Double
implicit def javaToScalaDouble(d: JD): Double = d.doubleValue
implicit def scalaToJavaDouble(d: Double): JD = new JD(d)
def myJavaFun(in: util.Map[O, JD]): Unit = {}
val myMap: util.Map[O, Double] = new util.HashMap[O, Double]()
//Doesn't work
// myJavaFun(myMap)
//Doesn't seem to work either
myJavaFun(mapAsJavaMap[O, JD](myMap.map(x => x).toSeq.toMap))
// A more real example (if it should make any difference)
//def myJavaFun2(in: util.Map[String, Function[O, Double]]): Unit = {}
Two problems:
Having an implicit B => C won't automatically let you use a Map[A, B] as a Map[A, C]. You need to do something like .mapValues(x => x) to provide some place for the implicit to apply.
The implicits between scala.Double and java.lang.Double are already defined in Predef. Your redefinitions of them create an ambiguity which prevents them from applying.
Your example, fixed and trimmed down:
import scala.collection.JavaConversions._
def myJavaFun(in: java.util.Map[Object, java.lang.Double]): Unit = {}
val myMap = new java.util.HashMap[Object, Double]()
myJavaFun(mapAsJavaMap(myMap.mapValues(x => x)))

Scala implicit conversion of apply method

I tried the following to create an option-checking style in code:
object Test {
trait Check
object x extends Check
def option() = false
def option(xx: Check) = true
implicit class AugmentCheck(s: String) {
def apply() = ""
def apply(xx: Check) = s
}
}
import Test._
val delete = option ( x )
val force = option ( )
val res = "mystring" ( )
But I face the following problem:
<console>:11: error: type mismatch;
found : String("mystring")
required: ?{def apply: ?}
Note that implicit conversions are not applicable because they are ambiguous:
both method augmentString in object Predef of type (x: String)scala.collection.
immutable.StringOps
and method AugmentCheck in object Test of type (s: String)Test.AugmentCheck
are possible conversion functions from String("mystring") to ?{def apply: ?}
val res = "mystring" ( )
^
<console>:11: error: String("mystring") does not take parameters
val res = "mystring" ( )
This is unfortunate, because even if there is ambiguity in the symbol name, there shouldn't be any ambiguity in the function signature.
If I introduce the method "apply", it works fine when there is an argument, but not without:
val res = "mystring" apply ( x )
res == ""
How can I remove the keyword "apply" in this case?
Instead of representing the on and off states with x or  , what about using y for yes and n for no?
object Test {
val y = true
val n = false
class DefaultConfigValue[U](val v: U)
implicit class ConfigValue[U](v: U) {
def y = v
def n(implicit default: DefaultConfigValue[U]) = default.v
}
implicit val defaultConfigString = new DefaultConfigValue("")
}
import Test._
import language.postfixOps
val delete = y
val force = n
val res1 = "my first string" y
val res2 = "my second string" n
Notice how you can import postfixOps and then just add extra whitespace to align all the ys and ns for your string configuration options into the same column.
I'm not sure if this makes sense, but I wrote the above code such that you should be able to use any type (not just strings) for the options with the y and n postfix operators. All you have to do to enable a new type is create an implicit DefaultConfigValue for that type so that the no option has something to return.
I'd strongly recommend against using this pattern, as it violates the principle of least surprise.
StringOps already defines apply(n) on a String to return the Nth character, any other overload is going to confuse a lot of people. Even in the context of a DSL.
Option is also a well-understood term, whose meaning is overloaded here.
Finally: x is very commonly used as a parameter to methods and as a bound value in pattern-matching. Using it in a wide scope like this is going to cause a lot of shadowing, and risks causing some very unexpected error messages.
Can you give a better idea of what you're trying to achieve so that we can suggest some better alternatives?

Java SortedMap to Scala TreeMap

I'm having trouble converting a java SortedMap into a scala TreeMap. The SortedMap comes from deserialization and needs to be converted into a scala structure before being used.
Some background, for the curious, is that the serialized structure is written through XStream and on desializing I register a converter that says anything that can be assigned to SortedMap[Comparable[_],_] should be given to me. So my convert method gets called and is given an Object that I can safely cast because I know it's of type SortedMap[Comparable[_],_]. That's where it gets interesting. Here's some sample code that might help explain it.
// a conversion from comparable to ordering
scala> implicit def comparable2ordering[A <: Comparable[A]](x: A): Ordering[A] = new Ordering[A] {
| def compare(x: A, y: A) = x.compareTo(y)
| }
comparable2ordering: [A <: java.lang.Comparable[A]](x: A)Ordering[A]
// jm is how I see the map in the converter. Just as an object. I know the key
// is of type Comparable[_]
scala> val jm : Object = new java.util.TreeMap[Comparable[_], String]()
jm: java.lang.Object = {}
// It's safe to cast as the converter only gets called for SortedMap[Comparable[_],_]
scala> val b = jm.asInstanceOf[java.util.SortedMap[Comparable[_],_]]
b: java.util.SortedMap[java.lang.Comparable[_], _] = {}
// Now I want to convert this to a tree map
scala> collection.immutable.TreeMap() ++ (for(k <- b.keySet) yield { (k, b.get(k)) })
<console>:15: error: diverging implicit expansion for type Ordering[A]
starting with method Tuple9 in object Ordering
collection.immutable.TreeMap() ++ (for(k <- b.keySet) yield { (k, b.get(k)) })
Firstly, to clarify your error:
// The type inferencer can't guess what you mean, you need to provide type arguments.
// new collection.immutable.TreeMap
// <console>:8: error: diverging implicit expansion for type Ordering[A]
//starting with method Tuple9 in object Ordering
// new collection.immutable.TreeMap
// ^
You can write an implicit to treat Comparable[T] as Ordering[T] as follows.
// This implicit only needs the type parameter.
implicit def comparable2ordering[A <: Comparable[A]]: Ordering[A] = new Ordering[A] {
def compare(x: A, y: A) = x.compareTo(y)
}
trait T extends Comparable[T]
implicitly[Ordering[T]]
However, if you really don't know the type of the key, I don't think you can create the Ordering in terms of Comparable#compareTo, at least without reflection:
val comparableOrdering = new Ordering[AnyRef] {
def compare(a: AnyRef, b: AnyRef) = {
val m = classOf[Comparable[_]].getMethod("compareTo", classOf[Object])
m.invoke(a, b).asInstanceOf[Int]
}
}
new collection.immutable.TreeMap[AnyRef, AnyRef]()(comparableOrdering)
You can probably also just give an explicit type to the TreeMap. That's how I just solved a similar problem:
collection.immutable.TreeMap[whatever,whatever]() ++ ...
(Sorry, I don't have the time to check how exactly this applies to the sources posted in the question.)