Adendo: This seems to be a scala IDE bug because everythong compiles and run smoothly by using the command line sbt. I'd close the question, but StackOverflow won't allow
I am unable to sum or multiply a matrix by a scalar in Breeze
If I try:
val z = DenseMatrix.zeros[Double](5,3)
z + 2.0
I get two errors:
could not find implicit value for parameter op: breeze.linalg.operators.OpAdd.Impl2[breeze.linalg.DenseMatrix[Double],Double,That]
not enough arguments for method +: (implicit op: breeze.linalg.operators.OpAdd.Impl2[breeze.linalg.DenseMatrix[Double],Double,That])That. Unspecified value parameter op.
The same thing happens if I try * , :* , *: , :+ and +: with slightly different errors.
Things work if it is a DenseVector and not a DenseMatrix and I use :+ .
If I try it in scala IDE worksheet, it produces the error but correctly prints the resulting matrix.
I'm using Scala IDE 4.4.1 , breeze 0.12 , scala 2.11.8
If someone faces this problem. The solution is quite simple.
Step 1
Close the project or Scala IDE
Step 2
From SBT execute, clean and later on eclipse command.
Step 3
Open eclipse again.
Eclipse should resolve the implicit conversions.
Related
I recently revived an old library that was written in scala 2.9, and I created a new scala project using scala 2.13.2
I am getting errors like the following:
type mismatch;
found : scala.collection.mutable.Buffer[Any]
[error] required: Seq[Any]
Was there a specific change between 2.9 to 2.13.2 that involved not implicitly casting sequences or something that might solve many of these types of compile errors?
I had to add .toSeq to many of my function return statements that were vals of Buffer[Any] that needed to be passed as an arguement to a function expected a Sequence.
Quite a lot things happened in the last 7+ years (including rewrite of the collections library).
If adding .toSeq solves your problem - just go for it.
If you want to know what exactly has changed - try upgrading version-by version: first upgrade to scala-2.10., then to 2.11., then 2.12.*, then, finally, to 2.13.2.
At each upgrade you'll probably see deprecation warnings. Fix them before upgrading to the next version.
Brave, but perhaps bad form, to disturb the dead. Nevertheless, maybe pass mutable.Buffer as mutable.Seq instead of Seq which is by default immutable.Seq. Consider
val mb = mutable.Buffer(11, Some(42))
val ms: mutable.Seq[Any] = mb // OK
val is: Seq[Any] = mb // NOK
I'm using Breeze to to do sum simple linear algebra operations on dense matrices. I'm using the Intellij IDEA. Here is an snippet of my code:
import breeze.linalg._
val X1:DenseMatrix[Double] = DenseMatrix.zeros[Double](10, 5) + 1.0
val n1 : Double = X1.rows.toDouble
val one_tall_t1 = DenseMatrix.zeros[Double](1, n1.toInt) + 1.0
val mu1=one_tall_t1*X1/n1
In the last line, the symbols * and / are shown with red color in the IDE. The error message is "Cannot resolve the symbol *".
But Intellij builds the program without any errors, and it runs fine.
I've been trying to find out the reason: since I'm new to Scala, I'm not sure if it is because of Intellij, Breeze, or just my code. In some posts, people have suggested to invalidate cache and restart Intellij, but this does not solve my issue.
I appreciate your comments or solutions!
IntelliJ gets confused by complex implicit searches like those used in Breeze. I file bugs when I can minimize them and get around to it, but it's a slog. (Eclipse, for what it's worth, isn't much better.)
It typically works better if you're just depending on Breeze, not developing inside of it. I assume you're doing that already though.
val finalRDD = joinedRDD.map(x => {
val d1 = x._2._1
val d2 = x._2._2
(x._1, d1 + d2)
})
In the above code, joinedRDD has type RDD[(Row, (Double, Double))] (according to IntelliJ) while Scala compiler says d1 & d2 are AnyVal.
For time being, I cast d1 & d2 as Double using asInstanceOf but next time it says
java.lang.ClassCastException: java.lang.Integer cannot be cast to
java.lang.Double
Is it Scala compiler issue or IntelliJ issue which shows me wrong inferred types. Any insights?
Seems good to me :-S
Type inference is far from omniscient. Sometimes you need to specify the types explicitly. In my experience, this is especially true when the result type can be anything. Some things to try:
My preferred option since you are not touching the key: joinedRDD.mapValues(x => x._1 + x._2)
Add some type information: val d1: Double = x._2._1. With some luck, at least the compiler might be more explicit.
Define your function separately, assigning types to the parameters, and use if inside: map(myFunc)
Also, I've seen some differences between IntelliJ Scala Plugin and the actual Scala compiler. Given the errors you are getting and the fact that AnyVal is the common parent class for both Int and Double, there is a good chance you don't have doubles to begin with (and the compiler is trying to find a shared parent). Do double check that you are getting the type you mention by putting it explicitly. It is very possible that your type confusion occurs before this line.
Good luck!
Well, I tried in IntelliJ IDEA 14 and the type inference is correct, recognizing d1 and d2 as Double (this was expected). Nonetheless, I usually avoid the type-aware highlighting feature of IDEA since many times it goes crazy and reports fake results.
As a side note, since you are not changing the key of your RDD, consider using mapValues instead of map (this provides clarity, as well as performance since it would take advantage of the partitioner of the input RDD and reuse it in the output RDD).
If I create a Scala Worksheet in Eclipse as follows:
object negative {
2.toString //> res0: String = 2
(2).toString //> res1: String = 2
// compile error
(-2).toString
}
the final line causes a compile error:
';' expected but ')' found. illegal start of simple expression
However, the same three lines compile and run fine within a normal Scala source file.
Why does this not work in the worksheet?
This is using Eclipse 3.7.2, Scala IDE 3.0.0.v-2_10, Scala Worksheet 0.1.4.v-2_10
[Updated: this question originally used toBinaryString, but the problem occurs even with toString, so I have simplified it]
It is a bug. The code in the main object (the first one) of a worksheet is instrumented before being executed. In the 2 mentioned case, the result of the instrumentation is not valid Scala code.
But it is only a problem if the code is at the top level in the main object. If the code is moved to a function or a different object in the same file, it works fine.
Eclipse worksheets are quite beta; for example last I checked, it couldn't handle a #tailrec decoration on a function.
So this is most probably a bug or limitation in Eclipse. After all, the feature seems quite new, and there are many other bugs.
(-2).toBinaryString
gives same error for me.
Note that java.lang.Integer.toBinaryString(-2)works just fine.
Back when reflection was still incipient, on the days of Scala 2.10.0 milestones, I asked a question about how could I use it to see the trees of code snippets from REPL. The excellent answer went further than I asked, and showed how they can be used to parse and evaluate trees as well, so I went ahead and tried to use that on a little project I had going on today.
Unfortunately, code parsed and evaluated that way doesn't seem to see any REPL definition:
scala> val x = 1
x: Int = 1
scala> import scala.tools.reflect.ToolBox
import scala.tools.reflect.ToolBox
scala> val tb = scala.reflect.runtime.universe.runtimeMirror(
getClass.getClassLoader).mkToolBox()
tb: scala.tools.reflect.ToolBox[reflect.runtime.universe.type] = ...
scala> tb.eval(tb.parse("x"))
scala.tools.reflect.ToolBoxError: reflective compilation has failed:
not found: value x
Is there a way to get it to recognize definitions made on REPL?
Recently I dug into repl, when trying to make it support type macros, so I'm well equipped to explain why it doesn't work. Getting it to work would be the next step :)
I know that you know that every snippet entered into repl gets wrapped into some boilerplate before being compiled. Therefore that x ends up being a field in a nested-nested-nested object in a package with a weird name.
Apparently, repl keeps track of all defined symbols and then injects the necessary imports along with the boilerplate it generates. Therefore subsequent lines can see that x unqualified. To the contrast, toolboxes simply reuse repl's classloader, but don't do anything about the imports, hence the failure.
A workaround would be to somehow get to an object representing a repl, ask it about defined symbols and then generate corresponding imports into the code that you feed to a toolbox. If you file a ticket, I'll try to code up a workaround after the 2.10.1 code freeze madness ends (supposedly, end of this week).