I have a problem with Intellij running tests on scala. I guess It is a configuration issue but I can't figure it out.
1.- When I write some simple test like below, I got the message: "class MyTest must either be declare as abstract or implement abstract member 'expectedTestCount' ... "
class MyTest extends FunSuite {
test("a") {
assert(true)
}
}
But I am just following scala's documentation in which no overriding is going on. It compiles and I can run the test, but it is anoying
2.- The other weird thing is the triple equal operator (===). You can use it in scala for a better explanation on tests, but Intellij seems to not recognize It: "cannot resolve symbol ===" message pops up when writing it. Again It compiles and works OK but you get all your code in alarm red
Any suggestion?
Best regards
Related
When debugging Spark/Scala code with IntelliJ, using e.g. df.select($"mycol") does not work in the evaluate expression window, while df.select(col("mycol")) works fine (but needs code change):
It says :
Error during generated code invocation:
com.intellij.debugger.engine.evaluation.EvaluateException: Error evaluating method : 'invoke': Method threw 'java.lang.NoSuchFieldError' exception.: Error evaluating method : 'invoke': Method threw 'java.lang.NoSuchFieldError' exception.
Strangely, it seems to work sometimes, especially if the $ is already part of an existing expression in the code which I mark to evaluate. If I write arbitrary expressions (code-fragments), it fails consistently
EDIT: even repeating import spark.implicts._ in the code-fragment window does not help
Try this workaround:
import spark.implicits._
$""
df.select($"diff").show()
It seems that import spark.implicits.StringToColumn at top of the code fragment works
I think the reason is that IntelliJ does not realize that the import of the implicits is used in the first place (it is rendered gray), therefore its not available in the evaluate expression context
I've had a similiar issue with NoSuchFieldError because of missing spark instance within the "Evaluate Expression" dialog. See example below.
class MyClass(implicit spark: SparkSession) {
import spark.implicits._
def myFunc1() = {
// breakpoint here
}
My workaround was to modify spark instance declaration in the constructor. I've changed "implicit spark" to "implicit val spark".
..and it works :)
I am working on a big sbt project and there is some functionality that I want to benchmark. I decided that I will be using jmh, thus I enabled the sbt-jmh plugin.
I wrote an initial test benchmark that looks like this:
import org.openjdk.jmh.annotations.Benchmark
class TestBenchmark {
#Benchmark
def functionToBenchMark = {
5 + 5
}
}
However, when I try to run it with jmh:run -i 20 -wi 10 -f1 -t1 .*TestBenchmark.* I get java.lang.InternalError: Malformed class name. I have freshly rebuilt the project and everything compiles and runs just fine.
The first printed message says
Processing 6718 classes from /path-to-repo/target/scala-2.11/classes
with "reflection" generator
I find it weird that the plugin tries to reflect the whole project (I guess including classes within the standard library). Before rebuilding I was getting NoClassDefFoundError, although the project was otherwise working well.
Since there are plenty of classes within the project and I cannot make sure that every little bit conforms to jmh's requirements, I was wondering if there's a way to overcome this issue and focus and reflect only the relevant classes that are annotated with #Benchmark?
My sbt version is 0.13.6 and the sbt-jmh version is 0.2.25.
So this is an issue with Scala and Class.getSimpleClassName.
Its not abonormal in Scala to have types like this:
object Outer {
sealed trait Inner
object Inner {
case object Inner1 extends
case object Inner2 extends Inner
}
}
With the above calling Outer.Inner.Inner1.getClass().getSimpleName() will throw the exception your seeing.
I don't think it uses the full project, but only for things that are directly referred to in the State or Benchmark.
Once I had my bench file written that way it worked.
I've been using IntelliJ IDEA 2016 with scala plugin 3.0, but run into the following error:
IntelliJ tells me that
Type "V" overrides nothing.
NodeVisitor and Visitor are all trait:
trait NodeVisitor extends Visitor[NodeBase]{
override def visit[E >:NodeBase](node:E):Unit
}
However, the code compiles fine. The same code also shows no error in Eclipse. Is this IDEA's bug? Or do I have to configure something special?
Early definitions block is used to initialize fields of your class in the right order on a new instance creation. It may contain only val and var definitions by the language specification. You can override type member in the body of your class, as it doesn't depend on the order of initialization.
I'm not sure why scalac compiles it, seems like a bug to me.
I have my very first scala program HelloWorld.scala which is located in ~/home/Jacobian/Projects/core/scala/ folder. Now I want to test the concept of packages and see how they can be imported/used. This is the code I have:
package Projects.core.scala
class HelloWorld{
private[scala] var greeting: String = "Hello from package"
}
That is. As you can see, I call my package Projects.core.scala just to reflect that I'm somewhere in /...very_long_path/Projects/core/scala/. I'm not sure what I'm doing wrong, but the problem is, when I compile my program with $ scalac HelloWorld.scala I get an error message:
error: '{' expected but ';' found
It's a funny error message, because I have no ';' in my code. If someone can show what I'm doing wrong, that would be great. But it would be even more useful, if someone could also share how this class HelloWorld wrapped in some arbitrary package (not necessary compatible with folders structure) can be imported. (PS. I'm reading Horstman book on Scala, but I can not figure out what can I do when it comes to a real world program and not to abstract concepts that are quite easy to understand)
FULL ERROR MESSAGE:
HelloWorld.scala:9: error: '{' expected but ';' found
class HelloWorld{
^ one error found
This error message pops up, when there are no parantheses and if I add
package Projects.core.scala {
class HelloWorld { ...
then the error goes away.
Importing issue
Now that I've wrapped class HelloWorld with parantheses and compiled the program, I see that there appeared some folder structure. So, it seems like now it should be quite easy to import the class and this is how I try to do that:
import Projects.core.scala.HelloWorld
object Test{
def main(args: Array[String]){
println("Hello from Test")
}
}
Test.scala is located right in the same folder where HelloWorld.scala and in this very folder I see Projects/core/scala/HelloWorld.class and other compiled files. However, when I try to compile now Test.scala, I get another error message:
Test.scala:3:error not found: object Projects
It is again very interesting - why it is trying to find an object, when I want it to import a class? And why does not it import this class? What other woodoo magic should take place?
File tree structure
HelloWorld.scala
Test.scala
Projects/core/scala/HelloWorld.class
And this is how Test.scala looks like:
import Projects.core.scala.HelloWorld._
object Test{
def main(args: Array[String]){
println("Hello from test")
}
}
I also tried import Projects.core.scala.HelloWorld and import Projects.core.scala._, but none of them work. I always get an error message:
error: not found: object Projects
EDIT
I added full path to Projects/core/scala/ to the CLASSPATH, but it does not help. Still, compiler is unable to import my custom class. I find it really frustrating, cos in Python or other languages that I know, it will take me a couple of second to do the trick, here I have to spend hours on such a miserable task - just importing a class. Gosh!
here is my test case , while i right click the file eclipse doest not show any run as junit test option. I try to manual create run configuration but does not take any sense.
scala version:2.8.1 scalatest:1.3 eclipse:3.6.2
package org.jilen.cache.segment
import org.junit.runner.RunWith
import org.scalatest.junit.JUnitRunner
import org.scalatest.FlatSpec
import org.scalatest.matchers.ShouldMatchers
#RunWith(classOf[JUnitRunner])
class RandomSegmentSpec extends FlatSpec with ShouldMatchers {
val option = SegmentOptions()
"A Simple Segment" should "contains (douglas,lea) after put into" in {
val segment = RandomSegment.newSegment(option)
segment.put("douglas", "lea")
segment("douglas") should be("lea")
}
it should "return null after (douglas,lea) is remove" in {
val segment = RandomSegment.newSegment(option)
segment.put("douglas", "lea")
segment -= ("douglas")
segment("douglas") should equal(null)
}
it should "contains nothing after clear" in {
val segment = RandomSegment.newSegment(option)
segment.put("jilen", "zhang")
segment.put(10, "ten")
segment += ("douglas" -> "lea")
segment += ("20" -> 20)
segment.clear()
segment.isEmpty should be(true)
}
}
I've encountered this seemingly randomly, and I think I've finally figured out why.
Unfortunately the plugin doesn't yet change package declarations when you move files, nor the class names when you rename files. (Given you can put multiple classes in one file, the latter will likely never be done.) If you are used to the renamings being done automagically in Eclipse, like I am, you're bound to get caught on this.
So... check carefully the following:
the package declaration in your Scala file matches the Eclipse package name
the name of the test class in the Scala file matches the name of the Scala file
I just ran into this, fixed both, and now my test runs!
This is a known problem with the Eclipse IDE for Scala. I'm currently working on the plugin for this. Watch this space.
I found Scalatest to be very bad at integrating with Eclipse (running the tests from eclipse showed that it ran them - but they would not pass or fail, but simply show up as passive blank boxes).
For some reason I could NOT get it to work after 3 hours of trying things!
Finally I tried specs2 - and it worked (Scala 2.9, Junit4 and Eclipse 3.6)!
They have a great doc here:
http://etorreborre.github.com/specs2/guide/org.specs2.guide.Runners.html#Runners+guide
Since I don't care which testing framework to use, I will try Specs2 purely from the convenience point of view.