Intellij16:Why the scala source code has grammatical errors in intellij - scala

Today I want to see scala source code using intellij,when I see HashMap class I found intellij has grammatical errors,scala source code must be correct,I want to know why intellij has grammatical errors here
It's show 'reassignment to val' error

Related

Scala worksheet evaluation results not showing up in Intellij

I am trying out scala for the first time and am following this page to set upmy first scala project :
https://docs.scala-lang.org/getting-started-intellij-track/getting-started-with-scala-in-intellij.html
However despite this on creating a simple worksheet with println("hello") upon evaluation no results come up.
What am I doing wrong ?
This seems to be an issue that is only happens on Scala 2.13. (I have not exhaustively tested it though.)
I have used 2.12.9 successfully.
I opened an issue on the Scala docs project too.
https://github.com/scala/docs.scala-lang/issues/1486

Prevent IDEA from inserting back quote when auto complete attributes

When I autocomplete an attribute in scala IntelliJ IDEA, it insert it while surrounding it with back quote ` like this :
`myAttribute` instead of myAttribute
This is annoying and I'd love to remove this behaviour. How to do this ?
Intellij only does this if the attribute or the variable name you are using is also a keyword in scala. So that it may reference the correct one. And its the feature of scala and not intellij.
If it is happening for other cases too.. then please recheck your Intellij Scala Code style, or just RESET your intellij .

Finding unused methods in scala using Intelli J

I am working on a scala project in IntelliJ and want to clean up my code.
For this, I am looking for a way in which I can get a list of unused methods within my project.
This link didn't answer my question. It is for Java and not for Scala.
Finding unused methods in IntelliJ (excluding tests)
Analyze -> Inspect Code for Scala in IntelliJ has an option to select unused symbol but I am unable to drill it down to only check methods.
Go to Analyze -> Inspect code
Under Inspection Profile click on "..." and make sure "Unused declaration" is checked. Chose Whole project option and then OK.

How to skip error checking when compiling Scala files in IDEA?

The run config option Make, no error check in IntelliJ IDEA does not skip Scala errors that result in ambiguous ClassDefNotFound errors upon running a project.
How do I skip error checking when compiling Scala files?
My end goal is for the program to fail at runtime when the classes with errors are accessed, not before runtime preventing the whole program from running.
As the name suggests, "Make, no error check" will just pretend compile errors don't matter. Then obviously you end up with an incomplete class path, not containing any classes that produce compiler errors. As a consequence, when you run that project, you should not be surprised to find class-not-found errors.
I don't know how this "feature" interacts with the incremental Scala compiler. I wouldn't expect "Make, no error check" to produce any useful results.
If you want to skip parts of your code that currently don't compile, a good solution is to use the ??? "hole" value, i.e. a placeholder of type Nothing that you can plug into any position where you would have incomplete code. E.g.
def myMethod: String = ??? // implement later
This allows these things to be compiled and will only produce runtime errors when hit during runtime.

Why does IntelliJ IDEA throw compilation error?

Compiling Spark gives this compile error :
To fix I modify Utils.classIsLoadable method to just return true:
def classIsLoadable(clazz: String): Boolean = {
// Try { Class.forName(clazz, false, getContextOrSparkClassLoader) }.isSuccess
true
}
I realise this is not a good fix, but so far Spark seems to be running correctly from source. Has this compile error been experienced before and is there a fix? Will returning true suffice for now , I'm not sure what impact modifying this return value may have?
I suggest compiling Spark from the command-line using Maven or SBT instead of trying to use your IDE's compiler. Many of the core Spark developers use IntelliJ for editing Spark's source code but still use the command-line compilers, largely because it's been difficult to get the project to build correctly inside IDEs. Even if you're using an external compiler, you should still be able to benefit from IntelliJ's syntax highlighting, type checking, etc.
Here's a relevant discussion from the Spark developer mailing list: http://apache-spark-developers-list.1001551.n3.nabble.com/IntelliJ-IDEA-cannot-compile-TreeNode-scala-td7090.html
Note that Spark users should be able to use IntelliJ to compile applications that depend on Spark; this issue only affects developers who want to build Spark itself.
If you're interested in fixing the build to work with IntelliJ, I recommend opening a ticket on the Spark issue tracker.