Why does IntelliJ IDEA throw compilation error? - scala

Compiling Spark gives this compile error :
To fix I modify Utils.classIsLoadable method to just return true:
def classIsLoadable(clazz: String): Boolean = {
// Try { Class.forName(clazz, false, getContextOrSparkClassLoader) }.isSuccess
true
}
I realise this is not a good fix, but so far Spark seems to be running correctly from source. Has this compile error been experienced before and is there a fix? Will returning true suffice for now , I'm not sure what impact modifying this return value may have?

I suggest compiling Spark from the command-line using Maven or SBT instead of trying to use your IDE's compiler. Many of the core Spark developers use IntelliJ for editing Spark's source code but still use the command-line compilers, largely because it's been difficult to get the project to build correctly inside IDEs. Even if you're using an external compiler, you should still be able to benefit from IntelliJ's syntax highlighting, type checking, etc.
Here's a relevant discussion from the Spark developer mailing list: http://apache-spark-developers-list.1001551.n3.nabble.com/IntelliJ-IDEA-cannot-compile-TreeNode-scala-td7090.html
Note that Spark users should be able to use IntelliJ to compile applications that depend on Spark; this issue only affects developers who want to build Spark itself.
If you're interested in fixing the build to work with IntelliJ, I recommend opening a ticket on the Spark issue tracker.

Related

Scala worksheet evaluation results not showing up in Intellij

I am trying out scala for the first time and am following this page to set upmy first scala project :
https://docs.scala-lang.org/getting-started-intellij-track/getting-started-with-scala-in-intellij.html
However despite this on creating a simple worksheet with println("hello") upon evaluation no results come up.
What am I doing wrong ?
This seems to be an issue that is only happens on Scala 2.13. (I have not exhaustively tested it though.)
I have used 2.12.9 successfully.
I opened an issue on the Scala docs project too.
https://github.com/scala/docs.scala-lang/issues/1486

Issue with BufferedReader.readLine using sbt run or sbt console

My question is quick I'm working on a small console for reading input in and then calling the appropriate code. I'm using sbt and I've encountered an issue where when I try to read input after running my program with sbt run, inside sbt console, or even in the plain old scala interpreter.
The prompt appears to just hang, but if I hit return it does actually read the input in. Though the shell's buffer remains empty. Here is the general code I've been trying that has been giving me the issue.
import java.io._
val s = new BufferedReader(new InputStreamReader(System.in))
val line = s.readLine
println(line)
Does anyone know why this is, and if so is there a way to fix this? I would love to be able to see what I type when I run my program from sbt. Without seeing my typing in the shell it makes the testing and development of my project much less enjoyable.
This is really a Java API question, although in Scala. BufferedReader.readLine() will consume all the characters you type from System.in until it has a whole line, at which time it will return the line as you said.
Console input was difficult in Java with the original java.io classes. Prior to Java6, I've seen a couple of messy solutions to this, but fortunately a new class was introduced with that release to make it much easier: java.io.Console. I think it then becomes as simple as
val line = System.console.readLine
println(line)

Import customization in Groovy-Eclipse for DSL

I'm using Groovy for a calculation engine DSL and really like the support we now have in Eclipse with STS and the Groovy-Eclipse plug-in (I'm on STS 2.8.0M2 with latest milestone of Groovy-Eclipse 2.5.2).
One issue I have is I don't know how to get the Groovy editor to 'know' the automatic imports I've added to my script runner, meaning Eclipse gives me a whole bunch of false errors. If you use the Groovy class loader, you can add additional import for 'free', so you avoid needing to do imports in your script.
I've had a play with the DSLD support in Groovy-Eclipse (which can be used to add auto-completion support) but it's not obvious to me that this is something I could do with it - I don't find the DSLD documentation the simplest to follow.
The inferencing settings for Groovy in Eclipse didn't look like the right thing either.
For example:
def result = new CalculationResult()
gives me an error on the CalculationResult class as it's not imported, but the script will execute correctly in my environment because of the customized imports on the Groovy class loader. I'm using the standard import customization provided by Groovy, for example:
import org.codehaus.groovy.control.customizers.ImportCustomizer
import org.codehaus.groovy.control.CompilerConfiguration
def importCustomizer = new ImportCustomizer()
importCustomizer.addImport 'CalculationResult', 'ch.hedgesphere.core.type.CalculationResult'
def configuration = new CompilerConfiguration()
configuration.addCompilationCustomizers(importCustomizer)
...
Any pointers appreciated.
This seems to be in their bugtracker as coming in the 2.6 release of the plugin.
But the comment from Andrew Eisenberg doesn't bode well:
Unfortunately, this is not something that DSLDs can do. Since a
missing import could mean compile errors, we would need a way to
augment the compiler lookup for this. There might be a way to specify
this information inside of a DSLD, but that would mean hooking into
DSLDs in a very different way. More likely, this will have to be
specified through an Eclipse plugin (like the gradle tooling).
Another possibility is that we can ensure that certain kinds of AST
Transforms are applied during a reconcile and so the editor would just
"magically" know about these extra imports. We will have to look into
the feasibility of this, however.
Still, maybe a vote on that issue wouldn't go amiss?

Why am I getting NullPointerException in the CompilationUnit instances returned from ASTParser.createASTs()

I am working on an Eclipse JDT plugin that requires parsing large numbers of source files,
so I am hoping to use the batch method ASTParser.createASTs(). The parsing executes without errors, but within the CompilationUnit instances it produces, many of the org.eclipse.jdt.internal.compiler.lookup.SourceTypeBinding instances have had their scope field set to null. This setting to null is occurring in the CompilationUnitDeclaration.cleanUp() methods, which are invoked on a worker thread that is unrelated to my plugin's code (i.e., my plugin's classes do not appear on the cleanUp() method call stack).
My parsing code looks like this (all rawSources are within the same project):
ASTParser parser = ASTParser.newParser(AST.JLS3);
parser.setResolveBindings(true);
parser.setStatementsRecovery(true);
parser.setBindingsRecovery(true);
parser.setIgnoreMethodBodies(false);
parser.setProject(project);
parser.createASTs(rawSources.values().toArray(new ICompilationUnit[0]), new String[0], this, deltaAnalyzer.progressMonitor);
Alternatively, I can execute the parsing this way, and no such problems occur:
for (ICompilationUnit source : rawSources.values())
{
parser.setResolveBindings(true);
parser.setStatementsRecovery(true);
parser.setBindingsRecovery(true);
parser.setIgnoreMethodBodies(false);
parser.setProject(project);
parser.setSource(source);
CompilationUnit ast = (CompilationUnit)parser.createAST(deltaAnalyzer.progressMonitor);
parsedSources.add(deltaAnalyzer.createParsedSource(source, ast));
}
This issue occurs in both Helios and Indigo (the very latest release build). I filed a bug in Eclipse Bugzilla, but if anyone knows of a way to work around this--or if I am using the API wrong--I would greatly appreciate your help.
Byron
Without knowing exactly what your exception is, I can still offer 2 suggestions:
Have a look at org.eclipse.jdt.ui.SharedASTProvider. If you are not making any changes to ASTs, this class may provide a more robust way of getting the ASTs.
Play around with some of the settings that you are using. Do you really need bindingsRecovery set to true? What about statementRecovery? Setting these to false may help you.

netbeans 7.0 and scala results in stackoverflow

I'm experiencing some rather annoying problems with scala. The problem is, that I can compile small scala project perfectly, but when the projects are bigger, the compiler crashes with an StackOverflowException.
Clearly, I have to increase the stack size for the compiler, however, that's probably my main problem here, I don't know how.
I'm starting netbeans with these parameters:
netbeans_default_options="-J-client -J-Xmx512m -J-Xss8m -J-Xms512m -J-XX:PermSize=128m -J-XX:MaxPermSize=512m -J-Dapple.laf.useScreenMenuBar=true -J-Dapple.awt.graphics.UseQuartz=true -J-Dsun.java2d.noddraw=true"
So, as far as I'm aware, -J-Xss8m should increase the thread stack size to 8 mb. However, that doesn't seem to affect the compiler. So I tried to pass the same parameter to the compiler directly, using the compiler flags, which I can set in netbeans, resulting in this:
-deprecation -J-Xss8m
But again, that doesn't help, I'm still getting the exception. I searched through the netbeans documentation, but all I found was the netbeans startup parameters, which I had already set. I hope somebody here can give me further information on how to handle this problem.
Further information:
So, after a day I finally had the chance to try everything out on a different machine. I used the same settings and same compiler, but to my surprise, I didn't get the same result. Meaning, on his machine the compiler compiles the whole code without any exception.
The only difference between mine computer and his is, that his has more RAM and CPU power, but that shouldn't make the deal since we both use netbeans with the same startup options.
By now, I even tried out the RC of the 2.9 scala compiler, it didn't help much. Also, I checked if I have the correct scala plugin installed, since there might be problems when using the 2.8 plugin with the 2.9 compiler and vice versa. However, I'm using the 2.9 plugin and 2.9 compiler, so that's fine.
The problem of giving the Scala compiler more stack space is similar to specifying more heap space. Both of these options must be specified as custom JVM arguments when running the Scala compiler. However Netbeans lacks any sort of documentation on how to do it, so here it is.
The way to specify custom JVM arguments for the Scala compiler with Netbeans is by customizing build.xml for each project.
Open nbproject/build-impl.xml in the project's folder.
Search for "scalac" and you will find the following target: -init-macrodef-scalac.
Copy the whole target definition, paste it into your build.xml, and save it.
Close nbproject/build-impl.xml, from now on you will work with build.xml.
In the target you just copied, locate the <scalac> tag, the nesting will be as follows: target.macrodef.sequential.scalac
Add a custom "jvmargs" attribute to the scalac tag, it will look as follows: <scalac jvmargs="-Xss2048k -Xmx2000m" ... >
Save the build.xml. Now whenever you compile your project with netbeans, the compiler will be run with the custom jvm arguments.