Error on opening the Apache Spark source code in IntelliJ IDEA - scala

I’m trying to open the Apache Spark source code in IntelliJ IDEA.
I opened pom.xml on the Spark source code root directory.
Project tree is displayed in the Project tool window.
But, when I open a source file, say org.apache.spark.deploy.yarn.ClientBase.scala, a lot of red marks shows on the editor scroll bar. It is the ‘Cannot resolve symbol’ error. Even it cannot resolve StringOps.format.
How can I fix it?
On File | Project Structure window, the following error message is displayed with pink background:
Library ‘Maven: org.scala-lang:scala-compiler-bundle:2.10.4’ is not used
Can it be a hint?
The versions I’m using are as follows:
OS: Windows 7
IntelliJ IDEA: 13.1.6
Scala plugin: 0.41.2
Spark source code: 1.1.1 (with a few file modified by me)
I’ve tried to fix this and error state changed somewhat, but eventually I gave up fixing it on my own (with googling) and deleted .idea folder and started over. So now I’m seeing the errors described above.
UPDATE:
I noticed thw following popup:
Maven projects need to be imported: Import Changes Enable Auto-Import
And enabled auto-import according to the articles IntelliJ: Maven projects need to be imported: Import Changes Enable Auto-Import and http://javafortesters.blogspot.kr/2013/09/do-enable-auto-import-in-intellij-for.html . Now IntelliJ resolves base Scala symbols.
But still it cannot resolve a few symbols.
The notable file is yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientBase.scala. In this file, ClientArguments class is not resolved. IntelliJ suggests importing org.apache.spark.deploy.ClientBase, but in fact ClientArgument class is in the same package with ClientBase - that is, org.apache.spark.deploy.yarn.ClientArgument.
Why IntelliJ confuses this?
Thank you.

You need to change the Scala compiler from IntelliJ to “sbt incremental compiler” (see the screenshot below).
You can access this by going to “preferences” -> “scala”.
NOTE: This is supported only for certain version of IntelliJ scala plugin. See this link for details.
http://blog.jetbrains.com/scala/2014/01/30/try-faster-scala-compiler-in-intellij-idea-13-0-2/

Seems your maven cannot download jars according to your pom dependencies setting.
Two possible factors:
It could be due to your network, so you need check with proxy setting: (Ctrl+Shift+A in IntelliJ, enter "proxy", to check it connection).
It could also because your maven home has not been set in IntelliJ. to set it, you need (Ctrl+Shift+A in IntelliJ, enter "maven setting", to set maven home to point to the place where you have installed your maven.

Related

"Error: Could not find or load main class" when trying to launch Run Configuration in Intellij

In a project that was originally set up for python I do have both python and java SDK's defined:
I am attempting to run a scala program: and the src directory is correctly marked as sources:
The class itself does have a main :
But the Intellij does not provide assist for setting up a Run Configuration - which should have been available by right click/context menu. So I set it up manually:
But when trying to actually run the program it is not successful saying Error: Could not find or load main class com.blazedb.algos.CourseraAlgos:
Update: for reference purposes here is a Run Configuration from a similar project that does work. I do not see any structural differences between the two.
Any ideas why Intellij does not recognize the file as a scala class?
In addition to the steps shown above I had also tried:
adding a new scala-specific module
nuking and recreating the IJ project
These did not resolve the issue. It turns out the problem is that there were no pom.xml in this project. The resulting behavior by Intellij made it difficult to trace down the root cause: there was no message like
You need a pom.xml or a build.sbt to proceed
Apparently stray scala classes (dissociated from a formal build) are only haphazardly supported in Intellij.
So finally the answer is to create a new scala based project. Adding scala back to a project built for python is at the least unreliable and maybe not possible at all.

Spark SQL has no SparkSqlParser.scala file when compiling in intelliJ idea

I have installed spark-hadoop env in my Red Hat 64. And I also want to read and write code in spark source code project in intelliJ idea. I have downloaded spark source code and make everything ready. But I had some errors when compiling spark project in IntelliJ idea.
Here are errors:
/home/xuch/IdeaProjects/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/CatalystQI.scala
Error:(809, 34) not found: value SparkSqlParser
case ast if ast.tokenType == SparlSqlParser.TinyintLiteral =>
Error:(812, 34) not found: value SparkSqlParser
case ast if ast.tokenType == SparlSqlParser.SmallintLiteral =>
... ...
But actually I did not find a file named SparkSqlParser.scala in the whole project neither a scala class named SparkSqlParser.
However, I had searched the web for some files named SparkSqlParser.scala, but they don't have attribute like "TinyintLiteral", "SmallintLiteral", etc.
Here are the files link:
https://github.com/yjshen/zzzzobspk/blob/master/sql/core/src/main/scala/org/apache/spark/sql/SparkSQLParser.scala
https://apache.googlesource.com/spark/+/c152dde78f73d5ce3a483fd60a47e7de1f1916da/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/SparkSQLParser.scala
I meet the same problem. Here is my solution:
Download the antlr4 (i.e. antlr v4) plugin of IntelliJ. Then you can see the file "spark-2.0.1\sql\catalyst\src\main\antlr4\org\apache\spark\sql\catalyst\parser\SqlBase.g4" can be recognized by IntelliJ IDEA.
Navigate to View->Tool Windows->Maven Projects tab. select the project "Spark Project Catalyst". Right click on it. Then select "Generate sources and update folders"
After that you can see some files added into the "spark-2.0.1\sql\catalyst\target\generated-sources\antlr4"
Then you can build success of the project.
Hope it can help you.
None of the advice here worked for me. I noticed, however, that the generated code depends on Antlr 3.x while Antlr 4.x is what is in the dependencies (mvn dependecy:tree). I don't know why this was the case. Maybe because I had earlier built it from the command line (?).
Anyway, try cleaning your Catalyst sub-project then rebuild the autogenerated sources. To do this in IntelliJ, go to View -> Tools Window -> Maven Projects.
Then navigate to the "Spark Project Catalyst" in the "Maven Project" tab.
Navigate to clean -> clean:clean and double click it. Navigate to Plugins -> antlr4 -> antlr4:antlr4 and double click it.
Now, you'll see the autogenerated sources of the Antlr classes are different and they should compile. YMMV.
1) First build your Spark from command line using build instructions given in http://spark.apache.org/docs/latest/building-spark.html#building-with-buildmvn
2) Then check
$SPARK_HOME/sql/catalyst/target/generated-sources/antlr3/org/apache/spark/sql/catalyst/parser folder.
Some of the generated classes like SparkSqlLexer.java is there.
List of classes it generates are
SparkSqlLexer.java[enter link description here][1]
SparkSqlParser.java
SparkSqlParser_ExpressionParser.java
SparkSqlParser_FromClauseParser.java
SparkSqlParser_IdentifiersParser.java
SparkSqlParser_KeywordParser.java
SparkSqlParser_SelectClauseParser.java
3) Open Module Settings. Click on spark-catalyst module. Go to Source tab in the right. Make target/generated-source as a source folder.
I also faced similar problem when I updated my fork to latest master. Unfortunately, could not find a way to make it work from IDEA. What I did is compiled the project from command line. It generated the antlr classes which is required. I then added the generated-source target/generated-source/antlr as source directory. After that I could run tests from Idea. Ideally Idea generate source should have generated the code. Need to check more why it did not. May be because I have maven3.3.3 configured.
I have did as the intruction from Rishitesh Mishra and get stuck in the first step. I have always errors when executing "build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package".
I have both tried on source code from https://spark.apache.org and fork on github.
I have attached the log screenshot in a new answer as following.
error log image

Why can't IntelliJ scala plugin pickup my dependencies?

I am using intellij 11 ultimate, using the scala plugin.
Most of my code has red highlights everywhere, meaning IntelliJ cannot properly resolve the keywords.
For example, I added the amazon aws depenency (using sbt) and 1/2 my code is red, and intellisence doesn't work either.
Note: This is happening for many many libraries I am using, and it is a real pain. I am building using the terminal using the sbt command also.
Is there something I am doing wrong? Or is it just a fact of live using scala with intellij?
Update:
I do see this message sometimes:
File '/path/to/app/web/built.sbt' seems to be SBT build file, but there is no external project related to it. Import the corresponing project?
"File -> Invalidate Caches -> Invalidate and Restart" often helps to prevent IntelliJ from showing non-existing Scala errors.
I have no IntelliJ 11, so I do not know if this exists in your Intellij version.

Scala IDE 4.0.0 thinks there's errors in an out-of-the-box Play Framework 2.3.7 program

I've created a Play Framework program via Typesafe Activator (so it follows the template exactly).
I used sbteclipse-plugin version 3.0.0 to create an Eclipse project and imported that into Scala IDE 4.0.0. These are all the latest versions at the time of writing.
The Scala IDE definitely seems to support the Play Framework. It has syntax highlighting for the custom formats, including the routing file and templates. Yet, it doesn't seem to be able to find the views from the controllers. In particular, the call to views.html.index triggers an error: "object index is not a member of package views.html".
I tried enabling refreshing using native hooks or pooling as detailed here, but it had no affect.
I should note that while the code has been compiled in the command line (with activator ~run), it hasn't been compiled in Scala IDE, since I don't know how to (it doesn't seem to be documented anywhere).
What can I do to get rid of these false errors?
EDIT: After running activator clean ~run, I have another error: The project cannot be built until build path errors are resolved. There's no further details on what these build path errors are.
Update: Just upgrade to sbteclipse version 5.1.0 and everything should work out of the box. Also make sure you follow the Play documentation on how to set up Eclipse/ScalaIDE.
This is a known bug in sbteclipse, which probably will be fixed soon.
For now, you can add the following line to your build.sbt:
EclipseKeys.createSrc := EclipseCreateSrc.All
Kill the SBT console and run sbt eclipse again. That should add the following line to the .classpath file within your project folder as a workaround:
<classpathentry kind="src" path="target/scala-2.11/twirl/main"/>
Refresh your Eclipse project to pick up the change.
I had the same issue, also with Scala IDE 4.0.0 . I followed mkurz instuctions and they worked like a charm. But instead of changing the .classpath file in the project folder manually I used Eclipse interface:
In the top menu of the main window, click on Project and then on Properties.
In the Properties window, click on Java Build Path option (options list is on the left)
In the Source tab, click on Add Folder... button.
In the Source Folder Selection window, choose the target/scala-2.11/twirl/main folder, so it is included in the compilation path. Click Ok button.
Click Ok in the Properties window.
Now the project should compile just fine :) . With that I was able to finish the play setup example in Scala IDE website
I tried #mkurz solution first, but also ran into the same error as #matt. I became frustrated that I could not generate the eclipse project without having to go to the Eclipse project properties to manually fix the build errors. After some investigation, I discovered the solution that removed all errors entirely. Add this your build.sbt:
unmanagedSourceDirectories in Compile <+= twirlCompileTemplates.target
Or if that does not work for you, you could also use:
unmanagedSourceDirectories in Compile <+= target.zipWith(scalaBinaryVersion) { (b,v) => b / s"scala-$v/twirl/main" }
Good bye, build errors!
I got the same error message.
Are you using java8 as jre in eclipse?
After switching back from java8 to java7, everything worked fine again.
If, after following Mkurz' instructions (adding EclipseKeys.CreateSrc... ), your problems are not solved, click on Project -> Properties -> Java Build Path. Look at the source folders tab.
You may find a duplicate file folder named .../src_managed/main (Thanks Matt). If so, close the project. Remove ONE of the two ../src_managed/main entries from the .classpath file (located in the base of the activator/SBT project directory). Reopen and clean the project and you should be good to go.
For me, it turned out that installed JRE in the Scala IDE was openjdk, changed it to Oracle Java 8 and it worked.

Starting a new project with sbt-idea plugin

I am new to sbt and the sbt-idea plugin. I created a new project with the plugin and when opening the generated .idea file inside IntelliJ and compiling I am getting that "please specify compiler in Scala facet". When looking on the scala compiler facet all I see is "buildScala" in red.
Since I saw many here are using the plugin, can you explain the steps you took to correct this?
I have this problem when I use the sbt-idea processor, and then import the module into an existing IDEA project. However, when I open the project created by sbt-idea I do not have the problem.
I have not been able to fix the red buildScala problem with imported modules. I suspect it would require tomfoolery in the project files, as I can't find config options to correct it via the GUI.
Personally, I always install sbt-idea as a processor in every new SBT install I make. (See "Usage as processor" in the previous link). Then the correct way to generate project files is simply sbt idea.
Can you clarify exactly the steps you followed that led to the error?