spark's example can not build in intellij - scala

I follow someone's tutorial to compile spark 2.1.0 and run one of its example in intellij IDEA 2016.3.4(scala 2.11.8) ,here is my steps:
download spark 2.1.0 from git(win 10 64bit)
I build it using "$ build/mvn -T 4 -DskipTests clean package" in cygwin
open the project in intellij
mark spark-streaming-flume-sink_2.11's target directory as Sources,and also its sub directory \target\scala-2.11\src_managed\main\compiled_avro\org\apache\spark\streaming\flume\sink
add {spark dir}/spark/assembly/target/scala-2.11/jars/*.jars as spark-examples_2.11 's Dependencies
But when I build the spark project,here is error message:
Even I delete this test class SparkSinkSuite ,the similar error "object xx is not a member of package xx" happen in other class(I found all the problem class are in 'external' module)
I've tried restart the intellij even the computer ,it doesn't help.

Related

IntelliJ Unit Tests throw scalac: Output directory not specified

I am running into a weird error with my IntelliJ setup for Scala. When I try to run any Unit Test for my project, I get the below error.
scalac: Output directory not specified for ...
From Maven, I am able to successfully run the tests. No issues there.
To ensure my Scala setup is correct, I created a simple HelloWorld scala project and I am able to successfully add a unit test and run it from IntelliJ.
I am running into this error only for my project repo that I downloaded from Git. Any suggestions on how I can identify what I am missing in the setup?
Things I tried so far:
I ensured that my Java and Maven version in IntelliJ matches what I have on my Mac
I tried deleting the .idea folder, "invalidate caches and restart". No luck after any of these.
➜ ~ mvn -version
Apache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f)
Maven home: /Users/syncmaster/.asdf/installs/maven/3.6.3
Java version: 11.0.14.1, vendor: Azul Systems, Inc., runtime: /Users/syncmaster/.asdf/installs/java/zulu-11.54.25/zulu-11.jdk/Contents/Home

intelliJ multiple scala-library*.jar file

I tried to run scala app in IJ community version 2018.3.6, and ran into this error:
Error:scalac: Multiple 'scala-library*.jar' files (scala-library-2.12.8.jar, scala-library-2.12.8.jar) in Scala compiler classpath in Scala SDK sbt: org.scala-lang:scala-library:2.12.8:jar
and I checked hte .idea/libraries folder saw:
sbt__org_scala_lang_scala_library_2_11_8.jar.xml
sbt__org_scala_lang_scala_library_2_12_8.jar.xml
after I delete sbt__org_scala_lang_scala_library_2_11_8.jar.xml still got the same error, then i deleted all 2.11.8 scala*.files still got hte same error
I also tried to open the same project in IJ community version 2019.2.4, it complained:
Cannot determine module type ("SBT_MODULE") for the following module:
there is no 'sbt' tool listed in Preference -> Build, Execution, Deployment -> Build Tools
anyone know how to get it work?
## FIX
i ended up to uninstall and reinstall Scala plugin in IJ 2019.2.4 to get it working again.

Cannot sync scala project in IntelliJ IDEA Community 2019.2.4

Scala newbie here: I am attempting to get started with Scala using Windows 10 (Pro 10.0.18362 Build 18362) Hyper-v Quick Create of Ubuntu (18.04.3 LTS). I installed the JRE and JDK (11.0.4). I installed IntelliJ IDEA 2019.2.4. I added the Plugin for Scala (plugin 2019.2.37). I have left the SBT Executor 1.2.1 disabled for now: it was enabled earlier but it does not seem to affect the results. I tried to create the HelloWorld application (see below). I added the Scala Framework to the Project and, after encountering the error below, the Mavin framework (adding Mavin did not help.) After correcting the error in bulid.sbt it looks like this:
import com.sun.tools.javac.resources.version
name := "HelloWorld"
version := "0.1"
scalaVersion := "2.13.1"
I create a Scala worksheet by right-click on scala folder and selecting Scala Worksheet:
object Hello extends App {
println("Hello, World!")
}
I get a pop-up saying Maven project needs to be imported. This succeeds quickly. I get a second one saying sbt project needs to be imported. This fails:
sbt.librarymanagement.ResolveException: Error downloading org.scala-sbt:zinc-compile-core_2.12:1.3.1
and this error accompanies it:
not found: https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/org.scala-sbt/util-position_2.12/1.3.2/ivys/ivy.xml
The first ResolveException appears to be about a part of Mavin, but that should be present. This error message is followed by many other "not found" errors, but I assume they all stem from the above errors or are related. I cannot seem to find a solution: most of the information and examples on IntelliJ IDEA Scala on the web are several years (editions) out of date.
It appears Maven apps are to be deployed to Apache Spark. Not my intention, but in the Mavin panel I can successfully clean, validate and compile.
When I Run the Hello.sc app, I get this:
/snap/intellij-idea-community/185/jbr/bin/java -javaagent:/snap/intellij-idea-community/185/lib/idea_rt.jar=37033:/snap/intellij-idea-community/185/bin -Dfile.encoding=UTF-8 -classpath /home/perfwise/ideaProjects/HelloWorld/target/classes:/home/perfwise/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.13.1.jar:/home/perfwise/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.13.1.jar:/home/perfwise/.ivy2/cache/org.scala-lang/scala-library/srcs/scala-library-2.13.1-sources.jar:/home/perfwise/.ivy2/cache/org.scala-lang/scala-reflect/srcs/scala-reflect-2.13.1-sources.jar Hello
Error: Could not find or load main class Hello
Caused by: java.lang.ClassNotFoundException: Hello
Process finished with exit code 1
I would have expected this to work. Any pointers will be much appreciated.

Intellij: Error while importing SBT project

I used Intellij for 4 months without any problems. Yesterday I installed it on another PC but I can't create a SBT Scala project.
Here the steps:
I create the project:
When started it says me that "SBT project need to be imported" so I click on "Enable Auto-Import" but then I get this error:
If I try to add Scala SDK from modules settings I get this error:
I downloaded Scala and SBT externally, I also tried all the suggested solutions from other similar threads but I can't resolve it.
How can I do?
Thank you in advance!
Add sbt.version to project/build.properties. You can check SBT version using sbt sbtVersion.

Building Customize Spark

We are creating a customize version of Spark since we are changing some lines of code from ALS.scala. We build the customize spark version using
mvn command:
./make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.6 -Phive -Phive-thriftserver -Pyarn
However, upon using the customized version of Spark, we run into this error:
Do you guys have some idea on what causes the error and how we might solve the issue?
I am actually using a jar file in the local machine by building them using sbt: sbt compile then sbt clean package and putting the jar file here: /Users/user/local/kernel/kernel-0.1.5-SNAPSHOT/lib.
However in the hadoop environment, the installation is different. Thus, I use maven to build spark and that's where the error comes in. I am thinking that this error might be dependent on using maven to build spark as there are some reports like this:
https://issues.apache.org/jira/browse/SPARK-2075
or probably on building spark assembly files