IntelliJ Unit Tests throw scalac: Output directory not specified - scala

I am running into a weird error with my IntelliJ setup for Scala. When I try to run any Unit Test for my project, I get the below error.
scalac: Output directory not specified for ...
From Maven, I am able to successfully run the tests. No issues there.
To ensure my Scala setup is correct, I created a simple HelloWorld scala project and I am able to successfully add a unit test and run it from IntelliJ.
I am running into this error only for my project repo that I downloaded from Git. Any suggestions on how I can identify what I am missing in the setup?
Things I tried so far:
I ensured that my Java and Maven version in IntelliJ matches what I have on my Mac
I tried deleting the .idea folder, "invalidate caches and restart". No luck after any of these.
➜ ~ mvn -version
Apache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f)
Maven home: /Users/syncmaster/.asdf/installs/maven/3.6.3
Java version: 11.0.14.1, vendor: Azul Systems, Inc., runtime: /Users/syncmaster/.asdf/installs/java/zulu-11.54.25/zulu-11.jdk/Contents/Home

Related

intelliJ multiple scala-library*.jar file

I tried to run scala app in IJ community version 2018.3.6, and ran into this error:
Error:scalac: Multiple 'scala-library*.jar' files (scala-library-2.12.8.jar, scala-library-2.12.8.jar) in Scala compiler classpath in Scala SDK sbt: org.scala-lang:scala-library:2.12.8:jar
and I checked hte .idea/libraries folder saw:
sbt__org_scala_lang_scala_library_2_11_8.jar.xml
sbt__org_scala_lang_scala_library_2_12_8.jar.xml
after I delete sbt__org_scala_lang_scala_library_2_11_8.jar.xml still got the same error, then i deleted all 2.11.8 scala*.files still got hte same error
I also tried to open the same project in IJ community version 2019.2.4, it complained:
Cannot determine module type ("SBT_MODULE") for the following module:
there is no 'sbt' tool listed in Preference -> Build, Execution, Deployment -> Build Tools
anyone know how to get it work?
## FIX
i ended up to uninstall and reinstall Scala plugin in IJ 2019.2.4 to get it working again.

spark's example can not build in intellij

I follow someone's tutorial to compile spark 2.1.0 and run one of its example in intellij IDEA 2016.3.4(scala 2.11.8) ,here is my steps:
download spark 2.1.0 from git(win 10 64bit)
I build it using "$ build/mvn -T 4 -DskipTests clean package" in cygwin
open the project in intellij
mark spark-streaming-flume-sink_2.11's target directory as Sources,and also its sub directory \target\scala-2.11\src_managed\main\compiled_avro\org\apache\spark\streaming\flume\sink
add {spark dir}/spark/assembly/target/scala-2.11/jars/*.jars as spark-examples_2.11 's Dependencies
But when I build the spark project,here is error message:
Even I delete this test class SparkSinkSuite ,the similar error "object xx is not a member of package xx" happen in other class(I found all the problem class are in 'external' module)
I've tried restart the intellij even the computer ,it doesn't help.

IntelliJ with SBT plugin: Error Resolving [com.mycompany.mypackage]

I am trying to build a SBT based Scala project. The project has dependencies on packages on my company's local maven server. From within the company's network, I was able to compile/package(from cli and IntelliJ) the project successfully yesterday.
Today from home from command prompt, When I do:
sbt assembly
to build the repo it works fine.
But when I build from IntelliJ or do:
sbt compile|package
It is hitting the maven server and failing. My question is if the packages are already in the cache - ~/.ivy2/ then why maven server is hit. Is there a way to avoid server hit?
Doing a little search, found the answer
$ sbt "set offline := true" run

Run a JMH benchmark for gradle project in eclipse

Is there any direct way to run JMH benchmarks in eclipse for a gradle project.
I tried but faced issues like
No benchmarks to run (then copied manually the generated META-INF witch Mircobenchmark file in resources)
Then it gave generated.package_name.testClass class not found exception.Adding this to the build-path->source gives errors as it expects package name as gradle/classes/generated/package_name

maven in command line succeed, but failed when using the same maven in eclipse

I'm trying to build the project of hadoop. I followed the official document as follows on how to do that.
http://wiki.apache.org/hadoop/HowToContribute
https://wiki.apache.org/hadoop/EclipseEnvironment
I've git clone the project, maven install it successfully, but when I import the project, or even a sub-project like 'hadoop-yarn-api', I got the following errors on maven:
Description Resource Path Location Type
Plugin execution not covered by lifecycle configuration: org.apache.hadoop:hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (execution: compile-protoc, phase: generate-sources) pom.xml /hadoop-yarn-api line 73 Maven Project Build Lifecycle Mapping Problem
Then I'm trying to mvn clean install the project via External Tools Configuration in eclipse, it also failed on:
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) on project hadoop-yarn-api: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version -> [Help 1]
But the curious part is that when I cd into the root directory of 'hadoop-yarn-api' and invoke mvn clean install, it can be built successfully.
I'm using m2ecipse in eclipse, and I'm sure that I've changed to the maven which is exactly the one that I'm used in command line, not the embedded one.
And I've installed protocol buffers 2.5.0:
$ protoc --version
libprotoc 2.5.0
Could anyone give me some idea? Many thanks!
P.S.
Eclipse Java EE IDE - Juno Service Release 2
m2e - 1.4.1.20140328-1905
Mac 1- 0.9.4
maven - 3.0.5
Hadoop - 2.2.0