Apache Spark Build error - scala

I'm building Apache spark source code in ubuntu 14.04.4 (spark version: 1.6.0 with Scala code runner version 2.10.4) with command
sudo sbt/sbt assembly
and getting the following error,
[warn] def deleteRecursively(dir: TachyonFile, client: TachyonFS) {
[warn] ^
[error] [error] while compiling:
/home/ashish/spark-apps/spark-1.6.1/core/src/main/scala/org/apache/spark/util/random/package.scala
[error] during phase: jvm [error] library
version: version 2.10.5 [error] compiler version: version
2.10.5 [error] reconstructed args: -deprecation -Xplugin:/home/ashish/.ivy2/cache/org.spark-project/genjavadoc-plugin_2.10.5/jars/genjavadoc-plugin_2.10.5-0.9-spark0.jar
-feature -P:genjavadoc:out=/home/ashish/spark-apps/spark-1.6.1/core/target/java -classpath /home/ashish/spark-apps/spark-1.6.1/core/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/launcher/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/network/common/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/network/shuffle/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/unsafe/target/scala-2.10/classes:/home/ashish/.ivy2/cache/org.spark-project.spark/unused/jars/unused-1.0.0.jar:/home/ashish/.ivy2/cache/com.google.guava/guava/bundles/guava-14.0.1.jar:/home/ashish/.ivy2/cache/io.netty/netty-all/jars/netty-all-4.0.29.Final.jar:/home/ashish/.ivy2/cache/org.fusesource.leveldbjni/leveldbjni-all/bundles/leveldbjni-all-1.8.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.4.4.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-annotations/bundles/jackson-annotations-2.4.4.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-core/bundles/jackson-......and
many other jars...
[error] [error] last tree to typer:
Literal(Constant(collection.mutable.Map)) [error]
symbol: null [error] symbol definition: null [error]
tpe: Class(classOf[scala.collection.mutable.Map]) [error]
symbol owners: [error] context owners: package package ->
package random [error] [error] == Enclosing template or
block == [error] [error] Template( // val :
in package random,
tree.tpe=org.apache.spark.util.random.package.type [error]
"java.lang.Object" // parents [error] ValDef( [error]
private [error] "_" [error] [error]
[error] ) [error] DefDef( // def ():
org.apache.spark.util.random.package.type in package random
[error] [error] "" [error]
[] [error] List(Nil) [error] //
tree.tpe=org.apache.spark.util.random.package.type [error]
Block( // tree.tpe=Unit [error] Apply( // def ():
Object in class Object, tree.tpe=Object [error]
package.super."" // def (): Object in class Object,
tree.tpe=()Object [error] Nil [error] )
[error] () [error] ) [error] ) [error]
) [error] [error] == Expanded type of tree == [error]
[error] ConstantType(value = Constant(collection.mutable.Map))
[error] [error] uncaught exception during compilation:
java.io.IOException [error] File name too long [warn] 45
warnings found [error] two errors found [error]
(core/compile:compile) Compilation failed [error] Total time:
5598 s, completed 5 Apr, 2016 9:06:50 AM
Where I'm getting wrong?

You should build Spark with Maven...
download the source and run ./bin/mvn clean package

Probably similar to http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-td10532.html
Try sudo sbt/sbt clean assembly

Related

Configuration with APB fails to elaborate

I created the following configuration in Configs.scala:
class APBConfig extends Config(new WithDebugAPB ++ new TinyConfig)
I tried to build it with the following command:
/rocket/rocket-chip/vsim$ make
CONFIG=freechips.rocketchip.system.APBConfig
And get the following error:
[error] java.lang.UnsupportedOperationException: empty.init
[error] ...
[error] at freechips.rocketchip.regmapper.RegMapper$.apply(RegMapper.scala:49)
[error] at freechips.rocketchip.amba.apb.APBRegisterNode.regmap(RegisterRouter.scala:32)
[error] at freechips.rocketchip.devices.debug.APBDebugRegisters$$anon$1.<init>(APB.scala:27)
[error] at freechips.rocketchip.devices.debug.APBDebugRegisters.module$lzycompute(APB.scala:26)
[error] at freechips.rocketchip.devices.debug.APBDebugRegisters.module(APB.scala:26)
[error] at freechips.rocketchip.devices.debug.APBDebugRegisters.module(APB.scala:18)
[error] at freechips.rocketchip.diplomacy.LazyModuleImpLike.$anonfun$instantiate$2(LazyModule.scala:280)
[error] at chisel3.Module$.do_apply(Module.scala:52)
[error] at freechips.rocketchip.diplomacy.LazyModuleImpLike.$anonfun$instantiate$1(LazyModule.scala:280)
[error] at scala.collection.immutable.List.flatMap(List.scala:338)
[error] at freechips.rocketchip.diplomacy.LazyModuleImpLike.instantiate(LazyModule.scala:278)
[error] at freechips.rocketchip.diplomacy.LazyModuleImpLike.instantiate$(LazyModule.scala:273)
[error] at freechips.rocketchip.diplomacy.LazyRawModuleImp.instantiate(LazyModule.scala:357)
[error] at freechips.rocketchip.diplomacy.LazyRawModuleImp.$anonfun$x$23$1(LazyModule.scala:370)
[error] at chisel3.withClockAndReset$.apply(MultiClock.scala:25)
[error] at freechips.rocketchip.diplomacy.LazyRawModuleImp.<init>(LazyModule.scala:370)
[error] at freechips.rocketchip.devices.debug.TLDebugModuleOuterAsync$$anon$3.<init>(Debug.scala:634)
[error] at freechips.rocketchip.devices.debug.TLDebugModuleOuterAsync.module$lzycompute(Debug.scala:634)
[error] at freechips.rocketchip.devices.debug.TLDebugModuleOuterAsync.module(Debug.scala:634)
[error] at freechips.rocketchip.devices.debug.TLDebugModuleOuterAsync.module(Debug.scala:598)
[error] at freechips.rocketchip.diplomacy.LazyModuleImpLike.$anonfun$instantiate$2(LazyModule.scala:280)
[error] at chisel3.Module$.do_apply(Module.scala:52)
[error] at freechips.rocketchip.diplomacy.LazyModuleImpLike.$anonfun$instantiate$1(LazyModule.scala:280)
[error] at scala.collection.immutable.List.flatMap(List.scala:338)
[error] at freechips.rocketchip.diplomacy.LazyModuleImpLike.instantiate(LazyModule.scala:278)
[error] at freechips.rocketchip.diplomacy.LazyModuleImpLike.instantiate$(LazyModule.scala:273)
[error] at freechips.rocketchip.diplomacy.LazyRawModuleImp.instantiate(LazyModule.scala:357)
[error] at freechips.rocketchip.diplomacy.LazyRawModuleImp.$anonfun$x$23$1(LazyModule.scala:370)
[error] at chisel3.withClockAndReset$.apply(MultiClock.scala:25)
[error] at freechips.rocketchip.diplomacy.LazyRawModuleImp.<init>(LazyModule.scala:370)
[error] at freechips.rocketchip.devices.debug.TLDebugModule$$anon$10.<init>(Debug.scala:1770)
[error] at freechips.rocketchip.devices.debug.TLDebugModule.module$lzycompute(Debug.scala:1770)
[error] at freechips.rocketchip.devices.debug.TLDebugModule.module(Debug.scala:1770)
[error] at freechips.rocketchip.devices.debug.TLDebugModule.module(Debug.scala:1745)
[error] at freechips.rocketchip.diplomacy.LazyModuleImpLike.$anonfun$instantiate$2(LazyModule.scala:280)
[error] at chisel3.Module$.do_apply(Module.scala:52)
[error] at freechips.rocketchip.diplomacy.LazyModuleImpLike.$anonfun$instantiate$1(LazyModule.scala:280)
[error] at scala.collection.immutable.List.flatMap(List.scala:338)
[error] at freechips.rocketchip.diplomacy.LazyModuleImpLike.instantiate(LazyModule.scala:278)
[error] at freechips.rocketchip.diplomacy.LazyModuleImpLike.instantiate$(LazyModule.scala:273)
[error] at freechips.rocketchip.diplomacy.LazyModuleImp.instantiate(LazyModule.scala:348)
[error] at freechips.rocketchip.diplomacy.LazyModuleImp.<init>(LazyModule.scala:350)
[error] at freechips.rocketchip.subsystem.BareSubsystemModuleImp.<init>(BaseSubsystem.scala:31)
[error] at freechips.rocketchip.subsystem.BaseSubsystemModuleImp.<init>(BaseSubsystem.scala:130)
[error] at freechips.rocketchip.subsystem.RocketSubsystemModuleImp.<init>(RocketSubsystem.scala:55)
[error] at freechips.rocketchip.system.ExampleRocketSystemModuleImp.<init>(ExampleRocketSystem.scala:27)
[error] at freechips.rocketchip.system.ExampleRocketSystem.module$lzycompute(ExampleRocketSystem.scala:24)
[error] at freechips.rocketchip.system.ExampleRocketSystem.module(ExampleRocketSystem.scala:24)
[error] at freechips.rocketchip.system.TestHarness.$anonfun$dut$1(TestHarness.scala:17)
[error] at chisel3.Module$.do_apply(Module.scala:52)
[error] at freechips.rocketchip.system.TestHarness.<init>(TestHarness.scala:17)
[error] at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
[error] at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
[error] at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error] at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
[error] at freechips.rocketchip.stage.phases.PreElaboration.$anonfun$transform$1(PreElaboration.scala:31)
[error] ... (Stack trace trimmed to user code only, rerun with --full-stacktrace if you wish to see the full stack trace)
Exception: sbt.TrapExitSecurityException thrown from the UncaughtExceptionHandler in thread "run-main-0"
[error] Nonzero exit code: 1
[error] (Compile / runMain) Nonzero exit code: 1
[error] Total time: 34 s, completed Aug 5, 2020, 1:58:55 PM
For reference, using WithJtagDTMSystem instead of WithDebugAPB works.
Any ideas what might be the problem?
I guess those error still not got solved by the rocket chip generator..
I thought this issue came up into rocket-chip forum.

Compilation error on insert-or-update action in Quill when using H2 database

is it possible to perform insert-or-update action in Quill when using H2 database? If I add to .insert action .onConflictIgnore, I'm getting compilation error:
[error] [...]/repository/HeadlinesRepository.scala:41:36: exception during macro expansion:
[error] java.lang.IllegalStateException: Action ast can't be translated to sql: 'querySchema("headlines").insert(v => v.link -> ?, v => v.title -> ?).onConflictIgnore'
[error] at io.getquill.util.Messages$.fail(Messages.scala:15)
[error] at io.getquill.context.sql.idiom.SqlIdiom.$anonfun$actionTokenizer$1(SqlIdiom.scala:387)
[error] at io.getquill.idiom.StatementInterpolator$Tokenizer$$anon$1.token(StatementInterpolator.scala:17)
[error] at io.getquill.idiom.StatementInterpolator$TokenImplicit.token(StatementInterpolator.scala:27)
[error] at io.getquill.context.sql.idiom.SqlIdiom.$anonfun$astTokenizer$1(SqlIdiom.scala:57)
[error] at io.getquill.idiom.StatementInterpolator$Tokenizer$$anon$1.token(StatementInterpolator.scala:17)
[error] at io.getquill.context.sql.idiom.SqlIdiom$$anon$1.token(SqlIdiom.scala:49)
[error] at io.getquill.context.sql.idiom.SqlIdiom$$anon$1.token(SqlIdiom.scala:46)
[error] at io.getquill.idiom.StatementInterpolator$TokenImplicit.token(StatementInterpolator.scala:27)
[error] at io.getquill.context.sql.idiom.SqlIdiom.translate(SqlIdiom.scala:39)
[error] at io.getquill.context.sql.idiom.SqlIdiom.translate$(SqlIdiom.scala:23)
[error] at io.getquill.H2Dialect$.translate(H2Dialect.scala:20)
[error] at io.getquill.context.ContextMacro.translateStatic(ContextMacro.scala:51)
[error] at io.getquill.context.ContextMacro.translate(ContextMacro.scala:37)
[error] at io.getquill.context.ContextMacro.expand(ContextMacro.scala:24)
[error] at io.getquill.context.ContextMacro.expand$(ContextMacro.scala:21)
[error] at io.getquill.context.ActionMacro.expand(ActionMacro.scala:10)
[error] at io.getquill.context.ActionMacro.expandBatchAction(ActionMacro.scala:121)
[error] at io.getquill.context.ActionMacro.runBatchAction(ActionMacro.scala:71)
Seems that no.
insert or update (upsert, conflict)
Upsert is supported by Postgres, SQLite, and MySQL
https://github.com/getquill/quill

Why does maven give me this Scala error doing clean install

I can't seem to find why scala packages are being "thought of" by maven as a member of package org.apache.kafka.streams.scala. I usually use sbt but I have to use maven on this one. Any help would be appreciated.
/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/bin/java -Dmaven.multiModuleProjectDirectory=/Users/jcena1/IdeaProjects/datapipe-scala-merge-transform "-Dmaven.home=/Applications/IntelliJ IDEA.app/Contents/plugins/maven/lib/maven3" "-Dclassworlds.conf=/Applications/IntelliJ IDEA.app/Contents/plugins/maven/lib/maven3/bin/m2.conf" "-javaagent:/Applications/IntelliJ IDEA.app/Contents/lib/idea_rt.jar=65172:/Applications/IntelliJ IDEA.app/Contents/bin" -Dfile.encoding=UTF-8 -classpath "/Applications/IntelliJ IDEA.app/Contents/plugins/maven/lib/maven3/boot/plexus-classworlds-2.5.2.jar" org.codehaus.classworlds.Launcher -Didea.version=2018.2.1 install
[INFO] Scanning for projects...
[WARNING]
[WARNING] Some problems were encountered while building the effective model for io.confluent:Jointransform:jar:4.1.1
[WARNING] 'dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: org.scala-lang:scala-library:jar -> duplicate declaration of version ${scala.version} # io.confluent:Jointransform:[unknown-version], /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/pom.xml, line 259, column 21
[WARNING]
[WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.
[WARNING]
[WARNING] For this reason, future Maven versions might no longer support building such malformed projects.
[WARNING]
[INFO] ------------------------------------------------------------------------
[INFO] Detecting the operating system and CPU architecture
[INFO] ------------------------------------------------------------------------
[INFO] os.detected.name: osx
[INFO] os.detected.arch: x86_64
[INFO] os.detected.version: 10.13
[INFO] os.detected.version.major: 10
[INFO] os.detected.version.minor: 13
[INFO] os.detected.classifier: osx-x86_64
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Jointransform 4.1.1
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-checkstyle-plugin:2.17:check (validate) # Jointransform ---
[INFO]
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-versions) # Jointransform ---
[INFO]
[INFO] --- build-helper-maven-plugin:1.10:add-source (add-source) # Jointransform ---
[INFO] Source directory: /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala added.
[INFO] Source directory: /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/target/generated-sources added.
[INFO]
[INFO] --- avro-maven-plugin:1.8.2:schema (default) # Jointransform ---
[INFO]
[INFO] --- maven-resources-plugin:3.0.2:resources (default-resources) # Jointransform ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 28 resources
[INFO]
[INFO] --- maven-compiler-plugin:3.3:compile (default-compile) # Jointransform ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 11 source files to /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/target/classes
[INFO]
[INFO] --- scala-maven-plugin:3.2.1:compile (default) # Jointransform ---
[INFO] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala:-1: info: compiling
[INFO] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/target/generated-sources:-1: info: compiling
[INFO] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/target/generated-sources/annotations:-1: info: compiling
[INFO] Compiling 16 source files to /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/target/classes at 1536749088682
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Filtertransform.scala:10: error: object language is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.language.implicitConversions
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Filtertransform.scala:14: error: object collection is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.collection.mutable
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Filtertransform.scala:75: error: Symbol 'type org.apache.kafka.streams.Consumed' is missing from the classpath.
[ERROR] This symbol is required by 'value com.lightbend.kafka.scala.streams.StreamsBuilderS.consumed'.
[ERROR] Make sure that type Consumed is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[ERROR] A full rebuild may help if 'StreamsBuilderS.class' was compiled against an incompatible version of org.apache.kafka.streams.
[ERROR] val stream: KStreamS[String, Array[Byte]] = builder.stream(inputTopic)
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:12: error: object collection is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.collection.JavaConverters._
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:11: error: object language is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.language.implicitConversions
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:20: error: object collection is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.collection.mutable
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:21: error: object collection is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.collection.mutable.ListBuffer
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:39: error: Symbol 'type org.apache.kafka.streams.Consumed' is missing from the classpath.
[ERROR] This symbol is required by 'method com.lightbend.kafka.scala.streams.ImplicitConversions.consumedFromSerde'.
[ERROR] Make sure that type Consumed is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[ERROR] A full rebuild may help if 'ImplicitConversions.class' was compiled against an incompatible version of org.apache.kafka.streams.
[ERROR] (out:+ s)
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:57: error: value asScala is not a member of java.util.List[String]
[ERROR] possible cause: maybe a semicolon is missing before `value asScala'?
[ERROR] .asScala
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:147: error: not found: value Consumed
[ERROR] implicit val c = Consumed.`with`(Serdes.String(), Serdes.ByteArray())
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:149: error: polymorphic expression cannot be instantiated to expected type;
[ERROR] found : [K, V]org.apache.kafka.streams.kstream.KTable[K,V]
[ERROR] required: org.apache.kafka.streams.scala.kstream.KTable[String,Array[Byte]]
[ERROR] val table: KTable[String, Array[Byte]] = builder.table(inputTopic)
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:156: error: not found: value Serialized
[ERROR] implicit val sb = Serialized.`with`(Serdes.String(), Serdes.ByteArray())
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:157: error: not found: value Serialized
[ERROR] implicit val ss = Serialized.`with`(Serdes.String(), Serdes.String())
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:158: error: not found: value Produced
[ERROR] implicit val p = Produced.`with`(Serdes.String(),Serdes.String())
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:241: error: value asScala is not a member of java.util.List[String]
[ERROR] possible cause: maybe a semicolon is missing before `value asScala'?
[ERROR] .asScala
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:245: error: value asScala is not a member of java.util.List[String]
[ERROR] val newValue = getValFromJSONMessage(lv, aggregateColumnList.asScala.toList.head)
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:232: error: diverging implicit expansion for type org.apache.kafka.streams.kstream.Serialized[KR,VR]
[ERROR] starting with method consumedFromSerde in object ImplicitConversions
[ERROR] val groupedK = table.groupBy{ (key: String, value: Array[Byte]) =>
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:252: error: not found: type Aggregator
[ERROR] val newAggregator: Aggregator[String, Array[Byte], Array[Byte]] = new Aggregator[String, Array[Byte], Array[Byte]] {
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:252: error: not found: type Aggregator
[ERROR] val newAggregator: Aggregator[String, Array[Byte], Array[Byte]] = new Aggregator[String, Array[Byte], Array[Byte]] {
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:10: error: object language is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.language.implicitConversions
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:11: error: object collection is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.collection.JavaConverters._
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:18: error: object collection is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.collection.mutable.ListBuffer
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:37: error: not found: type ListBuffer
[ERROR] val lst = new ListBuffer[String]
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:84: error: value asScala is not a member of java.util.List[String]
[ERROR] Note: implicit value j is not applicable here because it comes after the application point and it lacks an explicit result type
[ERROR] val leftRekeyColumnList = if(rekeyLeftNeeded) joinStep.getStringList("joinOn.leftFields").asScala.toList else List.empty
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:85: error: value asScala is not a member of java.util.List[String]
[ERROR] Note: implicit value j is not applicable here because it comes after the application point and it lacks an explicit result type
[ERROR] val rightRekeyColumnList = if(rekeyRightNeeded) joinStep.getStringList("joinOn.rightFields").asScala.toList else List.empty
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:96: error: value asScala is not a member of java.util.List[String]
[ERROR] Note: implicit value j is not applicable here because it comes after the application point and it lacks an explicit result type
[ERROR] if(!JoinUtils.validateFilterExpression(filterExpression.asScala.toList)) {
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:147: error: value asScala is not a member of java.util.List[String]
[ERROR] JoinUtils.joinValues(lv, rv, node, outputFields.asScala.toList)
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:157: error: value asScala is not a member of java.util.List[String]
[ERROR] val filterPredicate = JoinUtils.buildPostJoinPredicate(filterExpression.asScala.toList, lv)
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:165: error: value asScala is not a member of java.util.List[String]
[ERROR] val ba: Array[Byte] = JoinUtils.removeEntryFromJsonNode(extraFields.asScala.toList, value)
[ERROR] ^
[ERROR] 29 errors found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 10.937 s
[INFO] Finished at: 2018-09-12T06:44:53-04:00
[INFO] Final Memory: 35M/376M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile (default) on project Jointransform: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Process finished with exit code 1
I needed to prefix _root_ to all the Scala core packages like for example
import _root_.scala.collection.JavaConverters._

Object kkapi is not a member of package

I am trying to import a module from another project and did as following:
As you can see on the image, the imported library is /home/developer/...kafka-api.
I am using the importing library in testing.
When I compile my spec files with statement test:compile and I've got following error:
[IJ]sbt:auth_stream> test:compile
[info] Compiling 3 Scala sources to /home/developer/Desktop/microservices/bary/auth-stream/target/scala-2.12/test-classes ...
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:14:20: object kkapi is not a member of package io.khinkali
[error] import io.khinkali.kkapi.consumer.{KkConsumer, KkConsumerConfig, KkConsumerCreator}
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:15:20: object kkapi is not a member of package io.khinkali
[error] import io.khinkali.kkapi.producer.{KkProducer, KkProducerCreator, MaxBlockMsConfig}
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:24:56: not found: value KkConsumer
[error] private val consumer: IO[Consumer[String, String]] = KkConsumer.create(createConsumer())
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:52:5: not found: type KkConsumerCreator
[error] : KkConsumerCreator
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:25:56: not found: value KkProducer
[error] private val producer: IO[Producer[String, String]] = KkProducer.create(createProducer())
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:46:5: not found: type KkProducerCreator
[error] : KkProducerCreator
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:47:5: not found: value KkProducerCreator
[error] = KkProducerCreator(sys.env.get("KAFKA_SERVER").get,
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:49:10: not found: value MaxBlockMsConfig
[error] List(MaxBlockMsConfig(2000)))
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:53:5: not found: value KkConsumerCreator
[error] = KkConsumerCreator(sys.env.get("KAFKA_SERVER").get,
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:57:16: not found: type KkConsumerConfig
[error] List.empty[KkConsumerConfig])
[error] ^
[error] 10 errors found
[error] (test:compileIncremental) Compilation failed
What am I doing wrong?
Hint, that the package of both project starts with the name, namely:
The current project:
The imported project:
As you can see, the name differs only at the end. Could it be the problem?
What I am trying to approach is, to use a function for kafka-api project.

Why won't my scalatest test compile? (scala.MatchError)

There is all of the code in my project:
package fileSearcher
import org.scalatest.FlatSpec
class FilterCheckerTests extends org.scalatest.FlatSpec {
"Foo" should
"not do terrible things" in {
assert(1 == 1)
}
}
sbt test crashes with scala.MatchError (full details below).
What am I doing wrong?
[info] Compiling 1 Scala source to C:\scala\course\FileSearcher\target\scala-2.1
0\test-classes...
[error]
[error] while compiling: C:\scala\course\FileSearcher\src\test\scala\fileSe
archer\FilterCheckerTests.scala
[error] during phase: typer
[error] library version: version 2.10.4
[error] compiler version: version 2.10.4
[error] reconstructed args: -classpath C:\scala\course\FileSearcher\target\sca
la-2.10\test-classes;C:\scala\course\FileSearcher\target\scala-2.10\classes;C:\U
sers\Max\.ivy2\cache\org.scalatest\scalatest_2.11\bundles\scalatest_2.11-2.2.4.j
ar;C:\Users\Max\.ivy2\cache\org.scala-lang\scala-reflect\jars\scala-reflect-2.11
.2.jar;C:\Users\Max\.ivy2\cache\org.scala-lang.modules\scala-xml_2.11\bundles\sc
ala-xml_2.11-1.0.2.jar;C:\Users\Max\.ivy2\cache\com.novocode\junit-interface\jar
s\junit-interface-0.11.jar;C:\Users\Max\.ivy2\cache\junit\junit\jars\junit-4.11.
jar;C:\Users\Max\.ivy2\cache\org.hamcrest\hamcrest-core\jars\hamcrest-core-1.3.j
ar;C:\Users\Max\.ivy2\cache\org.scala-sbt\test-interface\jars\test-interface-1.0
.jar -bootclasspath C:\Program Files\Java\jdk1.8.0_20\jre\lib\resources.jar;C:\P
rogram Files\Java\jdk1.8.0_20\jre\lib\rt.jar;C:\Program Files\Java\jdk1.8.0_20\j
re\lib\sunrsasign.jar;C:\Program Files\Java\jdk1.8.0_20\jre\lib\jsse.jar;C:\Prog
ram Files\Java\jdk1.8.0_20\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_20\jre
\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_20\jre\lib\jfr.jar;C:\Program F
iles\Java\jdk1.8.0_20\jre\classes;C:\Users\Max\.ivy2\cache\org.scala-lang\scala-
library\jars\scala-library-2.10.4.jar
[error]
[error] last tree to typer: Literal(Constant(true))
[error] symbol: null
[error] symbol definition: null
[error] tpe: Boolean(true)
[error] symbol owners:
[error] context owners: value <local FilterCheckerTests> -> class FilterCh
eckerTests -> package fileSearcher
[error]
[error] == Enclosing template or block ==
[error]
[error] Template( // val <local FilterCheckerTests>: <notype> in class FilterChe
ckerTests
[error] "org.scalatest.FlatSpec" // parents
[error] ValDef(
[error] private
[error] "_"
[error] <tpt>
[error] <empty>
[error] )
[error] // 2 statements
[error] DefDef( // def <init>(): fileSearcher.FilterCheckerTests in class Filt
erCheckerTests
[error] <method>
[error] "<init>"
[error] []
[error] List(Nil)
[error] <tpt> // tree.tpe=fileSearcher.FilterCheckerTests
[error] Block( // tree.tpe=Unit
[error] Apply( // def <init>(): org.scalatest.FlatSpec in class FlatSpec,
tree.tpe=org.scalatest.FlatSpec
[error] FilterCheckerTests.super."<init>" // def <init>(): org.scalatest
.FlatSpec in class FlatSpec, tree.tpe=()org.scalatest.FlatSpec
[error] Nil
[error] )
[error] ()
[error] )
[error] )
[error] Apply(
[error] "Foo".should("not do terrible things")."in"
[error] Apply(
[error] "assert"
[error] Apply( // def ==(x: Int): Boolean in class Int, tree.tpe=Boolean(t
rue)
[error] 1."$eq$eq" // def ==(x: Int): Boolean in class Int, tree.tpe=(x:
Int)Boolean
[error] 1
[error] )
[error] )
[error] )
[error] )
[error]
[error] == Expanded type of tree ==
[error]
[error] ConstantType(value = Constant(true))
[error]
[error] uncaught exception during compilation: scala.MatchError
[trace] Stack trace suppressed: run last test:compile for the full output.
[error] (test:compile) scala.MatchError: false (of class scala.reflect.internal.
Trees$Literal)
[error] Total time: 0 s, completed Jun 20, 2015 11:07:15 AM
1. Waiting for source changes... (press enter to interrupt)
As you can see by looking at the classpath, which is printed by the compiler, you mixed Scala 2.10 with libraries for 2.11. Given that major versions of Scala are binary incompatible, this can never work.
This can be fixed with scalaVersion := "2.11.5" or by setting all dependencies to use the 2.10 versions, which can be done with libraryDependencies += "group" %% "libName" % "version", where %% means that sbt automatically uses the correct library version.
For anyone getting a similar error (as I did) running Scala 2.10 using Maven instead of sbt, the solution would simply be to change the Maven dependency suggested in the ScalaTest website from
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId>
<version>3.0.0</version>
<scope>test</scope>
</dependency>
to
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.10</artifactId>
<version>3.0.0</version>
<scope>test</scope>
</dependency>