Getting scalac: Error: Error compiling the sbt component 'compiler-interface-2.11.8-62.0' during running scala spark program - scala

build.sbt
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.11.8"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
ThisBuild / libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.8"
lazy val root = (project in file("."))
.settings(
name := "Spark_Practice"
)
Error:
scalac: Error: Error compiling the sbt component
'compiler-interface-2.11.8-62.0' sbt.internal.inc.CompileFailed: Error
compiling the sbt component 'compiler-interface-2.11.8-62.0' at
sbt.internal.inc.AnalyzingCompiler$.handleCompilationError$1(AnalyzingCompiler.scala:436)
at
sbt.internal.inc.AnalyzingCompiler$.$anonfun$compileSources$5(AnalyzingCompiler.scala:453)
at
sbt.internal.inc.AnalyzingCompiler$.$anonfun$compileSources$5$adapted(AnalyzingCompiler.scala:448)
at sbt.io.IO$.withTemporaryDirectory(IO.scala:490) at
sbt.io.IO$.withTemporaryDirectory(IO.scala:500) at
sbt.internal.inc.AnalyzingCompiler$.$anonfun$compileSources$2(AnalyzingCompiler.scala:448)
at
sbt.internal.inc.AnalyzingCompiler$.$anonfun$compileSources$2$adapted(AnalyzingCompiler.scala:440)
at sbt.io.IO$.withTemporaryDirectory(IO.scala:490) at
sbt.io.IO$.withTemporaryDirectory(IO.scala:500) at
sbt.internal.inc.AnalyzingCompiler$.compileSources(AnalyzingCompiler.scala:440)
at
org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl$.org$jetbrains$jps$incremental$scala$local$CompilerFactoryImpl$$getOrCompileInterfaceJar(CompilerFactoryImpl.scala:163)
at
org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl.$anonfun$getScalac$1(CompilerFactoryImpl.scala:61)
at scala.Option.map(Option.scala:242) at
org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl.getScalac(CompilerFactoryImpl.scala:54)
at
org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl.createCompiler(CompilerFactoryImpl.scala:23)
at
org.jetbrains.jps.incremental.scala.local.CachingFactory.$anonfun$createCompiler$3(CachingFactory.scala:24)
at
org.jetbrains.jps.incremental.scala.local.Cache.$anonfun$getOrUpdate$2(Cache.scala:20)
at scala.Option.getOrElse(Option.scala:201) at
org.jetbrains.jps.incremental.scala.local.Cache.getOrUpdate(Cache.scala:19)
at
org.jetbrains.jps.incremental.scala.local.CachingFactory.createCompiler(CachingFactory.scala:24)
at
org.jetbrains.jps.incremental.scala.local.LocalServer.doCompile(LocalServer.scala:43)
at
org.jetbrains.jps.incremental.scala.local.LocalServer.compile(LocalServer.scala:30)
at
org.jetbrains.jps.incremental.scala.remote.Main$.compileLogic(Main.scala:209)
at
org.jetbrains.jps.incremental.scala.remote.Main$.$anonfun$handleCommand$1(Main.scala:192)
at
org.jetbrains.jps.incremental.scala.remote.Main$.decorated$1(Main.scala:182)
at
org.jetbrains.jps.incremental.scala.remote.Main$.handleCommand(Main.scala:189)
at
org.jetbrains.jps.incremental.scala.remote.Main$.serverLogic(Main.scala:165)
at
org.jetbrains.jps.incremental.scala.remote.Main$.nailMain(Main.scala:105)
at
org.jetbrains.jps.incremental.scala.remote.Main.nailMain(Main.scala)
at
java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104)
at java.base/java.lang.reflect.Method.invoke(Method.java:577) at
com.facebook.nailgun.NGSession.runImpl(NGSession.java:312) at
com.facebook.nailgun.NGSession.run(NGSession.java:198)

Related

How to write Intellij IDEA plugin for scala macros

I wrote macro that is working after compilation, but the problem is that Intellij IDEA doesn't see my generated code and some red lines appear. So i found explanation here, that I need to write IDEA plugin that will allow IDEA recognize my generated code. The problem is that i cannot use SyntheticMembersInjector because of missing dependency. Is it possible to write IDEA plugin for my own scala macros?
my plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.7.6")
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.10.0-RC1")
addSbtPlugin("com.lightbend.akka.grpc" % "sbt-akka-grpc" % "1.0.2")
addSbtPlugin("org.jetbrains" % "sbt-idea-plugin" % "3.8.4")
my build.sbt:
import com.typesafe.sbt.packager.docker.DockerPlugin.autoImport.dockerExposedPorts
import sbt.Keys.{scalacOptions, version}
lazy val coreProject = (project in file("."))
.enablePlugins(JavaAppPackaging, DockerPlugin, AkkaGrpcPlugin)
.settings(
scalaVersion := "2.12.12",
name := "CDMS",
version := "0.1",
libraryDependencies ++= BuildConfig.projectDependencies,
dockerBaseImage := "adoptopenjdk/openjdk15:alpine",
dockerExposedPorts += 9002
)
.dependsOn(validationProject)
lazy val validationProject = (project in file("validation"))
.enablePlugins(SbtPlugin)
.settings(
scalaVersion := "2.12.12",
sbtPlugin := true,
libraryDependencies ++= BuildConfig.monocleDependencies
)

How to set the Docker Registry with sbt-native-packager

I'm trying to build a Docker image using sbt-native-packager with the following build.sbt (trying to publish the image to a local repository)
val sparkVersion = "2.4.5"
scalaVersion in ThisBuild := "2.12.0"
val sparkLibs = Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion
)
// JAR build settings
lazy val commonSettings = Seq(
organization := "dzlab",
version := "0.1",
scalaSource in Compile := baseDirectory.value / "src",
scalaSource in Test := baseDirectory.value / "test",
resourceDirectory in Test := baseDirectory.value / "test" / "resources",
javacOptions ++= Seq(),
scalacOptions ++= Seq(
"-deprecation",
"-feature",
"-language:implicitConversions",
"-language:postfixOps"
),
libraryDependencies ++= sparkLibs
)
// Docker Image build settings
dockerBaseImage := "gcr.io/spark-operator/spark:v" + sparkVersion
lazy val root = (project in file("."))
.enablePlugins(
DockerPlugin,
JavaAppPackaging
)
.settings(
name := "spark-k8s",
commonSettings,
dockerAliases ++= Seq(
dockerAlias.value.withRegistryHost(Some("localhost:5000"))
),
mainClass in (Compile, run) := Some("dzlab.SparkJob")
)
SBT and the packager versions
$ cat project/plugins.sbt
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0")
$ cat project/build.properties
sbt.version=0.13.18
When I try to run the packager
$ sbt docker:publish
[info] Loading global plugins from /Users/dzlab/.sbt/0.13/plugins
[info] Loading project definition from /Users/dzlab/Projects/spark-k8s/project
/Users/dzlab/Projects/spark-k8s/build.sbt:39: error: not found: value dockerAliases
dockerAliases ++= Seq(
^
sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? q
It does not recognize dockerAliases not sure why as it is part of the publishing settings.
What is the proper way to set the Docker registry??
Your sbt-native-packager version is hopelessly outdated, as is your sbt version. That SettingKey doesn't exist in that version.
Compare: sbt-native-packager 1.0 vs. sbt-native-packager 1.7.4

SBT sync throws an error "Cannot add dependency" on intellij

I get the following error on IntelliJ why I run sbt sync
[error] stack trace is suppressed; run 'last ProjectRef(uri("file:/Users/tushar/Documents/Projects/zio-rocksdb/"), "zio-rocksdb") / updateSbtClassifiers' for the full output
[error] (ProjectRef(uri("file:/Users/tushar/Documents/Projects/zio-rocksdb/"), "zio-rocksdb") / updateSbtClassifiers) java.lang.IllegalArgumentException: Cannot add dependency 'com.typesafe#ssl-config-core_2.12;0.4.0' to configuration 'default' of module dev.zio#zio-rocksdb$sbt_2.12;0.2.0+3-114b41b9+20200418-1131 because this configuration doesn't exist!
[error] Total time: 2 s, completed 18-Apr-2020, 11:31:05 AM
build.sbt
name := "scala-interview-scheduler"
version := "0.1"
scalaVersion := "2.13.1"
// ZIO Core
lazy val zioVersion = "1.0.0-RC18"
// Project Scheduler
lazy val scheduler =
(project in file("scheduler"))
// Project Storage
lazy val storage = (project in file("storage"))
.settings(
libraryDependencies ++= Seq(
"io.suzaku" %% "boopickle" % "1.3.1"
)
)
.dependsOn(ProjectRef(file("../zio-rocksdb"), "zio-rocksdb"))
// Project Program
lazy val program = (project in file("program"))
.dependsOn(scheduler)
// Testing
ThisBuild / libraryDependencies ++= Seq(
"dev.zio" %% "zio" % zioVersion,
"dev.zio" %% "zio-test" % zioVersion % "test",
"dev.zio" %% "zio-test-sbt" % zioVersion % "test"
)
ThisBuild / testFrameworks += new TestFramework("zio.test.sbt.ZTestFramework")
// Global Options
ThisBuild / scalacOptions ++= Seq(
"-language:postfixOps",
"-language:implicitConversions",
"-feature"
)
What does it mean?

Can not find PlayEbean Play 2.5.x

i am using the Play Java Starter Example 2.5.x from the Play Framework page this is my plugins.sbt
addSbtPlugin("com.typesafe.sbt" % "sbt-play-ebean" % "3.0.0")
this is my build.sbt
name := """play-java"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayJava,PlayEbean)
scalaVersion := "2.11.11"
libraryDependencies += filters
libraryDependencies ++= Seq(
javaJdbc,
cache,
javaWs,
evolutions
)
In application.conf:
ebean.default = ["models.*"]
when trying to run the application i always get:
error: not found: value PlayEbean lazy val root = (project in file(".")).enablePlugins(PlayJava, PlayEbean) ^ sbt.compiler.EvalException: Type error in expression [error] sbt.compiler.EvalException: Type error in expression Invalid response.
Help is very appreciated.
After researching i solved it the problem was that i did not put the
addSbtPlugin("com.typesafe.sbt" % "sbt-play-ebean" % "3.0.0")
in the correct file.

How to add input files in sbt Scala

I am new to scala. I am using sbt assembly to create a fat jar. My program reads input files. I kept my files under src/main/resources folder.But I am getting java.io.FileNotFoundException
I dont know how to specify the path? I will delpoying the jar on the server.
Here is my sbt build file
lazy val commonSettings = Seq(
organization := "com.insnapinc",
version := "0.1.0",
scalaVersion := "2.11.4"
)
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "memcache-client"
)
libraryDependencies ++= Seq (
"org.scalaj" %% "scalaj-http" % "1.1.4"
,"org.json4s" %% "json4s-native" % "3.2.10"
,"org.scalatest" % "scalatest_2.11" % "2.2.4" % "test"
)
/* assembly plugin */
mainClass in AssemblyKeys.assembly := Some("com.insnap.memcache.MemcacheTest")
assemblySettings
test in AssemblyKeys.assembly := {}