Attempting to execute compile task but mystery module can't be loaded - scala

I'm compiling a multi-part Scala project. It's not that large, but some of it is Scala 2.13 and some is Scala 3.
Attempting to compile generates the fatal error [UNRESOLVED DEPENDENCIES:
base#base_2.12;0.1.0-SNAPSHOT: not found]
The thing is, the string {0.1.0-SNAPSHOT} doesn't occur anywhere in my build.sbt or anywhere else. It used to be there, but it's long gone. I assume some update cache contains it, but I've been unable to find it.
Here is my {build.sbt}:
ThisBuild / libraryDependencies += "org.scalatest" %% "scalatest" % "3.2.7" % Test
ThisBuild / Compile / scalacOptions ++= Seq("--deprecation")
ThisBuild / Test / logBuffered := false
ThisBuild / Test / parallelExecution := false
lazy val scala213 = "2.13.5"
lazy val scala212 = "2.12.13"
lazy val scala3 = "3.0.0-RC2"
lazy val supportedScalaVersions = List(scala213, scala3)
lazy val root = (project in file("."))
.aggregate(top, trans, base)
.settings(
name := "toysat"
)
lazy val top = (project in file("top"))
.settings(
name := "main",
scalaVersion := scala213,
scalacOptions += "-Ytasty-reader",
libraryDependencies += "org.scalatest" %% "scalatest" % "3.2.7" % Test
)
.dependsOn(trans, base)
lazy val trans = (project in file("trans"))
.settings(
name := "trans",
Compile / scalaVersion := scala3,
libraryDependencies += "org.scalatest" %% "scalatest" % "3.2.7" % Test
).
dependsOn(base)
lazy val base = (project in file("base"))
.settings(
name := "base",
scalaVersion := scala213,
libraryDependencies += "org.scalatest" %% "scalatest" % "3.2.7" % Test
Most questions of this ilk on stackoverflow are about downloading remotely defined modules. The problem I'm having is that sbt cannot find an obsolete version of one of my (freshly compiled) modules.
and here is the sbt command output (this is an Emacs buffer):
sbt:toysat> reload
[info] welcome to sbt 1.5.5 (AdoptOpenJDK Java 1.8.0_292)
[info] loading project definition from /Users/drewmcdermott/BIG/RESEARCH/puzzles/toystory4/toysat/project
[info] loading settings for project root from build.sbt ...
[info] set current project to toysat (in build file:/Users/drewmcdermott/BIG/RESEARCH/puzzles/toystory4/toysat/)
sbt:toysat> compile
[info] compiling 4 Scala sources to /Users/drewmcdermott/BIG/RESEARCH/puzzles/toystory4/toysat/base/target/scala-2.13/classes ...
[warn]
[warn] Note: Unresolved dependencies path:
[info] done compiling
[error] stack trace is suppressed; run last trans / update for the full output
[error] (trans / update) sbt.librarymanagement.ResolveException: Error downloading base:base_2.12:0.1.0-SNAPSHOT
[error] Not found
[error] Not found
[error] not found: /Users/drewmcdermott/.ivy2/localbase/base_2.12/0.1.0-SNAPSHOT/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/base/base_2.12/0.1.0-SNAPSHOT/base_2.12-0.1.0-SNAPSHOT.pom
[error] Total time: 25 s, completed Jul 28, 2021 11:06:18 PM
The 25 seconds were consumed compiling the 4 files in the base subproject, apparently successfully. I think it's when sbt tries to compile the trans subproject that it runs into trouble.
Here's a partial stack trace. It means nothing to me except that Coursier is involved.
sbt:toysat> last trans / update
[debug] not up to date. inChanged = true, force = false
[debug] Updating trans...
[warn]
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading base:base_2.12:0.1.0-SNAPSHOT
[error] Not found
[error] Not found
[error] not found: /Users/drewmcdermott/.ivy2/localbase/base_2.12/0.1.0-SNAPSHOT/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/base/base_2.12/0.1.0-SNAPSHOT/base_2.12-0.1.0-SNAPSHOT.pom
[error] at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:258)
[error] at lmcoursier.CoursierDependencyResolution.$anonfun$update$38(CoursierDependencyResolution.scala:227)
[error] at lmcoursier.CoursierDependencyResolution$$Lambda$4262/0x0000000000000000.apply(Unknown Source)
[error] at scala.util.Either$LeftProjection.map(Either.scala:573)
[error] at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:227)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error] at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:59)
It seems clear that some cache somewhere is holding onto the string 0.1.0-SNAPSHOT, but there are an ungodly number of caches. I've tried deleting several, but I haven't found the relevant one.
Can someone explain how to recover from a situation like this?

Your base project is only compiled for Scala 2.13 whereas it is defined as a dependency (using dependsOn) of trans which targets Scala 3.
You should cross-build your base project for Scala 2.13 and 3 (and maybe 2.12 according to your error message even though I don't see any use of Scala 2.12 in what you shared).
Edit: Scala 2.13 and 3 are compatible, so the issue should only happen if a dependency is built only for 2.12.

I am not answering my own question because I'm a narcissist, but because I can't say what I want in a comment. Plus editing the original question would bury possibly useful information in an odd place. I've upvoted and approved #GaelJ's answer.
My build.sbt doesn't look that different. The differences may be encapsulated by showing the revised trans subproject definition:
lazy val trans = (projectMatrix in file("trans"))
.settings(
name := "trans",
version := "0.3",
// I thought this line was unnecessary, but without
// it sbt doesn't understand the command trans / compile --
Compile / scalaVersion := scala3,
libraryDependencies += "org.scalatest" %% "scalatest" % "3.2.7" % Test
)
.jvmPlatform(scalaVersions = Seq(scala213))
.dependsOn(base)

Related

Unable to find PlayScala (Heroku tutorial)

I'm new to Scala and Heroku and I'm following the Heroku getting-started guide.
I'm using Mac (10.14.6) and followed the instructions here to install sbt and play.
I am now on Declare app dependencies but when I type the sbt compile stage command I get the following error:
$ sbt compile stage
[info] welcome to sbt 1.3.13 (AdoptOpenJDK Java 11.0.8)
[info] loading project definition from /Users/jack/scala-getting-started/project
/Users/jack/scala-getting-started/build.sbt:5: error: not found: value PlayScala
lazy val root = (project in file(".")).enablePlugins(PlayScala)
^
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
The build.sbt file (provided automatically) is
name := """play-getting-started"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
jdbc,
cache,
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41",
ws
)
libraryDependencies <+= scalaVersion("org.scala-lang" % "scala-compiler" % _ )
How can I fix this error?
Thank you.

Scala IntelliJ library import errors

I am new to scala and I am trying to import the following libraries in my build.sbt. When IntelliJ does an auto-update I get the following error:
Error while importing sbt project:
List([info] welcome to sbt 1.3.13 (Oracle Corporation Java 1.8.0_251)
[info] loading global plugins from C:\Users\diego\.sbt\1.0\plugins
[info] loading project definition from C:\Users\diego\development\Meetup\Stream-Processing\project
[info] loading settings for project stream-processing from build.sbt ...
[info] set current project to Stream-Processing (in build file:/C:/Users/diego/development/Meetup/Stream-Processing/)
[info] sbt server started at local:sbt-server-80d70f9339b81b4d026a
sbt:Stream-Processing>
[info] Defining Global / sbtStructureOptions, Global / sbtStructureOutputFile and 1 others.
[info] The new values will be used by cleanKeepGlobs
[info] Run `last` for details.
[info] Reapplying settings...
[info] set current project to Stream-Processing (in build file:/C:/Users/diego/development/Meetup/Stream-Processing/)
[info] Applying State transformations org.jetbrains.sbt.CreateTasks from C:/Users/diego/.IntelliJIdea2019.3/config/plugins/Scala/repo/org.jetbrains/sbt-structure-extractor/scala_2.12/sbt_1.0/2018.2.1+4-88400d3f/jars/sbt-structure-extractor.jar
[info] Reapplying settings...
[info] set current project to Stream-Processing (in build file:/C:/Users/diego/development/Meetup/Stream-Processing/)
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.kafka:kafka-clients_2.11:2.3.1
[error] Not found
[error] Not found
[error] not found: C:\Users\diego\.ivy2\local\org.apache.kafka\kafka-clients_2.11\2.3.1\ivys\ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/kafka/kafka-clients_2.11/2.3.1/kafka-clients_2.11-2.3.1.pom
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.apache.kafka:kafka-clients_2.11:2.3.1
[error] Not found
[error] Not found
[error] not found: C:\Users\diego\.ivy2\local\org.apache.kafka\kafka-clients_2.11\2.3.1\ivys\ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/kafka/kafka-clients_2.11/2.3.1/kafka-clients_2.11-2.3.1.pom
[error] Total time: 2 s, completed Jun 28, 2020 12:11:24 PM
[info] shutting down sbt server)
This is my build.sbt file:
name := "Stream-Processing"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.4"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql-kafka-0-10_2.12
libraryDependencies += "org.apache.spark" %% "spark-sql-kafka-0-10" % "2.4.4"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients
libraryDependencies += "org.apache.kafka" %% "kafka-clients" % "2.3.1"
// https://mvnrepository.com/artifact/mysql/mysql-connector-java
libraryDependencies += "mysql" % "mysql-connector-java" % "8.0.18"
// https://mvnrepository.com/artifact/org.mongodb.spark/mongo-spark-connector
libraryDependencies += "org.mongodb.spark" %% "mongo-spark-connector" % "2.4.1"
I made a Scala project just to make sure Spark works and my python project using Kafka works as well so I am sure it's not a spark/kafka problem. Any reason why I am getting that error?
Try removing one % before "kafka-clients":
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "2.3.1"
The semantics of %% in SBT is that it appends the Scala version being used to the artifact name, so it becomes org.apache.kafka:kafka-clients_2.11:2.3.1 as the error message shows as well. Note the _2.11 suffix.
This is a nice shorthand for Scala libraries, but can get confusing for beginners, when used with Java libs.

Compile error scala project

I'm taking the coursera scala course. I downloaded the sample project, but I can not compile it. Giving me error when I run the console command.
build.sbt:
name := course.value + "-" + assignment.value
scalaVersion := "2.12.4"
scalacOptions ++= Seq("-deprecation")
// grading libraries
libraryDependencies += "junit" % "junit" % "4.10" % Test
// for funsets
libraryDependencies += "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.4"
resolvers += "Artima Maven Repository" at "http://repo.artima.com/releases"
// include the common dir
commonSourcePackages += "common"
courseId := "bRPXgjY9EeW6RApRXdjJPw"
Error:
> console
[info] Updating root
[info] Resolved root dependencies
[trace] Stack trace suppressed: run last *:coursierResolution for the full output.
[error] (*:coursierResolution) coursier.ResolutionException: Encountered 1 error(s) in dependency resolution:
[error] org.scalatest:scalatest_2.12:2.2.4:
[error] not found:
[error] /Users/joaonobre/.ivy2/local/org.scalatest/scalatest_2.12/2.2.4/ivys/ivy.xml
[error] /Users/joaonobre/.sbt/preloaded/org.scalatest/scalatest_2.12/2.2.4/ivys/ivy.xml
[error] /Users/joaonobre/.sbt/preloaded/org/scalatest/scalatest_2.12/2.2.4/scalatest_2.12-2.2.4.pom
[error] https://repo1.maven.org/maven2/org/scalatest/scalatest_2.12/2.2.4/scalatest_2.12-2.2.4.pom
[error] http://repo.artima.com/releases/org/scalatest/scalatest_2.12/2.2.4/scalatest_2.12-2.2.4.pom
[error] Total time: 1 s, completed Dec 22, 2017 4:16:17 PM
scala -version
Scala code runner version 2.12.4 -- Copyright 2002-2017, LAMP/EPFL and Lightbend, Inc.
Any idea how to fix it?
Thanks.
As suggested by #laughedelic, Scala 2.12 does not have scalatest-2.2.4. As a result, sbt cannot find the related scalatest pom file. You can pick any of the versions mentioned here.
For instance, try changing scalaTestDependency version to 3.0.5 in the CommonBuild.scala file
lazy val scalaTestDependency = "org.scalatest" %% "scalatest" % "3.0.5"
Here is the course link (Practice Programming Assignment of Week1)

Modifying and Building Spark core

I am trying to make a modification to the Apache Spark source code. I created a new method and added it to the RDD.scala file within the Spark source code I downloaded. After making the modification to RDD.scala, I built Spark using
mvn -Dhadoop.version=2.2.0 -DskipTests clean package
I then created a sample Scala Spark Application as mentioned here
I tried using the new function I created, and I got a compilation error when using sbt to create a jar for Spark. How exactly do I compile Spark with my modification and attach the modified jar to my project? The file I modified is RDD.scala within the core project. I run sbt package from the root dir of my Spark Application Project.
Here is the sbt file:
name := "N Spark"
version := "1.0"
scalaVersion := "2.11.6"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.3.0"
Here is the error:
sbt package
[info] Loading global plugins from /Users/Raggy/.sbt/0.13/plugins
[info] Set current project to Noah Spark (in build file:/Users/r/Downloads/spark-proj/n-spark/)
[info] Updating {file:/Users/r/Downloads/spark-proj/n-spark/}n-spark...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/r/Downloads/spark-proj/n-spark/target/scala-2.11/classes...
[error] /Users/r/Downloads/spark-proj/n-spark/src/main/scala/SimpleApp.scala:11: value reducePrime is not a member of org.apache.spark.rdd.RDD[Int]
[error] logData.reducePrime(_+_);
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 24 s, completed Apr 11, 2015 2:24:03 AM
UPDATE
Here is the updated sbt file
name := "N Spark"
version := "1.0"
scalaVersion := "2.10"
libraryDependencies += "org.apache.spark" % "1.3.0"
I get the following error for this file:
[info] Loading global plugins from /Users/Raggy/.sbt/0.13/plugins
/Users/Raggy/Downloads/spark-proj/noah-spark/simple.sbt:7: error: No implicit for Append.Value[Seq[sbt.ModuleID], sbt.impl.GroupArtifactID] found,
so sbt.impl.GroupArtifactID cannot be appended to Seq[sbt.ModuleID]
libraryDependencies += "org.apache.spark" % "1.3.0"
Delete libraryDependencies from build.sbt and just copy the custom-built Spark jar to the lib directory in your application project.

Why does sbt keep telling me to add -deprecation to scalacOptions when it's already?

Here below is my multi-project structure:
myApp
+ build.sbt
+ sub-prj-1
+ build.sbt
+ sub-prj-2
+ build.sbt
+ project
+ Build.scala
I use to define common settings in project/Build.scala like this:
import sbt._
import Keys._
object ApplicationBuild extends Build {
val defaultScalacOptions = Seq(
"-unchecked", "-deprecation", "-feature", "-language:reflectiveCalls", "-language:implicitConversions",
"-language:postfixOps", "-language:dynamics", "-language:higherKinds", "-language:existentials",
"-language:experimental.macros", "-encoding", "UTF-8", "-Xmax-classfile-name", "140")
val defaultResolvers = Seq(
"Typesafe releases repository" at "http://repo.typesafe.com/typesafe/releases/"
)
val defaultSettings = Defaults.defaultSettings ++ Seq(
scalaVersion := "2.10.4",
scalacOptions ++= defaultScalacOptions,
resolvers ++= defaultResolvers
)
}
... and then I reference these common settings in each build.sbt file:
name := "myapp"
organization := "Test, Inc."
version := "1.0"
ApplicationBuild.defaultSettings // it looks like common settings defined in
// project/Build.scala are not read...
scalacOptions += "-feature" // already defined in ApplicationBuild.defaultSettings...
// but if I don't define it here, it doesn't work
lazy val `sub-prj-1` = project.in(file("sub-prj-1"))
lazy val `sub-prj-2` = project.in(file("sub-prj-2"))
lazy val brix = project.in(file(".")).settings(
publishArtifact := false
).aggregate(
`sub-prj-1`,
`sub-prj-2`
)
For example, scalacOptions += "-feature" is already defined in Build.scala... but if I don't define it in build.sbt I always get the following warning:
[warn] there were 1 deprecation warning(s); re-run with -deprecation for details
[warn] one warning found
Any idea? Am I missing something? This problem first appeared after I installed sbt 0.13.5.
EDIT
Here's the content of scalacOptions:
[info] sub-prj-1/*:scalacOptions
[info] List(-unchecked, -deprecation, -feature, -language:reflectiveCalls, -language:implicitConversions, -language:postfixOps, -language:dynamics, -language:higherKinds, -language:existentials, -language:experimental.macros, -encoding, UTF-8, -Xmax-classfile-name, 140)
[info] sub-prj-2/*:scalacOptions
[info] List(-unchecked, -deprecation, -feature, -language:reflectiveCalls, -language:implicitConversions, -language:postfixOps, -language:dynamics, -language:higherKinds, -language:existentials, -language:experimental.macros, -encoding, UTF-8, -Xmax-classfile-name, 140)
[info] myapp/*:scalacOptions
[info] List(-unchecked, -deprecation, -feature, -language:reflectiveCalls, -language:implicitConversions, -language:postfixOps, -language:dynamics, -language:higherKinds, -language:existentials, -language:experimental.macros, -encoding, UTF-8, -Xmax-classfile-name, 140)
I can only guess (and counting on additional information to be corrected when mistaken), but the warn messages are from the build project (under project) not yours.
I'm on sbt 0.13.6-SNAPSHOT (built from the sources today) so your mileage may vary.
➜ myApp xsbt
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/sandbox/common-settings/myApp/project
[info] Updating {file:/Users/jacek/sandbox/common-settings/myApp/project/}myapp-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/jacek/sandbox/common-settings/myApp/project/target/scala-2.10/sbt-0.13/classes...
[warn] there were 1 deprecation warning(s); re-run with -deprecation for details
[warn] one warning found
[info] Set current project to brix (in build file:/Users/jacek/sandbox/common-settings/myApp/)
When I tried to reproduce your case, I ended up with the messages coming for the build definition under project:
[warn] there were 1 deprecation warning(s); re-run with -deprecation for details
[warn] one warning found
Are they what you want to get rid of? If so, read on. Else add additional information to your question. Thanks.
For sbt is recursive, what's beneath project is another build definition (and so on).
In order to get rid of the messages, you should follow their advice and add -deprecation to the build definition of the corresponding project. Add the following to project/build.sbt:
scalacOptions += "-deprecation"
With this, reload and the mystery becomes uncovered.
➜ myApp xsbt
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/sandbox/common-settings/myApp/project
[info] Compiling 1 Scala source to /Users/jacek/sandbox/common-settings/myApp/project/target/scala-2.10/sbt-0.13/classes...
[warn] /Users/jacek/sandbox/common-settings/myApp/project/Build.scala:15: value defaultSettings in object Defaults is deprecated: 0.13.2
[warn] val defaultSettings = Defaults.defaultSettings ++ Seq(
[warn] ^
[warn] one warning found
[info] Set current project to brix (in build file:/Users/jacek/sandbox/common-settings/myApp/)
>
As sbt.Defaults says:
#deprecated("0.13.2", "Default settings split into `coreDefaultSettings` and IvyModule/JvmModule plugins.")
To fix this, one should read the article Preview of upcoming sbt 1.0 features: Read about the new plugins.