Encounter sbt.librarymanagement.ResolveException - scala

I am fairly new to scala & sbt and encountered an issue when attempting to compile my scala project and have 1 hypothesis as to why i am seeing the below error:
the path to the dependency on my local is configured wrongly. it is currently: "/Users/jutay/.ivy2/localorg.apache.spark/....." and there should be a "/" after "local" which should make the path look like ".../.ivy2/local/org.apache.spark/..."
But I am unsure if this is truly the case?
As i did some research and found the following online examples, which is showing that those who encountered the same sbt.librarymanagement.ResolveException have the similar path designed e.g. ".../.ivy2/localnet.cakesolutions/..." so it seems that the missing "/" in the path is not the one causing the problem?
but in the event that this is indeed the one causing the problem, where can i make the config change to add a "/" after the string "local"?
https://users.scala-lang.org/t/help-with-this-package-downloading-error/8257
sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core:3.0.1
If there is no issue with the path above and it is correct and expected, what am i currently doing wrong or missing leading sbt compile to fail with the above error? It seems to not be able to download this dependency:
Error downloading org.apache.spark:spark-corcle_2.12:3.3.0
Note: I am currently using intelliJ to work on this scala sbt project.
I am kind of stuck on this issue.
[error] stack trace is suppressed; run last update for the full output [error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-corcle_2.12:3.3.0 [error] Not found [error] Not found [error] not found: /Users/jutay/.ivy2/localorg.apache.spark/spark-corcle_2.12/3.3.0/ivys/ivy.xml [error] not found: https://repo1.maven.org/maven2/org/apache/spark/spark-corcle_2.12/3.3.0/spark-corcle_2.12-3.3.0.pom [error] Total time: 2 s, completed 28 Nov, 2022 5:27:51 PM
This is my build.sbt file:
name := "proj123"
version := "3.0"
scalaVersion := "2.12.15"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-corcle" % "3.3.0",
"org.apache.spark" %% "spark-sql" % "3.3.0",
"com.google.code.gson" % "gson" % "2.2.4",
"com.google.cloud.spark" %% "spark-bigquery-with-dependencies" % "0.26.0" % "provided",
"com.typesafe" % "config" % "1.3.2",
"com.jcraft" % "jsch" % "0.1.55",
"org.springframework" % "spring-web" % "5.3.18",
"org.apache.httpcomponents" % "httpcore" % "4.4.15",
"org.apache.httpcomponents" % "httpclient" % "4.5.13",
"com.google.code.gson" % "gson" % "2.8.9"
)
I have tried looking up possible solutions online via google related to sbt.librarymanagement.ResolveException but i did not found any solution that would be helpful.
I tried checking my .bash_profile and my sbtconfig.txt as to whether there are any config options there to edit the path (by adding a "/") as a possible way to try and resolve the issue, but there seems to be no such options.
This is my .bash_profile :
# Configuration for node to trust the xxxx Proxy Certificates
export NODE_EXTRA_CA_CERTS='/usr/local/etc/openssl/certs/xxxx_proxy_cacerts.pem'
# Configuration to load nvm - node version manager
export NVM_DIR="$HOME/.nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm
# >>> coursier install directory >>>
export SPARK_HOME=/Users/jutay/Documents/spark-3.3.0-bin-hadoop3
export JAVA_HOME=/Library/Java/JavaVirtualMachines/zulu-8.jdk/Contents/Home
export SBT_HOME=/Users/jutay/Downloads/sbt
export PATH=$SPARK_HOME/bin:/$JAVA_HOME/bin:/$SBT_HOME/bin:$PATH
# <<< coursier install directory <<<
and this is the sbtconfig.txt inside the sbt folder which my .bash_profile reference:
# sbt configuration file for Windows
# Set the java args
#-mem 1024 was added in sbt.bat as default
#-Xms1024m
#-Xmx1024m
#-Xss4M
#-XX:ReservedCodeCacheSize=128m
# Set the extra sbt options
# -Dsbt.log.format=true

Related

Compiling a Scala program failing Due to Dependencies not found

I have installed Flink, Scala and sbt
Flink Version: 1.9.1
Scala Version: 2.10.6
Sbt Version: 1.3.7
I made relevant changes in build.sbt.
Compile command is failing
Here is the relevant information.
Any information is greatly appreciated
**Versions Information
[osboxes#osboxes local]$ scala -version
Scala code runner version 2.10.6 -- Copyright 2002-2013, LAMP/EPFL
[osboxes#osboxes local]$ flink --version
Version: 1.9.1, Commit ID: 4d56de8
[osboxes#osboxes readcsvfile]$ sbt -version
sbt version in this project: 1.3.7
sbt script version: 1.3.7
** build.sbt changes
val flinkVersion = "1.9.1"
val flinkDependencies = Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided")
** Compile Errors
sbt:readCsvfile> compile
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run last update for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.flink:flink-streaming-scala_2.13:1.9.1
[error] Not found
[error] Not found
[error] not found: /home/osboxes/.ivy2/local/org.apache.flink/flink-streaming-scala_2.13/1.9.1/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/flink/flink-streaming-scala_2.13/1.9.1/flink-streaming-scala_2.13-1.9.1.pom
[error] Error downloading org.apache.flink:flink-scala_2.13:1.9.1
[error] Not found
[error] Not found
[error] not found: /home/osboxes/.ivy2/local/org.apache.flink/flink-scala_2.13/1.9.1/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/flink/flink-scala_2.13/1.9.1/flink-scala_2.13-1.9.1.pom
[error] Total time: 4 s, completed Jan 30, 2020 3:59:12 PM
sbt:readCsvfile>
Few points I want to mention here regarding the SBT dependencies issues are:
Please add scalaVersion := "2.12.11" in build.sbt file like this, which includes the Scala version in your SBT dependencies automatically due to this%%.
name := "flink-streaming-demo"
scalaVersion := "2.12.11"
val flinkVersion = "1.10.0"
libraryDependencies += "org.apache.flink" %% "flink-scala" % flinkVersion % "provided"
libraryDependencies += "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided"
If you want Scala version specific SBT dependencies then use % like this:
libraryDependencies += "org.apache.flink" % "flink-scala_2.12" % flinkVersion % "provided"
libraryDependencies += "org.apache.flink" % "flink-streaming-scala_2.12" % flinkVersion % "provided"
In worst case if all these does not work then simply delete or rename these existing .sbt and .ivy2 hidden folder in your system home directory, where your all dependecies and plugins get sotred after downloading from maven central and then refresh/build the SBT project.
SBT dependency format
libraryDependencies += groupID % artifactID % revision % configuration
Meaning of % and %%
%: A method used to construct an Ivy Module ID from the strings you supply.
%%: When used after the groupID, it automatically adds your project’s Scala version (such as _2.12) to the end of the artifact name.
NOTE: To get more details click here.
summing up the comments since perhaps it is a bit hard to know what you should do
In general, if you get an "Unresolved dependencies" error, look at mvnrepository.com, search for your artifact:
https://mvnrepository.com/artifact/org.apache.flink/flink-scala
This tells you (second column) which Scala versions are supported by it. In this case, the library is available for 2.11.x and 2.12.x.
Thus, you have to use a Scala version compatible with that in your build, in build.sbt:
ThisBuild / scalaVersion := "2.12.10"

sbt ivy2: configuration 'master' not found from 'compile'

I am getting the following error when I build a library bp-kafka-bp2 that depends on library bp-bp2-componentized:
sbt.librarymanagement.ResolveException: unresolved dependency:
com.foo#bp-bp2-componentized_2.11;3.3.+: configuration not found in
com.foo#bp-bp2-componentized_2.11;3.3.0: 'master'. It was required
from com.foo#bp-kafka-bp2_2.11;3.10.1 compile
The unresolved library bp-bp2-componentized does in fact exist in ~/.ivy2/local and does not have a master configuration in its ivy.xml
My questions are:
Should the dependent library (bp-kafka-bp2) be looking for a master configuration of the (not really) missing library?
If it should not be looking for a master config, what can I do to make it stop doing so?
If it should be looking for a master config, how do I induce the build for the (not really) missing library to produce such a config?
I have tried this in sbt versions 1.1.5 and 1.2.1. I have deleted ~/.ivy2/local, ./ivy2/cache and ~/.sbt. I have also removed all the /target and project/target directories in the library I am building and done sbt clean
This library has built just fine for a year or two now. The only recent change I can think of is introducing 2.11 and 2.12 cross-compilation option -- which is not being exercised here, I'm just building 2.11 version on its own.
The direct dependency is declared in a multi-project build.sbt as
lazy val bp2 = (project in file("bp2")).
settings(commonSettings:_*).
settings(Seq(
name := "bp-kafka-bp2"
):_*).
settings(
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % ScalaTestVersion % "test",
"ch.qos.logback" % "logback-classic" % LogbackVersion % "test",
"com.foo" %% "bp-bp2-componentized" % Constellation.dependency("bp-bp2"),
"com.foo" %% "bp-akka-http" % Constellation.dependency("bp-akka")
)
).
dependsOn(reactiveComponentized)
where Constellation.dependency is just a function that looks up a version by name and turns it into a patch range:
object Constellation {
...
def dependency(name: String) : String = versions(name).split("\\.").dropRight(1).mkString(".") + ".+"
}
You can see from the error message that the version is being found and converted to 3.3.+ which is then resolved correctly to a 3.3.0 in the ivy cache. But then it insists on finding a master configuration which is not present.

sbt assembly failing due to conflicting file

I am trying to make a fat jar by running sbt assembly for my project.
I am getting the following error :
[error] (root/*:assembly) deduplicate: different file contents found in the following:
[error] /Users/xyz/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-core/jars/hadoop-mapreduce-client-core-2.2.0.jar:org/apache/hadoop/filecache/DistributedCache.class
[error] /Users/xyz/.ivy2/cache/org.apache.hadoop/hadoop-core/jars/hadoop-core-2.0.0-mr1-cdh4.7.1.jar:org/apache/hadoop/filecache/DistributedCache.class
DistributedCache of hadoop-mapreduce-client-core is deprecated now.
In my build.sbt I have included :
"org.apache.hadoop" % "hadoop-client" % "2.0.0-mr1-cdh4.7.1" excludeAll(
ExclusionRule(organization = "javax.servlet"))
The dependency is like this :
org.apache.hadoop:hadoop-client:2.2.0
org.apache.hadoop:hadoop-mapreduce-client-app:2.2.0
org.apache.hadoop:hadoop-mapreduce-client-core:2.2.0
How do I handle this?
Thanks in advance!
if you intend to remove the dependency jar of mapreduce-client-app when it is being load when you rely on hadoop-client:2.2.0, just simply add intransitive:
"org.apache.hadoop" % "hadoop-client" % "2.2.0" intransitive()
This will only include the hadoop-client:2.2.0 jar and exclude all its dependencies.

Play 2.2.2 with IntelliJ 13 & SBT 0.13 cant run - No main class detected

I'm trying to run one of the type safe activator projects in IntelliJ 13.1 with the latest Play 2 and Scala plugins.
I can run the project through the typesafe activator no problems but when I try to open the activator project via the build.sbt file in IntelliJ it all seems to work until I try to run the thing via the Play 2 App run configuration. I get the following error;
"C:\Program Files\Java\jdk1.8.0\bin\java" -Dfile.encoding=UTF8 -Djline.terminal=none -Dsbt.log.noformat=true -Dsbt.global.base=C:\Users\nw\AppData\Local\Temp\sbt-global-plugin5970836074908902993stub -Xms512M -Xmx1024M -Xss1M -XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=256M -classpath C:\Users\nw\.IntelliJIdea13\config\plugins\Scala\launcher\sbt-launch.jar xsbt.boot.Boot run
Getting org.fusesource.jansi jansi 1.11 ...
:: retrieving :: org.scala-sbt#boot-jansi
confs: [default]
1 artifacts copied, 0 already retrieved (111kB/37ms)
Getting org.scala-sbt sbt 0.13.0 ...
:: retrieving :: org.scala-sbt#boot-app
confs: [default]
43 artifacts copied, 0 already retrieved (12440kB/109ms)
Getting Scala 2.10.2 (for sbt)...
:: retrieving :: org.scala-sbt#boot-scala
confs: [default]
5 artifacts copied, 0 already retrieved (24390kB/62ms)
[info] Set current project to modules (in build file:/J:/DEV/TYPESAFE/reactive-maps/.idea/modules/)
java.lang.RuntimeException: No main class detected.
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run 'last compile:run' for the full output.
[error] (compile:run) No main class detected.
[error] Total time: 0 s, completed 02/04/2014 10:31:12 PM
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256M; support was removed in 8.0
Process finished with exit code 1
My build.sbt looks like this;
name := """reactive-maps"""
version := "1.0-SNAPSHOT"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.2.1",
"com.typesafe.akka" %% "akka-contrib" % "2.2.1",
"com.typesafe.play.extras" %% "play-geojson" % "1.0.0",
"org.webjars" %% "webjars-play" % "2.2.1",
"org.webjars" % "bootstrap" % "3.0.0",
"org.webjars" % "knockout" % "2.3.0",
"org.webjars" % "requirejs" % "2.1.8",
"org.webjars" % "leaflet" % "0.7.2"
)
play.Project.playScalaSettings
Just running sbt run from the command line in the project folder works fine. So there is some issue with the way the SBT project has been imported.
You can also use play to fix it
cd /path/to/project
play
[project] idea with-sources=yes
And now you can right click on Application and run Play2 application.
Just ran into the same problem - my project is set up like so:
root/
.idea/
MainApp/ <-- this one is a Play 2.2.2 project
.idea/
apps/ <-- contains other Play2 projects
I was attempting to run the MainApp project and had the same problem.
If you notice in your run error: [info] Set current project to modules (in build file:/J:/DEV/TYPESAFE/reactive-maps/.idea/modules/)
states it is setting the project to modules. You want it to be reactive-maps.
Mine did the same - my quick solution:
-- find the file .idea/modules/reactive-maps.iml, move it to the root level reactive-maps/ directory.
-- Open the file, Ctrl-F Ctrl-R (find-and-replace)
-- Find: $MODULE_DIR$/../.. Replace with: $MODULE_DIR$/
Running the Play2 application again worked at this point.
The ideal solution would be importing it correctly (and I'm sure there is a way to do this), but this got my application running through IntelliJ quickly (although it will likely break on restart/SBT refresh).

new to scala & sbt - trying to compile a project that imports akka.stm._

I'm new to Scala & sbt, and I'm trying to compile a project that imports akka.stm._.
When I try running sbt compile the compilation fails with a message point to that point.
I tried using
https://github.com/paulp/sbt-extras
so I'll have the exact sbt version defined in the "build.sbt" file, but it did not help.
I downloaded the akka-2.0.3.tgz and opened the files, but I don't understand exactly how to install them by default or how to tell sbt to use them.
I also noticed that the build.sbt file contains:
resolvers += "repo.codahale.com" at "http://repo.codahale.com"
libraryDependencies ++= Seq(
// "com.codahale" % "simplespec_2.9.0-1" % "0.3.4"
"com.codahale" % "simplespec_2.9.0-1" % "0.4.1"
// "se.scalablesolutions.akka" %% "akka-sbt-plugin" % "2.0-SNAPSHOT"
I tried uncommenting out the "se.scalablesolutions.akka" (assuming the programmer used the akka-library of that version), but it only printed the message:
[error] Error parsing expression. Ensure that there are no blank lines within a setting.
(There are no blank lines, I just deleted the '//' and replaced the double '%' with a single one)
How do I tell the sbt to find the akka jar files in their place? Can I add another resolver to solve this problem?
I know this kind of question doesn't fit in stackoverflow, but can you at least refer me to some manuals I should read inorder to solve this?
OK, first of all I want to apologize for the newbie question.
(Stackoverflow should make a separate "newbie" section)
First, the elements in the Seq part should be separated by comma.
Second, the akka snapshots were moved to http://repo.akka.io/snapshots so I fixed the build.sbt to:
resolvers += "repo.codahale.com" at "http://repo.codahale.com"
resolvers += "akka" at "http://repo.akka.io/snapshots"
libraryDependencies ++= Seq(
// "com.codahale" % "simplespec_2.9.0-1" % "0.3.4"
"com.codahale" % "simplespec_2.9.0-1" % "0.4.1",
"com.typesafe.akka" % "akka-stm" % "2.0-SNAPSHOT"
)
And the compilation finished successfully.
I don't know if this is the exact configuration in which the original compilation was done, but this issue doesn't disturb me at the moment.