SBT not finding resolvers added in another sbt file - scala

I have a SBT project, where I have added the resolvers in build.sbt. Now, instead of adding the resolvers in build.sbt file, I am trying to put in a new file, resolvers.sbt.
But the SBT is unable to find the artifacts if I make the resolvers in separate file. However, I can see the message while starting up the SBT that, my resolvers.sbt is being considered.
If I add the resolvers to global file in .sbt directory, it is getting resolved.
sbt version : 1.2.6
Did anyone else faced the same issue ?
build.sbt
import sbt.Credentials
name := "sbt-sample"
version := "0.1"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"com.reactore" %% "reactore-infra" % "1.0.0.0-DEV-SNAPSHOT"
)
resolvers.sbt
credentials += Credentials("Artifactory Realm", "192.168.1.120", "yadu", "password")
resolvers ++= Seq(
"Reactore-snapshots" at "http://192.168.1.120:8182/artifactory/libs-snapshot-local"
)
SBT Log
sbt:sbt-sample> reload
[info] Loading settings for project global-plugins from idea.sbt ...
[info] Loading global plugins from /home/administrator/.sbt/1.0/plugins
[info] Loading settings for project sbt-sample-build from resolvers.sbt ...
[info] Loading project definition from /home/administrator/source/poc/sbt-sample/project
[info] Loading settings for project sbt-sample from build.sbt ...
[info] Set current project to sbt-sample (in build file:/home/administrator/source/poc/sbt-sample/)

I am answering my own question here, hoping that it will be useful for someone else.
Found out the issue, I was keeping the resolvers.sbt inside project directory. I moved it to the project's home directory (where build.sbt is present), and now it is resolving.

Related

Using IntelliJ, how to add dependency in an sbt project

I'm new to sbt and I wanted to learn it with a small Scala project in IntelliJ.
I started with the official sbt Getting Started guide, to learn the sbt basics on the console (https://www.scala-sbt.org/1.x/docs/sbt-by-example.html). Following the guide, everything compiles fine.
Then I created an sbt project in IntelliJ, trying to do the same thing there. When I add the org.scalatest depenpency to the build.sbt file the project can no longer compile. The error message is:
module not found: org.scalatest#scalatest_2.13;3.0.5
When I created the fresh sbt project in IntelliJ, first the build.sbt looked something like this:
name := "sbtTest"
version := "1.0"
scalaVersion := "2.13.0"
Then I added the dependency:
name := "sbtTest"
version := "1.0"
scalaVersion := "2.13.0"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % Test
After reloading the build file and getting the error when compiling, I tried to change the biuld.sbt according to the code which already worked in the sbt Getting Started guide:
ThisBuild / scalaVersion := "2.13.0"
ThisBuild / organization := "me"
ThisBuild / version := "1.0"
lazy val sbtTest = (project in file("."))
.settings(
name := "sbtTest",
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % Test
)
This has again the same error after reloading and compiling.
My sbt version is 1.2.8.
Is something wrong with my build.sbt? Or is the sbt version too new for IntelliJ? Is IntelliJ probably not the recommended IDE to create Sala sbt projects?
Structure of both your build.sbt is ok. The thing seems to be in versions.
Scalatest 3.0.5 is currently unavailable for Scala 2.13.0. It's available for Scala 2.13.0-M2.
It's Scalatest 3.0.8 that is available for Scala 2.13.0.
https://mvnrepository.com/artifact/org.scalatest/scalatest
After you fix versions in build.sbt re-import IntelliJ project in pop-up window
or in sbt tool window
or in sbt shell

Modifying and Building Spark core

I am trying to make a modification to the Apache Spark source code. I created a new method and added it to the RDD.scala file within the Spark source code I downloaded. After making the modification to RDD.scala, I built Spark using
mvn -Dhadoop.version=2.2.0 -DskipTests clean package
I then created a sample Scala Spark Application as mentioned here
I tried using the new function I created, and I got a compilation error when using sbt to create a jar for Spark. How exactly do I compile Spark with my modification and attach the modified jar to my project? The file I modified is RDD.scala within the core project. I run sbt package from the root dir of my Spark Application Project.
Here is the sbt file:
name := "N Spark"
version := "1.0"
scalaVersion := "2.11.6"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.3.0"
Here is the error:
sbt package
[info] Loading global plugins from /Users/Raggy/.sbt/0.13/plugins
[info] Set current project to Noah Spark (in build file:/Users/r/Downloads/spark-proj/n-spark/)
[info] Updating {file:/Users/r/Downloads/spark-proj/n-spark/}n-spark...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/r/Downloads/spark-proj/n-spark/target/scala-2.11/classes...
[error] /Users/r/Downloads/spark-proj/n-spark/src/main/scala/SimpleApp.scala:11: value reducePrime is not a member of org.apache.spark.rdd.RDD[Int]
[error] logData.reducePrime(_+_);
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 24 s, completed Apr 11, 2015 2:24:03 AM
UPDATE
Here is the updated sbt file
name := "N Spark"
version := "1.0"
scalaVersion := "2.10"
libraryDependencies += "org.apache.spark" % "1.3.0"
I get the following error for this file:
[info] Loading global plugins from /Users/Raggy/.sbt/0.13/plugins
/Users/Raggy/Downloads/spark-proj/noah-spark/simple.sbt:7: error: No implicit for Append.Value[Seq[sbt.ModuleID], sbt.impl.GroupArtifactID] found,
so sbt.impl.GroupArtifactID cannot be appended to Seq[sbt.ModuleID]
libraryDependencies += "org.apache.spark" % "1.3.0"
Delete libraryDependencies from build.sbt and just copy the custom-built Spark jar to the lib directory in your application project.

SBT cannot find snapshots in an Artifactory maven repository

I'm just starting out trying to set up a workflow with scala and sbt, and I'm having trouble with my repository. I am trying to publish a simple test library, which is composed of two projects, and use it from another program.
My source library's build contains the following:
val sharedSettings = Seq(
name := "test-lib",
organization := "com.example",
version := "0.1-SNAPSHOT",
scalaVersion := "2.11.0",
publishTo := Some("Artifactory Realm" at "http://localhost:8081/artifactory/libs-snapshot-local"),
publishMavenStyle := true,
credentials += Credentials(Path.userHome / ".ivy2" / ".credentials")
)
lazy val root = project.in(file(".")).settings(sharedSettings: _*).aggregate(child1, child2)
lazy val sharedCode = project.settings(sharedSettings: _*)
val child1Settings = sharedSettings ++ Seq(unmanagedSourceDirectories in Compile <++= (unmanagedSourceDirectories in sharedCode) in Compile)
val child2Settings = sharedSettings ++ Seq(unmanagedSourceDirectories in Compile <++= (unmanagedSourceDirectories in sharedCode) in Compile)
lazy val child1 = project.settings(child1Settings: _*)
lazy val child2 = project.settings(child2Settings: _*)
I can run sbt publish okay, and it creates the directory com/example/test-lib/XXX in the repo.
In my test program, I have the following:
scalaVersion := "2.11.0",
resolvers += "Artifactory Realm" at "http://localhost:8081/artifactory/libs-snapshot-local",
libraryDependencies += "com.example" %% "test-lib" % "0.1-SNAPSHOT"
When the test program attempts to compile, it cannot resolve com.example, because of the following:
[warn] ==== Artifactory Realm: tried
[warn] http://localhost:8081/artifactory/libs-snapshot-local/com/example/test-lib_2.11/0.1-SNAPSHOT/test-lib_2.11-0.1-SNAPSHOT.pom
Looking at the repository directory itself, I am getting an additional timestamp on my pom files:
test-lib_2.11-0.1-20140510.183027-1.pom 10-May-2014 19:30 793 bytes
test-lib_2.11-0.1-20140510.183027-2.pom 10-May-2014 19:30 793 bytes
...
test-lib_2.11-0.1-20140510.183121-9.pom 10-May-2014 19:31 793 bytes
maven-metadata.xml in the directory is referencing these okay, sbt is looking directly for a pom file without a timestamp and cannot find it. The pom files contain the correct information.
What am I doing wrong?
The issue was not with my sbt configuration after all, but with my repository server.
I'm using Artifactory, and the snapshots repository was configured to use "unique snapshots" by default. The filenames of these snapshots are modified as they are published to include a timestamp, which sbt 13.x doesn't seem to understand.
After changing the repository's "Maven Snapshot Version Behaviour" from "Unique" to "Nonunique", everything started to work.
actually, the inconsistency of timestamp & build number suffix in maven-metadata.xml and jar/pom files genereated by sbt publish lead to such error.
with the following plugin sbt-maven-resolver during deployment procedure, the suffix will be kept same, looks like:
currently, adding this plugin on the deployment side (once the timestamp & build suffix are same, both sbt/maven can find the snapshots):
in plugins.sbt
addSbtPlugin("org.scala-sbt" % "sbt-maven-resolver" % "0.1.0")
hope to solve your case.

Resolution failure for plugin when added to libraryDependencies in project?

When I run sbt publishLocal, the plugin will be generated in <ivy-repository>/<org>/<plugin>/<scala-version>/<sbt-version>/<plugin-version>/...
For example:
[info] published sbt-cloudengine to /Users/hanxue/.ivy2/local/net.entrypass/sbt-cloudengine/scala_2.10/sbt_0.13/0.2.1/jars/sbt-cloudengine.jar
How can I exclude <scala-version> and <sbt-version> from the output path?
This path is causing resolution failure when I add the plugin as a dependency in build.sbt:
[warn] ==== Local Ivy Repository: tried
[warn] file:///Users/hanxue/.ivy2/local/net/entrypass/sbt-cloudengine/0.2.1/sbt-cloudengine-0.2.1.pom
Plugin's build.sbt is:
sbtPlugin := true
name := "sbt-cloudengine"
organization := "net.entrypass"
version := "0.2.1"
description := "sbt plugin for managing Google Cloud Engine resources"
licenses := Seq("BSD License" -> url("https://github.com/hanxue/sbt-cloudengine/blob/master/LICENSE"))
scalacOptions := Seq("-deprecation", "-unchecked")
publishArtifact in (Compile, packageBin) := true
publishArtifact in (Test, packageBin) := false
publishArtifact in (Compile, packageDoc) := false
publishArtifact in (Compile, packageSrc) := false
publishMavenStyle := false
Update 1
This is how the plugin is referenced in a project's <rootdir>/build.sbt
resolvers += "Local Ivy Repository" at "file://"+Path.userHome.absolutePath+"/.ivy2/local"
libraryDependencies ++= Seq(
"net.entrypass" % "sbt-cloudengine" % "0.2.1"
)
This is the directory listing
$ ls -R ~/.ivy2/local/net.entrypass/sbt-cloudengine/scala_2.10/sbt_0.13/0.2.1/
ivys jars poms
/Users/hanxue/.ivy2/local/net.entrypass/sbt-cloudengine/scala_2.10/sbt_0.13/0.2.1//ivys:
ivy.xml ivy.xml.md5 ivy.xml.sha1
/Users/hanxue/.ivy2/local/net.entrypass/sbt-cloudengine/scala_2.10/sbt_0.13/0.2.1//jars:
sbt-cloudengine.jar sbt-cloudengine.jar.sha1
sbt-cloudengine.jar.md5
/Users/hanxue/.ivy2/local/net.entrypass/sbt-cloudengine/scala_2.10/sbt_0.13/0.2.1//poms:
sbt-cloudengine.pom sbt-cloudengine.pom.sha1
sbt-cloudengine.pom.md5
Since you're publishing a sbt plugin and not a library, the path will correctly contain the sbt version and the scala version.
Your problem comes from the fact you're trying to load the plugin using the libraryDependencies. Instead, you have to use the file project/plugins.sbt with the following inside:
addSbtPlugin("net.entrypass" % "sbt-cloudengine" % "0.2.1")
This way, sbt will search the plugin using the correct path with the help of the current scala and sbt versions.

Why sbt resolver with Resolver method doesn't work?

I've installed a jarjar:jarjar:1.0 in my local maven repo, and my build.sbt is
name := "test"
version := "1.0"
libraryDependencies += "jarjar" % "jarjar" % "1.0"
resolvers += Resolver.file("maven-l", file("/Volumes/Data/Repo/Maven"))
It says jarjar cannot be resolved, and show resolvers gives:
> show resolvers
[info] List(FileRepository(maven-l,FileConfiguration(true,None),sbt.Patterns#30ea6dbc))
This does not look right.
Using resolvers += "maven-l" at "/Volumes/Data/Repo/Maven"
It works just fine.
> show resolvers
[info] List(maven-l: /Volumes/Data/Repo/Maven)
I'm wondering why is this? is this a bug, or specified?
I'm using sbt 0.12.1
Reference: http://www.scala-sbt.org/release/docs/Detailed-Topics/Resolvers.html