How to have a SBT subproject depending on the root project? - scala

I'm trying to use mdoc in a simple SBT single-project build (link). mdoc requires me to create a subproject for the documentation, but I'd like to avoid moving all the code to a subfolder. I was trying to create a docs subproject that depends on the root project:
lazy val core = project.in(file("."))
lazy val docs = project.in(file("docs")).dependsOn(core)
However, this makes SBT try to find a JAR for my root project (obviously not finding it):
(sbt) core ❯ docs/compile
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run last docs / update for the full output
[error] (docs / update) sbt.librarymanagement.ResolveException: Error downloading net.ruippeixotog:akka-testkit-specs2_2.12:0.3.0-SNAPSHOT
[error] Not found
[error] Not found
[error] not found: /Users/rui/.ivy2/local/net.ruippeixotog/akka-testkit-specs2_2.12/0.3.0-SNAPSHOT/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/net/ruippeixotog/akka-testkit-specs2_2.12/0.3.0-SNAPSHOT/akka-testkit-specs2_2.12-0.3.0-SNAPSHOT.pom
[error] Total time: 0 s, completed Dec 28, 2019 11:27:50 PM
With other subproject deps (e.g. if I make the core project point to another folder) SBT adds a direct classpath dependency to the subproject's target/ folder. Why is the root project handled differently? Is there another way to make this work?

Related

Scala SBT Plugin Version Error During Build

I keep getting the following error from my GitHub Actions workflow:
[info] welcome to sbt 1.7.1 (Eclipse Adoptium Java 11.0.16.1)
[info] loading settings for project plant-simulator-build from plugins.sbt ...
[info] loading project definition from /home/runner/work/plant-simulator/plant-simulator/project
[warn]
[warn] Note: Some unresolved dependencies have extra attributes. Check that these dependencies exist with the requested attributes.
[warn] org.scoverage:sbt-scoverage:2.0.7 (sbtVersion=1.0, scalaVersion=2.12)
[warn]
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading org.scoverage:sbt-scoverage;sbtVersion=1.0;scalaVersion=2.12:2.0.7
[error] Not found
[error] Not found
[error] not found: https://repo1.maven.org/maven2/org/scoverage/sbt-scoverage_2.12_1.0/2.0.7/sbt-scoverage-2.0.7.pom
[error] not found: /home/runner/.ivy2/localorg.scoverage/sbt-scoverage/scala_2.12/sbt_1.0/2.0.7/ivys/ivy.xml
[error] not found: https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/org.scoverage/sbt-scoverage/scala_2.12/sbt_1.0/2.0.7/ivys/ivy.xml
[error] not found: https://repo.typesafe.com/typesafe/ivy-releases/org.scoverage/sbt-scoverage/scala_2.12/sbt_1.0/2.0.7/ivys/ivy.xml
[error] at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:345)
[error] at lmcoursier.CoursierDependencyResolution.$anonfun$update$38(CoursierDependencyResolution.scala:314)
[error] at scala.util.Either$LeftProjection.map(Either.scala:573)
[error] at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:314)
I have the following defined in my project/plugins.sbt file:
// For code coverage test
addSbtPlugin("org.scoverage" % "sbt-scoverage" % "2.0.7")
My question is, why is it taking the 2.12_1.0 scoverage version instead of 2.12.17_2.0.7? This is ruining my build. Any ideas on how to fix this?
My question is, why is it taking the 2.12_1.0 scoverage version instead of 2.12.17_2.0.7?
It's not. It is trying to find version 2.0.7 of the plugin. The 2.12 refer to the Scala version of plugins expected by SBT (different from the version of your project), and 1.0 refer to SBT major version.
The error message is relatively clear:
Note: Some unresolved dependencies have extra attributes. Check that these dependencies exist with the requested attributes.
org.scoverage:sbt-scoverage:2.0.7 (sbtVersion=1.0, scalaVersion=2.12)
This is ruining my build. Any ideas on how to fix this?
There's no version 2.0.7 as of today. Latest is 2.0.5. Check out the GitHub page of the plugin for reference: https://github.com/scoverage/sbt-scoverage.

How to correctly prepare and setting preloaded sbt lib

In my situation, I cannot access internet in my office linux server. Therefore, I installed sbt in my personal computer. I followed instructions on sbt official web-site and run/compile a "hello world" for getting all dependency data.
Then, I copied ~/.ivy2 and ~/.sbt and sbt bin folders to my offline office desktop. But when I run sbt, I got these errors:
[info] welcome to sbt 1.4.9 (Oracle Corporation Java 1.8.0_25)
[info] loading project definition from /proj/mtk07847/test/scala/project
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading org.scala-lang:scala-library:2.12.12
[error] Not found
[error] Not found
[error] download error: Caught java.net.SocketException: Connection reset (Connection reset) while downloading https://repo1.maven.org/maven2/org/s
cala-lang/scala-library/2.12.12/scala-library-2.12.12.pom
[error] not found: ~/.ivy2/localorg.scala-lang/scala-library/2.12.12/ivys/ivy.xml
...
It seems that sbt tried to download missing scala-library:2.12.12 which I already prepared.
In my ~/.ivy2 folder,the structure and path looks like the following:
~/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.12.12.jar
~/.ivy2/cache/org.scala-lang/scala-library/ivy-2.12.12.xml
~/.ivy2/cache/org.scala-lang/scala-library/ivy-2.12.12.xml.original
~/.ivy2/cache/org.scala-lang/scala-library/ivydata-2.12.12.properties
which is different from above messages:
[error] not found: ~/.ivy2/localorg.scala-lang/scala-library/2.12.12/ivys/ivy.xml
should I modified any describing files to indicate searching path or rule for sbt? or this method
is totally wrong (if so, can anyone know the flow guide me the better way?)

Scalc SBT compile failing

I am trying to follow code from below link
http://spark.apache.org/docs/latest/quick-start.html
But when I am trying to create package it is failing. I want to know 2 thinks
obviously, why it is failing
why it is showing older version of the scala, while I mentioned 2.11
Below is the error message.
[info] Set current project to default-0464ce (in build file:/home/ubuntu/simple_sbt/)
[info] Updating {file:/home/ubuntu/simple_sbt/}default-0464ce...
[info] Resolving org.scala-lang#scala-library;2.9.1 ...
[info] Done updating.
[info] Compiling 1 Scala source to /home/ubuntu/simple_sbt/target/scala-2.9.1/classes...
[error] /home/ubuntu/simple_sbt/src/main/scala/SimpleApp.scala:1: object apache is not a member of package org
[error] import org.apache.spark.SparkContext
[error] ^
[error] /home/ubuntu/simple_sbt/src/main/scala/SimpleApp.scala:2: object apache is not a member of package org
[error] import org.apache.spark.SparkContext._
[error] ^
[error] two errors found
[error] {file:/home/ubuntu/simple_sbt/}default-0464ce/compile:compile: Compilation failed
[error] Total time: 2 s, completed Aug 30, 2016 3:19:18 AM
when you run sbt package , sometimes it fails as there are no dependencies that are downloaded and will be resolved for the files imported.
Try running, sbt run first and then sbt package . sbt run should bring in all the dependencies on top of which packaging and compiling will be possible.
If the above does not solves your problem, you need to share your sbt build file and the environment that you are using. In which directory you are running these commands, will also play a role.

Build.scala:1: not found: object sbt

I am using sbt version 1.0
$ sbt version
[info] Loading project definition from /Users/harit/code/learningScala/project
[info] Set current project to learningScala (in build file:/Users/harit/code/learningScala/)
[info] 1.0
I am using IntelliJ IDEA v14.1.3 for my project and the structure looks like
As you may see that project was not able to resolve Build. When I try command-line to run sbt, I see
$ sbt
[info] Loading project definition from /Users/harit/code/learningScala/project
[info] Set current project to learningScala (in build file:/Users/harit/code/learningScala/)
> compile
[info] Updating {file:/Users/harit/code/learningScala/}learningscala...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/harit/code/learningScala/target/scala-2.11/classes...
[error] /Users/harit/code/learningScala/Build.scala:1: not found: object sbt
[error] import sbt.Build
[error] ^
[error] /Users/harit/code/learningScala/Build.scala:3: not found: type Build
[error] object MyBuild extends Build {
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 2 s, completed May 28, 2015 8:10:37 PM
>
I am very new to Scala, sbt so no idea what is going wrong with it
The MyBuild.scala was at root. It should be inside project folder. I made that change and now it works. Thanks to tpolecat on IRC who helped me with this
> compile
[success] Total time: 0 s, completed May 28, 2015 8:20:57 PM
> compile
[info] Updating {file:/Users/harit/code/learningScala/}learningscala...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[success] Total time: 0 s, completed May 28, 2015 8:21:22 PM
>
I have a multi-project sbt build, and two of the projects were declaring two of the same val in their build.sbt files. I moved the duplicated val (s) to the Build.scala in the root project as def, and error stopped. Found the answer here: https://github.com/sbt/sbt/issues/1465
example: val samza_gid = "org.apache.samza" in build.sbt file became
def samza_gid = "org.apache.samza" in the Build.scala file.

"./sbt/sbt assembly" errors "Not a valid command: assembly" for Apache Spark project

I'm having trouble with installing Apache Spark on Ubuntu 13.04. Im using spark-0.8.1-incubating, and both ./sbt/sbt update and ./sbt/sbt compile work fine. However, when I do a ./sbt/sbt assembly I get the following error:
[info] Set current project to default-289e76 (in build file:/node-insights/server/lib/spark-0.8.1-incubating/sbt/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Not a valid configuration: assembly
[error] Not a valid key: assembly
[error] assembly
[error]
I googled for stuff related to this but couldn't find anything useful. Any guidance would be much appreciated.
The current project set to default-289e76 message suggests that sbt was called from outside of the Spark sources directory:
$ /tmp ./spark-0.8.1-incubating/sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Set current project to default-d0f036 (in build file:/private/tmp/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Not a valid configuration: assembly
[error] Not a valid key: assembly
[error] assembly
[error] ^
Running ./sbt/sbt assembly works fine from the spark-0.8.1-incubating directory (note the log output showing that the current project was set correctly):
$ spark-0.8.1-incubating sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project/project
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project
[info] Set current project to root (in build file:/private/tmp/spark-0.8.1-incubating/)
...
You typed "abt" twice, but shouldn't that be "sbt"? Apache Spark has its own copy of sbt, so make sure you're running Spark's version to pick up the "assembly" plugin among other customizations.
To run the Spark installation of sbt, go to the Spark directory and run ./sbt/sbt assembly .