In my situation, I cannot access internet in my office linux server. Therefore, I installed sbt in my personal computer. I followed instructions on sbt official web-site and run/compile a "hello world" for getting all dependency data.
Then, I copied ~/.ivy2 and ~/.sbt and sbt bin folders to my offline office desktop. But when I run sbt, I got these errors:
[info] welcome to sbt 1.4.9 (Oracle Corporation Java 1.8.0_25)
[info] loading project definition from /proj/mtk07847/test/scala/project
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading org.scala-lang:scala-library:2.12.12
[error] Not found
[error] Not found
[error] download error: Caught java.net.SocketException: Connection reset (Connection reset) while downloading https://repo1.maven.org/maven2/org/s
cala-lang/scala-library/2.12.12/scala-library-2.12.12.pom
[error] not found: ~/.ivy2/localorg.scala-lang/scala-library/2.12.12/ivys/ivy.xml
...
It seems that sbt tried to download missing scala-library:2.12.12 which I already prepared.
In my ~/.ivy2 folder,the structure and path looks like the following:
~/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.12.12.jar
~/.ivy2/cache/org.scala-lang/scala-library/ivy-2.12.12.xml
~/.ivy2/cache/org.scala-lang/scala-library/ivy-2.12.12.xml.original
~/.ivy2/cache/org.scala-lang/scala-library/ivydata-2.12.12.properties
which is different from above messages:
[error] not found: ~/.ivy2/localorg.scala-lang/scala-library/2.12.12/ivys/ivy.xml
should I modified any describing files to indicate searching path or rule for sbt? or this method
is totally wrong (if so, can anyone know the flow guide me the better way?)
Related
I keep getting the following error from my GitHub Actions workflow:
[info] welcome to sbt 1.7.1 (Eclipse Adoptium Java 11.0.16.1)
[info] loading settings for project plant-simulator-build from plugins.sbt ...
[info] loading project definition from /home/runner/work/plant-simulator/plant-simulator/project
[warn]
[warn] Note: Some unresolved dependencies have extra attributes. Check that these dependencies exist with the requested attributes.
[warn] org.scoverage:sbt-scoverage:2.0.7 (sbtVersion=1.0, scalaVersion=2.12)
[warn]
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading org.scoverage:sbt-scoverage;sbtVersion=1.0;scalaVersion=2.12:2.0.7
[error] Not found
[error] Not found
[error] not found: https://repo1.maven.org/maven2/org/scoverage/sbt-scoverage_2.12_1.0/2.0.7/sbt-scoverage-2.0.7.pom
[error] not found: /home/runner/.ivy2/localorg.scoverage/sbt-scoverage/scala_2.12/sbt_1.0/2.0.7/ivys/ivy.xml
[error] not found: https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/org.scoverage/sbt-scoverage/scala_2.12/sbt_1.0/2.0.7/ivys/ivy.xml
[error] not found: https://repo.typesafe.com/typesafe/ivy-releases/org.scoverage/sbt-scoverage/scala_2.12/sbt_1.0/2.0.7/ivys/ivy.xml
[error] at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:345)
[error] at lmcoursier.CoursierDependencyResolution.$anonfun$update$38(CoursierDependencyResolution.scala:314)
[error] at scala.util.Either$LeftProjection.map(Either.scala:573)
[error] at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:314)
I have the following defined in my project/plugins.sbt file:
// For code coverage test
addSbtPlugin("org.scoverage" % "sbt-scoverage" % "2.0.7")
My question is, why is it taking the 2.12_1.0 scoverage version instead of 2.12.17_2.0.7? This is ruining my build. Any ideas on how to fix this?
My question is, why is it taking the 2.12_1.0 scoverage version instead of 2.12.17_2.0.7?
It's not. It is trying to find version 2.0.7 of the plugin. The 2.12 refer to the Scala version of plugins expected by SBT (different from the version of your project), and 1.0 refer to SBT major version.
The error message is relatively clear:
Note: Some unresolved dependencies have extra attributes. Check that these dependencies exist with the requested attributes.
org.scoverage:sbt-scoverage:2.0.7 (sbtVersion=1.0, scalaVersion=2.12)
This is ruining my build. Any ideas on how to fix this?
There's no version 2.0.7 as of today. Latest is 2.0.5. Check out the GitHub page of the plugin for reference: https://github.com/scoverage/sbt-scoverage.
I downloaded play framework from here:
https://www.playframework.com/download
And I chose the Offline Distribution download.
I set the path environment variable and added this:
C:\Framework\activator-dist-1.3.10\bin
While trying to create a new project I wrote in the cmd "activator new" and it worked.
Than I wrote activator eclipse but I got this errors:
C:\Users\***\Desktop\***\FirstPlayProject>activator eclipse
ACTIVATOR_HOME=C:\Framework\activator-dist-1.3.10
The system cannot find the file BIN_DIRECTORY\..\conf\sbtconfig.txt.
[info] Loading project definition from C:\Users\***\Desktop\·δ≡σ·\FirstPlayPro
ject\project
[info] Updating {file:/C:/Users/***/Desktop/%D7%AA%D7%9B%D7%A0%D7%95%D7%AA/Fir
stPlayProject/project/}firstplayproject-build...
[info] Resolving com.typesafe.akka#akka-persistence-experimental_2.10;2.3.11 ...
[info] Resolving com.fasterxml.jackson.datatype#jackson-datatype-jsr310;2.5.4 ..
[info] Resolving org.scala-sbt.ivy#ivy;2.3.0-sbt-2cc8d2761242b072cedb0a04cb39435
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Set current project to FirstPlayProject (in build file:/C:/Users/***/De
sktop/%D7%AA%D7%9B%D7%A0%D7%95%D7%AA/FirstPlayProject/)
[error] Not a valid command: eclipse (similar: help, alias)
[error] Not a valid project ID: eclipse
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: eclipse (similar: deliver, licenses, clean)
[error] eclipse
[error]
Maybe it because of this error: The system cannot find the file BIN_DIRECTORY\..\conf\sbtconfig.txt.
You will need to add sbteclipse for that to work. In your project/plugins.sbt file, add the following:
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "4.0.0")
Then compile your app first (compile in the activator shell). Then you can use the eclipse command in the activator shell.
More info: https://www.playframework.com/documentation/2.5.x/IDE#Eclipse
I am trying to follow code from below link
http://spark.apache.org/docs/latest/quick-start.html
But when I am trying to create package it is failing. I want to know 2 thinks
obviously, why it is failing
why it is showing older version of the scala, while I mentioned 2.11
Below is the error message.
[info] Set current project to default-0464ce (in build file:/home/ubuntu/simple_sbt/)
[info] Updating {file:/home/ubuntu/simple_sbt/}default-0464ce...
[info] Resolving org.scala-lang#scala-library;2.9.1 ...
[info] Done updating.
[info] Compiling 1 Scala source to /home/ubuntu/simple_sbt/target/scala-2.9.1/classes...
[error] /home/ubuntu/simple_sbt/src/main/scala/SimpleApp.scala:1: object apache is not a member of package org
[error] import org.apache.spark.SparkContext
[error] ^
[error] /home/ubuntu/simple_sbt/src/main/scala/SimpleApp.scala:2: object apache is not a member of package org
[error] import org.apache.spark.SparkContext._
[error] ^
[error] two errors found
[error] {file:/home/ubuntu/simple_sbt/}default-0464ce/compile:compile: Compilation failed
[error] Total time: 2 s, completed Aug 30, 2016 3:19:18 AM
when you run sbt package , sometimes it fails as there are no dependencies that are downloaded and will be resolved for the files imported.
Try running, sbt run first and then sbt package . sbt run should bring in all the dependencies on top of which packaging and compiling will be possible.
If the above does not solves your problem, you need to share your sbt build file and the environment that you are using. In which directory you are running these commands, will also play a role.
I am building spark by using sbt. When I run the following command:
sbt/sbt assembly
it takes some time to build spark. There are several warning that appear and at the end I am getting following errors:
[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
[error] Use 'last' for the full log.
When I check sbt version using command sbt sbtVersion, I get following result:
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.typesafe.sbt:sbt-git:0.6.1 -> 0.6.2
[warn] * com.typesafe.sbt:sbt-site:0.7.0 -> 0.7.1
.......
[info] streaming-zeromq/*:sbtVersion
[info] 0.13.7
[info] repl/*:sbtVersion
[info] 0.13.7
[info] spark/*:sbtVersion
[info] 0.13.7
When I give command, ./bin/spark-shell, I get following output:
ls: cannot access '/home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10': No such file or directory
Failed to find Spark assembly in /home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10.
You need to build Spark before running this program.
What can the solution be?
You must configure SBT heap size:
on linux type export SBT_OPTS="-Xmx2G" to set it temporary
on linux you can edit ~/.bash_profile and add line export SBT_OPTS="-Xmx2G"
on windows type set JAVA_OPTS=-Xmx2G to set it temporary
on windows you can edit sbt\conf\sbtconfig.txt and set -Xmx2G
More info:
http://www.scala-sbt.org/0.13.1/docs/Getting-Started/Setup.html
How to set heap size for sbt?
This is probably not a common resolution, but in my case I had to run this command to resolve the OutOfMemoryError when building a spark project with sbt (path is specific to mac OS):
rm -rf /Users/markus.braasch/Library/Caches/Coursier/v1/https/
Increasing memory settings for a variety of arguments in SBT_OPTS did not solve it.
I'm having trouble with installing Apache Spark on Ubuntu 13.04. Im using spark-0.8.1-incubating, and both ./sbt/sbt update and ./sbt/sbt compile work fine. However, when I do a ./sbt/sbt assembly I get the following error:
[info] Set current project to default-289e76 (in build file:/node-insights/server/lib/spark-0.8.1-incubating/sbt/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Not a valid configuration: assembly
[error] Not a valid key: assembly
[error] assembly
[error]
I googled for stuff related to this but couldn't find anything useful. Any guidance would be much appreciated.
The current project set to default-289e76 message suggests that sbt was called from outside of the Spark sources directory:
$ /tmp ./spark-0.8.1-incubating/sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Set current project to default-d0f036 (in build file:/private/tmp/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Not a valid configuration: assembly
[error] Not a valid key: assembly
[error] assembly
[error] ^
Running ./sbt/sbt assembly works fine from the spark-0.8.1-incubating directory (note the log output showing that the current project was set correctly):
$ spark-0.8.1-incubating sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project/project
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project
[info] Set current project to root (in build file:/private/tmp/spark-0.8.1-incubating/)
...
You typed "abt" twice, but shouldn't that be "sbt"? Apache Spark has its own copy of sbt, so make sure you're running Spark's version to pick up the "assembly" plugin among other customizations.
To run the Spark installation of sbt, go to the Spark directory and run ./sbt/sbt assembly .