sbt fails to execute JUnit test - scala

I'm trying to execute JUnit test
sbt test
sbt clean test
Test fails with error :
[info] Updating {file:/D:/sbt_projects/hello-word-ec/}hello-word-ec...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 2 Scala sources to D:\sbt_projects\hello-word-ec\target\scala-2.10\test-classes...
[error] D:\sbt_projects\hello-word-ec\src\test\scala\Test1.scala:1: object junit is not a member of package org
[error] import org.junit.Test
[error] ^
[error] D:\sbt_projects\hello-word-ec\src\test\scala\Test1.scala:3: not found: type Test
[error] #Test
[error] ^
[error] D:\sbt_projects\hello-word-ec\src\test\scala\Test2.scala:1: object junit is not a member of package org
[error] import org.junit.Test
[error] ^
[error] D:\sbt_projects\hello-word-ec\src\test\scala\Test2.scala:2: object junit is not a member of package org
[error] import org.junit.BeforeClass
[error] ^
[error] D:\sbt_projects\hello-word-ec\src\test\scala\Test2.scala:6: not found: type BeforeClass
[error] #BeforeClass
[error] ^
[error] D:\sbt_projects\hello-word-ec\src\test\scala\Test2.scala:12: not found: type Test
[error] #Test
[error] ^
[error] 6 errors found
[error] (test:compileIncremental) Compilation failed
[error] Total time: 1 s, completed Jul 28, 2016 6:10:04 PM
My build.sbt file content
name := "sbt junit test project"
version := "1.0"
scalaVersion := "2.10.5"
libraryDependencies += "com.novocode" % "junit-interface" % "0.8" % "test->default"
libraryDependencies += "junit" % "junit" % "4.12" % Test
EclipseKeys.withSource := true
file %USERDIR%.sbt\0.13\plugins\build.sbt looks like
resolvers += Classpaths.typesafeResolver
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "4.0.0")

Put build.sbt in root project folder otherwise sbt clean cannot see the build.sbt and download dependencies . Originally build.sbt was located in <project root>\project\ folder

Related

intellij: errors with scala, spark

I have a scala project using spark libraies, and it works fine most of the times(using intellij). But some times it starts giving errors on intellij launch:
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-core_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-core_2.12\3.0.0-preview2\.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-core_2.12\3.0.0-preview2\.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] Error downloading org.apache.spark:spark-sql_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-sql_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-sql_2.12\3.0.0-preview2\.spark-sql_2.12-3.0.0-preview2.pom__sha1
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-core_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-core_2.12\3.0.0-preview2\.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] Error downloading org.apache.spark:spark-sql_2.12:3.0.0-preview2
[error] Not found
[error] Not found
[error] not found: C:\...\.ivy2\localorg.apache.spark\spark-sql_2.12\3.0.0-preview2\ivys\ivy.xml
[error] checksum format error: C:\....\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-sql_2.12\3.0.0-preview2\.spark-sql_2.12-3.0.0-preview2.pom__sha1
[error] Total time: 1 s, completed 17 Sep 2022, 15:13:49
[info] shutting down sbt server
build.sbt is:
/*ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.13.8"
lazy val root = (project in file("."))
.settings(
name := "spark-learning"
)*/
// Name of the package
name := "spark-learning"
// Version of our package
version := "1.0"
// Version of Scala
scalaVersion := "2.12.14"
// Spark library dependencies
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "3.0.0-preview2",
"org.apache.spark" %% "spark-sql" % "3.0.0-preview2"
)
What causes these issues out of sudden?And how can I get rid?
One weird thing in the error messages is the path where it looks for dependencies: ...ivy2\localorg.apache.spark\....
There should be a \ after local.
Any chance your SBT repositories configuration could be messed up? Not sure where it is on Windows, it's /etc/sbt/repositories typically on Linux but maybe Intellij has its own setting.

sbt error: object spark is not a member of package org.apache

I installed sbt-1.3.4.msi and when trying to build a sample SparkPi.scala app, I'm getting the following error:
C:\myapps\sbt\sparksample>sbt
[info] Loading project definition from C:\myapps\sbt\sparksample\project
[info] Compiling 1 Scala source to C:\myapps\sbt\sparksample\project\target\scala-2.12\sbt-1.0\classes ...
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:3:19: object spark is not a member of package org.apache
[error] import org.apache.spark._
[error] ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:8:20: not found: type SparkConf
[error] val conf = new SparkConf().setAppName("Spark Pi")
[error] ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:9:21: not found: type SparkContext
[error] val spark = new SparkContext(conf)
[error] ^
[error] three errors found
[error] (Compile / compileIncremental) Compilation failed
The SparkPi.scala file is in C:\myapps\sbt\sparksample\project\src\main\scala (as shown in the error messages above).
What am I missing here?
The C:\myapps\sbt\sparksample\sparksample.sbt file is as follows:
name := "Spark Sample"
version := "1.0"
scalaVersion := "2.12.10"
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.0.0"
C:\myapps\sbt\sparksample\project\src\main\scala directory has SparkPi.scala file
That's the problem. You've got the Scala file(s) under project directory that's owned by sbt itself (not your sbt-managed Scala project).
Move the SparkPi.scala and other Scala files to C:\myapps\sbt\sparksample\src\main\scala.

Scala IntelliJ library import errors

I am new to scala and I am trying to import the following libraries in my build.sbt. When IntelliJ does an auto-update I get the following error:
Error while importing sbt project:
List([info] welcome to sbt 1.3.13 (Oracle Corporation Java 1.8.0_251)
[info] loading global plugins from C:\Users\diego\.sbt\1.0\plugins
[info] loading project definition from C:\Users\diego\development\Meetup\Stream-Processing\project
[info] loading settings for project stream-processing from build.sbt ...
[info] set current project to Stream-Processing (in build file:/C:/Users/diego/development/Meetup/Stream-Processing/)
[info] sbt server started at local:sbt-server-80d70f9339b81b4d026a
sbt:Stream-Processing>
[info] Defining Global / sbtStructureOptions, Global / sbtStructureOutputFile and 1 others.
[info] The new values will be used by cleanKeepGlobs
[info] Run `last` for details.
[info] Reapplying settings...
[info] set current project to Stream-Processing (in build file:/C:/Users/diego/development/Meetup/Stream-Processing/)
[info] Applying State transformations org.jetbrains.sbt.CreateTasks from C:/Users/diego/.IntelliJIdea2019.3/config/plugins/Scala/repo/org.jetbrains/sbt-structure-extractor/scala_2.12/sbt_1.0/2018.2.1+4-88400d3f/jars/sbt-structure-extractor.jar
[info] Reapplying settings...
[info] set current project to Stream-Processing (in build file:/C:/Users/diego/development/Meetup/Stream-Processing/)
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.kafka:kafka-clients_2.11:2.3.1
[error] Not found
[error] Not found
[error] not found: C:\Users\diego\.ivy2\local\org.apache.kafka\kafka-clients_2.11\2.3.1\ivys\ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/kafka/kafka-clients_2.11/2.3.1/kafka-clients_2.11-2.3.1.pom
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.apache.kafka:kafka-clients_2.11:2.3.1
[error] Not found
[error] Not found
[error] not found: C:\Users\diego\.ivy2\local\org.apache.kafka\kafka-clients_2.11\2.3.1\ivys\ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/kafka/kafka-clients_2.11/2.3.1/kafka-clients_2.11-2.3.1.pom
[error] Total time: 2 s, completed Jun 28, 2020 12:11:24 PM
[info] shutting down sbt server)
This is my build.sbt file:
name := "Stream-Processing"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.4"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql-kafka-0-10_2.12
libraryDependencies += "org.apache.spark" %% "spark-sql-kafka-0-10" % "2.4.4"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients
libraryDependencies += "org.apache.kafka" %% "kafka-clients" % "2.3.1"
// https://mvnrepository.com/artifact/mysql/mysql-connector-java
libraryDependencies += "mysql" % "mysql-connector-java" % "8.0.18"
// https://mvnrepository.com/artifact/org.mongodb.spark/mongo-spark-connector
libraryDependencies += "org.mongodb.spark" %% "mongo-spark-connector" % "2.4.1"
I made a Scala project just to make sure Spark works and my python project using Kafka works as well so I am sure it's not a spark/kafka problem. Any reason why I am getting that error?
Try removing one % before "kafka-clients":
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "2.3.1"
The semantics of %% in SBT is that it appends the Scala version being used to the artifact name, so it becomes org.apache.kafka:kafka-clients_2.11:2.3.1 as the error message shows as well. Note the _2.11 suffix.
This is a nice shorthand for Scala libraries, but can get confusing for beginners, when used with Java libs.

Apache Ignite integration with scala-spark using sbt

I am trying to integrate ignite in scala code and run the application using sbt. I cannot use any IDE for this.
Scala version - 2.11.0
Spark version - 2.3.0
Ignite version - 2.8.0
Sbt version - 1.3.3
I have tried adding the basic library dependency in sbt.build,
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
my complete sbt.build code is,
scalaVersion := "2.11.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0"
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
but it is still not working, I am getting the following error:
$ sbt package
[info] Loading project definition from /home/testing/project
[info] Loading settings for project testing from build.sbt ...
[info] Set current project to testing (in build file:/home/testing/)
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading org.apache.ignite:ignite-spark_2.11:2.8.0
[error] Not found
[error] Not found
[error] not found: /root/.ivy2/local/org.apache.ignite/ignite-spark_2.11/2.8.0/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom
[error] at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:245)
[error] at lmcoursier.CoursierDependencyResolution.$anonfun$update$34(CoursierDependencyResolution.scala:214)
[error] at scala.util.Either$LeftProjection.map(Either.scala:573)
[error] at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:214)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error] at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:52)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:102)
[error] at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:69)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$20(LibraryManagement.scala:115)
[error] at scala.util.control.Exception$Catch.apply(Exception.scala:228)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:115)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:96)
[error] at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:150)
[error] at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:129)
[error] at sbt.Classpaths$.$anonfun$updateTask0$5(Defaults.scala:2946)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon$4.work(Transform.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:281)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.Execute.work(Execute.scala:290)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:281)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.ignite:ignite-spark_2.11:2.8.0
[error] Not found
[error] Not found
[error] not found: /root/.ivy2/local/org.apache.ignite/ignite-spark_2.11/2.8.0/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom
[error] Total time: 5 s, completed May 28, 2020 6:24:04 AM
I am working on Ubuntu:latest docker image.
I am sure that spark and ignite are working as I am also running pyspark and it is working just fine. Please help me out, I think being a newbie I am doing some minor mistake which is becoming a major issue here.
It seems that he tries to pool "ignite_spark_2.11", not only "ignite_spark".
"not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom"
use
libraryDependencies += "org.apache.ignite" % "ignite-spark" % "2.8.0"
instead of
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"
%% -> get dependencies with scala version appended
% -> get dependencies without scala version appended

what should I do to import cache,ws, jdbc and specs2 % Test

I'm upgrading my project from play 2.4.3 to 2.5.0
I have added the sbt plugin for play 2.5.0 in plugin.sbt file like this
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.5.0")
After adding when I compiled the code it throws me an error stating
[info] Loading project definition from /Users/ege/Sites/Aeione/greenroom6-services-v3/greenroom6-services-v3/project
[info] Compiling 1 Scala source to /Users/ege/Sites/Aeione/greenroom6-services-v3/greenroom6-services-v3/project/target/scala-2.10/sbt-0.13/classes...
[error] /Users/ege/Sites/Aeione/greenroom6-services-v3/greenroom6-services-v3/project/Common.scala:4: object PlayScala is not a member of package play
[error] import play.PlayScala
[error] ^
[error] /Users/ege/Sites/Aeione/greenroom6-services-v3/greenroom6-services-v3/project/Common.scala:49: not found: value jdbc
[error] jdbc,
[error] ^
[error] /Users/ege/Sites/Aeione/greenroom6-services-v3/greenroom6-services-v3/project/Common.scala:50: not found: value cache
[error] cache,
[error] ^
[error] /Users/ege/Sites/Aeione/greenroom6-services-v3/greenroom6-services-v3/project/Common.scala:51: not found: value ws
[error] ws,
[error] ^
[error] /Users/ege/Sites/Aeione/greenroom6-services-v3/greenroom6-services-v3/project/Common.scala:52: not found: value specs2
[error] specs2 % Test,
[error] ^
[error] 5 errors found
[error] (compile:compileIncremental) Compilation failed
can anyone help me out to import these dependencies?
I think your import is wrong. According to the documentation it should be:
import sbt._
import Keys._
import play.sbt._
import Play.autoImport._
import PlayKeys._