I'm getting the following error when compiling the following toy class:
package com.example
import scala.tools.reflect.ToolBox
import scala.reflect.runtime.{currentMirror => m}
object Hello {
def main(args: Array[String]): Unit = {
println("Hello, world!")
}
}
[info] Loading project definition from /Users/me/Temp/Bar/project
[info] Set current project to Bar (in build file:/Users/me/Temp/Bar/)
[info] Compiling 1 Scala source to /Users/me/Temp/Bar/target/scala-2.11/classes...
[error] /Users/me/Temp/Bar/src/main/scala/com/example/Hello.scala:3: object tools is not a member of package scala
[error] import scala.tools.reflect.ToolBox
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
This is my build.sbt file:
name := """Bar"""
version := "1.0"
scalaVersion := "2.11.8"
// Change this to another test framework if you prefer
libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % "test"
libraryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.8"
// Uncomment to use Akka
//libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.3.11"
The following dependency fixed the issue:
libraryDependencies += "org.scala-lang" % "scala-compiler" % "2.11.8"
Is this the best solution?
The ToolBox class is part of the compiler, not the public reflection API.
Related
I am trying to comile and run simple h2o scala code. But when I do sbt package I get errors.
Am I missing something in the sbt file
This is my h2o scala code
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.sql._
import ai.h2o.automl.AutoML
import ai.h2o.automl.AutoMLBuildSpec
import org.apache.spark.h2o._
object H2oScalaEg1 {
def main(args: Array[String]): Unit = {
val sparkConf1 = new SparkConf().setMaster("local[2]").setAppName("H2oScalaEg1App")
val sparkSession1 = SparkSession.builder.config(conf = sparkConf1).getOrCreate()
val h2oContext = H2OContext.getOrCreate(sparkSession1.sparkContext)
import h2oContext._
import java.io.File
import h2oContext.implicits._
import water.Key
}
}
And this is my sbt file.
name := "H2oScalaEg1Name"
version := "1.0"
scalaVersion := "2.11.12"
scalaSource in Compile := baseDirectory.value / ""
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.3"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.2.0"
libraryDependencies += "ai.h2o" % "h2o-core" % "3.22.1.3" % "runtime" pomOnly()
When I do sbt package I get these errors
[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:7:8: not found: object ai
[error] import ai.h2o.automl.AutoML
[error] ^
[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:8:8: not found: object ai
[error] import ai.h2o.automl.AutoMLBuildSpec
[error] ^
[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:10:25: object h2o is not a member of package org.apache.spark
[error] import org.apache.spark.h2o._
[error] ^
[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:20:20: not found: value H2OContext
[error] val h2oContext = H2OContext.getOrCreate(sparkSession1.sparkContext)
[error] ^
[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:28:10: not found: value water
[error] import water.Key
[error] ^
[error] 5 errors found
How can I fix this problem.
My spark version in spark-2.2.3-bin-hadoop2.7
Thanks,
marrel
pomOnly() in build.sbt indicates to the dependency management handlers that jar libs/artifacts for this dependency should not be loaded and to only look for the metadata.
Try to use libraryDependencies += "ai.h2o" % "h2o-core" % "3.22.1.3" instead.
Edit 1: Additionally I think you are missing (at least) one library dependency:
libraryDependencies += "ai.h2o" % "h2o-automl" % "3.22.1.3"
see: https://search.maven.org/artifact/ai.h2o/h2o-automl/3.22.1.5/pom
Edit 2:
The last dependency you are missing is sparkling-water-core:
libraryDependencies += "ai.h2o" % "sparkling-water-core_2.11" % "2.4.6" should do the trick.
Here is the github of sparkling-water/core/src/main/scala/org/apache/spark/h2o
.
I am new the scala and SBT build files. From the introductory tutorials adding spark dependencies to a scala project should be straight-forward via the sbt-spark-package plugin but I am getting the following error:
[error] (run-main-0) java.lang.NoClassDefFoundError: org/apache/spark/SparkContext
Please provide resources to learn more about what could be driving error as I want to understand process more thoroughly.
CODE:
trait SparkSessionWrapper {
lazy val spark: SparkSession = {
SparkSession
.builder()
.master("local")
.appName("spark citation graph")
.getOrCreate()
}
val sc = spark.sparkContext
}
import org.apache.spark.graphx.GraphLoader
object Test extends SparkSessionWrapper {
def main(args: Array[String]) {
println("Testing, testing, testing, testing...")
var filePath = "Desktop/citations.txt"
val citeGraph = GraphLoader.edgeListFile(sc, filepath)
println(citeGraph.vertices.take(1))
}
}
plugins.sbt
resolvers += "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven/"
addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.6")
build.sbt -- WORKING. Why does libraryDependencies run/work ?
spName := "yewno/citation_graph"
version := "0.1"
scalaVersion := "2.11.12"
sparkVersion := "2.2.0"
sparkComponents ++= Seq("core", "sql", "graphx")
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.2.0",
"org.apache.spark" %% "spark-sql" % "2.2.0",
"org.apache.spark" %% "spark-graphx" % "2.2.0"
)
build.sbt -- NOT WORKING. Would expect this to compile & run correctly
spName := "yewno/citation_graph"
version := "0.1"
scalaVersion := "2.11.12"
sparkVersion := "2.2.0"
sparkComponents ++= Seq("core", "sql", "graphx")
Bonus for explanation + links to resources to learn more about SBT build process, jar files, and anything else that can help me get up to speed!
sbt-spark-package plugin provides dependencies in provided scope:
sparkComponentSet.map { component =>
"org.apache.spark" %% s"spark-$component" % sparkVersion.value % "provided"
}.toSeq
We can confirm this by running show libraryDependencies from sbt:
[info] * org.scala-lang:scala-library:2.11.12
[info] * org.apache.spark:spark-core:2.2.0:provided
[info] * org.apache.spark:spark-sql:2.2.0:provided
[info] * org.apache.spark:spark-graphx:2.2.0:provided
provided scope means:
The dependency will be part of compilation and test, but excluded from
the runtime.
Thus sbt run throws java.lang.NoClassDefFoundError: org/apache/spark/SparkContext
If we really want to include provided dependencies on run classpath then #douglaz suggests:
run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated
I have the following ScalaCheck unit test using Mockito:
import org.scalatest.mockito.MockitoSugar
import org.mockito.Mockito.when
import org.scalatest.prop.PropertyChecks
import org.mockito.Mockito.verify
class PlayerTest extends org.scalatest.FunSuite with MockitoSugar with PropertyChecks {
test("doesn't accept anything but M") {
val mockIOHandler = mock[IOHandler]
val name = "me"
val player = new Player(name)
when(mockIOHandler.nextLine).thenReturn("m")
val apiUser = new Player("player1")
apiUser.chooseHand(mockIOHandler)
verify(mockIOHandler).write("some value")
}
}
In my build.sbt I have the following dependencies:
scalaVersion := "2.12.2"
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"
// https://mvnrepository.com/artifact/org.mockito/mockito-core
libraryDependencies += "org.mockito" % "mockito-core" % "1.8.5"
For it, I am getting this error:
Error:(12, 41) Symbol 'type <none>.scalacheck.Shrink' is missing from the classpath.
This symbol is required by 'value org.scalatest.prop.GeneratorDrivenPropertyChecks.shrA'.
Make sure that type Shrink is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
A full rebuild may help if 'GeneratorDrivenPropertyChecks.class' was compiled against an incompatible version of <none>.scalacheck.
test("doesn't accept anything but M") {
Any idea what could be wrong?
Adding scalacheck did the trick for me
lazy val scalaTest = "org.scalatest" %% "scalatest" % "3.0.+"
lazy val scalaCheck = "org.scalacheck" %% "scalacheck" % "1.13.+"
I have a small project.
Where I have the following problem:
scalaTest needs to be added to all three dependency project (client, server, shared), otherwise the scalatest library is not accessible from all projects.
In other words, if I write
val jvmDependencies = Def.setting(Seq(
"org.scalaz" %% "scalaz-core" % "7.2.8"
)++scalaTest)
then things work fine.
But if I don't write ++scalaTest into each three dependencies then it fails like this:
> test
[info] Compiling 1 Scala source to /Users/joco/tmp3/server/target/scala-2.11/test-classes...
[error] /Users/joco/tmp3/server/src/test/scala/Test.scala:1: object specs2 is not a member of package org
[error] import org.specs2.mutable.Specification
[error] ^
[error] /Users/joco/tmp3/server/src/test/scala/Test.scala:3: not found: type Specification
[error] class Test extends Specification {
[error] ^
[error] /Users/joco/tmp3/server/src/test/scala/Test.scala:5: value should is not a member of String
[error] "Test" should {
[error] ^
[error] /Users/joco/tmp3/server/src/test/scala/Test.scala:6: value in is not a member of String
[error] "one is one" in {
[error] ^
[error] /Users/joco/tmp3/server/src/test/scala/Test.scala:8: value === is not a member of Int
[error] 1===one
[error] ^
[error] 5 errors found
[error] (server/test:compileIncremental) Compilation failed
[error] Total time: 4 s, completed Mar 18, 2017 1:56:54 PM
However for production(not test) code everything works just fine: I don't have to add 3 times the same dependencies (in this example autowire) to all three projects if I want to use a library in all three projects, it is enough to add it to only the shared project and then I can use that library from all three projects.
For test code, however, as I mentioned above, currently I have to add the same library dependency (scalaTest - below) to all three projects.
Question: Is there a way to avoid this ?
Settings.scala:
import org.scalajs.sbtplugin.ScalaJSPlugin.autoImport._
import sbt.Keys._
import sbt._
object Settings {
val scalacOptions = Seq(
"-Xlint",
"-unchecked",
"-deprecation",
"-feature",
"-Yrangepos"
)
object versions {
val scala = "2.11.8"
}
val scalaTest=Seq(
"org.scalatest" %% "scalatest" % "3.0.1" % "test",
"org.specs2" %% "specs2" % "3.7" % "test")
val sharedDependencies = Def.setting(Seq(
"com.lihaoyi" %%% "autowire" % "0.2.6"
)++scalaTest)
val jvmDependencies = Def.setting(Seq(
"org.scalaz" %% "scalaz-core" % "7.2.8"
))
/** Dependencies only used by the JS project (note the use of %%% instead of %%) */
val scalajsDependencies = Def.setting(Seq(
"org.scala-js" %%% "scalajs-dom" % "0.9.1"
)++scalaTest)
}
build.sbt:
import sbt.Keys._
import sbt.Project.projectToRef
import webscalajs.SourceMappings
lazy val shared = (crossProject.crossType(CrossType.Pure) in file("shared")) .settings(
scalaVersion := Settings.versions.scala,
libraryDependencies ++= Settings.sharedDependencies.value,
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full)
) .jsConfigure(_ enablePlugins ScalaJSWeb)
lazy val sharedJVM = shared.jvm.settings(name := "sharedJVM")
lazy val sharedJS = shared.js.settings(name := "sharedJS")
lazy val elideOptions = settingKey[Seq[String]]("Set limit for elidable functions")
lazy val client: Project = (project in file("client"))
.settings(
scalaVersion := Settings.versions.scala,
scalacOptions ++= Settings.scalacOptions,
libraryDependencies ++= Settings.scalajsDependencies.value,
testFrameworks += new TestFramework("utest.runner.Framework")
)
.enablePlugins(ScalaJSPlugin)
.disablePlugins(RevolverPlugin)
.dependsOn(sharedJS)
lazy val clients = Seq(client)
lazy val server = (project in file("server")) .settings(
scalaVersion := Settings.versions.scala,
scalacOptions ++= Settings.scalacOptions,
libraryDependencies ++= Settings.jvmDependencies.value
)
.enablePlugins(SbtLess,SbtWeb)
.aggregate(clients.map(projectToRef): _*)
.dependsOn(sharedJVM)
onLoad in Global := (Command.process("project server", _: State)) compose (onLoad in Global).value
fork in run := true
cancelable in Global := true
For test code, however, as I mentioned above, currently I have to add the same library dependency (scalaTest - below) to all three projects.
That is expected. test dependencies are not inherited along dependency chains. That makes sense, because you don't want to depend on JUnit just because you depend on a library that happens to be tested using JUnit.
Although yes, that calls for a bit of duplication when you have several projects in the same build, all using the same testing framework. This is why we often find some commonSettings that are added to all projects of an sbt build. This is also where we typically put things like organization, scalaVersion, and many other settings that usually apply to all projects inside one build.
It appears like #js.native is not identified by my compiler. In general scalajs compiles to me in the project.
link to file (and its including project in github where it fails).
source of the file that fails on #js.native
package example
import scala.scalajs.js
import js.annotation._
#js.native // sbt won't compile this native not found how to fix?
trait Funnel {
}
yields:
Funnel.scala:8: type native is not a member of package
scala.scalajs.js [error] #js.native [error] ^ [error] one error
found
sbt for reference:
import com.lihaoyi.workbench.Plugin._
enablePlugins(ScalaJSPlugin)
workbenchSettings
name := "Example"
version := "0.1-SNAPSHOT"
scalaVersion := "2.11.5"
libraryDependencies ++= Seq(
"org.scala-js" %%% "scalajs-dom" % "0.8.0",
"com.lihaoyi" %%% "scalatags" % "0.5.4"
)
jsDependencies += "org.webjars" % "d3js" % "3.5.12" / "d3.js"
jsDependencies += ProvidedJS / "d3-funnel.js"
bootSnippet := "example.ScalaJSExample().main(document.getElementById('canvas'));"
updateBrowsers <<= updateBrowsers.triggeredBy(fastOptJS in Compile)
Looks like you are running scala.js 0.6.1. Try to upgrade your version to >= 0.6.5