Every now and then I start things from scratch to make sure I know what all the intricate details of setting up a project are. I have a simple app like the following but I get some dependency issues that don't seem to point me anywhere. I'm using scala verions 2.11.
My SBT:
name := "Helios"
version := "1.0"
scalaVersion := "2.11.8"
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"
libraryDependencies ++= Seq(
"com.typesafe.akka" % "akka-actor" % "2.0.2",
"com.typesafe.akka" % "akka-slf4j" % "2.0.5")
My Sample Class
import com.echostar.ese.helios.core.Asset
import akka.actor._
class NSPSG extends Actor {
def receive = {
case a: Asset => {
println(s"NSPSG Received asset: ${a}")
}
case _ => println("Unexpected message received")
}
}
(Asset class is just a case class with id and title in it.)
Error Message:
C:\PROJECTS\active\Helios>sbt compile
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
[info] Loading global plugins from C:\Users\dana.murad\.sbt\0.13\plugins
[info] Loading project definition from C:\PROJECTS\active\Helios\project
[info] Set current project to Helios (in build file:/C:/PROJECTS/active/Helios/)
[info] Updating {file:/C:/PROJECTS/active/Helios/}helios...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 3 Scala sources to C:\PROJECTS\active\Helios\target\scala-2.11\classes...
[error] missing or invalid dependency detected while loading class file 'package.class'.
[error] Could not access type ScalaObject in package scala,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[error] A full rebuild may help if 'package.class' was compiled against an incompatible version of scala.
[error] missing or invalid dependency detected while loading class file 'Actor.class'.
[error] Could not access type ScalaObject in package scala,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[error] A full rebuild may help if 'Actor.class' was compiled against an incompatible version of scala.
[error] C:\PROJECTS\active\Helios\src\main\scala\com\echostar\ese\helios\workers\NSPSG.scala:9: illegal inheritance;
[error] self-type com.echostar.ese.helios.workers.NSPSG does not conform to akka.actor.Actor's selftype akka.actor.Actor
[error] class NSPSG extends Actor {
[error] ^
[error] three errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 3 s, completed Apr 4, 2016 9:19:45 AM
My main App is just a println for now. It's not even calling this actor.
Am I using the wrong version of akka with scala 2.11? -Ylog-classpath didn't help
Don't know what fixed it but here's a list of things I did and the project compiles now.
Changed akka dependency line to this (added double percent and changed version back to 2.4)
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.4.0"
)
Removed my Run configuration and added it back again. ( I think the path to main showed as red (invalid) so redoing it helped resolve it. I changed my package name and I don't think intelliJ picked up the rename well in build-conf)
Started another test project from IntelliJ menu (Akka Main in Scala) and that took a bit to download all dependencies. So maybe my project needed those and wasn't download them?
Removed commented out lines (got rid of a semi-colon). I don't think this did anything but full disclosure, technically I did touch the code even though my actor definition is exactly the same.
Related
I am studying chisel3 with a small trial project.
I finished code, fixed several syntax issues in compilation, then, it reported an error without indicating error file and line number.
$ sbt test
[info] welcome to sbt 1.4.9 (Red Hat, Inc. Java 1.8.0_292)
[info] loading settings for project fparser-build from plugins.sbt ...
[info] loading project definition from /mnt/disk1/yupeng/repos/fparser/project
[info] loading settings for project root from build.sbt ...
[info] set current project to fparser (in build file:/mnt/disk1/yupeng/repos/fparser/)
[info] compiling 3 Scala sources to /mnt/disk1/yupeng/repos/fparser/target/scala-2.12/classes ...
[error] ## Exception when compiling 3 sources to /mnt/disk1/yupeng/repos/fparser/target/scala-2.12/classes
[error] scala.reflect.internal.Types$TypeError: object plugin is not a member of package chisel3.internal
[error]
[error]
[error] scala.reflect.internal.Types$TypeError: object plugin is not a member of package chisel3.internal
[error] (Compile / compileIncremental) scala.reflect.internal.Types$TypeError: object plugin is not a member of package chisel3.internal
[error] Total time: 3 s, completed Jul 16, 2021 4:38:42 PM
what does it mean? please help.
I just found, the error is gone after I changed chisel3 versions in build.sbt.
libraryDependencies ++= Seq(
"edu.berkeley.cs" %% "chisel3" % "3.4.3",
// "edu.berkeley.cs" %% "chisel3" % "3.2.6", // this one generate plugin error above
"edu.berkeley.cs" %% "chiseltest" % "0.3.3" % "test",
"edu.berkeley.cs" %% "rocketchip" % "1.2.6"
Previously I changed from 3.4.3 to 3.2.6 because sbt warning of
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
I ran sbt evicted and it said rocketchip 1.2.6 should use chisel3 3.2.6.
Maybe someone can clarify.
I'm new for spark, following tutorial to learn. I have installed
openjdk version "1.8.0_121"(web-binary)
Hadoop 2.8.0 (web-binary)
scala version 2.11.8 (apt)
and spark version 2.1.1 (web-binary-pre-build-with-hadoop 2.6.0 or later).
I runned SparkPi example and successed. But, some error appears, when I try to package my first spark app with sbt 0.13.15(apt), which was installed by the way org said.
I know must be a mistake about settings somewhere, but fail to find out in this link. Could anyone help me? Thanks :)
My project is like :
---SparkApp
|---simple.sbt
|---src
|---main
|---scala
|--- SimpleApp.scala
The dot sbt file in my project is :
name := "Simple Project"
version := "0.13.15"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1"
Error Log is like this :
hadoop#master:~/Mycode/SparkApp$ sbt package
[warn] Executing in batch mode.
[warn] For better performance, hit [ENTER] to switch to interactive mode, or
[warn] consider launching sbt without any commands, or explicitly passing 'shell'
[info] Loading project definition from /home/hadoop/Mycode/SparkApp/project
[info] Set current project to Simple Project (in build file:/home/hadoop/Mycode/SparkApp/)
[info] Compiling 1 Scala source to /home/hadoop/Mycode/SparkApp/target/scala-2.11/classes...
[error] missing or invalid dependency detected while loading class file 'SparkContext.class'.
[error] Could not access term akka in package <root>,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[error] A full rebuild may help if 'SparkContext.class' was compiled against an incompatible version of <root>.
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 2 s, completed May 16, 2017 1:08:53 PM
Some hints might be the problem is :
When I type spark-shell, I got this Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131) , which is different with when I type java -version openjdk version "1.8.0_121". Would this be the problem?
I didn't do anything after install sbt. Should I do something for setting? like let sbt know where my scala and spark is located. How?
I didn't have maven, should I?
------------------------ Second edit -------------------
After add -Ylog-classpath in dot sbt file, like this link said. I got a very long classpath print out which is too long to show here. Problem unsolved yet.
As noted, I provide the SimpleApp.scala :
/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object SimpleApp {
def main(args: Array[String]) {
val logFile = "file:///usr/local/spark/README.md" // Should be some file on your system
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
val logData = sc.textFile(logFile, 2).cache()
val numAs = logData.filter(line => line.contains("a")).count()
val numBs = logData.filter(line => line.contains("b")).count()
println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
}
}
tl;dr If you want to develop Spark applications you don't have to install Spark.
Having Spark installed locally does help a lot in your early days as a Spark developer (with tools like spark-shell and spark-submit), but is not required yet highly recommended.
In other words, what you've installed as a Spark package has nothing to do with what you can and want to use while developing a Spark application.
In sbt-managed Scala projects, you define what you want to use as a dependency, including Spark dependency, in libraryDependencies setting as follows:
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1"
And to my great surprise, you did that.
It appears that you use two different project directories to explain what you're doing ~/Mycode/SparkApp (in which you execute sbt package) and ---Pro (of which you show build.sbt).
Assuming your simple.sbt looks as follows:
name := "Simple Project"
version := "0.13.15"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1"
I could find one issue only which is version setting that I believe is 0.13.15 to reflect the version of sbt.
Please note that they are not in any way related and version is the version of your application while the version of sbt to use in the project is defined in project/build.properties that (given the latest version of sbt 0.13.15) should be as follows:
sbt.version = 0.13.15
The issue you face while executing sbt package (in /home/hadoop/Mycode/SparkApp) is that your application defines dependency on Akka as you can see in the error message:
[info] Set current project to Simple Project (in build file:/home/hadoop/Mycode/SparkApp/)
[info] Compiling 1 Scala source to /home/hadoop/Mycode/SparkApp/target/scala-2.11/classes...
[error] missing or invalid dependency detected while loading class file 'SparkContext.class'.
[error] Could not access term akka in package <root>,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[error] A full rebuild may help if 'SparkContext.class' was compiled against an incompatible version of <root>.
[error] one error found
[error] (compile:compileIncremental) Compilation failed
As of Spark 1.6 or so, Akka is no longer in use by Spark so I guess the project somehow reference Akka libraries that it should not if they're for Spark.
Lots of guesswork which I hope we'll sort out soon.
I have been following this tutorial to learn how to use Akka HTTP with Scala. I have no prior experience with Scala. I'm using IntelliJ Idea 2016.3 Ultimate.
I created a project and configured as the guide says.
name := "My Project"
version := "1.0"
scalaVersion := "2.9.1"
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"
libraryDependencies += "com.typesafe.akka" % "akka-actor" % "2.0"
I have the exactly same code as in the guide but when I run the SBT command compile I get the following error
[info] Updating {file:/Users/Javyer/Testing/Akka-Test/}akka-test...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/Javyer/Testing/Akka-Test/target/scala-2.9.1/classes...
[info] 'compiler-interface' not yet compiled for Scala 2.9.1.final. Compiling...
error: error while loading CharSequence, class file '/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/jre/lib/rt.jar(java/lang/CharSequence.class)' is broken
(bad constant pool tag 18 at byte 10)
error: error while loading AnnotatedElement, class file '/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/jre/lib/rt.jar(java/lang/reflect/AnnotatedElement.class)' is broken
(bad constant pool tag 18 at byte 76)
error: error while loading Arrays, class file '/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/jre/lib/rt.jar(java/util/Arrays.class)' is broken
(bad constant pool tag 18 at byte 765)
error: error while loading Comparator, class file '/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/jre/lib/rt.jar(java/util/Comparator.class)' is broken
(bad constant pool tag 18 at byte 20)
/var/folders/kw/80c0kgzs0b9d06vjrbx0l6g40000gn/T/sbt_139df807/xsbt/ExtractAPI.scala:549: error: java.util.Comparator does not take type parameters
private[this] val sortClasses = new Comparator[Symbol] {
^
5 errors found
[info] Resolving org.scala-sbt#interface;0.13.13 ...
[error] (compile:compileIncremental) Error compiling sbt component 'compiler-interface'
[error] Total time: 5 s, completed Mar 22, 2017 11:42:21 AM
If I change my build.sbt to a more recent version of the akka libraries
name := "Akka-Test"
version := "1.0"
scalaVersion := "2.12.1"
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.4.17",
"com.typesafe.akka" %% "akka-remote" % "2.4.17"
)
and compile again I get a different error
[info] Updating {file:/Users/Javyer/Testing/Akka-Test/}akka-test...
[info] Resolving jline#jline;2.14.1 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/Javyer/Testing/Akka-Test/target/scala-2.12/classes...
[error] /Users/Javyer/Testing/Akka-Test/src/main/scala/Pi.scala:5: object RoundRobinRouter is not a member of package akka.routing
[error] import akka.routing.RoundRobinRouter
[error] ^
[error] /Users/Javyer/Testing/Akka-Test/src/main/scala/Pi.scala:6: object Duration is not a member of package akka.util
[error] import akka.util.Duration
[error] ^
[error] /Users/Javyer/Testing/Akka-Test/src/main/scala/Pi.scala:7: object duration is not a member of package akka.util
[error] import akka.util.duration._
[error] ^
[error] /Users/Javyer/Testing/Akka-Test/src/main/scala/Pi.scala:17: not found: type Duration
[error] case class PiApproximation(pi: Double, duration: Duration)
[error] ^
[error] /Users/Javyer/Testing/Akka-Test/src/main/scala/Pi.scala:40: not found: value RoundRobinRouter
[error] val workerRouter = context.actorOf(Props[Worker].withRouter(RoundRobinRouter(nrOfWorkers)), name = "workerRouter")
[error] ^
[error] 5 errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 4 s, completed Mar 22, 2017 11:47:47 AM
How can I fix this? I don't know how to debug this error nor where to start tackling this problem. As I said, this is my first test with Scala so I don't have any prior experience with it, so, please, give me detail answers.
Thanks!
First error: Scala 2.9.x requires JVM runtime v 1.6 or 1.7, while you are using 1.8, which wouldn't work as we can clearly see from the message ('broken class' means that the runtime doesn't understand the structure of the files you are feeding it).
Second error requires a bit more context, could you post the code? I'm 99% sure you follow some outdated example, but use fresh library releases. E.g. there's no Duration in akka.util, no RoundRobinRouter in akka.routing etc.
I would suggest you to spend some time exploring/tweaking available samples, e.g. this one: https://github.com/akka/akka-samples/tree/master/akka-sample-main-scala.
Also, note that akka evolves pretty fast, but has a solid documentation versioning support, so for each library release there's a separate documentation section on the site. You'll be notified about this at leas once a day when you visit an outdated release documentation page. Here's the link to he 2.4.17 docs: http://doc.akka.io/docs/akka/2.4.17/scala.html
There are 2 options:
Using java 7 with scala 2.7.2
Using another version of scala (2.11.4), and java 8:
Example build.sbt
scalaVersion := Option(System.getProperty("scala.version")).getOrElse("2.11.4")
When I try to run ( through SBT) my scala program I run into a bunch of errors.
Here's an excerpt:
[error] missing or invalid dependency detected while loading class file 'IterableUtils.class'.
[error] Could not access type ScalaObject in package scala,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[error] A full rebuild may help if 'IterableUtils.class' was compiled against an incompatible version of scala.
[error] missing or invalid dependency detected while loading class file 'AsBooleanTrait.class'.
[error] Could not access type ScalaObject in package scala,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
.....
I did add the scala-library.jar to the classpath but to no avail. Does anyone know what might be missing?
Ps. used -> new SBT project (Intellij) on osx
edit: here's the build.sbt
name := "test"
version := "1.0"
scalaVersion := "2.11.8"
resolvers += "Scales XML" at "https://mvnrepository.com/artifact/org.scalesxml/scales-xml_2.9.1"
libraryDependencies += "org.scalesxml" % "scales-xml_2.9.1" % "0.3-RC7"
SBT is version 0.13.8
Edit 2:
Figured it out. I was trying to run a class (with a main method) without creating an instance... After changing it to an Object things work a lot better :)
Edit 3:
Spoke too soon. I turns out it has to do with setting the scalaVersion in build.sbt. When I leave that entire line out, it no longer complains about the missing dependencies. When I put it back in I get the errors mentioned above back. I tried setting it to 2.11.7 as well ( after installing that with brew install scala) but to no avail.
scalaVersion := "2.11.8"
libraryDependencies += "org.scalesxml" % "scales-xml_2.9.1" % "0.3-RC7"
You can't use a library compiled for Scala 2.9.1 with Scala 2.11.*. Write "org.scalesxml" %% "scales-xml" % some-version instead, which will look for scales-xml_2.11. See http://www.scala-sbt.org/0.13/docs/Cross-Build.html.
I'm new to Scala and SBT so I might be missing something obvious.
I was trying to compile the HelloWorld example on http://www.scalafx.org/docs/quickstart/
I created a file build.sbt containing:
scalaVersion := "2.11.5"
libraryDependencies += "org.scalafx" %% "scalafx" % "8.0.0-R4"
and a file src/main/scala/ScalaFXHelloWorld.scala containing the code from linked page.
However, when running sbt run I get:
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=256M; support was removed in 8.0
[info] Set current project to scalafx (in build file:/home/kvbx/Projects/ScalaFX/)
[info] Compiling 1 Scala source to /home/kvbx/Projects/ScalaFX/target/scala-2.11/classes...
[error] missing or invalid dependency detected while loading class file 'Color.class'.
[error] Could not access term javafx in package <root>,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[error] A full rebuild may help if 'Color.class' was compiled against an incompatible version of <root>.
[error] missing or invalid dependency detected while loading class file 'Color.class'.
[error] Could not access term scene in value javafx,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[error] A full rebuild may help if 'Color.class' was compiled against an incompatible version of javafx.
[error] missing or invalid dependency detected while loading class file 'Stage.class'.
[error] Could not access term javafx in package <root>,
...
...
I'm running sbt 0.13.7 and scala 2.11.5 on openjdk 1.8.0_31 on Archlinux
JavaFX isn't part of OpenJDK 8. I installed openjfx. Works. (Thanks Jasper)