sbt assembly is giving error - scala

C:\scala\spark-1.6.1-bin-hadoop2.6\spark-1.6.1-bin-hadoop2.6>sbt assembly
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; su
port was removed in 8.0
[info] Set current project to spark-1-6-1-bin-hadoop2-6 (in build file:/C:/scal
/spark-1.6.1-bin-hadoop2.6/spark-1.6.1-bin-hadoop2.6/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: assembly
[error] assembly
[error] ^
Tried everything given on web,still unable to sort out this issue,any help/pointers please ...
"./sbt/sbt assembly" errors "Not a valid command: assembly" for Apache Spark project
above solution is also not working...

Problem is you have not added sbt-assembly plugin.
In windows, C:\Users\<username>\.sbt\0.13\plugins\plugins.sbt and add
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")
then it will work. Explained here

Related

Symbol 'term org.apache.spark.annotation' is missing from the classpath(spark kubernetes)

I am using spark with Kubernetes and facing an issue while building a fat jar with "sbt assembly" command with libraryDependencies += "org.apache.spark" %% "spark-kubernetes" % "2.4.5" in my build.sbt. Everything is working fine if i downgrade the spark kubernetes version to "2.4.2" but not with "2.4.5".`
And in another case when i rebuild the same code again using sbt assembly after deleted the target folder generated by previous sbt assembly command then it starts working fine with the same 2.4.5 spark Kubernetes jar.
I just want to avoid this multiple times assembly jar building activity and i have also tried by adding spark tags library in my build.sbt file but still it is not working.
So Please help me with this issue.
I am using spark 2.4.4 and scala 2.11.12 and sbt 1.2.8
Kindly find the attached logs .
[error] import org.apache.spark.sql.{SQLContext, SparkSession}
[error] ^
[error] /d02/scala/sparktest/app/controllers/MainController.scala:92:15: Symbol 'term org.apache.spark.annotation' is missing from the classpath.
[error] This symbol is required by ' <none>'.
[error] Make sure that term annotation is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'SparkSession.class' was compiled against an incompatible version of org.apache.spark.
[error] var spark = SparkSession.builder().appName("sparktest").config(conf).getOrCreate()
[error] ^
[error] /d02/scala/sparktest/app/controllers/MainController.scala:94:24: Symbol 'term org.apache.spark.annotation' is missing from the classpath.
[error] This symbol is required by ' <none>'.
[error] Make sure that term annotation is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'SQLContext.class' was compiled against an incompatible version of org.apache.spark.
[error] var sqlContext = new SQLContext(sc)
[error] ^
[error] three errors found
[error] (Compile / compileIncremental) Compilation failed.

failed to create .ensime file, how can I do?

I have been trying to use ensime with sublime to use Scala. To install ensime I created a plugin.sbt in this location
~/.sbt/1.0/plugins/plugins.sbt
here is the screen shot
I also added "addSbtPlugin("org.ensime" % "sbt-ensime" % "2.0.1")" in the plugin.sbt. But when I run sbt and run the command "ensimeConfig" to create .ensime file I get error
C:\Users\Mahadi>sbt
"C:\Users\Mahadi\.sbt\preloaded\org.scala-sbt\sbt\"1.0.1"\jars\sbt.jar"
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; sup
port was removed in 8.0
[info] Loading global plugins from C:\Users\Mahadi\.sbt\1.0\plugins\project
[info] Loading settings from plugins.sbt ...
[info] Loading global plugins from C:\Users\Mahadi\.sbt\1.0\plugins
[info] Loading project definition from C:\Users\Mahadi\project
[info] Set current project to mahadi (in build file:/C:/Users/Mahadi/)
[info] sbt server started at 127.0.0.1:5547
sbt:mahadi> ensimeConfig
[error] Not a valid command: ensimeConfig
[error] Not a valid project ID: ensimeConfig
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: ensimeConfig
[error] ensimeConfig
[error] ^
sbt:mahadi>
So I am looking for your help.
Youn should run sbt in the root directory of your sbt project, as the ensime configuration file is built based on it.

ScalaFX: HelloWorld compilation error

I'm new to Scala and SBT so I might be missing something obvious.
I was trying to compile the HelloWorld example on http://www.scalafx.org/docs/quickstart/
I created a file build.sbt containing:
scalaVersion := "2.11.5"
libraryDependencies += "org.scalafx" %% "scalafx" % "8.0.0-R4"
and a file src/main/scala/ScalaFXHelloWorld.scala containing the code from linked page.
However, when running sbt run I get:
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=256M; support was removed in 8.0
[info] Set current project to scalafx (in build file:/home/kvbx/Projects/ScalaFX/)
[info] Compiling 1 Scala source to /home/kvbx/Projects/ScalaFX/target/scala-2.11/classes...
[error] missing or invalid dependency detected while loading class file 'Color.class'.
[error] Could not access term javafx in package <root>,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[error] A full rebuild may help if 'Color.class' was compiled against an incompatible version of <root>.
[error] missing or invalid dependency detected while loading class file 'Color.class'.
[error] Could not access term scene in value javafx,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[error] A full rebuild may help if 'Color.class' was compiled against an incompatible version of javafx.
[error] missing or invalid dependency detected while loading class file 'Stage.class'.
[error] Could not access term javafx in package <root>,
...
...
I'm running sbt 0.13.7 and scala 2.11.5 on openjdk 1.8.0_31 on Archlinux
JavaFX isn't part of OpenJDK 8. I installed openjfx. Works. (Thanks Jasper)

How can I enable remote debugging for SBT in windows?

I would like to accomplish running the equivalent of this
sbt -jvm-debug 5005
However I don't seem to be able to pass in args in Windows. This is what I am seeing
>sbt -jvm-debug 5005
[info] Loading project definition from [myProject]
[info] Set current project to [myProject] (in build file myProject)
[error] Expected letter
[error] Expected symbol
[error] Expected '!'
[error] Expected '+'
[error] Expected '++'
[error] Expected ';'
[error] Expected end of input.
[error] Expected 'show'
[error] Expected '*'
[error] Expected '{'
[error] Expected project ID
[error] Expected configuration
[error] Expected key
[error] 5005
[error] ^
[error] Not a valid command: jvm-debug
[error] Not a valid project ID: jvm-debug
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: jvm-debug
[error] jvm-debug
[error] ^
I would like to be able to remote debug this application from Intellij. Any help would be great!
set SBT_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=5005
sbt run
was the only working solution on Windows 7.
Seems like the Windows version of SBT doesn't define this functionality.
On Linux it is defined in the $SBT_HOME/sbt/bin/sbt-launch-lib.bash as
addDebugger () {
addJava "-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=$1"
}
You can achieve the same result by setting the SBT_OPTS environmental variable on Windows.
Run SBT like this, to make the debugger listen on port 5005
set SBT_OPTS="-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=5005" && sbt
Currently, I'd downloaded the last updated sbt, exactly sbt 1.3.3 but when I want to enable debug in the project, I had the same problem as you. Investigating, I could see that I haven't this file (sbt-launch-lib.bash) in my own root C:\Program Files (x86)\sbt\bin so when I see in another machine, I could notice that sbt, It was 1.2.8 and yes it had the file that I mentioned before so I uninstalled sbt 1.3.3 and I installed sbt 1.2.8 and put in my project sbt -jvm-debug 9999 and it works.

"./sbt/sbt assembly" errors "Not a valid command: assembly" for Apache Spark project

I'm having trouble with installing Apache Spark on Ubuntu 13.04. Im using spark-0.8.1-incubating, and both ./sbt/sbt update and ./sbt/sbt compile work fine. However, when I do a ./sbt/sbt assembly I get the following error:
[info] Set current project to default-289e76 (in build file:/node-insights/server/lib/spark-0.8.1-incubating/sbt/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Not a valid configuration: assembly
[error] Not a valid key: assembly
[error] assembly
[error]
I googled for stuff related to this but couldn't find anything useful. Any guidance would be much appreciated.
The current project set to default-289e76 message suggests that sbt was called from outside of the Spark sources directory:
$ /tmp ./spark-0.8.1-incubating/sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Set current project to default-d0f036 (in build file:/private/tmp/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Not a valid configuration: assembly
[error] Not a valid key: assembly
[error] assembly
[error] ^
Running ./sbt/sbt assembly works fine from the spark-0.8.1-incubating directory (note the log output showing that the current project was set correctly):
$ spark-0.8.1-incubating sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project/project
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project
[info] Set current project to root (in build file:/private/tmp/spark-0.8.1-incubating/)
...
You typed "abt" twice, but shouldn't that be "sbt"? Apache Spark has its own copy of sbt, so make sure you're running Spark's version to pick up the "assembly" plugin among other customizations.
To run the Spark installation of sbt, go to the Spark directory and run ./sbt/sbt assembly .