Scala REPL import SBT file - scala

I have an SBT file that has the following contents:
name := "Scala Playground"
version := "1.0"
scalaVersion := "2.11.6"
resolvers += "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/"
libraryDependencies ++= Seq(
"com.netflix.rxjava" %% "rxjava-scala" % "0.19.1",
"com.typesafe.play" %% "play-json" % "2.2.1"
)
Saved as scala-playground.sbt. I want to use this in my Scala REPL. When I tried to do the following:
sbt scala-playground.sbt
I got the following error:
[info] Set current project to Scala Playground (in build file:/home/joe/Desktop/)
[error] Not a valid command: scala-playground
[error] Not a valid project ID: scala-playground
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: scala-playground (similar: scala-version, scalac-options, scala-binary-version)
[error] scala-playground
[error] ^
I can't see anything stupid in my sbt file. Could anyone throw some light on it? Is this a proper way to get dependencies inside my Scala REPL?
All I want to do is to get in some dependencies inside my Scala REPL, so that I can quickly run and evaluate certain libraries.

The command line arguments are sbt commands, not the file you want to use. Just go to the directory with scala-playground.sbt file and run from there:
sbt console
sbt should automatically load the scala-playground.sbt file from current directory and open Scala console.

Related

Not able to cross publish my sbt plugin for multiple Scala versions

I have the following sbt file
lazy val root = (project in file(".")).
settings(
inThisBuild(List(
sbtPlugin := true,
organization := "com.foo",
crossScalaVersions := Seq("2.11.2", "2.12.0"),
version := "1.0.0"
)),
name := "myplugin",
libraryDependencies ++= Seq(
"org.scala-lang.modules" %% "scala-xml" % "1.0.6",
"com.typesafe" % "config" % "1.3.3"
)
)
Now I can easily do sbt publishLocal and I see that it generates a jar file in the .ivy2/local/com.foo/myplugin/scala_2.12/sbt_1.0/1.0.0/jars/
but if I do a
sbt +publishLocal
I get an error
[error] Modules were resolved with conflicting cross-version suffixes in ProjectRef(uri("file:/Users/user/myplugin/"), "root"):
[error] org.scala-lang.modules:scala-xml _2.11, _2.12
[error] org.scala-lang.modules:scala-parser-combinators _2.11, _2.12
[error] java.lang.RuntimeException: Conflicting cross-version suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.modules:scala-parser-combinators
[error] at scala.sys.package$.error(package.scala:27)
[error] at sbt.librarymanagement.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:39)
[error] at sbt.librarymanagement.ConflictWarning$.apply(ConflictWarning.scala:19)
[error] at sbt.Classpaths$.$anonfun$ivyBaseSettings$64(Defaults.scala:1995)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:39)
[error] at sbt.std.Transform$$anon$4.work(System.scala:66)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:262)
My expectation was that SBT will do the compilation and publishing twice and each time it will pick the right jars for the right Scala version. Why does it say that there is a conflict?
My end goal is to make SBT publish multiple jar files one for each scala version in my crossScalaVersions list.
sbt uses a fixed version of Scala: sbt 0.13 uses Scala 2.10, sbt 1.x uses Scala 2.12. So
one thing is that you cannot compile an sbt plugin for Scala 2.11,
and another thing is that you should cross-build your plugin for different versions of sbt (because Scala versions are implied).
There is documentation about Cross building plugins, but I'm not sure it is up to date, so it's better to see some examples in existing plugins. I think it should be enough to have this setup for your plugin project:
in project/build.properties:
sbt.version=0.13.17
in build.sbt settings:
sbtPlugin := true,
crossSbtVersions := Seq("0.13.17", "1.0.0"),
See sbt-boilerplate for an example.
After poking around with your build.sbt I found out the following:
removing all dependencies does not resolve the problem
removing sbtPlugin := true do resolve the problem
downgrading sbt from 1.x to 0.13.16 helps me compile your example
Additionally we should remember that sbt 0.13.x was written with Scala 2.10.
So, if you are writing a sbt plugin:
downgrade sbt
change Scala from 2.11.2 to 2.10.x
if you are not writing a plugin:
remove sbtPlugin := true

scala error: bad symbolic reference

I was using sbt to compile a scala code, and got the following error message
[error] /user/xyin/Projects/measurement_wspark/src/main/scala/json2csv.scala:41: bad symbolic reference. A signature in DefaultReads.class refers to term time
[error] in package java which is not available.
[error] It may be completely missing from the current classpath, or the version on
[error] the classpath might be incompatible with the version used when compiling DefaultReads.class.
[error] val ts = (json \ "RequestTimestamp").validate[String]
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
I was using the play json library to parse json files in my main class. And the content of the build.sbt file is like the following:
name := "json_parser"
version := "1.0"
scalaVersion := "2.10.1"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
resolvers += "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/"
libraryDependencies += "com.typesafe.play" %% "play-json" % "2.4.8"
I am totally new to scala and the play framework. Does anyone has any clue about this problem? Thanks!

"Filename too long" in sbt assembly inside a docker container

I have a scala Play project and I need to create a fat jar in the docker build time but I get this error:
[warn] Error extracting zip entry [...] (File name too long)
I tried adding the option scalacOptions ++= Seq("-Xmax-classfile-name","72") in build.sbt but doesn't works. I tried also appending -Xmax-classfile-name=72 to sbt assembly with the same result.
As I need to do it in docker build time, I can't use a mounted volume as mentioned here https://github.com/sbt/sbt-assembly/issues/69#issuecomment-196901781
What do I need to do to fix this?
In /project/plugins.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
and in build.sbt
assemblyJarName in assembly := "jarname.jar"
target in assembly := baseDirectory.value
Then run command "assembly" from project root and should generate the jar file.

Modifying and Building Spark core

I am trying to make a modification to the Apache Spark source code. I created a new method and added it to the RDD.scala file within the Spark source code I downloaded. After making the modification to RDD.scala, I built Spark using
mvn -Dhadoop.version=2.2.0 -DskipTests clean package
I then created a sample Scala Spark Application as mentioned here
I tried using the new function I created, and I got a compilation error when using sbt to create a jar for Spark. How exactly do I compile Spark with my modification and attach the modified jar to my project? The file I modified is RDD.scala within the core project. I run sbt package from the root dir of my Spark Application Project.
Here is the sbt file:
name := "N Spark"
version := "1.0"
scalaVersion := "2.11.6"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.3.0"
Here is the error:
sbt package
[info] Loading global plugins from /Users/Raggy/.sbt/0.13/plugins
[info] Set current project to Noah Spark (in build file:/Users/r/Downloads/spark-proj/n-spark/)
[info] Updating {file:/Users/r/Downloads/spark-proj/n-spark/}n-spark...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/r/Downloads/spark-proj/n-spark/target/scala-2.11/classes...
[error] /Users/r/Downloads/spark-proj/n-spark/src/main/scala/SimpleApp.scala:11: value reducePrime is not a member of org.apache.spark.rdd.RDD[Int]
[error] logData.reducePrime(_+_);
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 24 s, completed Apr 11, 2015 2:24:03 AM
UPDATE
Here is the updated sbt file
name := "N Spark"
version := "1.0"
scalaVersion := "2.10"
libraryDependencies += "org.apache.spark" % "1.3.0"
I get the following error for this file:
[info] Loading global plugins from /Users/Raggy/.sbt/0.13/plugins
/Users/Raggy/Downloads/spark-proj/noah-spark/simple.sbt:7: error: No implicit for Append.Value[Seq[sbt.ModuleID], sbt.impl.GroupArtifactID] found,
so sbt.impl.GroupArtifactID cannot be appended to Seq[sbt.ModuleID]
libraryDependencies += "org.apache.spark" % "1.3.0"
Delete libraryDependencies from build.sbt and just copy the custom-built Spark jar to the lib directory in your application project.

sbt-native-packager and RPM - how do I set required parameters?

I'm finding it diffuclt to build a Play project using the sbt native packager. I don't know where to set the RPM configuration when I am given the following error:
[error] `rpmVendor in Rpm` is empty. Please provide a valid vendor for the rpm SPEC.
[error] `packageSummary in Rpm` is empty. Please provide a valid summary for the rpm SPEC.
[error] `packageDescription in Rpm` is empty. Please provide a valid description for the rpm SPEC.
I've set the following in project/plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "0.8.0")
In my build.sbt:
name := """supersecretproject"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.1"
libraryDependencies ++= Seq(
jdbc,
anorm,
cache,
ws
)
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.27"
javacOptions ++= Seq("-source", "1.6", "-target", "1.6")
tomcat()
The documentation merely states:
A rpm package needs some mandatory settings to be valid. Make sure you have these settings in your build:
rpmRelease := "1"
rpmVendor := "typesafe"
rpmUrl := Some("http://github.com/paulp/sbt-extras")
rpmLicense := Some("BSD")
Which is almost entirely useless if you don't know SBT very well! How do I "have these settings in your build:" as the documentation instructs?
I've tried adding the above "settings" to build.sbt or a separate packageSettings.sbt but with no luck as I just get the following error:
error: not found: value rpmRelease
rpmRelease := "1"
^
[error] Type error in expression
Note: I run the sbt using sbt rpm:packageBin
It sounds like the developers of that plugin are trying to not be too prescriptive, but in doing so have not given you enough information to even get started! :-(
The simplest possible solution: Copy those four settings (including the blank lines between) into your build.sbt.
A logical place is probably towards the bottom of the file, as "packaging" your app is something that happens "towards the end" of the development cycle.
Another option: SBT automatically combines the contents of all .sbt files it finds in the project root. So if you prefer, you could create a new file such as packagingSettings.sbt and put those settings in there.
Edit: Help with imports:
Whichever option you choose, you'll need to add the following imports at the top of the file (as per the getting started guide):
import com.typesafe.sbt.SbtNativePackager._
import NativePackagerKeys._