I have a problem with scala.tools.sbt
scala file
Here I used parser functionality to make abstract syntax tree of code 2 + 3
import scala.tools.nsc._
object Main extends App {
var i = new Interpreter
println(i.parse("2 + 3"))
}
SBT configuration
name := "scalaSample"
version := "1.0-SNAPSHOT"
scalaVersion := "2.9.1"
libraryDependencies += "org.scalatest" %% "scalatest" % "1.7.1" % "test"
libraryDependencies += "org.scala-lang" % "scala-compiler" % "2.9.1"
Error
Failed to initialize compiler: object scala not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programatically, settings.usejavacp.value = true.
[error] (run-main) java.lang.NullPointerException
java.lang.NullPointerException at
scala.tools.nsc.CompilationUnits$CompilationUnit.
(CompilationUnits.scala:16) at
scala.tools.nsc.interpreter.ExprTyper$codeParser$.applyRule(ExprTyper.scala:22)
at
scala.tools.nsc.interpreter.ExprTyper$codeParser$.stmts(ExprTyper.scala:36)
at
scala.tools.nsc.interpreter.ExprTyper$$anonfun$parse$2.apply(ExprTyper.scala:47)
at
scala.tools.nsc.interpreter.ExprTyper$$anonfun$parse$2.apply(ExprTyper.scala:46)
at
scala.tools.nsc.reporters.Reporter.withIncompleteHandler(Reporter.scala:46)
at
scala.tools.nsc.interpreter.ExprTyper$class.parse(ExprTyper.scala:46)
at
scala.tools.nsc.interpreter.IMain$exprTyper$.parse(IMain.scala:1012)
at scala.tools.nsc.interpreter.IMain.parse(IMain.scala:1013) at
eu.semantiq.scalaToJS.Main$delayedInit$body.apply(Main.scala:7) at
scala.Function0$class.apply$mcV$sp(Function0.scala:34) at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:60) at
scala.App$$anonfun$main$1.apply(App.scala:60) at
scala.collection.LinearSeqOptimized$class.foreach(LinearSeqOptimized.scala:59)
at scala.collection.immutable.List.foreach(List.scala:45) at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:30)
at scala.App$class.main(App.scala:60) at
eu.semantiq.scalaToJS.Main$.main(Main.scala:5) at
eu.semantiq.scalaToJS.Main.main(Main.scala) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
java.lang.RuntimeException: Nonzero exit code: 1 at
scala.sys.package$.error(package.scala:27)
In scala REPL everything works
Welcome to Scala version 2.9.0.1 (OpenJDK 64-Bit Server VM, Java
1.6.0_23). Type in expressions to have them evaluated. Type :help for more information.
scala> import scala.tools.nsc._
import scala.tools.nsc._
scala> var i = new Interpreter
warning: there were 4 deprecation
warnings; re-run with -deprecation for details warning: there were 1
deprecation warnings; re-run with -deprecation for details
i: scala.tools.nsc.Interpreter = scala.tools.nsc.Interpreter#786bfd73
scala> println(i.parse("2 + 3"))
Some(List(2.$plus(3)))
I feel really sorry for my bad English
According to xsbt's FAQ:
sbt runs tests in the same JVM as sbt itself and Scala classes are not
in the same class loader as the application classes.
And there's more:
The key is to initialize the Settings for the interpreter using
embeddedDefaults.
The example that is given there uses some arbitrary type MyType. In fact, you can use any of your types to help sbt find the appropriate class loader (see this answer).
Hence, your code should look like this:
import scala.tools.nsc._
trait Foo // Arbitrary type added to get stuff working
object Main extends App {
val settings = new Settings
settings.embeddedDefaults[Foo]
val interpreter = new Interpreter(settings)
println(interpreter.parse("2 + 3"))
}
Related
I am upgrading my version of Scala to 2.12 from 2.11 and the object mapper seems to break. Other parts of my code require features only available under 2.12.
This using scala 2.12 with spark 2.1 mentions rebuilding Jackson as a possible solution. Is this truely necessary or is there a simpler solution?
SBT configuration for Scala 2.11.0
// Identity
name := "ScalaJsonSpike00"
organization := "com.acme"
// Versions
version := "1.0"
scalaVersion := "2.11.0"
// Scala test
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test"
// JSON
// https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind
libraryDependencies += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.8"
// https://mvnrepository.com/artifact/com.fasterxml.jackson.module/jackson-module-scala_2.11
libraryDependencies += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.8.8"
Code for both 2.11 and 2.12
import com.fasterxml.jackson.databind.ObjectMapper
import com.fasterxml.jackson.module.scala.DefaultScalaModule
import com.fasterxml.jackson.module.scala.experimental.ScalaObjectMapper
object Main {
def main(args: Array[String]): Unit = {
val originalMap = Map("a" -> List(1,2), "b" -> List(3,4,5), "c" -> List())
val mapper = new ObjectMapper() with ScalaObjectMapper
mapper.registerModule(DefaultScalaModule)
println(mapper.writeValueAsString(originalMap))
}
}
Result with Scala 2.11.0
{"a":[1,2],"b":[3,4,5],"c":[]}
SBT configuration update to scala 2.12.1
scalaVersion := "2.12.1"
Result with Scala 2.12.1
com.intellij.rt.execution.application.AppMain Main
Exception in thread "main" java.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.experimental.ScalaObjectMapper.$init$(Lcom/fasterxml/jackson/module/scala/experimental/ScalaObjectMapper;)V
at Main$$anon$1.<init>(Main.scala:9)
at Main$.main(Main.scala:9)
at Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
You should use %% to get jackson-module-scala: appropriate for your Scala version:
libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.8.8"
In your current version you always use Scala 2.11 version of this module, which is not binary compatible with Scala 2.12.
There is no such issue with core Jackson libraries, as those as Java ones, and therefore not affected by Scala version in any way.
The solution i found works is to copy all the code from ScalaObjectMapper into a scala class that extends Object Mapper, and then instantiate that.
I am upgrading my version of Scala to 2.12 from 2.11 and the object mapper seems to break. Other parts of my code require features only available under 2.12.
This using scala 2.12 with spark 2.1 mentions rebuilding Jackson as a possible solution. Is this truely necessary or is there a simpler solution?
SBT configuration for Scala 2.11.0
// Identity
name := "ScalaJsonSpike00"
organization := "com.acme"
// Versions
version := "1.0"
scalaVersion := "2.11.0"
// Scala test
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test"
// JSON
// https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind
libraryDependencies += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.8"
// https://mvnrepository.com/artifact/com.fasterxml.jackson.module/jackson-module-scala_2.11
libraryDependencies += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.8.8"
Code for both 2.11 and 2.12
import com.fasterxml.jackson.databind.ObjectMapper
import com.fasterxml.jackson.module.scala.DefaultScalaModule
import com.fasterxml.jackson.module.scala.experimental.ScalaObjectMapper
object Main {
def main(args: Array[String]): Unit = {
val originalMap = Map("a" -> List(1,2), "b" -> List(3,4,5), "c" -> List())
val mapper = new ObjectMapper() with ScalaObjectMapper
mapper.registerModule(DefaultScalaModule)
println(mapper.writeValueAsString(originalMap))
}
}
Result with Scala 2.11.0
{"a":[1,2],"b":[3,4,5],"c":[]}
SBT configuration update to scala 2.12.1
scalaVersion := "2.12.1"
Result with Scala 2.12.1
com.intellij.rt.execution.application.AppMain Main
Exception in thread "main" java.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.experimental.ScalaObjectMapper.$init$(Lcom/fasterxml/jackson/module/scala/experimental/ScalaObjectMapper;)V
at Main$$anon$1.<init>(Main.scala:9)
at Main$.main(Main.scala:9)
at Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
You should use %% to get jackson-module-scala: appropriate for your Scala version:
libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.8.8"
In your current version you always use Scala 2.11 version of this module, which is not binary compatible with Scala 2.12.
There is no such issue with core Jackson libraries, as those as Java ones, and therefore not affected by Scala version in any way.
The solution i found works is to copy all the code from ScalaObjectMapper into a scala class that extends Object Mapper, and then instantiate that.
I have a Scala code like below :-
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark._
object RecipeIO {
val sc = new SparkContext(new SparkConf().setAppName("Recipe_Extraction"))
def read(INPUT_PATH: String): org.apache.spark.rdd.RDD[(String)]= {
val data = sc.wholeTextFiles("INPUT_PATH")
val files = data.map { case (filename, content) => filename}
(files)
}
}
When I compile this code using sbt it gives me the error :
value wholeTextFiles is not a member of org.apache.spark.SparkContext.
I am importing all of which is required but it's still giving me this errror.
But when I compile this code by replacing wholeTextFiles with textFile, the code gets compiled.
What might be the problem here and how do I resolve that?
Thanks in advance!
Environment:
Scala compiler version 2.10.2
spark-1.2.0
Error:
[info] Set current project to RecipeIO (in build file:/home/akshat/RecipeIO/)
[info] Compiling 1 Scala source to /home/akshat/RecipeIO/target/scala-2.10.4/classes...
[error] /home/akshat/RecipeIO/src/main/scala/RecipeIO.scala:14: value wholeTexFiles is not a member of org.apache.spark.SparkContext
[error] val data = sc.wholeTexFiles(INPUT_PATH)
[error] ^
[error] one error found
[error] {file:/home/akshat/RecipeIO/}default-55aff3/compile:compile: Compilation failed
[error] Total time: 16 s, completed Jun 15, 2015 11:07:04 PM
My build.sbt file looks like this :
name := "RecipeIO"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "0.9.0-incubating"
libraryDependencies += "org.eclipse.jetty" % "jetty-server" % "8.1.2.v20120308"
ivyXML :=
<dependency org="org.eclipse.jetty.orbit" name="javax.servlet" rev="3.0.0.v201112011016">
<artifact name="javax.servlet" type="orbit" ext="jar"/>
</dependency>
You have a typo: it should be wholeTextFiles instead of wholeTexFiles.
As a side note, I think you want sc.wholeTextFiles(INPUT_PATH) and not sc.wholeTextFiles("INPUT_PATH") if you really want to use the INPUT_PATH variable.
Running w/ a simple SBT project w/ Java 7 (details below) and invoking sbt run at the command line (no IntelliJ or anything)
source
import scala.tools.nsc.{ Global, Settings }
object Playground extends App {
val compiler = new Global(new Settings())
val testFiles = List("Test.scala")
val runner = new compiler.Run()
val result = runner.compile(testFiles)
println(result)
}
error
error: error while loading Object, Missing dependency 'object scala in compiler mirror', required by /Library/Java/JavaVirtualMachines/jdk1.7.0_60.jdk/Contents/Home/jre/lib/rt.jar(java/lang/Object.class)
[error] (run-main-0) scala.reflect.internal.MissingRequirementError: object scala in compiler mirror not found.
scala.reflect.internal.MissingRequirementError: object scala in compiler mirror not found.
at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17)
at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:173)
at scala.reflect.internal.Definitions$DefinitionsClass.ScalaPackage$lzycompute(Definitions.scala:161)
at scala.reflect.internal.Definitions$DefinitionsClass.ScalaPackage(Definitions.scala:161)
at scala.reflect.internal.Definitions$DefinitionsClass.ScalaPackageClass$lzycompute(Definitions.scala:162)
at scala.reflect.internal.Definitions$DefinitionsClass.ScalaPackageClass(Definitions.scala:162)
at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1388)
at scala.tools.nsc.Global$Run.<init>(Global.scala:1053)
<etc...>
build.sbt
scalaVersion := "2.11.4"
val scalaV = "2.11.4"
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-compiler" % scalaV,
"org.scala-lang" % "scala-library" % scalaV,
"org.scala-lang" % "scala-reflect" % scalaV
)
java
$ java -version
java version "1.7.0_60-ea"
Java(TM) SE Runtime Environment (build 1.7.0_60-ea-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.60-b09, mixed mode)
This is the one where you have to say:
trait Probe
object Playground extends App {
//val compiler = new Global(new Settings())
val s = new Settings()
s.embeddedDefaults[Probe]
val compiler = new Global(s)
val testFiles = List("Test.scala")
val runner = new compiler.Run()
val result = runner.compile(testFiles)
println(result)
}
That took me a couple of minutes. That method name, "embeddedDefaults", is as cryptic as any to come out of sbt.
The comment on MutableSettings (which suggests a side effect):
/** Initializes these settings for embedded use by type `T`.
* The class loader defining `T` should provide resources `app.class.path`
* and `boot.class.path`. These resources should contain the application
* and boot classpaths in the same form as would be passed on the command line.*/
The indentation is as in the source code.
I hit the same problem.
settings.usejavacp.value = true
solved the problem for me!
#som-snytt solution worked for me on a clean sbt project. It didn't work on an akka-http project. this is the manual solution I've found (hardcoded path. One should adjust it to his env or put it in conf file)
It is just telling the compiler where to find scala libs for compilation
val settings = new Settings()
//didn't need this one:// settings.embeddedDefaults[Probe]
settings.classpath.value = "/home/oz/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.11.8.jar:/home/oz/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.8.jar:/home/oz/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.11.8.jar:/home/oz/.ivy2/cache/org.scala-lang.modules/scala-xml_2.11/bundles/scala-xml_2.11-1.0.4.jar:/home/oz/.ivy2/cache/org.scala-lang.modules/scala-parser-combinators_2.11/bundles/scala-parser-combinators_2.11-1.0.4.jar"
settings.bootclasspath append "/home/oz/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.8.jar:/home/oz/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.11.8.jar:/home/oz/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.11.8.jar:/home/oz/.ivy2/cache/org.scala-lang.modules/scala-xml_2.11/bundles/scala-xml_2.11-1.0.4.jar:/home/oz/.ivy2/cache/org.scala-lang.modules/scala-parser-combinators_2.11/bundles/scala-parser-combinators_2.11-1.0.4.jar:/home/oz/.ivy2/cache/jline/jline/jars/jline-2.12.1.jar"
I resole it, beacuse the maven dependence error:
<dependency>
<groupId>com.haizhi.spark</groupId>
<artifactId>spark-assembly</artifactId>
<version>1.6.1</version>
</dependency>
I remove this dependency, and then successful!!
Consider the following sample code: it writes a file to mongodb and then tries to reread it
import com.mongodb.casbah.Imports._
import com.mongodb.casbah.gridfs.Imports._
object TestGridFS{
def main(args: Array[String]){
val mongoConn = MongoConnection()
val mongoDB = mongoConn("gridfs_test")
val gridfs = GridFS(mongoDB) // creates a GridFS handle on ``fs``
val xls = new java.io.FileInputStream("ok.xls")
val savedFile=gridfs.createFile(xls)
savedFile.filename="ok.xls"
savedFile.save
println("savedfile id: %s".format(savedFile._id.get))
val file=gridfs.findOne(savedFile._id.get)
val bytes=file.get.source.map(_.toByte).toArray
println(bytes)
}
}
this yields
gridfs $ sbt run
[info] Loading global plugins from /Users/jean/.sbt/plugins
[info] Set current project to gridfs-test (in build file:/Users/jean/dev/sdev/src/perso/gridfs/)
[info] Running TestGridFS
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
savedfile id: 504c8cce0364a7cd145d5dc1
[error] (run-main) java.nio.charset.MalformedInputException: Input length = 1
java.nio.charset.MalformedInputException: Input length = 1
at java.nio.charset.CoderResult.throwException(CoderResult.java:260)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:319)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:158)
at java.io.InputStreamReader.read(InputStreamReader.java:167)
at java.io.BufferedReader.fill(BufferedReader.java:136)
at java.io.BufferedReader.read(BufferedReader.java:157)
at scala.io.BufferedSource$$anonfun$iter$1$$anonfun$apply$mcI$sp$1.apply$mcI$sp(BufferedSource.scala:38)
at scala.io.Codec.wrap(Codec.scala:64)
at scala.io.BufferedSource$$anonfun$iter$1.apply(BufferedSource.scala:38)
at scala.io.BufferedSource$$anonfun$iter$1.apply(BufferedSource.scala:38)
at scala.collection.Iterator$$anon$14.next(Iterator.scala:148)
at scala.collection.Iterator$$anon$25.hasNext(Iterator.scala:463)
at scala.collection.Iterator$$anon$19.hasNext(Iterator.scala:334)
at scala.io.Source.hasNext(Source.scala:238)
at scala.collection.Iterator$$anon$19.hasNext(Iterator.scala:334)
at scala.collection.Iterator$class.foreach(Iterator.scala:660)
at scala.collection.Iterator$$anon$19.foreach(Iterator.scala:333)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:99)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:250)
at scala.collection.Iterator$$anon$19.toBuffer(Iterator.scala:333)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:237)
at scala.collection.Iterator$$anon$19.toArray(Iterator.scala:333)
at TestGridFS$.main(test.scala:15)
at TestGridFS.main(test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
[error] {file:/Users/jean/dev/sdev/src/perso/gridfs/}default-b6ab90/compile:run: Nonzero exit code: 1
[error] Total time: 1 s, completed 9 sept. 2012 14:34:22
I don't understand what the charset problem can be, I just wrote the file to the database. when querying the base I DO see the files and chunks in there, but can't seem to be able to read them.
I tried this with mongo 2.0 and 2.2, casbah 2.4 and 3.0.0-M2 to no avail, and don't see what I could do to get the bytes, on mac OSX mountain lion.
PS: To run the test, you can use the following build.sbt
name := "gridfs-test"
version := "1.0"
scalaVersion := "2.9.1"
libraryDependencies += "org.mongodb" %% "casbah" % "2.4.1"
libraryDependencies += "org.mongodb" %% "casbah-gridfs" % "2.4.1"
resolvers ++= Seq("Typesafe Releases" at "http://repo.typesafe.com/typesafe/releases/",
"sonatype release" at "https://oss.sonatype.org/content/repositories/releases",
"OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/")
Here is the stacktrace I get :
I found a way to read the file contents back from mongodb. The source method relies on underlying.inpustream which is defined in GridFSDBFile.
Every test I did which uses underlying.inpustream failed with the same error.
However, the API proposes another way to access the files : writeTo. writeTo does not use underlying.inpustream.
Here is the "fixed" code from the question :
import com.mongodb.casbah.Imports._
import com.mongodb.casbah.gridfs.Imports._
object TestGridFS{
def main(args: Array[String]){
val mongoConn = MongoConnection()
val mongoDB = mongoConn("gridfs_test")
val gridfs = GridFS(mongoDB) // creates a GridFS handle on ``fs``
val xls = new java.io.File("ok.xls")
val savedFile=gridfs.createFile(xls)
savedFile.filename="ok.xls"
savedFile.save
println("savedfile id: %s".format(savedFile._id.get))
val file=gridfs.findOne(savedFile._id.get)
val byteArrayOutputStream = new java.io.ByteArrayOutputStream()
file.map(_.writeTo(byteArrayOutputStream))
byteArrayOutputStream.toByteArray
}
}
the last line, byteArrayOutputStream.toByteArray gives you an array of bytes which can then be used however you see fit.