Scala delimited continuations error at runtime - scala

Scala newbie here, I just downloaded Eclipse 3.6.2 and Scala IDE 2.0.0-beta4 (with Scala 2.9.0.final). I create a new Scala project to try delimited continuations:
package delimCCTests
import scala.util.continuations._
object Test extends App {
val result = reset {
1 + shift { k: (Int => Int) => k(k(5)) } + 1
}
println(result)
}
This compiles fine, then I click Run as -> Scala application and get this exception:
Exception in thread "main" java.lang.NoSuchMethodError: scala.util.continuations.package$.shift(Lscala/Function1;)Ljava/lang/Object;
at delimCCTests.Test$$anonfun$1.apply$mcI$sp(DelimCCTests.scala:7)
at delimCCTests.Test$$anonfun$1.apply(DelimCCTests.scala:7)
at delimCCTests.Test$$anonfun$1.apply(DelimCCTests.scala:7)
at scala.util.continuations.package$.reset(package.scala:20)
at delimCCTests.Test$delayedInit$body.apply(DelimCCTests.scala:6)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:60)
at scala.App$$anonfun$main$1.apply(App.scala:60)
at scala.collection.LinearSeqOptimized$class.foreach(LinearSeqOptimized.scala:59)
at scala.collection.immutable.List.foreach(List.scala:45)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:30)
at scala.App$class.main(App.scala:60)
at delimCCTests.Test$.main(DelimCCTests.scala:5)
at delimCCTests.Test.main(DelimCCTests.scala)
What am I doing wrong? Am I missing some configuration?
BTW I thought the compiler inferred the type of the continuation? This article uses:
val result = reset {
1 + shift { k => k(k(5)) } + 1
}
but this doesn't compile in my environment...

This error means that you didn't add Scala CPS plugin - it's not a part of a standard assembly (so far). Put the jar on the classpath, and run Scala is follows, in order to have continuations enabled:
$ scala -P:continuations:enable

This can be solved in eclipse by adding the CPS plugins class on the Scala Compiler > Advanced section, as well as enabling the switch:
Xplugin should be scala.tools.selectivecps.SelectiveCPSPlugin and Xpluginsdir should be the dir which contains org.scala-lang.plugins.scala-continuations-plugin.jar

Related

How to create ROM with VecInit(Array()) in Chisel?

I'm trying to declare a «rom» with VecInit() like it :
val GbColors = VecInit(Array(GB_GREEN0, GB_GREEN1, GB_GREEN2, GB_GREEN3))
With GB_GREENx declared like it :
class VgaColors extends Bundle {
val red = UInt(6.W)
val green = UInt(6.W)
val blue = UInt(6.W)
}
//...
object GbConst {
//...
/* "#9BBC0F"*/
val GB_GREEN0 = (new VgaColors()).Lit(_.red -> "h26".U(6.W),
_.green -> "h2F".U(6.W),
_.blue -> "h03".U(6.W))
/* "#8BAC0F"*/
val GB_GREEN1 = (new VgaColors()).Lit(_.red -> "h1E".U(6.W),
_.green -> "h27".U(6.W),
_.blue -> "h03".U(6.W))
/* "#306230"*/
val GB_GREEN2 = (new VgaColors()).Lit(_.red -> "h0C".U(6.W),
_.green -> "h18".U(6.W),
_.blue -> "h0C".U(6.W))
/*"#0F380F"*/
val GB_GREEN3 = (new VgaColors()).Lit(_.red -> "h03".U(6.W),
_.green -> "h0E".U(6.W),
_.blue -> "h03".U(6.W))
I can't manage to use GbColors as indexable Vec :
io.vga_color := GbColors(io.mem_data)
It generate a java stack error :
[info] [0.004] Elaborating design...
[error] chisel3.internal.ChiselException: Connection between sink (VgaColors(IO in unelaborated MemVga)) and source (VgaColors(Wire in GbWrite)) failed #.blue: Sink or source unavailable to current module.
[error] ...
[error] at gbvga.MemVga.$anonfun$new$42(memvga.scala:87)
[error] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[error] at chisel3.WhenContext.<init>(When.scala:80)
[error] at chisel3.when$.apply(When.scala:32)
[error] at gbvga.MemVga.<init>(memvga.scala:86)
[error] at gbvga.GbVga.$anonfun$memvga$1(gbvga.scala:24)
[error] at chisel3.Module$.do_apply(Module.scala:54)
[error] at gbvga.GbVga.<init>(gbvga.scala:24)
[error] at gbvga.GbVgaDriver$.$anonfun$new$9(gbvga.scala:53)
[error] ... (Stack trace trimmed to user code only, rerun with --full-stacktrace if you wish to see the full stack trace)
...
To manage it, I have to use switch(){is()} format :
switch(io.mem_data) {
is("b00".U) {
io.vga_color := GB_GREEN0
}
is("b01".U) {
io.vga_color := GB_GREEN1
}
is("b10".U) {
io.vga_color := GB_GREEN2
}
is("b11".U) {
io.vga_color := GB_GREEN3
}
}
But it's too verbose I think.
What is wrong with my VecInit() «rom» ?
[edit]
My versions are :
$ java -version
java version "1.8.0_151"
Java(TM) SE Runtime Environment (build 1.8.0_151-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.151-b12, mixed mode)
$ scala -version
Scala code runner version 2.11.7-20150420-135909-555f8f09c9 -- Copyright 2002-2013, LAMP/EPFL
In build.sbt :
val defaultVersions = Map(
"chisel3" -> "3.4.0-RC1",
"chisel-iotesters" -> "1.5.0-RC1",
"chisel-formal" -> "0.1-SNAPSHOT",
)
I think the problem here is because the Bundles in GbConst are created outside of a Module. One potential fix would be to make GbConst into a trait and add it to Modules who need access to those values. (I have created a PR that seems to show this approach works, though it's probably creating a lot of copies of the Bundles). Another approach (that I have not tried) would be to create a Module that serves up all the Bundles as outputs (which should make less copies).
My PR also changed the chisel3 and chisel-testers dependencies to be SNAPSHOTS.

How to run Scala test in Scala native application?

I have hello world scala native app and wanted to run small scala test to this app I use the usual test command but it's throw an exception :
NativeMain.scala
object NativeMain {
val p = new Person("xxxx")
def main(args: Array[String]): Unit = {
println("Hello world")
}
}
class Person(var name: String)
}
NativeTest.scala
import org.scalatest.{FlatSpec, Matchers}
class NativeTest extends FlatSpec with Matchers {
"name" should "the name is set correctly in constructor" in {
assert(NativeMain.p.name == "xxxx")
}
}
I run test command in the sbt shell and got this error
[IJ]> test
[info] Compiling 1 Scala source to /home/****/Documents/ScalaNativeFresh/target/scala-2.11/classes...
[info] Compiling 1 Scala source to /home/****/Documents/ScalaNativeFresh/target/scala-2.11/test-classes...
[info] Compiling 1 Scala source to /home/****/Documents/ScalaNativeFresh/target/scala-2.11/test-classes...
[info] Linking (28516 ms)
[error] cannot link: #java.lang.Thread::getStackTrace_scala.scalanative.runtime.ObjectArray
[error] unable to link
[error] (nativetest:nativeLink) unable to link
[error] Total time: 117 s, completed Apr 2, 2019 3:04:24 PM
Any help or suggestions thank you :) ?
There is an open issue to add Add support for Scala Native #1112 and according to cheeseng:
3.1.0-SNAP6 and 3.2.0-SNAP10 are the only 2 versions (as of the time of writing) that supports scala-native
Try importing scalatest_native0.3_2.11 like so
libraryDependencies += "org.scalatest" % "scalatest_native0.3_2.11" % "3.2.0-SNAP10"
scalatest-native-example is a working example showing how to use scalatest with scala-native.

scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found

I'am trying to build spark streaming application using sbt package,I can't discover what's the reason of this error.
this is some thing of the error
scala.reflect.internal.MissingRequirementError: object
java.lang.Object in compiler mirror not found. at
scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
at
scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
at
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
at
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
at
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
and here is the code
import org.apache.spark.SparkContext
import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.twitter._
import twitter4j.Status
object TrendingHashTags {
def main(args: Array[String]): Unit = {
val Array(consumerKey, consumerSecret, accessToken, accessTokenSecret,
lang, batchInterval, minThreshold, showCount ) = args.take(8)
val filters = args.takeRight(args.length - 8)
System.setProperty("twitter4j.oauth.consumerKey", consumerKey)
System.setProperty("twitter4j.oauth.consumerSecret", consumerSecret)
System.setProperty("twitter4j.oauth.accessToken", accessToken)
System.setProperty("twitter4j.oauth.accessTokenSecret", accessTokenSecret)
val conf = new SparkConf().setAppName("TrendingHashTags")
val ssc = new StreamingContext(conf, Seconds(batchInterval.toInt))
val tweets = TwitterUtils.createStream(ssc, None, filters)
val tweetsFilteredByLang = tweets.filter{tweet => tweet.getLang() == lang}
val statuses = tweetsFilteredByLang.map{ tweet => tweet.getText()}
val words = statuses.flatMap{status => status.split("""\s+""")}
val hashTags = words.filter{word => word.startsWith("#")}
val hashTagPairs = hashTags.map{hashtag => (hashtag, 1)}
val tagsWithCounts = hashTagPairs.updateStateByKey(
(counts: Seq[Int], prevCount: Option[Int]) =>
prevCount.map{c => c + counts.sum}.orElse{Some(counts.sum)}
)
val topHashTags = tagsWithCounts.filter{ case(t, c) =>
c > minThreshold.toInt
}
val sortedTopHashTags = topHashTags.transform{ rdd =>
rdd.sortBy({case(w, c) => c}, false)
}
sortedTopHashTags.print(showCount.toInt)
ssc.start()
ssc.awaitTermination()
}
}
I solved this issue ,I found that I used java 9 that isn't compatible with scala version so I migrated from java 9 into java 8.
The error means that scala was compiled using a version of java, different from the current version.
I am using maven instead of sbt, but the same behavior is observed.
Find the java version:
> /usr/libexec/java_home -V
Matching Java Virtual Machines (2):
15.0.1, x86_64: "OpenJDK 15.0.1" /Users/noname/Library/Java/JavaVirtualMachines/openjdk-15.0.1/Contents/Home
1.8.0_271, x86_64: "Java SE 8" /Library/Java/JavaVirtualMachines/jdk1.8.0_271.jdk/Contents/Home
If you installed scala, while you were on version >1.8 and then downgraded the java version (edited the $JAVA_HOME to point to 1.8), you will get this error.
Checked the scala version being used by the project :
$ ls -l /Users/noname/.m2/repository/org/scala-lang/scala-library/2.11.11/scala-library-2.11.11.jar
-rwxrwxrwx 1 noname staff 0 Nov 17 03:41 /Users/noname/.m2/repository/org/scala-lang/scala-library/2.11.11/scala-library-2.11.11.jar
To rectify the issue, remove the scala jar file:
$ rm /Users/noname/.m2/repository/org/scala-lang/scala-library/2.11.11/scala-library-2.11.11.jar
Now, execute mvn clean install again and the project would compile.
I had faced this issue when I had to downgrade my projects Scala version to use a dependency that was compiled in a lower Scala version and could not resolved it even after I made sure JDK and all other dependencies are compatible with the downgraded Scala library version.
As #ForeverLearner mentioned above, deleting Scala library versions higher than the one I am now using to compile project from maven repo (/Users/<>/.m2/repository/org/scala-lang/scala-library/...) helped me get rid of this error
The above fix resolved my issue as well (setting Java 8) , If you are using Intellij you can go to Project Settings and under Project change the Project SDK to 1.8 .

Scala UDF: java.lang.NoClassDefFoundError: scala/ScalaObject

I am trying to use Scala for UDFs, but the Pig job is failing with the
error "java.lang.NoClassDefFoundError: scala/ScalaObject". What am I doing wrong?
$ cat NonEmpty.scala
package nonempty
import org.apache.pig.FilterFunc
import org.apache.pig.data._
class NonEmpty extends FilterFunc {
def exec(input: Tuple) = {
val s = input.get(0)
s match {
case a: String => !a.isEmpty
case _ => false
}
}
}
$ cat ex3.pig
register ./nonempty.jar
register ./scala-library.jar;
define NonEmpty nonempty.NonEmpty();
raw = load 'excite-small.log' using PigStorage('\t') as (user: chararray, time:chararray, query: chararray);
locations = filter raw by NonEmpty(query);
Build:
scalac -cp ~/pig-0.9.2/pig-0.9.2.jar NonEmpty.scala
jar -cf nonempty.jar nonempty
Pig Stack Trace:
2 ---------------
3 ERROR 2998: Unhandled internal error. scala/ScalaObject
4
5 java.lang.NoClassDefFoundError: scala/ScalaObject
(...)
ScalaObject is located in the scala-library.jar which needs to be included on the runtime classpath. So add scala-library.jar to the runtime classpath of your command that is running the program.

Scala SBT: scala.tools.nsc isn't running

I have a problem with scala.tools.sbt
scala file
Here I used parser functionality to make abstract syntax tree of code 2 + 3
import scala.tools.nsc._
object Main extends App {
var i = new Interpreter
println(i.parse("2 + 3"))
}
SBT configuration
name := "scalaSample"
version := "1.0-SNAPSHOT"
scalaVersion := "2.9.1"
libraryDependencies += "org.scalatest" %% "scalatest" % "1.7.1" % "test"
libraryDependencies += "org.scala-lang" % "scala-compiler" % "2.9.1"
Error
Failed to initialize compiler: object scala not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programatically, settings.usejavacp.value = true.
[error] (run-main) java.lang.NullPointerException
java.lang.NullPointerException at
scala.tools.nsc.CompilationUnits$CompilationUnit.
(CompilationUnits.scala:16) at
scala.tools.nsc.interpreter.ExprTyper$codeParser$.applyRule(ExprTyper.scala:22)
at
scala.tools.nsc.interpreter.ExprTyper$codeParser$.stmts(ExprTyper.scala:36)
at
scala.tools.nsc.interpreter.ExprTyper$$anonfun$parse$2.apply(ExprTyper.scala:47)
at
scala.tools.nsc.interpreter.ExprTyper$$anonfun$parse$2.apply(ExprTyper.scala:46)
at
scala.tools.nsc.reporters.Reporter.withIncompleteHandler(Reporter.scala:46)
at
scala.tools.nsc.interpreter.ExprTyper$class.parse(ExprTyper.scala:46)
at
scala.tools.nsc.interpreter.IMain$exprTyper$.parse(IMain.scala:1012)
at scala.tools.nsc.interpreter.IMain.parse(IMain.scala:1013) at
eu.semantiq.scalaToJS.Main$delayedInit$body.apply(Main.scala:7) at
scala.Function0$class.apply$mcV$sp(Function0.scala:34) at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:60) at
scala.App$$anonfun$main$1.apply(App.scala:60) at
scala.collection.LinearSeqOptimized$class.foreach(LinearSeqOptimized.scala:59)
at scala.collection.immutable.List.foreach(List.scala:45) at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:30)
at scala.App$class.main(App.scala:60) at
eu.semantiq.scalaToJS.Main$.main(Main.scala:5) at
eu.semantiq.scalaToJS.Main.main(Main.scala) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
java.lang.RuntimeException: Nonzero exit code: 1 at
scala.sys.package$.error(package.scala:27)
In scala REPL everything works
Welcome to Scala version 2.9.0.1 (OpenJDK 64-Bit Server VM, Java
1.6.0_23). Type in expressions to have them evaluated. Type :help for more information.
scala> import scala.tools.nsc._
import scala.tools.nsc._
scala> var i = new Interpreter
warning: there were 4 deprecation
warnings; re-run with -deprecation for details warning: there were 1
deprecation warnings; re-run with -deprecation for details
i: scala.tools.nsc.Interpreter = scala.tools.nsc.Interpreter#786bfd73
scala> println(i.parse("2 + 3"))
Some(List(2.$plus(3)))
I feel really sorry for my bad English
According to xsbt's FAQ:
sbt runs tests in the same JVM as sbt itself and Scala classes are not
in the same class loader as the application classes.
And there's more:
The key is to initialize the Settings for the interpreter using
embeddedDefaults.
The example that is given there uses some arbitrary type MyType. In fact, you can use any of your types to help sbt find the appropriate class loader (see this answer).
Hence, your code should look like this:
import scala.tools.nsc._
trait Foo // Arbitrary type added to get stuff working
object Main extends App {
val settings = new Settings
settings.embeddedDefaults[Foo]
val interpreter = new Interpreter(settings)
println(interpreter.parse("2 + 3"))
}