scala: delimited continuation issues - scala

Here is a code I have problem to compile.
My question is very simple.
How to fix compilation error?
I did try to run this from sbt and from command prompt.
SBT configuration below. I did add references to continuation plugin!
But looks like this is not helping!
See error below.
I've got this from scaladocs: http://www.scala-lang.org/files/archive/api/2.11.8/scala-continuations-library/#scala.util.continuations.package
object MainApp
{
def main(args: Array[String]): Unit =
{
import scala.util.continuations._
val uuidGen : String = "UniqueValue"
def ask(prompt: String): Int #cps[Unit] =
shift {
k: (Int => Unit) => {
val id = uuidGen
printf("%s\nrespond with: submit(0x%x, ...)\n", prompt, id)
}
}
def go =
reset {
println("Welcome!")
val first = ask("Please give me a number")
val second = ask("Please enter another number")
printf("The sum of your numbers is: %d\n", first + second)
}
go
}
}
Sbt configuration for the refrences
name := """scala-testing"""
version := "0.1.0"
scalaVersion := "2.11.2"
autoCompilerPlugins := true
addCompilerPlugin(
"org.scala-lang.plugins" % "scala-continuations-plugin_2.11.6" % "1.0.2")
libraryDependencies +=
"org.scala-lang.plugins" %% "scala-continuations-library" % "1.0.2"
fork in run := true

Related

intellij scala: Error: Could not find or load main class

i have a R&D project that read data from oracle then write to spark standalone cluster by using scala & intellij
this is my build.sbt with library Dependencies
name := "DB_Oracle_V07"
version := "0.1"
scalaVersion := "2.11.12"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-hive
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.4.0"
my project main class without any spark syntax
package com.xxxx.spark
import java.io.FileWriter
import java.sql.{Connection, DriverManager}
import java.text.SimpleDateFormat
import java.util.Calendar
import scala.collection.mutable.ArrayBuffer
object query01 {
var dateStart = ""
case class DF_LOT_INFO(LOT_NUMBER: String, MACHINE: String, FACILITY: String, LOT_TYPE: String, REC_DATE: String, FILE_NAME: String)
def main(args:Array[String]): Unit = {
val cal = Calendar.getInstance
cal.add(Calendar.DATE, 1)
val date = cal.getTime
val format1 = new SimpleDateFormat("yyyyMMdd_HHmmss")
dateStart = format1.format(date)
print_log("start")
val url = "jdbc:oracle:thin:#TMDT1PEN.XXXX.XXXX.COM:1521:TMDT1"
//val driver = "oracle.jdbc.OracleDriver"
val driver = "oracle.jdbc.driver.OracleDriver"
val username = "TMDB_XXXX"
val password = "XXXXXXXX"
val connection:Connection = null
val result = ArrayBuffer[String]()
try{
print_log("Class.forName start")
val app_dir = System.getProperty("user.dir")
print_log("current dir: " + app_dir)
val java_class_path = System.getProperty("java.class.path")
print_log("java_class_path: " + java_class_path)
Class.forName(driver)
var testing = Class.forName(driver)
print_log("Class.forName end")
//DriverManager.registerDriver(new oracle.jdbc.driver.OracleDriver)
val connection = DriverManager.getConnection(url, username, password)
val statement = connection.createStatement
val rs = statement.executeQuery("select * from tester")
var i = 1
while(rs.next){
val item = rs.getString("tester_name")
println("data:" + item)
print_log("data:" + item)
result.append(item)
i = i + 1
}
values('aaaaa','bbbbb',sysdate)")
}
catch{
//case unknown => println("Got this unknown exception: " + unknown)
case unknown => print_error_log("Unknown exception: " + unknown)
}
finally{
}
print_log("end")
}
def print_log(msg:String): Unit = {
val fw = new FileWriter(dateStart + "_log.txt", true)
try {
fw.write("\n" + msg)
}
finally fw.close()
}
def print_error_log(msg:String): Unit = {
val fw = new FileWriter(dateStart + "_error_log.txt", true)
try {
fw.write("\n" + msg)
}
finally fw.close()
}
}
i build artifact as usual & added ojdbc6.jar in my proect .jar as my oracle library
ojdbc6.jar
but i fail to execute my project jar file and getting below error
Error: Could not find or load main class com.xxxx.spark.query01
blindly i removed all spark extracted jar files in my project .jar
or create a new project without any library Dependencies in build.sbt,
then i able to execute my project .jar without error.
With above blind test i sure that the error is caused by spark library,
Any expert can advise me how to solve this error? Thanks for all :)

how to start server/client grpc using scalapb on spark?

i have a problem about running server/client using ScalaPB on spark.
its totally work fine while I running my code using "sbt run". i want running this code using spark coz next ill import my spark model to predict some label. but while I submit my jar to spark, they give me error like this.
Exception in thread "main" io.grpc.ManagedChannelProvider$ProviderNotFoundException:
No functional server found. Try adding a dependency on the grpc-netty artifact
this is my build.sbt
scalaVersion := "2.11.7"
PB.targets in Compile := Seq(
scalapb.gen() -> (sourceManaged in Compile).value
)
val scalapbVersion =
scalapb.compiler.Version.scalapbVersion
val grpcJavaVersion =
scalapb.compiler.Version.grpcJavaVersion
libraryDependencies ++= Seq(
// protobuf
"com.thesamet.scalapb" %% "scalapb-runtime" % scalapbVersion % "protobuf",
//for grpc
"io.grpc" % "grpc-netty" % grpcJavaVersion ,
"com.thesamet.scalapb" %% "scalapb-runtime-grpc" % scalapbVersion
)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
using shade still not work
assemblyShadeRules in assembly := Seq(ShadeRule.rename("com.google.**" -> "shadegoogle.#1").inAll)
and this my main
import java.util.logging.Logger
import io.grpc.{Server, ServerBuilder}
import org.apache.spark.ml.tuning.CrossValidatorModel
import org.apache.spark.sql.SparkSession
import testproto.test.{Email, EmailLabel, RouteGuideGrpc}
import scala.concurrent.{ExecutionContext, Future}
object HelloWorldServer {
private val logger = Logger.getLogger(classOf[HelloWorldServer].getName)
def main(args: Array[String]): Unit = {
val server = new HelloWorldServer(ExecutionContext.global)
server.start()
server.blockUntilShutdown()
}
private val port = 50051
}
class HelloWorldServer(executionContext: ExecutionContext) {
self =>
private[this] var server: Server = null
private def start(): Unit = {
server = ServerBuilder.forPort(HelloWorldServer.port).addService(RouteGuideGrpc.bindService(new RouteGuideImpl, executionContext)).build.start
HelloWorldServer.logger.info("Server started, listening on " + HelloWorldServer.port)
sys.addShutdownHook {
System.err.println("*** shutting down gRPC server since JVM is shutting down")
self.stop()
System.err.println("*** server shut down")
}
}
private def stop(): Unit = {
if (server != null) {
server.shutdown()
}
}
private def blockUntilShutdown(): Unit = {
if (server != null) {
server.awaitTermination()
}
}
private class RouteGuideImpl extends RouteGuideGrpc.RouteGuide {
override def getLabel(request: Email): Future[EmailLabel] = {
val replay = EmailLabel(emailId = request.emailId, label = "aaaaa")
Future.successful(replay)
}
}
}
thanks
It looks like grpc-netty is not found when an uber jar is made. Instead of using ServerBuilder, change your code to use io.grpc.netty.NettyServerBuilder.

Akka.actor.dispatcher NoSuchMethod exception

I am trying to learn about Akka actors and I am running the following example. My problem is that when I run though the Idea IDE it works perfectly fine. But when I run it using the jar created by sbt assembly it throws a NoSuchMethodException java.lang.NoSuchMethodError: akka.actor.ActorSystem.dispatcher()Lscala/concurrent/ExecutionContextExecutor Exception which I cannot debug because it works fine in the IDE.
object Runner {
def main(args: Array[String]) {
run()
}
def run() = {
val system = ActorSystem("my-system")
import system.dispatcher
val props = Props[Manager]
val pool = mutable.ArrayBuffer.empty[(Int, ActorRef)]
for (i <- 1 to 10) {
pool += ((i, system.actorOf(props)))
}
val futures = pool.map {
case (x: Int, y: ActorRef) =>
val future = ask(y, Echo(x))(Timeout(100 seconds)).mapTo[Int]
println(future.toString)
future
}
/*Next line causes Exception*/
val futureList = Future.sequence(futures)
val result = futureList.map(x => {
x.sum
})
result onSuccess {
case sum => println(sum)
}
pool.foreach(x => system.stop(x._2))
system.shutdown()
}
}
The sbt file I am using is the following.
lazy val commonSettings = Seq(
organization := "foobar",
version := "1.0",
scalaVersion := "2.10.6",
test in assembly := {}
)
lazy val root = (project).aggregate(redis).settings(commonSettings: _*).
settings(
name := "scala_code_root",
version := "1.0",
scalaVersion := "2.10.6"
exportJars := false
)
lazy val myakka =(project in file("myakka")).settings(commonSettings: _*).settings(
libraryDependencies += "com.typesafe.akka" % "akka-actor_2.10" % "2.3.15"
)
The exception is thrown at the line val futureList = Future.sequence(futures). Apparently the method is trere because both IDEA and sbt-assembly use the same sbt file. What could be the cause of the Exception?

RxScala Observable never runs

With the following build.sbt:
name := "blah"
version := "1.0"
scalaVersion := "2.11.6"
libraryDependencies ++= Seq("io.reactivex" % "rxscala_2.11" % "0.24.1", "org.scalaj" %% "scalaj-http" % "1.1.4")
and this code:
import rx.lang.scala.Observable
import scala.concurrent.duration._
import scala.language.postfixOps
object Main {
def main(args: Array[String]): Unit = {
println("Ready?")
val o = Observable.interval(200 millis).take(5)
o.subscribe(n => println(s"n = ${n}"))
}
}
When I run it, all that's printed is Ready?; I see no n = ... at all.
I run using sbt run; it's built using Scala 2.6.11 and RxScala 0.24.1, as well as sbt 0.13. Any ideas?
The problem is that your program exits before o fires. Try the following code:
import rx.lang.scala.Observable
import scala.concurrent.duration._
import scala.language.postfixOps
object Main {
def main(args: Array[String]): Unit = {
println("Ready?")
val o = Observable.interval(200 millis).take(5)
o.subscribe(n => println(s"n = ${n}"))
Thread.sleep(5000)
}
}
Alternatively you can replace Thread.sleep with o.toBlocking.last, which cannot return before o terminates.

How to write a plugin which depends on a task from another plugin?

There is a great sbt plugin sbt-dependency-graph, which provides a dependencyTree task to show the dependencies.
I want to write a sbt plugin which depends on it, but always fails.
build.sbt
sbtPlugin := true
name := "my-sbt-plugin-depends-on-another"
version := "0.1.2.1"
organization := "test20140913"
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.7.5")
src/main/scala/MySbtPlugin.scala
import sbt._
object MySbtPlugin extends AutoPlugin {
object autoImport {
lazy val hello = taskKey[Unit]("hello task from my plugin")
lazy val hello2 = taskKey[Unit]("hello task from my plugin2")
}
import autoImport._
override def trigger = allRequirements
override def requires = plugins.JvmPlugin
val helloSetting = hello := println("Hello from my plugin")
val helloSetting2 = hello2 := {
println("hello2, task result from another plugins:")
println(net.virtualvoid.sbt.graph.Plugin.dependencyTree.value)
println("=========================================")
}
override def projectSettings = Seq(
helloSetting, helloSetting2
)
}
Then I published it to local, and use it in another project:
build.sbt
name := "sbt--plugin-test"
version := "1.0"
scalaVersion := "2.11.6"
net.virtualvoid.sbt.graph.Plugin.graphSettings
project/plugins.scala
logLevel := Level.Info
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.7.5")
addSbtPlugin("test20140913" % "my-sbt-plugin-depends-on-another" % "0.1.2.1")
When I run sbt on the later project, it reports:
Reference to undefined setting:
*:dependencyTree from *:hello2 (/Users/twer/workspace/my-sbt-plugin-depends-on-another/src/main/scala/test20140913/MySbtPlugin.scala:38)
Did you mean provided:dependencyTree ?
at sbt.Init$class.Uninitialized(Settings.scala:262)
at sbt.Def$.Uninitialized(Def.scala:10)
at sbt.Init$class.delegate(Settings.scala:188)
at sbt.Def$.delegate(Def.scala:10)
Where is wrong?
PS: The plugin code is here: https://github.com/freewind/my-sbt-plugin-depends-on-another
dependencyTree is only defined for specific configurations (well all of them), but it automatically delegates to Compile in the shell.
Try defining hello2 like so:
val helloSetting2 = hello2 := {
println("hello2, task result from another plugins:")
import net.virtualvoid.sbt.graph.Plugin.dependencyTree
println((dependencyTree in Compile).value)
println("=========================================")
}