This is a follow up question to https://stackoverflow.com/a/55440851/2691976
I have the following code
import scala.io.Source
import scala.util.Using
object Problem {
def main(args: Array[String]): Unit = {
Using(Source.fromFile("thisfileexists.txt")) { source =>
println(1 / 1)
println(1 / 0)
}
}
}
Running it with scala3, it will just print out 1 single line and no error.
scala3 test.scala
1
I am expecting an error like the following,
Exception in thread "main" java.lang.ArithmeticException: / by zero
at Problem$.main(test.scala:10)
at Problem.main(test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at dotty.tools.scripting.ScriptingDriver.compileAndRun(ScriptingDriver.scala:42)
at dotty.tools.scripting.Main$.main(Main.scala:43)
at dotty.tools.MainGenericRunner$.run$1(MainGenericRunner.scala:230)
at dotty.tools.MainGenericRunner$.main(MainGenericRunner.scala:239)
at dotty.tools.MainGenericRunner.main(MainGenericRunner.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at coursier.bootstrap.launcher.a.a(Unknown Source)
at coursier.bootstrap.launcher.Launcher.main(Unknown Source)
So why does it not print out error when I am using Using (which I suspect it is causing the problem here)?
And what is the solution so I can use both Using and Source.fromFile with potential error?
I have read the Using Scala 2 doc and Scala 3 doc but it doesn't say anything about error
In case this is important, I am using Mac
scala3 --version
Scala code runner version 3.1.2-RC1-bin-20211213-8e1054e-NIGHTLY-git-8e1054e -- Copyright 2002-2021, LAMP/EPFL
Thats because Using returns Try as you can see here
https://www.scala-lang.org/api/2.13.x/scala/util/Using$.html#apply[R,A](resource:=%3ER)(f:R=%3EA)(implicitevidence$1:scala.util.Using.Releasable[R]):scala.util.Try[A]
You can use .fold, .get, pattern matching, etc.
for example:
import scala.io.Source
import scala.util.Using
object Problem {
def main(args: Array[String]): Unit = {
Using(Source.fromFile("thisfileexists.txt")) { source =>
println(1 / 1)
println(1 / 0)
}.get
}
}
or as follows:
import scala.io.Source
import scala.util.Using
import scala.util._
object Problem {
def main(args: Array[String]): Unit = {
Using(Source.fromFile("thisfileexists.txt")) { source =>
println(1 / 1)
println(1 / 0)
} match {
case Success(res) => println("Do something on sucess")
case Failure(ex) => println(s"Failed with ex: ${ex.getMessage}")
}
}
}
You can read more about Try at scala:
https://www.scala-lang.org/api/2.13.x/scala/util/Try.html
Related
I have a patient resource of below type:
val p:Patient = new Patient
which comes under below package:
import org.hl7.fhir.r4.model.Patient
Now, I want to set some value for it like one ID attribute with value like example and when I try something like p.getId() I should be able to retrieve it. I was trying scala reflection and desgined below methods by referring one of the posts but not sure how to use it over here. Below are the methods for get and set:
object PatientInvoker {
def main(args: Array[String]): Unit = {
val spark = SparkSession.builder().appName("Patient").master("local[1]").getOrCreate()
val patientOutput = "C:\\Users\\siddheshk2\\IdeaProjects\\fhir\\mapper\\src\\main\\resources\\patientOutput.json"
val idValue = spark.read.option("multiline", "true").json(patientOutput).select(col("id")).first.getString(0)
implicit def reflector(ref: AnyRef) = new {
def getV(name: String): Any = ref.getClass.getMethods.find(_.getName == name).get.invoke(ref)
def setV(name: String, value: Any): Unit = ref.getClass.getMethods.find(_.getName == name + "_$eq").get.invoke(ref, value.asInstanceOf[AnyRef])
}
val p: Patient = new Patient
p.setV("id", idValue)
println("id:" + p.getV("id"))
}
}
I am getting below error:
Exception in thread "main" java.util.NoSuchElementException: None.get
at scala.None$.get(Option.scala:347)
at scala.None$.get(Option.scala:345)
at com.fhir.mapper.io.PatientInvoker$$anon$1.setV(StudentInvoker.scala:16)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
Unable to set value of idValue using reflector method. Please guide me through it
I was using my Spark app in cluster mode and everything went well. Now, I need to do some test in my local installation (on my laptop) and I get NullPointerException in the following line:
val brdVar = spark.sparkContext.broadcast(rdd.collectAsMap())
EDIT: This is the full stacktrace:
Exception in thread "main" java.lang.NullPointerException
at learner.LearnCh$.learn(LearnCh.scala:81)
at learner.Learner.runLearningStage(Learner.scala:166)
at learner.Learner.run(Learner.scala:29)
at Driver$.runTask(Driver.scala:26)
at Driver$.main(Driver.scala:19)
at Driver.main(Driver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I was reading a lot, and I couldn't get the answer to my problem (EDIT: I'm using def main(args: Array[String]): Unit = ...). The use case for this brdVar is to get a numerical id value from a string one:
val newRdd: RDD[(Long, Map[Byte, Int])] = origRdd.mapPartitions { partition => partition.map(r => (r.idString, r)) }
.aggregateByKey // this line doesn't affect to my problem ....
.mapPartitions { partition => partition.map { case (idString, listIndexes) => (brdVar.value(idString), .....) } }
So, in order to continue and don't get stuck with broadcast in local mode, I change the idea and I wanted to simulate the brdVar saving its data in a file, and reading and searching the key calling a function instead of this part brdVar.value(idString) doing this getNumericalID(id). To do so, I've written this function:
def getNumericalID(strID: String): Long = {
val pathToRead = ....
val file = spark.sparkContext.textFile(pathToRead)
val process = file.map{line =>
val l = line.split(",")
(l(0), l(1))
}.filter(e=>e._1 == strID).collect()
process(0)._2.toLong
}
But I'm still getting NullPointerException message, but this time in this val file = .... line. I've checked, and the file has content. I think maybe I'm misunderstanding something, any ideas?
I am having trouble turning my data into nested JSON objects. My current data looks like this:
Map(23 -> {"errorCode":null,"runStatusId":null,"lakeHdfsPath":"/user/jmblnvr/20140817_011500_zoot_kohls_offer_init.dat","fieldIndex":23,"datasetFieldName":"TERM_MM","datasetFieldSum":0.0,"datasetFieldMin":0.0,"datasetFieldMax":0.0,"datasetFieldMean":0.0,"datasetFieldSigma":0.0,"datasetFieldNullCount":170544.0,"datasetFieldObsCount":0.0,"datasetFieldKurtosis":0.0,"datasetFieldSkewness":0.0,"frequencyDistribution":null,"id":null,"fieldType":"NUMBER"}, 32 -> {"errorCode":null,"runStatusId":null,"lakeHdfsPath":"/user/jmblnvr/20140817_011500_zoot_kohls_offer_init.dat","fieldIndex":32,"datasetFieldName":"ACCT_NBR","datasetFieldSum":0.0,"datasetFieldMin":0.0,"datasetFieldMax":0.0,"datasetFieldMean":0.0,"datasetFieldSigma":0.0,"datasetFieldNullCount":0.0,"datasetFieldObsCount":0.0,"datasetFieldKurtosis":0.0,"datasetFieldSkewness":0.0,"frequencyDistribution":"(6393050780099594,56810)","id":null,"fieldType":"STRING"} etc. etc.
When I run it through:
def jsonClob(json: scala.collection.mutable.Map[Int, String]): String = {
implicit val formats = org.json4s.DefaultFormats
val A = Serialization.write(json)
A
}
I get the following Error:
Exception in thread "main" scala.MatchError: (23,{"errorCode":null,"fieldIndex":23,"datasetFieldObsCount":0.0,"datasetFieldKurtosis":0.0,"datasetFieldSkewness":0.0,"frequencyDistribution":null,"runStatusId":null,"lakeHdfsPath":"/user/jmblnvr/20140817_011500_zoot_kohls_offer_init.dat","datasetFieldName":"TERM_MM","datasetFieldSum":0.0,"datasetFieldMin":0.0,"datasetFieldMax":0.0,"datasetFieldMean":0.0,"datasetFieldSigma":0.0,"datasetFieldNullCount":170544.0,"id":null,"fieldType":"NUMBER"}) (of class scala.Tuple2)
at org.json4s.Extraction$.internalDecomposeWithBuilder(Extraction.scala:132)
at org.json4s.Extraction$.decomposeWithBuilder(Extraction.scala:67)
at org.json4s.Extraction$.decompose(Extraction.scala:194)
at org.json4s.jackson.Serialization$.write(Serialization.scala:22)
at com.capitalone.dts.toolset.jsonWrite$.jsonClob(jsonWrite.scala:17)
at com.capitalone.dts.dq.profiling.DQProfilingEngine.profile(DQProfilingEngine.scala:264)
at com.capitalone.dts.dq.profiling.Profiler$.main(DQProfilingEngine.scala:64)
at com.capitalone.dts.dq.profiling.Profiler.main(DQProfilingEngine.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I am taking advice from another post I created but am having 0 luck with a custom serializer. So far my code looks like this but completely lost on it:
class Tuple2Serializer extends CustomSerializer[(Int, String)]( format => (
{
case JObject(JField(JInt(k), v)) => (k, v)
},
{
case (t: Int, s:String ) => (t -> s)
} ) )
Edit:
I have it working now thanks to the comment but it is creating with these \, not sure why or how to remove without ruining the JSON
Example:
\"errorCode\":null,\"runStatusId\":null,\"lakeHdfsPath\":\"/user/jmblnvr/20140817_011500_zoot_kohls_offer_init.dat\",\"fieldIndex\":45,\"datasetFieldName\":\"PRESENTABLE_FLAG\"
I'm trying to get familiar with Slick 3.0 and Futures (using Scala 2.11.6). I use simple code based on Slick's Multi-DB Cake Pattern example. Why does the following code terminate with an exception and how to fix it?
import scala.concurrent.Await
import scala.concurrent.duration._
import slick.jdbc.JdbcBackend.Database
import scala.concurrent.ExecutionContext.Implicits.global
class Dispatcher(db: Database, dal: DAL) {
import dal.driver.api._
def init() = {
db.run(dal.create)
try db.run(dal.stuffTable += Stuff(23,"hi"))
finally db.close
val x = {
try db.run(dal.stuffTable.filter(_.serial === 23).result)
finally db.close
}
// This crashes:
val result = Await.result(x, 2 seconds)
}
}
Execution fails with:
java.util.concurrent.RejectedExecutionException: Task slick.backend.DatabaseComponent$DatabaseDef$$anon$2#5c73f637 rejected from java.util.concurrent.ThreadPoolExecutor#4129c44c[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 2]
at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
at scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:136)
at slick.backend.DatabaseComponent$DatabaseDef$class.runSynchronousDatabaseAction(DatabaseComponent.scala:224)
at slick.jdbc.JdbcBackend$DatabaseDef.runSynchronousDatabaseAction(JdbcBackend.scala:38)
at slick.backend.DatabaseComponent$DatabaseDef$class.runInContext(DatabaseComponent.scala:201)
at slick.jdbc.JdbcBackend$DatabaseDef.runInContext(JdbcBackend.scala:38)
at slick.backend.DatabaseComponent$DatabaseDef$class.runInternal(DatabaseComponent.scala:75)
at slick.jdbc.JdbcBackend$DatabaseDef.runInternal(JdbcBackend.scala:38)
at slick.backend.DatabaseComponent$DatabaseDef$class.run(DatabaseComponent.scala:72)
at slick.jdbc.JdbcBackend$DatabaseDef.run(JdbcBackend.scala:38)
at Dispatcher.init(Dispatcher.scala:15)
at SlickDemo$.main(SlickDemo.scala:16)
at SlickDemo.main(SlickDemo.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
I think that something is not correct in what you are trying to do: Slick's run method doesn't return Unit and doesn't fail with an exception - as it used to in previous versions. run now returns a Future, so if you want to run actions in sequence you need to flatMap the steps, or use a for-comprehension:
def init() = {
val = results for {
_ <- db.run(dal.create)
_ <- db.run(dal.stuffTable += Stuff(23, "hi"))
r <- db.run(dal.stuffTable.filter(_.serial === 23).result)
} yield r
}
I am not sure that you really need to use db.close that way: that is actually what may be causing the error (i.e. the db is closed in concurrence with the future that runs the actual queries so the execution can't happen).
If you want to handle errors use Future's capabilities, e.g.:
result.onFailure { case NonFatal(ex) => // do something with the exception }
i with a scala code like this for echo service.
import scala.actors.Actor
import scala.actors.Actor._
import scala.actors.remote.RemoteActor._
class Echo extends Actor {
def act() {
alive(9010)
register('myName, self)
loop {
react {
case msg => println(msg)
}
}
}
}
object EchoServer {
def main(args: Array[String]): unit = {
val echo = new Echo
echo.start
println("Echo server started")
}
}
EchoServer.main(null)
but there has some exception.
java.lang.NoClassDefFoundError: Main$$anon$1$Echo$$anonfun$act$1
at Main$$anon$1$Echo.act((virtual file):16)
at scala.actors.Reaction.run(Reaction.scala:76)
at scala.actors.Actor$$anonfun$start$1.apply(Actor.scala:785)
at scala.actors.Actor$$anonfun$start$1.apply(Actor.scala:783)
at scala.actors.FJTaskScheduler2$$anon$1.run(FJTaskScheduler2.scala:160)
at scala.actors.FJTask$Wrap.run(Unknown Source)
at scala.actors.FJTaskRunner.scanWhileIdling(Unknown Source)
at scala.actors.FJTaskRunner.run(Unknown Source)
Caused by: java.lang.ClassNotFoundException: Main$$anon$1$Echo$$anonfun$act$1
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
... 8 more
i don't konw how can cause it.
by the way .my scala version is 2.7.5
ClassNotFoundException indicates that something was probably not compiled, that should have been compiled. How did you compile it? Manually using scalac?
Try the following rm *.class scalac *.scala scala EchoServer.
The following works:
EchoServer.scala
import scala.actors.Actor
import scala.actors.Actor._
import scala.actors.remote.RemoteActor._
class Echo extends Actor {
def act() {
alive(9010)
register('myName, self)
loop {
react {
case msg => println(msg)
}
}
}
}
object EchoServer {
def main(args: Array[String]): unit = {
val echo = new Echo
echo.start
println("Echo server started")
}
}
Client.scala
import scala.actors.Actor._
import scala.actors.remote.Node
import scala.actors.remote.RemoteActor._
object Client extends Application {
override def main(args: Array[String]) {
if (args.length < 1) {
println("Usage: scala Client [msg]")
return
}
actor {
val remoteActor = select(Node("localhost", 9010), 'myName)
remoteActor !? args(0) match {
case msg => println( "Server's response is [" + msg + "]" )
}
}
}
}
Command line:
rm *.class && scalac *.scala && scala EchoServer
And in other terminal:
scala Client hello
You need to set the classloader on the remote actors.
Before the act() method, add the line:
RemoteActor.classLoader = getClass.getClassLoader
Why is setting the classloader necessary with Scala RemoteActors?