compiler crash when I use macros and playframework - scala

I wrote a macro that parses JSON into a matching case class.
def parse(jsonTree: JsValue): BaseType = macro parserImpl
def parserImpl(c: blackbox.Context)(jsonTree: c.Tree) = {
import c.universe._
val q"$json" = jsonTree
val cases = List("X", "Y").map { caseClassName =>
val caseClass = c.parse(caseClassName)
val reader = c.parse(s"JSONHelp.${caseClassName}_reads")
val y = cq"""$caseClassName => (($json \ "config").validate[$caseClass]($reader)).get"""
println(showCode(y))
y
}.toList
val r =
q"""
import play.api.libs.json._
import JSONHelp._
println($json)
($json \ "type").as[String] match { case ..$cases }
"""
println(showCode(r))
r
}
The following is that code it generates (printed by the last println):
{
import play.api.libs.json._;
import JSONHelp._;
println(NodeParser.this.json);
NodeParser.this.json.\("type").as[String] match {
case "X" => NodeParser.this.json.\("config").validate[X](JSONHelp.X_reads).get
case "Y" => NodeParser.this.json.\("config").validate[Y](JSONHelp.Y_reads).get
}
}
The compilation of the subproject containing the macro definition works fine. But when I compile the project(using sbt 0.13.11 and scala 2.11.8) using the macro, I get the following error:
java.lang.NullPointerException
at play.routes.compiler.RoutesCompiler$GeneratedSource$.unapply(RoutesCompiler.scala:37)
at play.sbt.routes.RoutesCompiler$$anonfun$11$$anonfun$apply$2.isDefinedAt(RoutesCompiler.scala:180)
at play.sbt.routes.RoutesCompiler$$anonfun$11$$anonfun$apply$2.isDefinedAt(RoutesCompiler.scala:179)
at scala.Option.collect(Option.scala:250)
at play.sbt.routes.RoutesCompiler$$anonfun$11.apply(RoutesCompiler.scala:179)
at play.sbt.routes.RoutesCompiler$$anonfun$11.apply(RoutesCompiler.scala:178)

I'm not a user, but I see it seems to want tree positions with a source file:
val routesPositionMapper: Position => Option[Position] = position => {
position.sourceFile collect {
case GeneratedSource(generatedSource) => {
It's typical to use atPos(pos)(tree). You might assume the incoming tree.pos for synthetic trees.

Related

Registering UDF's dynamically using scala reflect not working

I am registering my UDF's dynamically using scala reflect as shown below and this code works fine. However when I try to list spark functions using spark.catalog then I don't see it. Can you please help me understanding what I am missing here:
spark.catalog.listFunctions().foreach{
fun =>
if (fun.name == "ModelIdToModelYear") {
println(fun.name)
}
}
def registerUDF(spark: SparkSession) : Unit = {
val runtimeMirror = scala.reflect.runtime.universe.runtimeMirror(getClass.getClassLoader)
val moduleSymbol = runtimeMirror.moduleSymbol(Class.forName("com.local.practice.udf.UdfModelIdToModelYear"))
val targetMethod = moduleSymbol.typeSignature.members.filter{
x => x.isMethod && x.name.toString == "ModelIdToModelYear"
}.head.asMethod
val function = runtimeMirror.reflect(runtimeMirror.reflectModule(moduleSymbol).instance).reflectMethod(targetMethod)
function(spark.udf)
}
Below is my UDF definition
package com.local.practice.udf
import org.apache.spark.sql.expressions.UserDefinedFunction
import org.apache.spark.sql.functions.udf
//noinspection ScalaStyle
object UdfModelIdToModelYear {
val ModelIdToModelYear: UserDefinedFunction = udf((model_id : String) => {
val numPattern = "(\\d{2})_.+".r
numPattern.findFirstIn(model_id).getOrElse("0").toInt
})
}

"graph-for-funcs.sc" script compilation error

I am trying to reproduce the devign experiment. When joern is updated, some error messages appear in the graph-for-funcs.sc script that parses the bin file into JSON. I modified some, and there are still some errors as shown in the figure below. Did you encounter similar errors and how did you solve them?
graph-for-funcs.sc
import scala.jdk.CollectionConverters._
import io.circe.syntax._
import io.circe.generic.semiauto._
import io.circe.{Encoder, Json}
import io.shiftleft.semanticcpg.language.types.expressions.generalizations.CfgNode
import io.shiftleft.codepropertygraph.generated.EdgeTypes
import io.shiftleft.codepropertygraph.generated.NodeTypes
import io.shiftleft.codepropertygraph.generated.nodes
import io.shiftleft.dataflowengineoss.language._
import io.shiftleft.semanticcpg.language._
import io.shiftleft.semanticcpg.language.types.expressions.Call
import io.shiftleft.semanticcpg.language.types.structure.Local
import io.shiftleft.codepropertygraph.generated.nodes.MethodParameterIn
import overflowdb._
import overflowdb.traversal._
final case class GraphForFuncsFunction(function: String,
file: String,
id: String,
AST: List[nodes.AstNode],
CFG: List[nodes.AstNode],
PDG: List[nodes.AstNode])
final case class GraphForFuncsResult(functions: List[GraphForFuncsFunction])
implicit val encodeEdge: Encoder[Edge] =
(edge: Edge) =>
Json.obj(
("id", Json.fromString(edge.toString)),
("in", Json.fromString(edge.inNode.toString)),
("out", Json.fromString(edge.outNode.toString))
)
implicit val encodeNode: Encoder[nodes.AstNode] =
(node: nodes.AstNode) =>
Json.obj(
("id", Json.fromString(node.toString)),
("edges",
Json.fromValues((node.inE("AST", "CFG").l ++ node.outE("AST", "CFG").l).map(_.asJson))),
("properties", Json.fromValues(node.propertyMap.asScala.toList.map { case (key, value) =>
Json.obj(
("key", Json.fromString(key)),
("value", Json.fromString(value.toString))
)
}))
)
implicit val encodeFuncFunction: Encoder[GraphForFuncsFunction] = deriveEncoder
implicit val encodeFuncResult: Encoder[GraphForFuncsResult] = deriveEncoder
#main def main(): Json = {
GraphForFuncsResult(
cpg.method.map { method =>
val methodName = method.fullName
val methodId = method.toString
val methodFile = method.location.filename
val astChildren = method.astMinusRoot.l
val cfgChildren = method.out(EdgeTypes.CONTAINS).asScala.collect { case node: nodes.CfgNode => node }.toList
val local = new NodeSteps(
method
.out(EdgeTypes.CONTAINS)
.hasLabel(NodeTypes.BLOCK)
.out(EdgeTypes.AST)
.hasLabel(NodeTypes.LOCAL)
.cast[nodes.Local])
val sink = local.evalType(".*").referencingIdentifiers.dedup
val source = new NodeSteps(method.out(EdgeTypes.CONTAINS).hasLabel(NodeTypes.CALL).cast[nodes.Call]).nameNot("<operator>.*").dedup
val pdgChildren = sink
.reachableByFlows(source)
.l
.flatMap { path =>
path.elements
.map {
case trackingPoint # (_: MethodParameterIn) => trackingPoint.start.method.head
case trackingPoint => trackingPoint.cfgNode
}
}
.filter(_.toString != methodId)
GraphForFuncsFunction(methodName, methodFile, methodId, astChildren, cfgChildren, pdgChildren.distinct)
}.l
).asJson
}
error
graph-for-funcs.sc:92: value evalType is not a member of io.shiftleft.semanticcpg.language.NodeSteps[io.shiftleft.codepropertygraph.generated.nodes.Local]
val sink = local.evalType(".*").referencingIdentifiers.dedup
^
graph-for-funcs.sc:93: value nameNot is not a member of io.shiftleft.semanticcpg.language.NodeSteps[io.shiftleft.codepropertygraph.generated.nodes.Call]
val source = new NodeSteps(method.out(EdgeTypes.CONTAINS).hasLabel(NodeTypes.CALL).cast[nodes.Call]).nameNot("<operator>.*").dedup
^
java.lang.RuntimeException: Compilation Failed
io.shiftleft.console.scripting.AmmoniteExecutor.$anonfun$runScript$7(AmmoniteExecutor.scala:50)
cats.effect.internals.IORunLoop$.liftedTree3$1(IORunLoop.scala:229)
cats.effect.internals.IORunLoop$.step(IORunLoop.scala:229)
cats.effect.IO.unsafeRunTimed(IO.scala:320)
cats.effect.IO.unsafeRunSync(IO.scala:239)
io.shiftleft.console.scripting.ScriptManager.runScript(ScriptManager.scala:130)
io.shiftleft.console.scripting.ScriptManager$CpgScriptRunner.runScript(ScriptManager.scala:64)
io.shiftleft.console.scripting.ScriptManager$CpgScriptRunner.runScript(ScriptManager.scala:54)
ammonite.$sess.cmd8$.<clinit>(cmd8.sc:1)

Generating import statements with scala macros

I have the following code:
#mymacro #imports
val _1 = { import scala.collection.mutable.ListBuffer }
#mymacro
val _2 = { val yy: ListBuffer[Int] = ListBuffer.empty }
#mymacro is a scala macro that checks if it has been annotated with the #importsannotation. Part of the implementatation is as follows:
case (cc#q"${mods: Modifiers} val $tname: ${tpt: Tree} = ${expr: Tree}") :: Nil =>
if (tname.toString().startsWith("_"))
if (checkImports(mods, expr)) {
q"import scala.collection.mutable.ListBuffer"
}
else
q"{$expr}"
Currently the macro is able to transform the whole val _1 = ... statement to import scala.collection.mutable.ListBuffer (without the {} brackets!) But when the compilation continues, I keep getting the not found: type ListBuffer compilation error. Now I wonder if it is possible to fix this error somehow without having to define the import statement at the top of the file.
I am using the Scala 2.10 macro paradise plugin

Scala Akka Stream: How to Pass Through a Seq

I'm trying to wrap some blocking calls in Future.The return type is Seq[User] where User is a case class. The following just wouldn't compile with complaints of various overloaded versions being present. Any suggestions? I tried almost all the variations is Source.apply without any luck.
// All I want is Seq[User] => Future[Seq[User]]
def findByFirstName(firstName: String) = {
val users: Seq[User] = userRepository.findByFirstName(firstName)
val sink = Sink.fold[User, User](null)((_, elem) => elem)
val src = Source(users) // doesn't compile
src.runWith(sink)
}
First of all, I assume that you are using version 1.0 of akka-http-experimental since the API may changed from previous release.
The reason why your code does not compile is that the akka.stream.scaladsl.Source$.apply() requires
scala.collection.immutable.Seq instead of scala.collection.mutable.Seq.
Therefore you have to convert from mutable sequence to immutable sequence using to[T] method.
Document: akka.stream.scaladsl.Source
Additionally, as you see the document, Source$.apply() accepts ()=>Iterator[T] so you can also pass ()=>users.iterator as argument.
Since Sink.fold(...) returns the last evaluated expression, you can give an empty Seq() as the first argument, iterate over the users with appending the element to the sequence, and finally get the result.
However, there might be a better solution that can create a Sink which puts each evaluated expression into Seq, but I could not find it.
The following code works.
import akka.actor._
import akka.stream.ActorMaterializer
import akka.stream.scaladsl.{Source,Sink}
import scala.concurrent.ExecutionContext.Implicits.global
case class User(name:String)
object Main extends App{
implicit val system = ActorSystem("MyActorSystem")
implicit val materializer = ActorMaterializer()
val users = Seq(User("alice"),User("bob"),User("charlie"))
val sink = Sink.fold[Seq[User], User](Seq())(
(seq, elem) =>
{println(s"elem => ${elem} \t| seq => ${seq}");seq:+elem})
val src = Source(users.to[scala.collection.immutable.Seq])
// val src = Source(()=>users.iterator) // this also works
val fut = src.runWith(sink) // Future[Seq[User]]
fut.onSuccess({
case x=>{
println(s"result => ${x}")
}
})
}
The output of the code above is
elem => User(alice) | seq => List()
elem => User(bob) | seq => List(User(alice))
elem => User(charlie) | seq => List(User(alice), User(bob))
result => List(User(alice), User(bob), User(charlie))
If you need just Future[Seq[Users]] dont use akka streams but
futures
import scala.concurrent._
import ExecutionContext.Implicits.global
val session = socialNetwork.createSessionFor("user", credentials)
val f: Future[List[Friend]] = Future {
session.getFriends()
}

How do I provide basic configuration for a Scala application?

I am working on a small GUI application written in Scala. There are a few settings that the user will set in the GUI and I want them to persist between program executions. Basically I want a scala.collections.mutable.Map that automatically persists to a file when modified.
This seems like it must be a common problem, but I have been unable to find a lightweight solution. How is this problem typically solved?
I do a lot of this, and I use .properties files (it's idiomatic in Java-land). I keep my config pretty straight-forward by design, though. If you have nested config constructs you might want a different format like YAML (if humans are the main authors) or JSON or XML (if machines are the authors).
Here's some example code for loading props, manipulating as Scala Map, then saving as .properties again:
import java.io._
import java.util._
import scala.collection.JavaConverters._
val f = new File("test.properties")
// test.properties:
// foo=bar
// baz=123
val props = new Properties
// Note: in real code make sure all these streams are
// closed carefully in try/finally
val fis = new InputStreamReader(new FileInputStream(f), "UTF-8")
props.load(fis)
fis.close()
println(props) // {baz=123, foo=bar}
val map = props.asScala // Get to Scala Map via JavaConverters
map("foo") = "42"
map("quux") = "newvalue"
println(map) // Map(baz -> 123, quux -> newvalue, foo -> 42)
println(props) // {baz=123, quux=newvalue, foo=42}
val fos = new OutputStreamWriter(new FileOutputStream(f), "UTF-8")
props.store(fos, "")
fos.close()
Here's an example of using XML and a case class for reading a config. A real class can be nicer than a map. (You could also do what sbt and at least one project do, take the config as Scala source and compile it in; saving it is less automatic. Or as a repl script. I haven't googled, but someone must have done that.)
Here's the simpler code.
This version uses a case class:
case class PluginDescription(name: String, classname: String) {
def toXML: Node = {
<plugin>
<name>{name}</name>
<classname>{classname}</classname>
</plugin>
}
}
object PluginDescription {
def fromXML(xml: Node): PluginDescription = {
// extract one field
def getField(field: String): Option[String] = {
val text = (xml \\ field).text.trim
if (text == "") None else Some(text)
}
def extracted = {
val name = "name"
val claas = "classname"
val vs = Map(name -> getField(name), claas -> getField(claas))
if (vs.values exists (_.isEmpty)) fail()
else PluginDescription(name = vs(name).get, classname = vs(claas).get)
}
def fail() = throw new RuntimeException("Bad plugin descriptor.")
// check the top-level tag
xml match {
case <plugin>{_*}</plugin> => extracted
case _ => fail()
}
}
}
This code reflectively calls the apply of a case class. The use case is that fields missing from config can be supplied by default args. No type conversions here. E.g., case class Config(foo: String = "bar").
// isn't it easier to write a quick loop to reflect the field names?
import scala.reflect.runtime.{currentMirror => cm, universe => ru}
import ru._
def fromXML(xml: Node): Option[PluginDescription] = {
def extract[A]()(implicit tt: TypeTag[A]): Option[A] = {
// extract one field
def getField(field: String): Option[String] = {
val text = (xml \\ field).text.trim
if (text == "") None else Some(text)
}
val apply = ru.newTermName("apply")
val module = ru.typeOf[A].typeSymbol.companionSymbol.asModule
val ts = module.moduleClass.typeSignature
val m = (ts member apply).asMethod
val im = cm reflect (cm reflectModule module).instance
val mm = im reflectMethod m
def getDefault(i: Int): Option[Any] = {
val n = ru.newTermName("apply$default$" + (i+1))
val m = ts member n
if (m == NoSymbol) None
else Some((im reflectMethod m.asMethod)())
}
def extractArgs(pss: List[List[Symbol]]): List[Option[Any]] =
pss.flatten.zipWithIndex map (p => getField(p._1.name.encoded) orElse getDefault(p._2))
val args = extractArgs(m.paramss)
if (args exists (!_.isDefined)) None
else Some(mm(args.flatten: _*).asInstanceOf[A])
}
// check the top-level tag
xml match {
case <plugin>{_*}</plugin> => extract[PluginDescription]()
case _ => None
}
}
XML has loadFile and save, it's too bad there seems to be no one-liner for Properties.
$ scala
Welcome to Scala version 2.10.0-RC5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_06).
Type in expressions to have them evaluated.
Type :help for more information.
scala> import reflect.io._
import reflect.io._
scala> import java.util._
import java.util._
scala> import java.io.{StringReader, File=>JFile}
import java.io.{StringReader, File=>JFile}
scala> import scala.collection.JavaConverters._
import scala.collection.JavaConverters._
scala> val p = new Properties
p: java.util.Properties = {}
scala> p load new StringReader(
| (new File(new JFile("t.properties"))).slurp)
scala> p.asScala
res2: scala.collection.mutable.Map[String,String] = Map(foo -> bar)
As it all boils down to serializing a map / object to a file, your choices are:
classic serialization to Bytecode
serialization to XML
serialization to JSON (easy using Jackson, or Lift-JSON)
use of a properties file (ugly, no utf-8 support)
serialization to a proprietary format (ugly, why reinvent the wheel)
I suggest to convert Map to Properties and vice versa. "*.properties" files are standard for storing configuration in Java world, why not use it for Scala?
The common way are *. properties, *.xml, since scala supports xml natively, so it would be easier using xml config then in java.