scala.meta parent of parent of Defn.Object - scala

Let it be the following hierarchy:
object X extends Y{
...
}
trait Y extends Z {
...
}
trait Z {
def run(): Unit
}
I parse the scala file containing the X and
I want to know if its parent or grandparent is Z.
I can check for parent as follows:
Given that x: Defn.Object is the X class I parsed,
x
.children.collect { case c: Template => c }
.flatMap(p => p.children.collectFirst { case c: Init => c }
will give Y.
Question: Any idea how I can get the parent of the parent of X (which is Z in the above example) ?
Loading Y (the same way I loaded X) and finding it's parent doesn't seem like a good idea, since the above is part of a scan procedure where among all files under src/main/scala I'm trying to find all classes which extend Z and implement run, so I don't see an easy and performant way to create a graph with all intermediate classes so as to load them in the right order and check for their parents.

It seems you want Scalameta to process your sources not syntactically but semantically. Then you need SemanticDB. Probably the most convenient way to work with SemanticDB is Scalafix
rules/src/main/scala/MyRule.scala
import scalafix.v1._
import scala.meta._
class MyRule extends SemanticRule("MyRule") {
override def isRewrite: Boolean = true
override def description: String = "My Rule"
override def fix(implicit doc: SemanticDocument): Patch = {
doc.tree.traverse {
case q"""..$mods object $ename extends ${template"""
{ ..$stats } with ..$inits { $self => ..$stats1 }"""}""" =>
val initsParents = inits.collect(_.symbol.info.map(_.signature) match {
case Some(ClassSignature(_, parents, _, _)) => parents
}).flatten
println(s"object: $ename, parents: $inits, grand-parents: $initsParents")
}
Patch.empty
}
}
in/src/main/scala/App.scala
object X extends Y{
override def run(): Unit = ???
}
trait Y extends Z {
}
trait Z {
def run(): Unit
}
Output of sbt out/compile
object: X, parents: List(Y), grand-parents: List(AnyRef, Z)
build.sbt
name := "scalafix-codegen"
inThisBuild(
List(
//scalaVersion := "2.13.2",
scalaVersion := "2.11.12",
addCompilerPlugin(scalafixSemanticdb),
scalacOptions ++= List(
"-Yrangepos"
)
)
)
lazy val rules = project
.settings(
libraryDependencies += "ch.epfl.scala" %% "scalafix-core" % "0.9.16",
organization := "com.example",
version := "0.1",
)
lazy val in = project
lazy val out = project
.settings(
sourceGenerators.in(Compile) += Def.taskDyn {
val root = baseDirectory.in(ThisBuild).value.toURI.toString
val from = sourceDirectory.in(in, Compile).value
val to = sourceManaged.in(Compile).value
val outFrom = from.toURI.toString.stripSuffix("/").stripPrefix(root)
val outTo = to.toURI.toString.stripSuffix("/").stripPrefix(root)
Def.task {
scalafix
.in(in, Compile)
.toTask(s" --rules=file:rules/src/main/scala/MyRule.scala --out-from=$outFrom --out-to=$outTo")
.value
(to ** "*.scala").get
}
}.taskValue
)
project/plugins.sbt
addSbtPlugin("ch.epfl.scala" % "sbt-scalafix" % "0.9.16")
Other examples:
https://github.com/olafurpg/scalafix-codegen (semantic)
https://github.com/DmytroMitin/scalafix-codegen (semantic)
https://github.com/DmytroMitin/scalameta-demo (syntactic)
Is it possible to using macro to modify the generated code of structural-typing instance invocation? (semantic)
Scala conditional compilation (syntactic)
Macro annotation to override toString of Scala function (syntactic)
How to merge multiple imports in scala? (syntactic)
You can avoid Scalafix but then you'll have to work with internals of SemanticDB manually
import scala.meta._
import scala.meta.interactive.InteractiveSemanticdb
import scala.meta.internal.semanticdb.{ClassSignature, Range, SymbolInformation, SymbolOccurrence, TypeRef}
val source: String =
"""object X extends Y{
| override def run(): Unit = ???
|}
|
|trait Y extends Z
|
|trait Z {
| def run(): Unit
|}""".stripMargin
val textDocument = InteractiveSemanticdb.toTextDocument(
InteractiveSemanticdb.newCompiler(List(
"-Yrangepos"
)),
source
)
implicit class TreeOps(tree: Tree) {
val occurence: Option[SymbolOccurrence] = {
val treeRange = Range(tree.pos.startLine, tree.pos.startColumn, tree.pos.endLine, tree.pos.endColumn)
textDocument.occurrences
.find(_.range.exists(occurrenceRange => treeRange == occurrenceRange))
}
val info: Option[SymbolInformation] = occurence.flatMap(_.symbol.info)
}
implicit class StringOps(symbol: String) {
val info: Option[SymbolInformation] = textDocument.symbols.find(_.symbol == symbol)
}
source.parse[Source].get.traverse {
case tree#q"""..$mods object $ename extends ${template"""
{ ..$stats } with ..$inits { $self => ..$stats1 }"""}""" =>
val initsParents = inits.collect(_.info.map(_.signature) match {
case Some(ClassSignature(_, parents, _, _)) =>
parents.collect {
case TypeRef(_, symbol, _) => symbol
}
}).flatten
println(s"object = $ename = ${ename.info.map(_.symbol)}, parents = $inits = ${inits.map(_.info.map(_.symbol))}, grand-parents = $initsParents")
}
Output:
object = X = Some(_empty_/X.), parents = List(Y) = List(Some(_empty_/Y#)), grand-parents = List(scala/AnyRef#, _empty_/Z#)
build.sbt
//scalaVersion := "2.13.3"
scalaVersion := "2.11.12"
lazy val scalametaV = "4.3.18"
libraryDependencies ++= Seq(
"org.scalameta" %% "scalameta" % scalametaV,
"org.scalameta" % "semanticdb-scalac" % scalametaV cross CrossVersion.full
)
Semanticdb code seems to be working in Scala 3
https://scastie.scala-lang.org/DmytroMitin/3QQwsDG2Rqm71qa6mMMkTw/36 [copy] (at Scastie -Dscala.usejavacp=true didn't help with object scala.runtime in compiler mirror not found, so I used Coursier to guarantee that scala-library is on path, locally it works without Coursier)

Related

ZIO: How to return JSON ? [instead of using case class in ZIO-Http use schema to map?]

I tried directly getting body of JSON in code which I then want to convert to Avro to write to a kafka topic.
Here is my code with case class:
import zhttp.http._
import zio._
import zhttp.http.{Http, Method, Request, Response, Status}
import zhttp.service.Server
import zio.json._
import zio.kafka._
import zio.kafka.serde.Serde
import zio.schema._
case class Experiments(experimentId: String,
variantId: String,
accountId: String,
deviceId: String,
date: Int)
//case class RootInterface (events: Seq[Experiments])
object Experiments {
implicit val encoder: JsonEncoder[Experiments] = DeriveJsonEncoder.gen[Experiments]
implicit val decoder: JsonDecoder[Experiments] = DeriveJsonDecoder.gen[Experiments]
implicit val codec: JsonCodec[Experiments] = DeriveJsonCodec.gen[Experiments]
implicit val schema: Schema[Experiments] = DeriveSchema.gen
}
object HttpService {
def apply(): Http[ExpEnvironment, Throwable, Request, Response] =
Http.collectZIO[Request] {
case req#(Method.POST -> !! / "zioCollector") =>
val c = req.body.asString.map(_.fromJson[Experiments])
for {
u <- req.body.asString.map(_.fromJson[Experiments])
r <- u match {
case Left(e) =>
ZIO.debug(s"Failed to parse the input: $e").as(
Response.text(e).setStatus(Status.BadRequest)
)
case Right(u) =>
println(s"$u + =====")
ExpEnvironment.register(u)
.map(id => Response.text(id))
}
}
yield r
}
}
// val experimentsSerde: Serde[Any, Experiments] = Serde.string.inmapM { string =>
// //desericalization
// ZIO.fromEither(string.fromJson[Experiments].left.map(errorMessage => new RuntimeException(errorMessage)))
// } { theMatch =>
// ZIO.effect(theMatch.toJson)
//
// }
object ZioCollectorMain extends ZIOAppDefault {
def run: ZIO[Environment with ZIOAppArgs with Scope, Any, Any] = {
Server.start(
port = 9001,
http = HttpService()).provide(ZLayerExp.layer)
}
}
I'm looking into Zio-Json but no success yet, any help is appreciated !
We could also schema something to get the avro generic record
here's my json :
{
"experimentId": "abc",
"variantId": "123",
"accountId": "123",
"deviceId": "123",
"date": 1664544365
}
This function works for me in Scala 3 (sorry, I didn't include all the code but it should be enough):
import zio.*
import zio.Console.printLine
import zhttp.http.*
import zhttp.service.Server
import zio.json.*
...
case class Experiments(experimentId: String,
variantId: String,
accountId: String,
deviceId: String,
date: Int)
//case class RootInterface (events: Seq[Experiments])
object Experiments:
implicit val encoder: JsonEncoder[Experiments] = DeriveJsonEncoder.gen[Experiments]
implicit val decoder: JsonDecoder[Experiments] = DeriveJsonDecoder.gen[Experiments]
implicit val codec: JsonCodec[Experiments] = DeriveJsonCodec.gen[Experiments]
val postingExperiment: Http[Any, Throwable, Request, Response] =
Http.collectZIO[Request] {
case req#(Method.POST -> !! / "zioCollector") =>
//val c = req.body.asString.map(_.fromJson[Experiments])
val experimentsZIO = req.body.asString.map(_.fromJson[Experiments])
for {
experimentsOrError <- experimentsZIO
response <- experimentsOrError match {
case Left(e) => ZIO.debug(s"Failed to parse the input: $e").as(
Response.text(e).setStatus(Status.BadRequest)
)
case Right(experiments) => ZIO.succeed(Response.json(experiments.toJson))
}
} yield response
}
I modified your code slightly (you didn't post your ExpEnvironment class), and it returns back the object posted to the url.
and the test code is:
import sttp.client3.{SimpleHttpClient, UriContext, basicRequest}
object TestExperiments:
def main(args: Array[String]): Unit =
val client = SimpleHttpClient()
//post request
val request = basicRequest
.post(uri"http://localhost:9009/zioCollector")
.body("{ \"experimentId\": \"abc\", \"variantId\": \"123\", \"accountId\": \"123\", \"deviceId\": \"123\", \"date\": 1664544365 }")
val response = client.send(request)
println(response.body)
val invalidJsonRequest = basicRequest
.post(uri"http://localhost:9009/zioCollector")
.body("{ \"experimentId\": \"abc\", \"variantId\": \"123\", \"accountId\": \"123\", \"deviceId\": \"123\", \"date\": 1664544365 ") // missing the closing bracket
val invalidJsonResponse = client.send(invalidJsonRequest)
println(invalidJsonResponse.body)
You have to add: "com.softwaremill.sttp.client3" %% "core" % "3.8.3" to your sbt file.
build.sbt:
ThisBuild / scalaVersion := "3.2.0"
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / organization := "TestSpeed"
ThisBuild / organizationName := "example"
lazy val root = (project in file("."))
.settings(
name := "TestZio",
libraryDependencies ++= Seq(
"dev.zio" %% "zio" % "2.0.2",
"dev.zio" %% "zio-json" % "0.3.0-RC11",
"io.d11" %% "zhttp" % "2.0.0-RC11",
"dev.zio" %% "zio-test" % "2.0.2" % Test,
"com.softwaremill.sttp.client3" %% "core" % "3.8.3" % Test
),
testFrameworks += new TestFramework("zio.test.sbt.ZTestFramework")
)
I didn't include anything related to avro because I am not familiar with it.

What is Scala 3 equivalent to this Scala 2 code that uses Enumeration and play-json?

I have some code that works in Scala 2.{10,11,12,13} that I'm now trying to convert to Scala 3. Scala 3 does Enumeration differently than Scala 2. I'm trying to figure out how to convert the following code that interacts with play-json so that it will work with Scala 3. Any tips or pointers to code from projects that have already crossed this bridge?
// Scala 2.x style code in EnumUtils.scala
import play.api.libs.json._
import scala.language.implicitConversions
// see: http://perevillega.com/enums-to-json-in-scala
object EnumUtils {
def enumReads[E <: Enumeration](enum: E): Reads[E#Value] =
new Reads[E#Value] {
def reads(json: JsValue): JsResult[E#Value] = json match {
case JsString(s) => {
try {
JsSuccess(enum.withName(s))
} catch {
case _: NoSuchElementException =>
JsError(s"Enumeration expected of type: '${enum.getClass}', but it does not appear to contain the value: '$s'")
}
}
case _ => JsError("String value expected")
}
}
implicit def enumWrites[E <: Enumeration]: Writes[E#Value] = new Writes[E#Value] {
def writes(v: E#Value): JsValue = JsString(v.toString)
}
implicit def enumFormat[E <: Enumeration](enum: E): Format[E#Value] = {
Format(EnumUtils.enumReads(enum), EnumUtils.enumWrites)
}
}
// ----------------------------------------------------------------------------------
// Scala 2.x style code in Xyz.scala
import play.api.libs.json.{Reads, Writes}
object Xyz extends Enumeration {
type Xyz = Value
val name, link, unknown = Value
implicit val enumReads: Reads[Xyz] = EnumUtils.enumReads(Xyz)
implicit def enumWrites: Writes[Xyz] = EnumUtils.enumWrites
}
As an option you can switch to jsoniter-scala.
It supports enums for Scala 2 and Scala 3 out of the box.
Also it has handy derivation of safe and efficient JSON codecs.
Just need to add required libraries to your dependencies:
libraryDependencies ++= Seq(
// Use the %%% operator instead of %% for Scala.js and Scala Native
"com.github.plokhotnyuk.jsoniter-scala" %% "jsoniter-scala-core" % "2.13.5",
// Use the "provided" scope instead when the "compile-internal" scope is not supported
"com.github.plokhotnyuk.jsoniter-scala" %% "jsoniter-scala-macros" % "2.13.5" % "compile-internal"
)
And then derive a codec and use it:
import com.github.plokhotnyuk.jsoniter_scala.core._
import com.github.plokhotnyuk.jsoniter_scala.macros._
implicit val codec: JsonValueCodec[Xyz.Xyz] = JsonCodecMaker.make
println(readFromString[Xyz.Xyz]("\"name\""))
BTW, you can run the full code on Scastie: https://scastie.scala-lang.org/Evj718q6TcCZow9lRhKaPw

get annotations from class in scala 3 macros

i am writing a macro to get annotations from a 'Class'
inline def getAnnotations(clazz: Class[?]): Seq[Any] = ${ getAnnotationsImpl('clazz) }
def getAnnotationsImpl(expr: Expr[Class[?]])(using Quotes): Expr[Seq[Any]] =
import quotes.reflect.*
val cls = expr.valueOrError // error: value value is not a member of quoted.Expr[Class[?]]
val tpe = TypeRepr.typeConstructorOf(cls)
val annotations = tpe.typeSymbol.annotations.map(_.asExpr)
Expr.ofSeq(annotations)
but i get an error when i get class value from expr parameter
#main def test(): Unit =
val cls = getCls
val annotations = getAnnotations(cls)
def getCls: Class[?] = Class.forName("Foo")
is it possible to get annotations of a Class at compile time by this macro ?!
By the way, eval for Class[_] doesn't work even in Scala 2 macros: c.eval(c.Expr[Class[_]](clazz)) produces
java.lang.ClassCastException:
scala.reflect.internal.Types$ClassNoArgsTypeRef cannot be cast to java.lang.Class.
Class[_] is too runtimy thing. How can you extract its value from its tree ( Expr is a wrapper over tree)?
If you already have a Class[?] you should use Java reflection rather than Scala 3 macros (with Tasty reflection).
Actually, you can try to evaluate a tree from its source code (hacking multi-staging programming and implementing our own eval instead of forbidden staging.run). It's a little similar to context.eval in Scala 2 macros (but we evaluate from a source code rather than from a tree).
import scala.quoted.*
object Macro {
inline def getAnnotations(clazz: Class[?]): Seq[Any] = ${getAnnotationsImpl('clazz)}
def getAnnotationsImpl(expr: Expr[Class[?]])(using Quotes): Expr[Seq[Any]] = {
import quotes.reflect.*
val str = expr.asTerm.pos.sourceCode.getOrElse(
report.errorAndAbort(s"No source code for ${expr.show}")
)
val cls = Eval[Class[?]](str)
val tpe = TypeRepr.typeConstructorOf(cls)
val annotations = tpe.typeSymbol.annotations.map(_.asExpr)
Expr.ofSeq(annotations)
}
}
import dotty.tools.dotc.core.Contexts.Context
import dotty.tools.dotc.{Driver, util}
import dotty.tools.io.{VirtualDirectory, VirtualFile}
import java.net.URLClassLoader
import java.nio.charset.StandardCharsets
import dotty.tools.repl.AbstractFileClassLoader
object Eval {
def apply[A](str: String): A = {
val content =
s"""
|package $$generated
|
|object $$Generated {
| def run = $str
|}""".stripMargin
val sourceFile = util.SourceFile(
VirtualFile(
name = "$Generated.scala",
content = content.getBytes(StandardCharsets.UTF_8)),
codec = scala.io.Codec.UTF8
)
val files = this.getClass.getClassLoader.asInstanceOf[URLClassLoader].getURLs
val depClassLoader = new URLClassLoader(files, null)
val classpathString = files.mkString(":")
val outputDir = VirtualDirectory("output")
class DriverImpl extends Driver {
private val compileCtx0 = initCtx.fresh
val compileCtx = compileCtx0.fresh
.setSetting(
compileCtx0.settings.classpath,
classpathString
).setSetting(
compileCtx0.settings.outputDir,
outputDir
)
val compiler = newCompiler(using compileCtx)
}
val driver = new DriverImpl
given Context = driver.compileCtx
val run = driver.compiler.newRun
run.compileSources(List(sourceFile))
val classLoader = AbstractFileClassLoader(outputDir, depClassLoader)
val clazz = Class.forName("$generated.$Generated$", true, classLoader)
val module = clazz.getField("MODULE$").get(null)
val method = module.getClass.getMethod("run")
method.invoke(module).asInstanceOf[A]
}
}
package mypackage
import scala.annotation.experimental
#experimental
class Foo
Macro.getAnnotations(Class.forName("mypackage.Foo")))
// new scala.annotation.internal.SourceFile("/path/to/src/main/scala/mypackage/Foo.scala"), new scala.annotation.experimental()
scalaVersion := "3.1.3"
libraryDependencies += scalaOrganization.value %% "scala3-compiler" % scalaVersion.value
How to compile and execute scala code at run-time in Scala3?
(compile time of the code expanding macros is the runtime of macros)
Actually, there is even a way to evaluate a tree itself (not its source code). Such functionality exists in Scala 3 compiler but is deliberately blocked because of phase consistency principle. So this to work, the code expanding macros should be compiled with a compiler patched
https://github.com/DmytroMitin/dotty-patched
scalaVersion := "3.2.1"
libraryDependencies += scalaOrganization.value %% "scala3-staging" % scalaVersion.value
// custom Scala settings
managedScalaInstance := false
ivyConfigurations += Configurations.ScalaTool
libraryDependencies ++= Seq(
scalaOrganization.value % "scala-library" % "2.13.10",
scalaOrganization.value %% "scala3-library" % "3.2.1",
"com.github.dmytromitin" %% "scala3-compiler-patched-assembly" % "3.2.1" % "scala-tool"
)
import scala.quoted.{Expr, Quotes, staging, quotes}
object Macro {
inline def getAnnotations(clazz: Class[?]): Seq[String] = ${impl('clazz)}
def impl(expr: Expr[Class[?]])(using Quotes): Expr[Seq[String]] = {
import quotes.reflect.*
given staging.Compiler = staging.Compiler.make(this.getClass.getClassLoader)
val tpe = staging.run[Any](expr).asInstanceOf[TypeRepr]
val annotations = Expr(tpe.typeSymbol.annotations.map(_.asExpr.show))
report.info(s"annotations=${annotations.show}")
annotations
}
}
Normally, for expr: Expr[A] staging.run(expr) returns a value of type A. But Class is specific. For expr: Expr[Class[_]] inside macros it returns a value of type dotty.tools.dotc.core.Types.CachedAppliedType <: TypeRepr. That's why I had to cast.
In Scala 2 this also would be c.eval(c.Expr[Any](/*c.untypecheck*/(clazz))).asInstanceOf[Type].typeSymbol.annotations because for Class[_] c.eval returns scala.reflect.internal.Types$ClassNoArgsTypeRef <: Type.
https://github.com/scala/bug/issues/12680

Vec of Bundle as a Module parameter

I'm writing a Wishbone Intercon module to make the address decoding automatically.
I have two Bundle classes that describe Wishbone master and Wishbone slave interface.
class WbMaster (val dwidth: Int,
val awidth: Int) extends Bundle {
val adr_o = Output(UInt(awidth.W))
//...
val cyc_o = Output(Bool())
}
// Wishbone slave interface
class WbSlave (val dwidth: Int,
val awidth: Int) extends Bundle {
val adr_i = Input(UInt(awidth.W))
//...
val cyc_i = Input(Bool())
}
I want to pass these Bundle as parameter to my module Wishbone like following:
class WbInterconOneMaster(val awbm: WbMaster,
val awbs: Vec(WbSlave)) extends Module {
val io = IO(new Bundle{
val wbm = Flipped(new WbMaster(awbm.dwidth, awbm.awidth))
val wbs = Vec(?)
})
}
The objective is to permit a variable number of wishbone slaves and let the module doing the plumbing. Like following:
val spi2Wb = Module(new Spi2Wb(dwidth, awidth))
val wbMdio1 = Module(new MdioWb(mainFreq, targetFreq))
val wbMdio2 = Module(new MdioWb(mainFreq, targetFreq))
val slavesVec = Vec(Seq(wbMdio1, wbMdio2))
val wbIntercon = Module(new WbIntercon(spi2Wb.io.wbm, slavesVec))
The question is multiple:
is it the right way to do it ?
How to declare the Vec() in module parameters ?
I tryied this but does not work:
// Wishbone Intercone with one master and several slaves
// data bus is same size as master
class WbInterconOneMaster(val awbm: WbMaster,
val awbs: Vec[Seq[WbSlave]]) extends Module {
val io = IO(new Bundle{
val wbm = Flipped(new WbMaster(awbm.dwidth, awbm.awidth))
val wbs = Vec.fill(awbs.size){awbs.map(_.cloneType())}
})
}
Try thinking about your parameters as generators of the types that you need. The following is a toy example of this idea. In this case the one constructor parameter bgen is a generator method that will return an instance of a Bundle. It shows the use of this generator as is and also as part of a Vec
class BundleX extends Bundle {
val a = UInt(8.W)
val b = UInt(8.W)
}
class ModuleX(bgen: () => BundleX, numInputs: Int) extends Module {
val io = IO(new Bundle{
val in1 = Input(Vec(numInputs, bgen()))
val out1 = Output(bgen())
})
// output fields a and b are the the sum of all the corresponding inputs
io.out1.a := io.in1.foldLeft(0.U) { case (res, value) => res +% value.a}
io.out1.b := io.in1.foldLeft(0.U) { case (res, value) => res +% value.b}
}
class BundleXSpec extends ChiselPropSpec {
property("testname") {
elaborate(new ModuleX(() => new BundleX, 4))
}
}
I found a solution with MixedVec (experimental) module. I simply pass a Seq of WbSlave Bundle as a module parameter and I made a MixedVec (WbSlave can have different parameters in fact):
class WbInterconOneMaster(val awbm: WbMaster,
val awbs: Seq[WbSlave]) extends Module {
val io = IO(new Bundle{
val wbm = Flipped(new WbMaster(awbm.dwidth, awbm.awidth))
val wbs = MixedVec(awbs.map{i => new WbSlave(i.dwidth, i.awidth)})
})
io.wbm.dat_i := 0.U
io.wbm.ack_i := 0.U
for(i <- 0 until io.wbs.size){
io.wbs(i).dat_o := 0.U
io.wbs(i).ack_o := 0.U
}
}
That compile in the testbench.

Get a fully qualified name for references using scalameta

I'm trying to write a simple program to traverse all the referenced code starting from a given method using scalameta.
I was able to follow the calls but could not resolve method references.
analyzeme/src/main/scala/codelab/FindMe.scala
package codelab
object FindMe {
def main(args: Array[String]): Unit = {
val x = someRecognizeableName(1, 2)
val y = List(1, 2, 3)
y.foldLeft(0)(someRecognizeableName)
}
def someRecognizeableName(a: Int, b: Int): Int = a + b
}
Generated and loaded semanticdb for FindMe.scala and checking the usages of someRecognizeableName method.
I can see the first call in the db.names list:
[87..108): someRecognizeableName => _root_.codelab.FindMe.someRecognizeableName(Int,Int).
The second one, though, when I don't call the method, just pass the reference is showing up as this:
[159..180): someRecognizeableName => local2_src_main_scala_codelab_FindMe_scala
So when I try to follow references startin from main, I don't get a fully qualified name of the someRecognizeableName reference in the second case.
Question: Is there a way to get a fully qualified name from semanticdb for that reference?
Full source to reproduce the above
run instructions:
analyzeme $ sbt compile
analyzer $ sbt "run ../analyzeme"
analyzeme/src/main/scala/codelab/FindMe.scala
package codelab
object FindMe {
def main(args: Array[String]): Unit = {
val x = someRecognizeableName(1, 2)
val y = List(1, 2, 3)
y.foldLeft(0)(someRecognizeableName)
}
def someRecognizeableName(a: Int, b: Int): Int = a + b
}
analyzer/src/main/scala/Main.scala
import org.langmeta.io.{Classpath, Sourcepath}
import scala.meta._
object Main {
def main(args: Array[String]): Unit = {
println(s"Loading from [${ args(0) }]")
println()
val cp = Classpath(s"${ args(0) }/target/scala-2.12/classes")
val sp = Sourcepath(s"${ args(0) }/src/main/scala")
val db = Database.load(cp, sp)
println("* names:")
db.names foreach println
println()
println("* symbols:")
db.symbols foreach println
println()
println("* synthetics:")
db.synthetics foreach println
println()
println("* messages:")
db.messages foreach println
println()
}
}
analyzeme/build.sbt
name := "analyzee"
version := "0.1"
scalaVersion := "2.12.4"
addCompilerPlugin("org.scalameta" % "semanticdb-scalac" % "3.4.0" cross CrossVersion.full)
scalacOptions += "-Yrangepos"
analyzer/build.sbt
name := "analyzer"
version := "0.1"
scalaVersion := "2.12.4"
libraryDependencies += "org.scalameta" %% "scalameta" % "3.4.0"
libraryDependencies += "org.scalameta" %% "contrib" % "3.4.0"
package codelab
object FindMe {
def main(args: Array[String]): Unit = {
val x = someRecognizeableName(1, 2)
y.foldLeft(0)(someRecognizeableName)
// same as
y.foldLeft(0){ a, b => someRecognizeableName(a, b) }
}
I debug the code and found at the second case, the compiler passed an anonymous symbol which is not accessible from the current semanticdb, it maybe should comes in the syhthetics partion but I can't find it inside.
So I guess the compiler generated anonymous is missing in the current semanticdb.