How do I use Scala-Meta Parse an object? - scala

I am trying to use Scala Meta to write an annotation so I can generate another case class from an existing object.
But when I try to do this:
MyObject.parse[Source].show[Structure]
I got this error:
Error:(5, 20) not enough arguments for method parse: (implicit convert: scala.meta.common.Convert[domain.MyObject.type,scala.meta.inputs.Input], implicit parse: scala.meta.parsers.Parse[scala.meta.Source], implicit dialect: scala.meta.Dialect)scala.meta.parsers.Parsed[scala.meta.Source].
Unspecified value parameters convert, parse, dialect.
MyObject.parse[Source].show[Structure];}
^
I am very confused because based on their tutorial, that's what I need to start with
http://scalameta.org/tutorial/#.parse[T]
How can I reflect this object to loop through all properties?
Thanks

parse[Source] parses text. You may try the following
import scala.meta._
"object MyObject".parse[Source].get.show[Syntax]
If you are creating annotation then it might look like:
#MyAnnotation
object MyObject
And in another module:
import scala.meta._
class MyAnnotation extends StaticAnnotation {
inline def apply(defn: Any): Any = meta {
defn.show[Syntax]
defn
}
}

Related

Scala Type Classes Understanding Interface Syntax

I'm was reading about cats and I encountered the following code snippet which is about serializing objects to JSON!
It starts with a trait like this:
trait JsonWriter[A] {
def write(value: A): Json
}
After this, there are some instances of our domain object:
final case class Person(name: String, email: String)
object JsonWriterInstances {
implicit val stringWriter: JsonWriter[String] =
new JsonWriter[String] {
def write(value: String): Json =
JsString(value)
}
implicit val personWriter: JsonWriter[Person] =
new JsonWriter[Person] {
def write(value: Person): Json =
JsObject(Map(
"name" -> JsString(value.name),
"email" -> JsString(value.email)
))
}
// etc...
}
So far so good! I can then use this like this:
import JsonWriterInstances._
Json.toJson(Person("Dave", "dave#example.com"))
Later on I come across something called the interface syntax, which uses extension methods to extend existing types with interface methods like below:
object JsonSyntax {
implicit class JsonWriterOps[A](value: A) {
def toJson(implicit w: JsonWriter[A]): Json =
w.write(value)
}
}
This then simplifies the call to serializing a Person as:
import JsonWriterInstances._
import JsonSyntax._
Person("Dave", "dave#example.com").toJson
What I don't understand is that how is the Person boxed into JsonWriterOps such that I can directly call the toJson as though toJson was defined in the Person case class itself. I like this magic, but I fail to understand this one last step about the JsonWriterOps. So what is the idea behind this interface syntax and how does this work? Any help?
This is actually a standard Scala feature, since JsonWriterOps is marked implicit and is in scope, the compiler can apply it at compilation-time when needed.
Hence scalac will do the following transformations:
Person("Dave", "dave#example.com").toJson
new JsonWriterOps(Person("Dave", "dave#example.com")).toJson
new JsonWriterOps[Person](Person("Dave", "dave#example.com")).toJson
Side note:
It's much more efficient to implicit classes as value classes like this:
implicit class JsonWriterOps[A](value: A) extends AnyVal
This makes the compiler also optimize away the new object construction, if possible, compiling the whole implicit conversion + method call to a simple function call.

How does Scala use explicit types when resolving implicits?

I have the following code which uses spray-json to deserialise some JSON into a case class, via the parseJson method.
Depending on where the implicit JsonFormat[MyCaseClass] is defined (in-line or imported from companion object), and whether there is an explicit type provided when it is defined, the code may not compile.
I don't understand why importing the implicit from the companion object requires it to have an explicit type when it is defined, but if I put it inline, this is not the case?
Interestingly, IntelliJ correctly locates the implicit parameters (via cmd-shift-p) in all cases.
I'm using Scala 2.11.7.
Broken Code - Wildcard import from companion object, inferred type:
import SampleApp._
import spray.json._
class SampleApp {
import MyJsonProtocol._
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
object MyJsonProtocol extends DefaultJsonProtocol {
implicit val myCaseClassSchemaFormat = jsonFormat1(MyCaseClass)
}
}
Results in:
Cannot find JsonReader or JsonFormat type class for SampleAppObject.MyCaseClass
Note that the same thing happens with an explicit import of the myCaseClassSchemaFormat implicit.
Working Code #1 - Wildcard import from companion object, explicit type:
Adding an explicit type to the JsonFormat in the companion object causes the code to compile:
import SampleApp._
import spray.json._
class SampleApp {
import MyJsonProtocol._
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
object MyJsonProtocol extends DefaultJsonProtocol {
//Explicit type added here now
implicit val myCaseClassSchemaFormat: JsonFormat[MyCaseClass] = jsonFormat1(MyCaseClass)
}
}
Working Code #2 - Implicits inline, inferred type:
However, putting the implicit parameters in-line where they are used, without the explicit type, also works!
import SampleApp._
import spray.json._
class SampleApp {
import DefaultJsonProtocol._
//Now in-line custom JsonFormat rather than imported
implicit val myCaseClassSchemaFormat = jsonFormat1(MyCaseClass)
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
}
After searching for the error message Huw mentioned in his comment, I was able to find this StackOverflow question from 2010: Why does this explicit call of a Scala method allow it to be implicitly resolved?
This led me to this Scala issue created in 2008, and closed in 2011: https://issues.scala-lang.org/browse/SI-801 ('require explicit result type for implicit conversions?')
Martin stated:
I have implemented a slightly more permissive rule: An implicit conversion without explicit result type is visible only in the text following its own definition. That way, we avoid the cyclic reference errors. I close for now, to see how this works. If we still have issues we migth come back to this.
This holds - if I re-order the breaking code so that the companion object is declared first, then the code compiles. (It's still a little weird!)
(I suspect I don't see the 'implicit method is not applicable here' message because I have an implicit value rather than a conversion - though I'm assuming here that the root cause is the same as the above).

Unable to override method from base class from Java library in Scala

I am trying to extend a class from a Java library and override the base method, but Eclipse keeps giving me errors even though the method signature is correct.
My code looks like this:
import someLibrary.ClassA
import someLibrary.TypeX
import someLibrary.TypeY
...
class MyClass extends ClassA {
...
override def foo(s: TypeX, params: Array[String]): List[TypeY] = {
...
}
}
The error I keep getting is:
overriding method foo in class ClassA of type (x$1: someLibrary.TypeX,
x$2: Array[String])java.util.List[someLibrary.TypeY]; method foo has
incompatible type
Note that my method signature is exactly the same.
EDIT:
After reading Ren's answer here's how I fixed it:
import scala.collection.JavaConverters._
myScalaList.asJava
It could be that the original method is returning a java.util.List, while your override is returning a Scala list.

spray-json for normal classes (non case) on a List

I'm finding myself in a situation in which I need to serialize into JSON a non case class.
Having a class as:
class MyClass(val name: String) {
def SaySomething() : String = {
return "Saying something... "
}
}
I've created a JsonProtocol for this class:
object MyClassJsonProtocol extends DefaultJsonProtocol {
implicit object MyClassJsonFormat extends JsonWriter[MyClass] {
override def write(obj: MyClass): JsValue =
JsObject(
"name" -> JsString(obj.name)
)
}
}
Later on in the code I import the protocol..
val aListOfMyClasses = List[MyClass]() ... // lets assume that has items and not an empty list
import spray.json._
import MyClassJsonProtocol._
val json = aListOfMyClasses.toJson
When trying to build the project I get the following error:
Cannot find JsonWriter or JsonFormat for type class List[MyClass]
spray-json has already a format for generic list and I'm providing a format for my class, what would be the problem?
Thanks in advance...!!!
When I extended MyClassJsonFormat from JsonFormat instead of JsonWriter, it stared working fine. Looks like the CollectionFormats trait will work only if you extend from JsonFormat
The following code compiles fine for me
object MyClassJsonProtocol extends DefaultJsonProtocol {
implicit object MyClassJsonFormat extends JsonFormat[MyClass] {
override def write(obj: MyClass): JsValue =
JsObject(
"name" -> JsString(obj.name)
)
override def read(json: JsValue): MyClass = new MyClass(json.convertTo[String])
}
}
The reason seems to be mentioned here:
An issue you might run into with just JsonReader/JsonWriter is that
when you try to lookup JsonReader/JsonWriter for Option or a
collection, it looks for a JsonFormat for the contained type, which
will fail. Not sure if there is something I am missing that will fix
that issue.
You and I have run into this. I don't see other way out at the moment than #user007's suggestion to use a full JsonFormat. That, itself, brings more difficulties at least to me - I was planning to use the default reader for my class.
Oh, well...

Scala importing a file in all files of a package

I need to use an implicit ordering that has been defined in an object in a file
abc
in the following way:
object abc{
implicit def localTimeOrdering: Ordering[LocalDate] = Ordering.fromLessThan(_.isBefore(_))
}
So, I make a package object
xyz
inside a file 'package.scala' that in turn is in the package 'xyz' that has files in which I need the implicit ordering to be applicable. I write something like this:
package object xyz{
import abc._
}
It does not seem to work. If I manually write the implicit definition statement inside the package object, it works perfectly. What is the correct way to import the object (abc) such that all of its objects/classes/definitions can be used in my entire package 'xyz' ?
You cannot import the implicit conversions in that way, you will have to:
Manually write them inside the object:
package obj {
implicit def etc//
}
Or obtain them via inheritance/mixins:
package obj extends SomeClassOrTraitWithImplicits with AnotherTraitWithImplicits {
}
For this reason, you usually define your implicit conversions in traits or class definitions, that way you can do bulk import with a single package object.
The usual pattern is to define a helper trait for each case.
trait SomeClass {
// all the implicits here
}
object SomeClass extends SomeClass {}
Doing this would allow you to:
package object abc extends SomeClass with SomeOtherClass with AThirdClass {
// all implicits are now available in scope.
}