Scaladoc link to another method - scala

I have two methods on my companion object (model.Product):
def apply(p:ProductSyntax)(rs: WrappedResultSet): Product
def apply(p: ResultName[Product])(rs: WrappedResultSet): Product
The first method delegates to the second and I would like to indicate this in the docs. I tried using:
/**
* delegates to [[apply]]
* /
But scaladoc complains that this is ambiguous but tells me that
(p: scalikejdbc.ResultName[model.Product])(rs: scalikejdbc.WrappedResultSet): model.Product in object Product
is an option
However I can't work out how to tell scaladoc to use this method. I tried
/**
* Delegates to [[apply(scalikejdbc.ResultName[model.Product])(scalikejdbc.WrappedResultSet):model.Product]]
* /
But it tells me that no member is found:
Could not find any member to link for "apply(scalikejdbc.ResultName[model.Product])(scalikejdbc.WrappedResultSet):model.Product".
How would I add a link to the def apply(p: ResultName[Product])(rs: WrappedResultSet): Product method?

So this is what I discovered:
Everything must be fully qualified, even the class/object itself
Package dots should be escaped with \
You cannot use any spaces in the signature
Paramaters should include the name not just the type i.e. foo(a:String) not foo(String)
The signature should end with a *
Finally this worked:
[[apply(p:scalikejdbc\.ResultName[model\.Product])(rs:scalikejdbc\.WrappedResultSet):model\.Product*]]
HOWEVER ... the backslash escaping and * also appears in the generated html!

Related

VS Code #typedef capitalization

I noticed today while working on a very small JS file that my defined type wasn't allowing intellisense when a variable was declared later on as being of that type. Turns out that changing the typedef to have a capital letter fixes the issue. My type definition is at the top of the file and the variable that is of that type is within an IIFE.
If I move the type definition inside the IIFE, then it works no matter the case of the type's name. However, leaving the type definition at the top of the file (outside the IIFE) and making the name capitalized also makes it work.
Is it documented anywhere that a capitalized type definition makes it global?
EDIT: Adding a couple screenshots. This seems to be sporadic to reproduce using simple examples.
Non-working
Working
EDIT_2: It seems to be related to having a variable with the exact same name as the type definition.
/**
* An object that stores all the necessary contextual data
* Defined inside the HTML file loaded for the alert
* #typedef {Object} Test
* #prop {Number} personId person_id of the patient
*/
(function() {
/** #type {Test} */
const Test = window.Test;
Test
})();
You should consider the #typedef as being a variable – one that's only usable in your type declarations.
That means that you can override that variable in your local scope, so that it means something else there.
And that's exactly what you are doing here.

Why does spray-json apply this hierarchy way in RootJsonFormat?

Recently, I am reading the source code of Spray-json. I noted that the following hierarchy relation in JsonFormat.scala, please see below code snippet
/**
* A special JsonFormat signaling that the format produces a legal JSON root
* object, i.e. either a JSON array
* or a JSON object.
*/
trait RootJsonFormat[T] extends JsonFormat[T] with RootJsonReader[T] with RootJsonWriter[T]
To express the confusion more convenient, I draw the following diagram of hierarchy:
According to my limited knowledge of Scala, I think the JsonFormat[T] with should be removed from the above code. Then I cloned the repository of Spary-json, and comment the code JsonFormat[T] with
trait RootJsonFormat[T] extends RootJsonReader[T] with RootJsonWriter[T]
Then I compile it in SBT(use package/compile command) and it passed to the compiling process and generates a spray-json_2.11-1.3.4.jar successfully.
However, when I run the test cases via test command of SBT, it failed.
So I would like to know why. Thanks in advance.
I suggest you to not think of it in terms of OOP. Think of it in terms of type classes. In case when some entity must be serialized and deserialized at the same time, there is a type class JsonFormat that includes both JsonWriter and JsonReader. This is convenient since you don't need to search for 2 type class instances when you need both capabilities. But in order for this approach to work, there has to be an instance of JsonFormat type class. This is why you can't just throw it away from hierarchy. For instance:
def myMethod[T](t: T)(implicit format: JsonFormat[T]): Unit = {
format.read(format.write(t))
}
If you want this method to work properly there has to be a direct descendant of JsonFormat and a concrete implicit instance of it for a specific type T.
UPD: By creating an instance of the JsonFormat type class, you get instances for JsonWriter and JsonReader type classes automatically (in case when you need both). So this is also a way to reduce boilerplate.

Where does type class inherit `.apply` in scala?

I am looking at this code
def loginForm = Form(mapping("username" -> text, "password" -> text)
(LoginRequest.apply)(LoginRequest.unapply))
case class LoginRequest(username:String, password:String)
Here is the source code for Form in Play
def mapping[R, A1, A2](a1: (String, Mapping[A1]),
a2: (String, Mapping[A2]))
(apply: Function2[A1, A2, R])
(unapply: Function1[R, Option[(A1, A2)]]):
Mapping[R] = {
new ObjectMapping2(apply, unapply, a1, a2)
}
I am trying to figure out what
LoginRequest.apply
actually means and what
ObjectMapping2(LoginRequest.apply, LoginRequest.unapply, "username" -> text, "password" -> text)
do
Your LoginRequest is a case class and as such contains methods apply and unapply, inserted automatically by the compiler. Play need them to know how to construct and deconstruct your domain object from the mapping. If you had some custom code you would need a custom apply/unapply, but since you're using a case class (which is the correct practice) you can simply pass its apply and unapply functions which you get out of the box.
So, to answer your first question, LoginRequest.apply is an invocation of apply function available for every case class.
To answer your second question,
ObjectMapping2(LoginRequest.apply, LoginRequest.unapply, "username" -> text, "password" -> text)`
is saying that a new LoginRequest will be created from "username" and "password" strings by passing them to LoginRequest.apply which is a constructor for your LoginRequest case class. It also says that deconstructing of your LoginRequest is done using LoginRequest.unapply, which will return Option[(String, String)] (one string for each parameter, in your case username and password).
EDIT:
I've been rightfully warned in the comments to be more precise. So, when defining a case class, compiler will automatically generate apply() and unapply() in its companion object. You can always include these methods, in any companion object of any class you define. In this situation compiler does it for you. Method apply() is "special" in the sense that it allows a special syntax sugar: instead of invoking Something.apply(), it can also be invoked as simply Something().
Note that apply() and unapply() are not overridden or implemented; you (or in this case the compiler) are simply defining them from scratch just like any other custom method. But it's the "trick" in the Scala compiler that allows the syntax sugar. So, if you define a method apply() in MyClass's companion object, you are of course allowed to call it in the usual way as MyClass.apply(whatever), but also as MyClass(whatever) (that's the trick part).
You use this all the time even if you're sometimes not aware of it - for example, List(1, 2, 3) is actually desugared by the compiler into List.apply(1, 2, 3). This trick allows us programmers to write prettier and more readable code (List.apply() is ugly, isn't it?)
Unrelated to this particular question, but let me mention that too - compiler will also add some other methods for case classes: copy, toString, equals and hashCode. Unlike apply and unapply, these methods will be added to the class (so instead of invoking on companion object like apply and unapply, those need to be invoked upon an instance of the case class).

Is there a good way to determine the package of a type in a Scala macro?

I'm writing a macro that needs to determine the package of the type it's parameterised on. It's possible with something like:
def macroImpl[T: c.WeakTypeTag](c: Context) = {
import c.universe._
val typ = weakTypeOf[T]
val pkg = typ.typeSymbol.fullName.stripSuffix(s".${typ.typeSymbol.name}")
}
but this feels distinctly hacky. Is there a better approach?
This need not be in a macro, but nothing here would preclude doing so. As long as you have a type symbol, you can continually check the type's owner until you find a package:
def owners[T : WeakTypeTag] = Iterator.iterate(weakTypeOf[T].typeSymbol.owner)(_.owner).takeWhile(!_.isPackageClass)
The last element of this iterator will be the package symbol. If all you care about is the package name, you could do:
def package[T : WeakTypeTag] = owners[T].last.fullName
The documentation on symbol reflection has a good note about the owner property (my bolding):
Symbols are organized in a hierarchy. For example, a symbol that represents a parameter of a method is owned by the corresponding method symbol, a method symbol is owned by its enclosing class, trait, or object, a class is owned by a containing package and so on.
If a symbol does not have an owner, for example, because it refers to a top-level entity, such as a top-level package, then its owner is the special NoSymbol singleton object. Representing a missing symbol, NoSymbol is commonly used in the API to denote an empty or default value. Accessing the owner of NoSymbol throws an exception. See the API docs for the general interface provided by type Symbol.

Invoke a method with a named parameter through reflection

Say I define the following case class:
case class A(x: Int, y: String, s: Double)
and I want to be able dynamically call the copy method via reflection to achieve something like the following:
val a1 = A(1, "hello", 2.3)
val a2 = a1.copy(y = "goodbye") // Do this with reflection???
Is it possible to do the copy via reflection?
Thanks
Des
At the moment Scala's runtime reflection library doesn't provide a convenient way to invoke methods with named/default arguments. Current API only allows reflective calls to methods when the user explicitly provides arguments for all parameters in their declaration order. Please submit an issue to our bug tracker, and I'll be happy to look into it.