A class imported from a companion not usable as the constructor parameter default value - scala

Consider following code:
object Main extends App {
object Project {
case class Config(rules: Seq[String] = Seq.empty)
}
import Project._
//case class Project(root: String, config: Config) // compiles fine
//case class Project(root: String, config: Project.Config = Project.Config()) // compiles fine
case class Project(root: String, config: Config = Config()) // error: not found: type Config
}
Why does the last version not compile (same with Config = Config.apply())?

It is not clear to me if this is a bug or not, but here is why it produces an error:
Consider this, which works:
import Project._
object Project {
case class Config()
}
case class Project(config: Config = Config())
When you add a default argument the compiler generates a method to calculate the value. When that value is a constructor default, that method is added to the companion object of the class. So the compiler will generate this method:
def <init>$default$1: Project.Config = Config()
Which will get added to your Project object.
The Scala type checker generates an object tree of Contexts. Each context has a reference to the context of it's outer scope. So the generated method gets a context and that generated method's outer scope is the Project companion object.
When the type checker attempts to resolve Config() it traverses all the enclosing contexts and cannot find Config (I am not sure why, and this may be a bug).
Once it has exhausted the contexts it resolves the imports which has the import Project._! The type checker is happy because it can now traverse the imports and find the apply method.
Now when you move the import below Project:
object Project {
case class Config()
}
import Project._
case class Project(config: Config = Config())
In this case the imports available to the generated method does not have the Project._ import (this may also be a bug), I'm assuming because it's below the object definition which is where the generated method lives. The type checker then throws an error because it can't find Config.
What appears to be happening is when the type checker is resolving Config() it needs the import above the Project companion object as it needs to process the import to be able to resolve it and unless the import is above Project that import is not in scope.
For those who wish to debug further take a look at Contexts.lookupSymbol which is where the lookup is happening

Related

How to invoke class constructor, if class and object is in the same file

In MyFactory.scala, object and Class defined in the same file with same name, like this
package com.mydomain.app.module
object MyFactory{
val a1 = "a1"
val b1 = "b1"
}
class MyFactory(config:Configuration){
//blah....
}
Problem is I cannot Initiate MyFactory object in another class
var myFactory = new Myfactory(defaultConfiguration)
due to the error
not found: type MyFactory
All I did was a common import
import com.mydomain.app.module.MyFactory
What is the valid way to initiate an object of the class, if I can't modify anything from MyFactory.scala (legacy code)
var myFactory = new MyFactory(defaultConfiguration)
is the valid way to initiate an object of the class.
import com.mydomain.app.module.MyFactory should be enough for bringing MyFactory (and its companion) to the scope.
Sometimes "object app is not a member of package com.mydomain" can mean that you're trying to recompile MyFactory.scala referring to something not compiled in com.mydomain...
Try mvn clean compile.

Unable to declare functor type that takes zero parameters?

I'm trying to make a type definition for the function type () => Unit, I use this signature quite a bit for cleanup callback functions, and I'd like to give them more meaningful names.
I've tried the following, which I think should be correct syntax, but it doesn't compile:
package myPackage
import stuff
type CleanupCallback = () => Unit
trait myTrait ...
class mObject ...
Why doesn't it compile? And what is the correct syntax?
The compilation error is: expected class or object definition
You can't declare type alias out of class/trait/object scope. But you can declare it in package object as follows:
package object myPackage {
type CleanupCallback = () => Unit
}
It will be visible for all classes in myPackage.
Also you can import it in other classes which belong to other packages:
import myPackage.CleanupCallback
trait MyTrait {
def foo: CleanupCallBack
}
IDEA plugin supports creation of package objects, another version is (suppose you don't have IDEA plugin):
Create file package.scala in your package. The file must contain:
package object packageName { // name must match with package name
// ...
}

ficus configuration load generic

Loading a ficus configuration like
loadConfiguration[T <: Product](): T = {
import net.ceedubs.ficus.readers.ArbitraryTypeReader._
import net.ceedubs.ficus.Ficus._
val config: Config = ConfigFactory.load()
config.as[T]
fails with:
Cannot generate a config value reader for type T, because it has no apply method in a companion object that returns type T, and it doesn't have a primary constructor
when instead directly specifying a case class instead of T i.e. SomeClass it works just fine. What am I missing here?
Ficus uses the type class pattern, which allows you to constrain generic types by specifying operations that must be available for them. Ficus also provides type class instance "derivation", which in this case is powered by a macro that can inspect the structure of a specific case class-like type and automatically create a type class instance.
The problem in this case is that T isn't a specific case class-like type—it's any old type that extends Product, which could be something nice like this:
case class EasyToDecode(a: String, b: String, c: String)
But it could also be:
trait X extends Product {
val youWillNeverDecodeMe: String
}
The macro you've imported from ArbitraryTypeReader has no idea at this point, since T is generic here. So you'll need a different approach.
The relevant type class here is ValueReader, and you could minimally change your code to something like the following to make sure T has a ValueReader instance (note that the T: ValueReader syntax here is what's called a "context bound"):
import net.ceedubs.ficus.Ficus._
import net.ceedubs.ficus.readers.ValueReader
import com.typesafe.config.{ Config, ConfigFactory }
def loadConfiguration[T: ValueReader]: T = {
val config: Config = ConfigFactory.load()
config.as[T]
}
This specifies that T must have a ValueReader instance (which allows us to use .as[T]) but says nothing else about T, or about where its ValueReader instance needs to come from.
The person calling this method with a concrete type MyType then has several options. Ficus provides instances that are automatically available everywhere for many standard library types, so if MyType is e.g. Int, they're all set:
scala> ValueReader[Int]
res0: net.ceedubs.ficus.readers.ValueReader[Int] = net.ceedubs.ficus.readers.AnyValReaders$$anon$2#6fb00268
If MyType is a custom type, then either they can manually define their own ValueReader[MyType] instance, or they can import one that someone else has defined, or they can use generic derivation (which is what ArbitraryTypeReader does).
The key point here is that the type class pattern allows you as the author of a generic method to specify the operations you need, without saying anything about how those operations will be defined for a concrete type. You just write T: ValueReader, and your caller imports ArbitraryTypeReader as needed.

How does Scala use explicit types when resolving implicits?

I have the following code which uses spray-json to deserialise some JSON into a case class, via the parseJson method.
Depending on where the implicit JsonFormat[MyCaseClass] is defined (in-line or imported from companion object), and whether there is an explicit type provided when it is defined, the code may not compile.
I don't understand why importing the implicit from the companion object requires it to have an explicit type when it is defined, but if I put it inline, this is not the case?
Interestingly, IntelliJ correctly locates the implicit parameters (via cmd-shift-p) in all cases.
I'm using Scala 2.11.7.
Broken Code - Wildcard import from companion object, inferred type:
import SampleApp._
import spray.json._
class SampleApp {
import MyJsonProtocol._
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
object MyJsonProtocol extends DefaultJsonProtocol {
implicit val myCaseClassSchemaFormat = jsonFormat1(MyCaseClass)
}
}
Results in:
Cannot find JsonReader or JsonFormat type class for SampleAppObject.MyCaseClass
Note that the same thing happens with an explicit import of the myCaseClassSchemaFormat implicit.
Working Code #1 - Wildcard import from companion object, explicit type:
Adding an explicit type to the JsonFormat in the companion object causes the code to compile:
import SampleApp._
import spray.json._
class SampleApp {
import MyJsonProtocol._
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
object MyJsonProtocol extends DefaultJsonProtocol {
//Explicit type added here now
implicit val myCaseClassSchemaFormat: JsonFormat[MyCaseClass] = jsonFormat1(MyCaseClass)
}
}
Working Code #2 - Implicits inline, inferred type:
However, putting the implicit parameters in-line where they are used, without the explicit type, also works!
import SampleApp._
import spray.json._
class SampleApp {
import DefaultJsonProtocol._
//Now in-line custom JsonFormat rather than imported
implicit val myCaseClassSchemaFormat = jsonFormat1(MyCaseClass)
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
}
After searching for the error message Huw mentioned in his comment, I was able to find this StackOverflow question from 2010: Why does this explicit call of a Scala method allow it to be implicitly resolved?
This led me to this Scala issue created in 2008, and closed in 2011: https://issues.scala-lang.org/browse/SI-801 ('require explicit result type for implicit conversions?')
Martin stated:
I have implemented a slightly more permissive rule: An implicit conversion without explicit result type is visible only in the text following its own definition. That way, we avoid the cyclic reference errors. I close for now, to see how this works. If we still have issues we migth come back to this.
This holds - if I re-order the breaking code so that the companion object is declared first, then the code compiles. (It's still a little weird!)
(I suspect I don't see the 'implicit method is not applicable here' message because I have an implicit value rather than a conversion - though I'm assuming here that the root cause is the same as the above).

Discover if a class has been declared inside a particular module type

I am writing a DSL which describes a structure. The DSL uses Types to reference classes which will later be instantiated. I would like to enforce that a particular Type has been declared within a particular module. This can be a runtime check after the DSL has been compiled.
In essence I need to access the outer element starting from the inner class and check it is of the correct type and get a reference to it.
If I get the child type I can get its symbol using reflection by calling to typeSymbol and in the resulting symbol I see that I can call owner to get the outer type symbol. However if I try to reflect the parent as a Module (even when the parent it is a module) it throws an exception.
Lets put an example:
trait TheMixin
object TheParent extends TheMixin {
case class TheChild()
}
object TestDiscoverParent extends App {
import scala.reflect.runtime.{currentMirror => cm, universe => ru}
val theChildType = ru.typeOf[TheParent.TheChild]
val theChildOwner = theChildType.typeSymbol.owner
println(theChildOwner)
val theParentModuleSymbol = theChildOwner.asModule
val theParentRef = cm.reflectModule(theParentModuleSymbol).instance
println(theParentRef.isInstanceOf[TheMixin])
}
In this example, the line println(theChildOwner) will print
object TheParent
But the call to theChildOwner.asModule throws an exception:
Exception in thread "main" scala.ScalaReflectionException: object TheParent is not a module
How can I get a reference to the outer object wrapper?
There seems to one more indirection, the owner is a "module class". I'm not exactly sure what that means, probably just that technically behind every singleton object there is also a class that is instantiated once.
So the following seems to work:
object TestDiscoverParent extends App {
import scala.reflect.runtime.{currentMirror => cm, universe => ru}
val theChildType = ru.typeOf[TheParent.TheChild]
val theChildOwner = theChildType.typeSymbol.owner
println(theChildOwner)
require(theChildOwner.isModuleClass)
val theParentModuleClass = theChildOwner.asClass
val theParentModuleSymbol = theParentModuleClass.module.asModule
val theParentRef = cm.reflectModule(theParentModuleSymbol).instance
println(theParentRef.isInstanceOf[TheMixin])
}