Unable to declare functor type that takes zero parameters? - scala

I'm trying to make a type definition for the function type () => Unit, I use this signature quite a bit for cleanup callback functions, and I'd like to give them more meaningful names.
I've tried the following, which I think should be correct syntax, but it doesn't compile:
package myPackage
import stuff
type CleanupCallback = () => Unit
trait myTrait ...
class mObject ...
Why doesn't it compile? And what is the correct syntax?
The compilation error is: expected class or object definition

You can't declare type alias out of class/trait/object scope. But you can declare it in package object as follows:
package object myPackage {
type CleanupCallback = () => Unit
}
It will be visible for all classes in myPackage.
Also you can import it in other classes which belong to other packages:
import myPackage.CleanupCallback
trait MyTrait {
def foo: CleanupCallBack
}
IDEA plugin supports creation of package objects, another version is (suppose you don't have IDEA plugin):
Create file package.scala in your package. The file must contain:
package object packageName { // name must match with package name
// ...
}

Related

A class imported from a companion not usable as the constructor parameter default value

Consider following code:
object Main extends App {
object Project {
case class Config(rules: Seq[String] = Seq.empty)
}
import Project._
//case class Project(root: String, config: Config) // compiles fine
//case class Project(root: String, config: Project.Config = Project.Config()) // compiles fine
case class Project(root: String, config: Config = Config()) // error: not found: type Config
}
Why does the last version not compile (same with Config = Config.apply())?
It is not clear to me if this is a bug or not, but here is why it produces an error:
Consider this, which works:
import Project._
object Project {
case class Config()
}
case class Project(config: Config = Config())
When you add a default argument the compiler generates a method to calculate the value. When that value is a constructor default, that method is added to the companion object of the class. So the compiler will generate this method:
def <init>$default$1: Project.Config = Config()
Which will get added to your Project object.
The Scala type checker generates an object tree of Contexts. Each context has a reference to the context of it's outer scope. So the generated method gets a context and that generated method's outer scope is the Project companion object.
When the type checker attempts to resolve Config() it traverses all the enclosing contexts and cannot find Config (I am not sure why, and this may be a bug).
Once it has exhausted the contexts it resolves the imports which has the import Project._! The type checker is happy because it can now traverse the imports and find the apply method.
Now when you move the import below Project:
object Project {
case class Config()
}
import Project._
case class Project(config: Config = Config())
In this case the imports available to the generated method does not have the Project._ import (this may also be a bug), I'm assuming because it's below the object definition which is where the generated method lives. The type checker then throws an error because it can't find Config.
What appears to be happening is when the type checker is resolving Config() it needs the import above the Project companion object as it needs to process the import to be able to resolve it and unless the import is above Project that import is not in scope.
For those who wish to debug further take a look at Contexts.lookupSymbol which is where the lookup is happening

How to handle different package names in different versions?

I have a 3rd party library with package foo.bar
I normally use it as:
import foo.bar.{Baz => MyBaz}
object MyObject {
val x = MyBaz.getX // some method defined in Baz
}
The new version of the library has renamed the package from foo.bar to newfoo.newbar. I have now another version of my code with the slight change as follows:
import newfoo.newbar.{Baz => MyBaz}
object MyObject {
val x = MyBaz.getX // some method defined in Baz
}
Notice that only the first import is different.
Is there any way I can keep the same version of my code and still switch between different versions of the 3rd party library as and when needed?
I need something like conditional imports, or an alternative way.
The other answer is on the right track but doesn't really get you all the way there. The most common way to do this kind of thing in Scala is to provide a base compatibility trait that has different implementations for each version. In my little abstracted library, for example, I have the following MacrosCompat for Scala 2.10:
package io.travisbrown.abstracted.internal
import scala.reflect.ClassTag
private[abstracted] trait MacrosCompat {
type Context = scala.reflect.macros.Context
def resultType(c: Context)(tpe: c.Type)(implicit
tag: ClassTag[c.universe.MethodType]
): c.Type = {
import c.universe.MethodType
tpe match {
case MethodType(_, res) => resultType(c)(res)
case other => other
}
}
}
And this one for 2.11:
package io.travisbrown.abstracted.internal
import scala.reflect.ClassTag
private[abstracted] trait MacrosCompat {
type Context = scala.reflect.macros.whitebox.Context
def resultType(c: Context)(tpe: c.Type): c.Type = tpe.finalResultType
}
And then my classes, traits, and objects that use the macro reflection API can just extend MacrosCompat and they'll get the appropriate Context and an implementation of resultType for the version we're currently building (this is necessary because of changes to the macros API between 2.10 and 2.11).
(This isn't originally my idea or pattern, but I'm not sure who to attribute it to. Probably Eugene Burmako?)
If you're using SBT, there's special support for version-specific source trees—you can have a src/main/scala for your shared code and e.g. src/main/scala-2.10 and src/main/scala-2.11 directories for version-specific code, and SBT will take care of the rest.
You can try to use type aliases:
package myfoo
object mybar {
type MyBaz = newfoo.newbar.Baz
// val MyBaz = newfoo.newbar.Baz // if Baz is a case class/object, then it needs to be aliased twice - as a type and as a value
}
And then you may simply import myfoo.mybar._ and replace the object mybar to switch to different version of the library.

Why do I get "expected class or object definition" when defining a type in scala?

If I write something like this (to define Slick tables as per docs):
type UserIdentity = (String, String)
class UserIdentity(tag: Tag){
...
}
I get a compile error: "expected class or object definition" pointing to the type declaration. Why?
You can't define type aliases outside of a class, trait, or object definition.
If you want a type alias available at the package level (so you don't have to explicitly import it), the easiest way around this to define a package object, which has the same name as a package and allows you to define anything inside of it, including type aliases.
So if you have a foo.barpackage and you wish to add a type alias, do this:
package foo
package object bar {
type UserIdentity = (String, String)
}
//in another file
package foo.bar
val x: UserIdentity = ...

Scala importing a file in all files of a package

I need to use an implicit ordering that has been defined in an object in a file
abc
in the following way:
object abc{
implicit def localTimeOrdering: Ordering[LocalDate] = Ordering.fromLessThan(_.isBefore(_))
}
So, I make a package object
xyz
inside a file 'package.scala' that in turn is in the package 'xyz' that has files in which I need the implicit ordering to be applicable. I write something like this:
package object xyz{
import abc._
}
It does not seem to work. If I manually write the implicit definition statement inside the package object, it works perfectly. What is the correct way to import the object (abc) such that all of its objects/classes/definitions can be used in my entire package 'xyz' ?
You cannot import the implicit conversions in that way, you will have to:
Manually write them inside the object:
package obj {
implicit def etc//
}
Or obtain them via inheritance/mixins:
package obj extends SomeClassOrTraitWithImplicits with AnotherTraitWithImplicits {
}
For this reason, you usually define your implicit conversions in traits or class definitions, that way you can do bulk import with a single package object.
The usual pattern is to define a helper trait for each case.
trait SomeClass {
// all the implicits here
}
object SomeClass extends SomeClass {}
Doing this would allow you to:
package object abc extends SomeClass with SomeOtherClass with AThirdClass {
// all implicits are now available in scope.
}

types as mold to delegated classes in Scala

I am working in ScalaFX Project. In this moment I am adapting classes from javafx.scene.control.cell. In this package, methods with same signature are duplicated in many classes. e.g. StringConverter<T> converter(). To avoid unnecessary duplication of code (and to know how to use existential types), I created the following code:
// Defined in scalafx.util package. All classes in scalafx use this trait
package scalafx.util
trait SFXDelegate[+D <: Object] extends AnyRef {
def delegate: D
override def toString = "[SFX]" + delegate.toString
override def equals(ref: Any): Boolean = {
ref match {
case sfxd: SFXDelegate[_] => delegate.equals(sfxd.delegate)
case _ => delegate.equals(ref)
}
}
override def hashCode = delegate.hashCode
}
// Package Object
package scalafx.scene.control
import javafx.{ util => jfxu }
import javafx.beans.{ property => jfxbp }
import javafx.scene.{ control => jfxsc }
import scalafx.Includes._
import scalafx.beans.property.ObjectProperty
import scalafx.util.SFXDelegate
import scalafx.util.StringConverter
package object cell {
type Convertable[T] = {
def converterProperty: jfxbp.ObjectProperty[jfxu.StringConverter[T]]
}
type JfxConvertableCell[T] = jfxsc.Cell[T] with Convertable[T]
trait ConvertableCell[C <: JfxConvertableCell[T], T]
extends SFXDelegate[C] {
def converter: ObjectProperty[StringConverter[T]] = ObjectProperty(delegate.converterProperty.getValue)
def converter_=(v: StringConverter[T]) {
converter() = v
}
}
}
In JfxConvertableCell type I wanna say
My type is a javafx.scene.control.Cell of type T that has a method called converterProperty that returns a javafx.beans.property.ObjectProperty of type javafx.util.StringConverter[T].
While in ConvertableCell trait, my intention is say that delegate value (from SFXDelegate trait) must be of type JfxConvertableCell. The first class that I tried to create was the counter-part of CheckBoxListCell:
package scalafx.scene.control.cell
import javafx.scene.control.{cell => jfxscc}
import scalafx.scene.control.ListCell
import scalafx.util.SFXDelegate
class CheckBoxListCell[T](override val delegate: jfxscc.CheckBoxListCell[T] = new jfxscc.CheckBoxListCell[T])
extends ListCell[T](delegate)
with ConvertableCell[jfxscc.CheckBoxListCell[T], T]
with SFXDelegate[jfxscc.CheckBoxListCell[T]] {
}
However in this moment I got this message from compiler:
type arguments [javafx.scene.control.cell.CheckBoxListCell[T],T] do not conform to trait ConvertableCell's type parameter bounds [C <: scalafx.scene.control.cell.package.JfxConvertableCell[T],T]
Did I understand something wrong? CheckBoxListCell have the converterProperty method. Can't we use types and existential types as a mold into which we fit our delegated classes?
The problem is in your definition of converterProperty. You define it as a parameterless method, while it is seen by scala as a method with an empty parameter list.
Just doing this makes it compile properly:
type Convertable[T] = {
def converterProperty(): jfxbp.ObjectProperty[jfxu.StringConverter[T]]
}
While scala treats a parameterless method and a method with an empty parameter list as essentially the same thing as far as overriding is concerned (see scala spec # 5.1.4), they are still different entites.
And when interroperating with java code (which has no notion of parameterless method), a nullary method is seen as a method with an empty prameter list, not as a parameterless method, thus the structural types don't match.