I need to use an implicit ordering that has been defined in an object in a file
abc
in the following way:
object abc{
implicit def localTimeOrdering: Ordering[LocalDate] = Ordering.fromLessThan(_.isBefore(_))
}
So, I make a package object
xyz
inside a file 'package.scala' that in turn is in the package 'xyz' that has files in which I need the implicit ordering to be applicable. I write something like this:
package object xyz{
import abc._
}
It does not seem to work. If I manually write the implicit definition statement inside the package object, it works perfectly. What is the correct way to import the object (abc) such that all of its objects/classes/definitions can be used in my entire package 'xyz' ?
You cannot import the implicit conversions in that way, you will have to:
Manually write them inside the object:
package obj {
implicit def etc//
}
Or obtain them via inheritance/mixins:
package obj extends SomeClassOrTraitWithImplicits with AnotherTraitWithImplicits {
}
For this reason, you usually define your implicit conversions in traits or class definitions, that way you can do bulk import with a single package object.
The usual pattern is to define a helper trait for each case.
trait SomeClass {
// all the implicits here
}
object SomeClass extends SomeClass {}
Doing this would allow you to:
package object abc extends SomeClass with SomeOtherClass with AThirdClass {
// all implicits are now available in scope.
}
Related
I'm trying to make a type definition for the function type () => Unit, I use this signature quite a bit for cleanup callback functions, and I'd like to give them more meaningful names.
I've tried the following, which I think should be correct syntax, but it doesn't compile:
package myPackage
import stuff
type CleanupCallback = () => Unit
trait myTrait ...
class mObject ...
Why doesn't it compile? And what is the correct syntax?
The compilation error is: expected class or object definition
You can't declare type alias out of class/trait/object scope. But you can declare it in package object as follows:
package object myPackage {
type CleanupCallback = () => Unit
}
It will be visible for all classes in myPackage.
Also you can import it in other classes which belong to other packages:
import myPackage.CleanupCallback
trait MyTrait {
def foo: CleanupCallBack
}
IDEA plugin supports creation of package objects, another version is (suppose you don't have IDEA plugin):
Create file package.scala in your package. The file must contain:
package object packageName { // name must match with package name
// ...
}
I'm writing a set of implicit Scala wrapper classes for an existing Java library (so that I can decorate that library to make it more convenient for Scala developers).
As a trivial example, let's say that the Java library (which I can't modify) has a class such as the following:
public class Value<T> {
// Etc.
public void setValue(T newValue) {...}
public T getValue() {...}
}
Now let's say I want to decorate this class with Scala-style getters and setters. I can do this with the following implicit class:
final implicit class RichValue[T](private val v: Value[T])
extends AnyVal {
// Etc.
def value: T = v.getValue
def value_=(newValue: T): Unit = v.setValue(newValue)
}
The implicit keyword tells the Scala compiler that it can convert instances of Value to be instances of RichValue implicitly (provided that the latter is in scope). So now I can apply methods defined within RichValue to instances of Value. For example:
def increment(v: Value[Int]): Unit = {
v.value = v.value + 1
}
(Agreed, this isn't very nice code, and is not exactly functional. I'm just trying to demonstrate a simple use case.)
Unfortunately, Scala does not allow implicit classes to be top-level, so they must be defined within a package object, object, class or trait and not just in a package. (I have no idea why this restriction is necessary, but I assume it's for compatibility with implicit conversion functions.)
However, I'm also extending RichValue from AnyVal to make this a value class. If you're not familiar with them, they allow the Scala compiler to make allocation optimizations. Specifically, the compiler does not always need to create instances of RichValue, and can operate directly on the value class's constructor argument.
In other words, there's very little performance overhead from using a Scala implicit value class as a wrapper, which is nice. :-)
However, a major restriction of value classes is that they cannot be defined within a class or a trait; they can only be members of packages, package objects or objects. (This is so that they do not need to maintain a pointer to the outer class instance.)
An implicit value class must honor both sets of constraints, so it can only be defined within a package object or an object.
And therein lies the problem. The library I'm wrapping contains a deep hierarchy of packages with a huge number of classes and interfaces. Ideally, I want to be able to import my wrapper classes with a single import statement, such as:
import mylib.implicits._
to make using them as simple as possible.
The only way I can currently see of achieving this is to put all of my implicit value class definitions inside a single package object (or object) within a single source file:
package mylib
package object implicits {
implicit final class RichValue[T](private val v: Value[T])
extends AnyVal {
// ...
}
// Etc. with hundreds of other such classes.
}
However, that's far from ideal, and I would prefer to mirror the package structure of the target library, yet still bring everything into scope via a single import statement.
Is there a straightforward way of achieving this that doesn't sacrifice any of the benefits of this approach?
(For example, I know that if I forego making these wrappers value classes, then I can define them within a number of different traits - one for each component package - and have my root package object extend all of them, bringing everything into scope through a single import, but I don't want to sacrifice performance for convenience.)
implicit final class RichValue[T](private val v: Value[T]) extends AnyVal
Is essentially a syntax sugar for the following two definitions
import scala.language.implicitConversions // or use a compiler flag
final class RichValue[T](private val v: Value[T]) extends AnyVal
#inline implicit def RichValue[T](v: Value[T]): RichValue[T] = new RichValue(v)
(which, you might see, is why implicit classes have to be inside traits, objects or classes: they also have matching def)
There is nothing that requires those two definitions to live together. You can put them into separate objects:
object wrappedLibValues {
final class RichValue[T](private val v: Value[T]) extends AnyVal {
// lots of implementation code here
}
}
object implicits {
#inline implicit def RichValue[T](v: Value[T]): wrappedLibValues.RichValue[T] = new wrappedLibValues.RichValue(v)
}
Or into traits:
object wrappedLibValues {
final class RichValue[T](private val v: Value[T]) extends AnyVal {
// implementation here
}
trait Conversions {
#inline implicit def RichValue[T](v: Value[T]): RichValue[T] = new RichValue(v)
}
}
object implicits extends wrappedLibValues.Conversions
I'm trying to import an object from another .scala file that doesn't exist inside a class. I've found you can import a class like in here Scala, importing class. Is there a way to import an object without having a class around it?
Thanks
Importing a class and importing an object work the same in scala.
If you have a class
package com.package1
class MyClass{}
and an object
package com.package2
object MyObject{}
You import both the exact same way
package com.package3
import com.package1.MyClass
import com.package2.MyObject
import syntax is the same no matter what you are importing, whether it's an object, a class, a trait, a method, or a field
Yes, Scala can do exactly what you ask, and this is used frequently. Here is an example:
object Blah {
val x = 1
val y = "hello"
}
object Main extends App {
import Blah._
println(s"x=$x; y=$y")
}
Output is:
x=1; y=hello
You can also import members of a class instance, which blew my mind the first time I saw it.
If you are talking about companion objects, they are not defined inside a class, but after the class definition:
class AClass {
def sayHello() = {
println(AClass.Hello)
}
}
object AClass {
private val Hello = "hello"
}
You should have no problem importing it.
I have the following code which uses spray-json to deserialise some JSON into a case class, via the parseJson method.
Depending on where the implicit JsonFormat[MyCaseClass] is defined (in-line or imported from companion object), and whether there is an explicit type provided when it is defined, the code may not compile.
I don't understand why importing the implicit from the companion object requires it to have an explicit type when it is defined, but if I put it inline, this is not the case?
Interestingly, IntelliJ correctly locates the implicit parameters (via cmd-shift-p) in all cases.
I'm using Scala 2.11.7.
Broken Code - Wildcard import from companion object, inferred type:
import SampleApp._
import spray.json._
class SampleApp {
import MyJsonProtocol._
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
object MyJsonProtocol extends DefaultJsonProtocol {
implicit val myCaseClassSchemaFormat = jsonFormat1(MyCaseClass)
}
}
Results in:
Cannot find JsonReader or JsonFormat type class for SampleAppObject.MyCaseClass
Note that the same thing happens with an explicit import of the myCaseClassSchemaFormat implicit.
Working Code #1 - Wildcard import from companion object, explicit type:
Adding an explicit type to the JsonFormat in the companion object causes the code to compile:
import SampleApp._
import spray.json._
class SampleApp {
import MyJsonProtocol._
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
object MyJsonProtocol extends DefaultJsonProtocol {
//Explicit type added here now
implicit val myCaseClassSchemaFormat: JsonFormat[MyCaseClass] = jsonFormat1(MyCaseClass)
}
}
Working Code #2 - Implicits inline, inferred type:
However, putting the implicit parameters in-line where they are used, without the explicit type, also works!
import SampleApp._
import spray.json._
class SampleApp {
import DefaultJsonProtocol._
//Now in-line custom JsonFormat rather than imported
implicit val myCaseClassSchemaFormat = jsonFormat1(MyCaseClass)
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
}
After searching for the error message Huw mentioned in his comment, I was able to find this StackOverflow question from 2010: Why does this explicit call of a Scala method allow it to be implicitly resolved?
This led me to this Scala issue created in 2008, and closed in 2011: https://issues.scala-lang.org/browse/SI-801 ('require explicit result type for implicit conversions?')
Martin stated:
I have implemented a slightly more permissive rule: An implicit conversion without explicit result type is visible only in the text following its own definition. That way, we avoid the cyclic reference errors. I close for now, to see how this works. If we still have issues we migth come back to this.
This holds - if I re-order the breaking code so that the companion object is declared first, then the code compiles. (It's still a little weird!)
(I suspect I don't see the 'implicit method is not applicable here' message because I have an implicit value rather than a conversion - though I'm assuming here that the root cause is the same as the above).
An old trick I used in my previous Java projects was to create e.g. a FileUtils class that offered helper functions for common file operations needed by my project and not covered by e.g. org.apache.commons.io.FileUtils. Therefore my custom FileUtils would extend org.apache.commons.io.FileUtils and offer all their functions as well.
Now I try to do the same in Scala but the apache helper functions are not seen through my FileUtils Scala object, what is wrong here?
import org.apache.commons.io.{ FileUtils => ApacheFileUtils }
object FileUtils extends ApacheFileUtils {
// ... additional helper methods
}
val content = FileUtils.readFileToString(new File("/tmp/whatever.txt"))
here the compiler complains that readFileToString is not a member of my Scala FileUtils but it is of ApacheFileUtils and I extend from it ...
The Scala equivalent of a class with static methods is an object, so in Scala terms, the static components of FileUtils are seen as
object FileUtils {
def readFile(s:String) = ???
...
}
And in Scala, you can't extend an object. This is illegal:
object A
object B extends A // A is not a type
Therefore object FileUtils extends ApacheFileUtils only gives you access to the class-level definitions of ApacheFileUtils (that except for the base Object methods like equals and hashCode, you have none)
You might find that Scala offers more elegant ways of providing extensions. Have a look at the 'pimp up my library' pattern for good starting point.
To apply this pattern to your example:
// definition of your "pimped" methods
import java.io.File
class RichFile(file:File) {
def readToString():String = ???
}
// companion object defines implicit conversion
object RichFile {
implicit def fileToRichFile(f:File):RichFile = new RichFile(f)
}
// Usage
import RichFile._
val content = new File("/tmp/whatever.txt").readToString