In Scala 2.9.x I was used to the following syntax:
class B(currencies: Seq[Currency])(implicit c:C) extends
CSomething(c){
import c._
// def mystuff = call() // in fact this is c.call()
}
This does not work anymore in Scala 2.10.x, meaning that if I do import c._ , the members of c are not visible inside B. I am therefore forced to do c.call().
Is members importing forbidden in Scala 2.10.x? why?
Related
I'm trying to write some abstractions in some Spark Scala code, but running into some issues when using objects. I'm using Spark's Encoder which is used to convert case classes to database schema's here as an example, but I think this question goes for any context bound.
Here is a minimal code example of what I'm trying to do:
package com.sample.myexample
import org.apache.spark.sql.Encoder
import scala.reflect.runtime.universe.TypeTag
case class MySparkSchema(id: String, value: Double)
abstract class MyTrait[T: TypeTag: Encoder]
object MyObject extends MyTrait[MySparkSchema]
Which fails with the following compilation error:
Unable to find encoder for type com.sample.myexample.MySparkSchema. An implicit Encoder[com.sample.myexample.MySparkSchema] is needed to store com.sample.myexample.MySparkSchema instances in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases.
I tried defining the implicit evidence in the object like such: (the import statement was suggested by IntelliJ, but it looks a bit weird)
import com.sample.myexample.MyObject.encoder
object MyObject extends MyTrait[MySparkSchema] {
implicit val encoder: Encoder[MySparkSchema] = Encoders.product[MySparkSchema]
}
Which fails with the error message
MyTrait.scala:13:25: super constructor cannot be passed a self reference unless parameter is declared by-name
One other thing I tried is to convert the object to a class and provide implicit evidence to the constructor:
class MyObject(implicit evidence: Encoder[MySparkSchema]) extends MyTrait[MySparkSchema]
This compiles and works fine, but at the expense of MyObject now being a class instead.
Question: Is it possible to provide implicit evidence for the context bounds when extending a trait? Or does the implicit evidence force me to make a constructor and use class instead?
Your first error almost gives you the solution, you have to import spark.implicits._ for Product types.
You could do this:
val spark: SparkSession = SparkSession.builder().getOrCreate()
import spark.implicits._
Full Example
package com.sample.myexample
import org.apache.spark.sql.Encoder
import scala.reflect.runtime.universe.TypeTag
case class MySparkSchema(id: String, value: Double)
abstract class MyTrait[T: TypeTag: Encoder]
val spark: SparkSession = SparkSession.builder().getOrCreate()
import spark.implicits._
object MyObject extends MyTrait[MySparkSchema]
Why I can do this in Java:
import javax.swing.GroupLayout.Group;
but if I do the same in Scala (by using Ammonite), I get this:
value Group is not a member of object javax.swing.GroupLayout possible
cause: maybe a semicolon is missing before `value Group'? import
javax.swing.GroupLayout.Group
Is it due to the fact that Group is a public class derived from a private class called Spring?.
I can import neither SequentialGroup nor ParallelGroup.
Is it a bug in Scala?
I'm using Java 11 and Scala 2.12.10.
Scala 2.13.1 also fails. :-(
I need the import, for defining a generic method that can have a Group parameter, that could be either a ParallelGroup or a SequentialGroup.
I'd like to generate a generic method that takes as a parameter a Group, that could be either a ParallelGroup or a SequientialGroup
That would be a type projection
def method(group: GroupLayout#Group) = ...
or if you also have the layout the group belongs to,
def method(layout: GroupLayout)(group: layout.Group) = ...
or
val layout: GroupLayout = ...
def method(group: layout.Group) = ...
How am I supposed to mock a nested Java class using scalamock, especially when said nested Java class is coming from a third party library?
Given the following sources:
src/main/java/Outer.java
/**
* Outer class that offers a Nested class inaccessible from Scala :(
*/
public class Outer {
public class Nested {
}
}
src/main/java/UseNestedJavaClassFromJava.java
public class UseNestedJavaClassFromJava {
private Outer.Nested nested;
}
src/main/scala/ImportNestedJavaClass.scala
// TODO uncomment the below line to break the build
//import Outer.Nested
Uncommenting the scala import line results in a compilation failure while compiling UseNestedJavaClassFromJava.java works just fine.
Full minimal example with gradle: https://github.com/billyjf/async-http-client-gradle-scala.
Apparently this was somewhat already addressed in the below question, but resorting to Java glue code or reflection trickery just for the sake of testing Scala code that leverages a Java library with nested Java classes seems a bit unreasonable to me, is there really no other solution?
Scala can't access Java inner class?
I finally found a solution using Mockito:
import org.mockito.Mockito
import org.scalamock.scalatest.MockFactory
import org.scalatest.mockito.MockitoSugar
import org.scalatest.{FlatSpec, Matchers}
class OuterNestedTest extends FlatSpec with MockFactory with Matchers with MockitoSugar {
"Nested Java class" should "be mockable using Mockito from within a scalatest" in {
val mockedNestedJavaClass = Mockito.mock(classOf[Outer#Nested])
Mockito.when(mockedNestedJavaClass.methodWithAParam("some value"))
.thenReturn("mocked", Nil: _*)
mockedNestedJavaClass.methodWithAParam("some value") shouldBe "mocked"
}
}
class Main {
val z = new Outer;
private[this] val y:z.Inner = null
}
For more context:
Outer.Inner is interpreted as Outer$.Inner (companion object).
Official Scala website:
As opposed to Java-like languages where such inner classes are members
of the enclosing class, in Scala such inner classes are bound to the
outer object.
https://docs.scala-lang.org/tour/inner-classes.html
I have the following code which uses spray-json to deserialise some JSON into a case class, via the parseJson method.
Depending on where the implicit JsonFormat[MyCaseClass] is defined (in-line or imported from companion object), and whether there is an explicit type provided when it is defined, the code may not compile.
I don't understand why importing the implicit from the companion object requires it to have an explicit type when it is defined, but if I put it inline, this is not the case?
Interestingly, IntelliJ correctly locates the implicit parameters (via cmd-shift-p) in all cases.
I'm using Scala 2.11.7.
Broken Code - Wildcard import from companion object, inferred type:
import SampleApp._
import spray.json._
class SampleApp {
import MyJsonProtocol._
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
object MyJsonProtocol extends DefaultJsonProtocol {
implicit val myCaseClassSchemaFormat = jsonFormat1(MyCaseClass)
}
}
Results in:
Cannot find JsonReader or JsonFormat type class for SampleAppObject.MyCaseClass
Note that the same thing happens with an explicit import of the myCaseClassSchemaFormat implicit.
Working Code #1 - Wildcard import from companion object, explicit type:
Adding an explicit type to the JsonFormat in the companion object causes the code to compile:
import SampleApp._
import spray.json._
class SampleApp {
import MyJsonProtocol._
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
object MyJsonProtocol extends DefaultJsonProtocol {
//Explicit type added here now
implicit val myCaseClassSchemaFormat: JsonFormat[MyCaseClass] = jsonFormat1(MyCaseClass)
}
}
Working Code #2 - Implicits inline, inferred type:
However, putting the implicit parameters in-line where they are used, without the explicit type, also works!
import SampleApp._
import spray.json._
class SampleApp {
import DefaultJsonProtocol._
//Now in-line custom JsonFormat rather than imported
implicit val myCaseClassSchemaFormat = jsonFormat1(MyCaseClass)
val inputJson = """{"children":["a", "b", "c"]}"""
println(s"Deserialise: ${inputJson.parseJson.convertTo[MyCaseClass]}")
}
object SampleApp {
case class MyCaseClass(children: List[String])
}
After searching for the error message Huw mentioned in his comment, I was able to find this StackOverflow question from 2010: Why does this explicit call of a Scala method allow it to be implicitly resolved?
This led me to this Scala issue created in 2008, and closed in 2011: https://issues.scala-lang.org/browse/SI-801 ('require explicit result type for implicit conversions?')
Martin stated:
I have implemented a slightly more permissive rule: An implicit conversion without explicit result type is visible only in the text following its own definition. That way, we avoid the cyclic reference errors. I close for now, to see how this works. If we still have issues we migth come back to this.
This holds - if I re-order the breaking code so that the companion object is declared first, then the code compiles. (It's still a little weird!)
(I suspect I don't see the 'implicit method is not applicable here' message because I have an implicit value rather than a conversion - though I'm assuming here that the root cause is the same as the above).
An old trick I used in my previous Java projects was to create e.g. a FileUtils class that offered helper functions for common file operations needed by my project and not covered by e.g. org.apache.commons.io.FileUtils. Therefore my custom FileUtils would extend org.apache.commons.io.FileUtils and offer all their functions as well.
Now I try to do the same in Scala but the apache helper functions are not seen through my FileUtils Scala object, what is wrong here?
import org.apache.commons.io.{ FileUtils => ApacheFileUtils }
object FileUtils extends ApacheFileUtils {
// ... additional helper methods
}
val content = FileUtils.readFileToString(new File("/tmp/whatever.txt"))
here the compiler complains that readFileToString is not a member of my Scala FileUtils but it is of ApacheFileUtils and I extend from it ...
The Scala equivalent of a class with static methods is an object, so in Scala terms, the static components of FileUtils are seen as
object FileUtils {
def readFile(s:String) = ???
...
}
And in Scala, you can't extend an object. This is illegal:
object A
object B extends A // A is not a type
Therefore object FileUtils extends ApacheFileUtils only gives you access to the class-level definitions of ApacheFileUtils (that except for the base Object methods like equals and hashCode, you have none)
You might find that Scala offers more elegant ways of providing extensions. Have a look at the 'pimp up my library' pattern for good starting point.
To apply this pattern to your example:
// definition of your "pimped" methods
import java.io.File
class RichFile(file:File) {
def readToString():String = ???
}
// companion object defines implicit conversion
object RichFile {
implicit def fileToRichFile(f:File):RichFile = new RichFile(f)
}
// Usage
import RichFile._
val content = new File("/tmp/whatever.txt").readToString