Why do I get the error
error: illegal start of statement (no modifiers allowed here)
override def toString = {
^
when loading the following code (wrapped in a .scala file) from spark-shell (Spark version 2.2.0, Scala version 2.11.8)?
import org.apache.spark.util.StatCounter
class NAStatCounter extends Serializable
{
val stats: StatCounter = new StatCounter()
var missing: Long = 0
def add(x: Double): NAStatCounter = {
if (java.lang.Double.isNaN(x)) {
missing += 1
} else {
stats.merge(x)
}
this
}
def merge(other: NAStatCounter): NAStatCounter = {
stats.merge(other.stats)
missing += other.missing
this
}
override def toString = {
"stats: " + stats.toString + " NaN: " + missing
}
}
object NAStatCounter extends Serializable {
def apply(x: Double) = new NAStatCounter().add(x)
}
It is an example code from a book and it looks weird I get this error...
toString method returns a String so just add return type to it.
override def toString: String = {
"stats: " + stats.toString + " NaN: " + missing
}
Related
class Complex(real: Double, imaginary: Double) {
def re = real
def im = imaginary
override def toString() : String =
"" + re + (if (im < 0) "" else "+") + im + "i"
}
object Runme {
// making a new starting point...
def main(args: Array[String]): Unit = {
var c = new Complex(2.3, 4.5)
print(c)
}
}
When I run this code, why do I get "Complex#3834d63f" instead of "2.3+4.5i"?
I had accidentally nested the class Complex declaration inside another class Complex declaration. This question is now resolved.
import org.apache.spark.{ SparkConf, SparkContext }
import org.apache.spark.rdd.RDD
class BaseType(val a: String) extends Serializable {
override def toString = "(" + a + ")"
}
class TypeA(a: String, val b: String) extends BaseType(a) {
override def toString = "(" + a + "," + b + ")"
}
class TypeB(a: String, val b: String) extends BaseType(a) {
override def toString = "(" + a + "," + b + ")"
}
object EntityInheritance {
def main(args: Array[String]) = {
val sparkConf = new SparkConf()
.setMaster("local[*]")
.setAppName("EntityInheritance Sample")
val sc = new SparkContext(sparkConf)
val text_file = sc.textFile("/dqa/sample_logs/tipologies/entityInheritance.txt")
val items = text_file.flatMap(_.split("\n"))
val itemsRDDa = items.map(newInstanceA(_))
itemsRDDa.foreach { rdd => println(rdd) }
val countAa = countersAttributeA[TypeA](itemsRDDa)
val itemsRDDb = items.map(newInstanceB(_))
itemsRDDb.foreach { rdd => println(rdd) }
val countBa = countersAttributeA[TypeB](itemsRDDb)
sc.stop()
}
def newInstanceA(str: String): TypeA = {
val parts = str.split(" ")
new TypeA(parts(0), parts(1))
}
def newInstanceB(str: String): TypeB = {
val parts = str.split(" ")
new TypeB(parts(0), parts(1))
}
// I want to implement a generic function that receives RDD[TypeA] or RDD[TypeB]
// it's a simple example
def countersAttributeA[A](rdd: org.apache.spark.rdd.RDD[A]) = {
rdd
.map(s => (s.a, 1))
.reduceByKey(_ + _)
}
}
Hello, I have a problem but is possible that this idea is isn't good.
I trying to implement a generic function that receives different types. When create a different objects for example TypeA and TypeB I want to send to counterAttributeA -> count number of apparences of attribute 'a', but the console send this error:
[error] /src/main/scala/org/sparklambda/testing/EntityInheritance.scala:53: value a is not a member of type parameter A
[error] .map(s => (s.a, 1))
[error] ^
[error] one error found
Anyone you can help me? Thank's for all.
You need to tell the compiler that the type is going to be at least BaseType in order for it to know that it'll have access to the a property.
def countersAttributeA[A <: BaseType](rdd: org.apache.spark.rdd.RDD[A])
I have written a little snippet of code to test the Dynamic trait capabilities:
class Foo extends Dynamic {
def selectDynamic(name: String) {
println("selectDynamic: " + name)
}
def applyDynamic(name: String)(args: Any*) {
println("applyDynamic: " + name)
}
def applyDynamicNamed(name: String)(args: (String, Any)*) {
println("applyDynamicNamed: " + name)
}
def updateDynamic(name: String)(value: Any) {
println("updateDynamic: " + name)
}
}
object Test {
def main(args: Array[String]) {
val foo = new Foo
foo.bar(5) //1
foo.bar(x = 5) //2
foo.bar //3
foo.baz = 5 //4
}
}
The problem is that it wouldn't compile both in Scala 2.9 and 2.10 because of the fourth line in main:
error: reassignment to val
foo.baz = 5
If I comment this string, 2.9 would complain about the second line:
error: not found: value x
foo.bar(x = 5)
Meanwhile 2.10 would compile and the program would produce:
applyDynamic: bar
applyDynamicNamed: bar
selectDynamic: bar
So now I wonder if I'm doing something wrong (maybe miss some dependencies)? Is there a difference between Dynamic's in Scala 2.9 and 2.10? And what's wrong with the foo.baz = 5?
This is a bug: https://issues.scala-lang.org/browse/SI-5733
from time to time, i deal with java that has stuff like the following in it:
def printDbl(d:Double) { println("dbl: " + d) }
def printInt(i:Int) { println("int: " + i) }
naturally, i'd like to wrap this in some scala, which ends up looking like this:
def print[T:Manifest] (t:T) {
if (manifest[T] <:< manifest[Int]) { printInt(t.asInstanceOf[Int]) ; return }
if (manifest[T] <:< manifest[Double]) { printDbl(t.asInstanceOf[Double]) ; return }
throw new UnsupportedOperationException("not implemented: " + manifest[T])
}
but when i run the following, i get a runtime exception:
print(1)
print(2.0)
print("hello")
i seem to recall there being a way to catch this at compile time, but i can't seem to google it up. perhaps some clever implied conversions?
Why don't you just take advantage of method overloading and write your Scala wrapper like this?:
object Printer {
def print(d: Double) { printDbl(d) }
def print(i: Int) { printInt(i) }
}
This is very simple and provides the desired behavior:
import Printer._
print(1.) // dbl: 1.0
print(1) // int: 1
print("hello") // compile-time type error
scala> object SpecType {
| trait SpecType[T] {
| def is(s: String): Boolean
| }
| implicit object DoubleType extends SpecType[Double] {
| def is(s: String) = s == "Double"
| }
| implicit object IntType extends SpecType[Int] {
| def is(s: String) = s == "Int"
| }
| }
defined module SpecType
scala> import SpecType._
import SpecType._
scala> def print[T: SpecType](x: T) {
| if(implicitly[SpecType[T]].is("Int")) println("Int")
| if(implicitly[SpecType[T]].is("Double")) println("Double")
| }
print: [T](x: T)(implicit evidence$1: SpecType.SpecType[T])Unit
scala> print(1)
Int
scala> print(1.0)
Double
scala> print("")
<console>:21: error: could not find implicit value for evidence parameter of typ
e SpecType.SpecType[String]
print("")
this is the best i've come up with
class CanPrint[T] (t:T) { def getT = t}
implicit def canPrint(i:Int) = new CanPrint[Int](i)
implicit def canPrint(d:Double) = new CanPrint[Double](d)
def print[T:Manifest] (t:CanPrint[T]) {
if (manifest[T] <:< manifest[Int]) { printInt(t.getT.asInstanceOf[Int]) ; return }
if (manifest[T] <:< manifest[Double]) { printDbl(t.getT.asInstanceOf[Double]) ; return }
throw new UnsupportedOperationException("not implemented: " + manifest[T])
}
the following does not compile
print(1)
print(1.0)
print("hello")
and the following does what i expect
print(1)
print(1.0)
however, this is bad code because i have to import the implicit defs for it to work, and as a consumer of this code all i see is the method signature saying that i have to pass in a CanPrint object, which i can instantiate.
print(new CanPrint("hello")) // pwned
can i make the constructor private and only accessible to the implicit methods or some such?
def printDbl(d:Double) { println("dbl: " + d) }
def printInt(i:Int) { println("int: " + i) }
trait Printer[T] { def print(t:T) }
class PD extends Printer[Double] { def print(d:Double) = printDbl(d) }
class PI extends Printer[Int] { def print(i:Int) = printInt(i) }
implicit val pd = new PD()
implicit val pi = new PI()
def print[T](t:T)(implicit printer:Printer[T]) = printer.print(t)
print(1) // 1
print(2.0) // 2.0
print("hello") // Error:(88, 7) could not find implicit value for parameter printer: A$A336.this.Printer[String]
Given:
case class FirstCC {
def name: String = ... // something that will give "FirstCC"
}
case class SecondCC extends FirstCC
val one = FirstCC()
val two = SecondCC()
How can I get "FirstCC" from one.name and "SecondCC" from two.name?
def name = this.getClass.getName
Or if you want only the name without the package:
def name = this.getClass.getSimpleName
See the documentation of java.lang.Class for more information.
You can use the property productPrefix of the case class:
case class FirstCC {
def name = productPrefix
}
case class SecondCC extends FirstCC
val one = FirstCC()
val two = SecondCC()
one.name
two.name
N.B.
If you pass to scala 2.8 extending a case class have been deprecated, and you have to not forget the left and right parent ()
class Example {
private def className[A](a: A)(implicit m: Manifest[A]) = m.toString
override def toString = className(this)
}
def name = this.getClass.getName
Here is a Scala function that generates a human-readable string from any type, recursing on type parameters:
https://gist.github.com/erikerlandson/78d8c33419055b98d701
import scala.reflect.runtime.universe._
object TypeString {
// return a human-readable type string for type argument 'T'
// typeString[Int] returns "Int"
def typeString[T :TypeTag]: String = {
def work(t: Type): String = {
t match { case TypeRef(pre, sym, args) =>
val ss = sym.toString.stripPrefix("trait ").stripPrefix("class ").stripPrefix("type ")
val as = args.map(work)
if (ss.startsWith("Function")) {
val arity = args.length - 1
"(" + (as.take(arity).mkString(",")) + ")" + "=>" + as.drop(arity).head
} else {
if (args.length <= 0) ss else (ss + "[" + as.mkString(",") + "]")
}
}
}
work(typeOf[T])
}
// get the type string of an argument:
// typeString(2) returns "Int"
def typeString[T :TypeTag](x: T): String = typeString[T]
}
def name = getClass.getSimpleName.split('$').head
This will remove the $1 appearing at the end on some classes.