Add ArrayBuffer Element via def in Scala - scala

I am curious as to how to add an ArrayBuffer element via a def. E.g.
def addToArray[T](data: ArrayBuffer[T]): ArrayBuffer[T] = {
return(data += T("XYZ"))
}
I tried this but no go. I assume we cannot do this generically, but I would like to know how to do this. If I do this, return(new ArrayBuffer[T]()), it works. Not the most difficult but somehow escaping me.

Just give the def your generic buffer and the element you want to add then return it:
def addToArrayBuffer[T](data: ArrayBuffer[T], elem: T): ArrayBuffer[T] = {
data += elem
data
}
println(addToArrayBuffer(ArrayBuffer(1, 2, 30), 4)) // ArrayBuffer(1, 2, 30, 4)
If you are not passing the element you want to add as a parameter to the def, then you can't add it inside the def. The idea is that you cannot create an instance of a type parameter, because instantiation requires a constructor which is unavailable if the type is unknown. This restriction is mentioned in the Java generics spec here:
Cannot Create Instances of Type Parameters: You cannot create an
instance of a type parameter. For example, the following code causes a
compile-time error:
public static <E> void append(List<E> list) {
E elem = new E(); // compile-time error
list.add(elem);
}
Wikipedia also explains this very nicely:
Java generics differ from C++ templates. Java generics generate only
one compiled version of a generic class or function regardless of the
number of parameterizing types used. Furthermore, the Java run-time
environment does not need to know which parameterized type is used
because the type information is validated at compile-time and is not
included in the compiled code. Consequently, instantiating a Java
class of a parameterized type is impossible because instantiation
requires a call to a constructor, which is unavailable if the type is
unknown.
Note that there might be a workaround to this using reflection, which is further detailed in the spec.

Related

Scala, why do I not need to import deduced types

I feel like I should preface this with the fact that I'm building my projects with sbt.
My problem is that, if at compile time a method returns something of an unimported type, in the file where I call the method, as long as I use type inference, everything compiles. Once I try to assign the unimported type to the var/val which I created with the return value of my function, I get a compiler error.
Lets say I have two classes in two package. Class App in package main and class Imported in package libraries. Lets further more say that we have a class ImportedFactory in the package main and that this class has a method for creating objects of the type Imported.
This code compiles just fine:
class App() {
// method return object of type Imported
val imp = ImportedFactory.createImportedObject()
}
This doesn't:
class App() {
// method return object of type Imported
val imp : Imported = ImportedFactory.createImportedObject()
}
This yet again does:
import libraries.Imported
class App() {
// method return object of type Imported
val imp : Imported = ImportedFactory.createImportedObject()
}
This seems like rather strange behavior. Is this normal for languages with type inference at compile time and I've yet to notice it until now in go/C++ due to my ignorance ?
Does one of the two valid approaches (import&explicit type vs infered) have advantages/drawback over the other ? (expect for, of course, one being more explicit and verbose and the other one being shorter)
Is this black magic or does the Scala compiler accomplish these deductions in a rather straight forward way ?
The only thing importing does is making a not fully qualified name available in the current scope. You could just as well write this:
class App() {
val imp: libraries.Imported = ImportedFactory.createImportedObject()
}
The reason you import libraries.Imported is for making the shorter name Imported available for you to write. If you let the compiler infer the type, you don't mention the type in your code, so you don't have to import its shorter name.
And by the way: this has nothing to do with dynamic casting in C++. The only mechanism at work in your code is type inference.
note: You'll get better search results with the term type inference
With val imp = ImportedFactory.createImportedObject() you are letting the compiler figure out what type imp should be based on type inference. Whatever type createImportObject returns, that's what type imp is.
With val imp : Imported = ImportedFactory.createImportedObject() you are explicitly stating that imp is an Imported. But the compiler doesn't know what you mean by that unless you... import... it.
Both approaches have merit:
inferred types
Inferred types are great for when you're throwing together code where the type should be obvious:
val i = 1 // obviously `i` is an int
val j = i + 10 // obviously still an int
It's also great for local vars/vals where the type would be too much of a pain to write
val myFoo: FancyAbstractThing[TypeParam, AnotherTypeParam[OhNoMoreTypeParams]] = ...
// vs
val myFoo = FancyThingFactory.makeANewOne()
The downside is that if you have allowed a public def/val to have an inferred type, it can be more difficult to determine how to use that method. For this reason, omitting type annotations is typically only used for simple constants, and in local vals/vars that "client code" doesn't have to look at.
explicit types
When you do want to write library-ish code (i.e. public vals/defs), the convention is to explicitly-type them.
Probably the simplest reason for this is because this:
def myLibraryMethod = {
// super complicated implementation
}
is harder to understand than
def myLibraryMethod: String = {
// super complicated implementation
}
Another benefit to explicitly-typing your code is when you want to expose a less-specific type than what the value actually is:
val invalidNumbers: Set[Int] = TreeSet(4, 8, 15, 16, 23, 42)
In this example, you don't want client code to need to care that your invalidNumbers is actually a TreeSet. That's an implementation detail. In this case you're hiding some information that, while true, would be distracting.

Why do you need Arbitraries in scalacheck?

I wonder why Arbitrary is needed because automated property testing requires property definition, like
val prop = forAll(v: T => check that property holds for v)
and value v generator. The user guide says that you can create custom generators for custom types (a generator for trees is exemplified). Yet, it does not explain why do you need arbitraries on top of that.
Here is a piece of manual
implicit lazy val arbBool: Arbitrary[Boolean] = Arbitrary(oneOf(true, false))
To get support for your own type T you need to define an implicit def
or val of type Arbitrary[T]. Use the factory method Arbitrary(...) to
create the Arbitrary instance. This method takes one parameter of type
Gen[T] and returns an instance of Arbitrary[T].
It clearly says that we need Arbitrary on top of Gen. Justification for arbitrary is not satisfactory, though
The arbitrary generator is the generator used by ScalaCheck when it
generates values for property parameters.
IMO, to use the generators, you need to import them rather than wrapping them into arbitraries! Otherwise, one can argue that we need to wrap arbitraries also into something else to make them usable (and so on ad infinitum wrapping the wrappers endlessly).
You can also explain how does arbitrary[Int] convert argument type into generator. It is very curious and I feel that these are related questions.
forAll { v: T => ... } is implemented with the help of Scala implicits. That means that the generator for the type T is found implicitly instead of being explicitly specified by the caller.
Scala implicits are convenient, but they can also be troublesome if you're not sure what implicit values or conversions currently are in scope. By using a specific type (Arbitrary) for doing implicit lookups, ScalaCheck tries to constrain the negative impacts of using implicits (this use also makes it similar to Haskell typeclasses that are familiar for some users).
So, you are entirely correct that Arbitrary is not really needed. The same effect could have been achieved through implicit Gen[T] values, arguably with a bit more implicit scoping confusion.
As an end-user, you should think of Arbitrary[T] as the default generator for the type T. You can (through scoping) define and use multiple Arbitrary[T] instances, but I wouldn't recommend it. Instead, just skip Arbitrary and specify your generators explicitly:
val myGen1: Gen[T] = ...
val mygen2: Gen[T] = ...
val prop1 = forAll(myGen1) { t => ... }
val prop2 = forAll(myGen2) { t => ... }
arbitrary[Int] works just like forAll { n: Int => ... }, it just looks up the implicit Arbitrary[Int] instance and uses its generator. The implementation is simple:
def arbitrary[T](implicit a: Arbitrary[T]): Gen[T] = a.arbitrary
The implementation of Arbitrary might also be helpful here:
sealed abstract class Arbitrary[T] {
val arbitrary: Gen[T]
}
ScalaCheck has been ported from the Haskell QuickCheck library. In Haskell type-classes only allow one instance for a given type, forcing you into this sort of separation.
In Scala though, there isn't such a constraint and it would be possible to simplify the library. My guess is that, ScalaCheck being (initially written as) a 1-1 mapping of QuickCheck, makes it easier for Haskellers to jump into Scala :)
Here is the Haskell definition of Arbitrary
class Arbitrary a where
-- | A generator for values of the given type.
arbitrary :: Gen a
And Gen
newtype Gen a
As you can see they have a very different semantic, Arbitrary being a type class, and Gen a wrapper with a bunch of combinators to build them.
I agree that the argument of "limiting the scope through semantic" is a bit vague and does not seem to be taken seriously when it comes to organizing the code: the Arbitrary class sometimes simply delegates to Gen instances as in
/** Arbirtrary instance of Calendar */
implicit lazy val arbCalendar: Arbitrary[java.util.Calendar] =
Arbitrary(Gen.calendar)
and sometimes defines its own generator
/** Arbitrary BigInt */
implicit lazy val arbBigInt: Arbitrary[BigInt] = {
val long: Gen[Long] =
Gen.choose(Long.MinValue, Long.MaxValue).map(x => if (x == 0) 1L else x)
val gen1: Gen[BigInt] = for { x <- long } yield BigInt(x)
/* ... */
Arbitrary(frequency((5, gen0), (5, gen1), (4, gen2), (3, gen3), (2, gen4)))
}
So in effect this leads to code duplication (each default Gen being mirrored by an Arbitrary) and some confusion (why isn't Arbitrary[BigInt] not wrapping a default Gen[BigInt]?).
My reading of that is that you might need to have multiple instances of Gen, so Arbitrary is used to "flag" the one that you want ScalaCheck to use?

Possible to find parameter type methods return type in Scala where parameter is a primitive type?

Suppose I have:
class X
{
val listPrimitive: List[Int] = null
val listX: List[X] = null
}
and I print out the return types of each method in Scala as follows:
classOf[ComplexType].getMethods().foreach { m => println(s"${m.getName}: ${m.getGenericReturnType()}") }
listPrimitive: scala.collection.immutable.List<Object>
listX: scala.collection.immutable.List<X>
So... I can determine that the listX's element type is X, but is there any way to determine via reflection that listPrimitive's element type is actually java.lang.Integer? ...
val list:List[Int] = List[Int](123);
val listErased:List[_] = list;
println(s"${listErased(0).getClass()}") // java.lang.Integer
NB. This seems not to be an issue due to JVM type erasure since I can find the types parameter of List. It looks like the scala compiler throws away this type information IFF the parameter type is java.lang.[numbers] .
UPDATE:
I suspect this type information is available, due to the following experiment. Suppose I define:
class TestX{
def f(x:X):Unit = {
val floats:List[Float] = x.listPrimitive() // type mismatch error
}
}
and X.class is imported via a jar. The full type information must be available in X.class in order that this case correctly fails to compile.
UPDATE2:
Imagine you're writing a scala extension to a Java serialization library. You need to implement a:
def getSerializer(clz:Class[_]):Serializer
function that needs to do different things depending on whether:
clz==List[Int] (or equivalently: List[java.lang.Integer])
clz==List[Float] (or equivalently: List[java.lang.Float])
clz==List[MyClass]
My problem is that I will only ever see:
clz==List[Object]
clz==List[Object]
clz==List[MyClass]
because clz is provided to this function as clz.getMethods()(i).getGenericReturnType().
Starting with clz:Class[_] how can I recover the element type information that was lost?
Its not clear to me that TypeToken will help me because its usages:
typeTag[T]
requires that I provide T (ie. at compile time).
So, one path to a solution... Given some clz:Class[_], can I determine the TypeTokens of its method's return types? Clearly this is possible as this information must be contained (somewhere) in a .class file for a scala compiler to correctly generate type mismatch errors (see above).
At the java bytecode level Ints have to be represented as something else (apparently Object) because a List can only contain objects, not primitives. So that's what java-level reflection can tell you. But the scala type information is, as you infer, present (at the bytecode level it's in an annotation, IIRC), so you should be able to inspect it with scala reflection:
import scala.reflect.runtime.universe._
val list:List[Int] = List[Int](123)
def printTypeOf[A: TypeTag](a: A) = println(typeOf[A])
printTypeOf(list)
Response to update2: you should use scala reflection to obtain a mirror, not the Class[_] object. You can go via the class name if need be:
import scala.reflect.runtime.universe._
val rm = runtimeMirror(getClass.getClassLoader)
val someClass: Class[_] = ...
val scalaMirrorOfClass = rm.staticClass(someClass.getName)
// or possibly rm.reflectClass(someClass) ?
val someObject: Any = ...
val scalaMirrorOfObject = rm.reflectClass(someObject)
I guess if you really only have the class, you could create a classloader that only loads that class? I can't imagine a use case where you wouldn't have the class, or even a value, though.

Declaring types based on types of other values

At the moment, I have the following Scala code using the Scala Graph library:
val g = Graph.from(nodes, edges)
// nodes: List[Table]; edges: List[LkUnDiEdge[Table,String]]
// inferred - g: Graph[Table, LkUnDiEdge]
...
def nodeTransformer(innerNode: Graph[Table, LkUnDiEdge]#NodeT) = {
val node = innerNode.value
Some( root,
DotNodeStmt(node.name, style(node)) )
}
You may notice the parameter to nodeTransformer is an inner type of the type of g. Similar code repeats the concrete Graph type or some of its type parameters in other places in the codebase.
Type inference does not work in all places, and I'd like a way to express this type dependency without repeating the same explicit types throughout the code. As an example, the C++ typeof operator would allow me to rewrite the function as follows:
def nodeTransformer(innerNode: typeof(g)#NodeT) = {
...
}
What is the sane way to express such static type dependencies in Scala when type inference fails?
What about:
def nodeTransformer(innerNode: g.NodeT) = {
...
}
In contrast with C++ typeof, you should pass innerNode exactly for g, but not any other Graph instance (even with same type): What is meant by Scala's path-dependent types?
P.S. To extract type from generic (without type member inside) - How to capture T from TypeTag[T] or any other generic in scala?

Can structural typing work with generics?

I have an interface defined using a structural type like this:
trait Foo {
def collection: {
def apply(a: Int) : String
def values() : collection.Iterable[String]
}
}
}
I wanted to have one of the implementers of this interface do so using a standard mutable HashMap:
class Bar {
val collection: HashMap[Int, String] = HashMap[Int, String]()
}
It compiles, but at runtime I get a NoSuchMethod exception when referring a Bar instance through a Foo typed variable. Dumping out the object's methods via reflection I see that the HashMap's apply method takes an Object due to type erasure, and there's some crazily renamed generated apply method that does take an int. Is there a way to make generics work with structural types? Note in this particular case I was able to solve my problem using an actual trait instead of a structural type and that is overall much cleaner.
Short answer is that the apply method parameter is causing you grief because it requires some implicit conversions of the parameter (Int => Integer). Implicits are resolved at compile time, the NoSuchMethodException is likely a result of these missing implicits.
Attempt to use the values method and it should work since there are no implicits being used.
I've attempted to find a way to make this example work but have had no success so far.