Scala and 'different type of instance of trait Map: java.util.Map[K,V]' - scala

I am getting the above mentioned Error from Scala compiler.
I am quite new to Scala and experimenting with it by converting a Java project that I have, to Scala. In my Java project, I am using Apache 'commons-chain' and I have a class that is extending 'org.apache.commons.chain.impl.ContextBase' and I am getting this error for it. I searched the internet it seems this problem has something to do with type erasure but my class doesn't not do anything special, just inherits from this class.
class SpecialContext extends ContextBase {
}
and here is the exact error I get..
Error:(10, 7) illegal inheritance;
class SpecialContext inherits different type instances of trait Map:
java.util.Map[K,V] and java.util.Map[K,V]
class SpecialContext extends ContextBase {
One of the attractions of Scala for me, while I can use nice language features of Scala, I would be still able to use the extensive number of open source libraries of the Java. After this experience, I am questioning this fact, considering my class not doing anything special, is it always this problematic to integrate the Java world and Scala world.
First my question is off-course is there a solution for the problem I described above?
Second question is, how is your experience integrating Scala and Java libraries? Or am I following the wrong way, are there ports of the popular Java libraries to Scala, like command-chain here, or lets say Spring....
Thx for answers.

The problem with ContextChain is that it uses raw types: in https://commons.apache.org/proper/commons-chain/apidocs/org/apache/commons/chain/impl/ContextBase.html you can see Map and HashMap instead of Map<Something, Something>.
Java only supports raw types to integrate with old, pre-generics code (to remind you, Java 5 was released in 2004), so you shouldn't see them in modern Java libraries. Scala doesn't support them at all.

Related

How can I assert if a class extends "AnyVal" using ArchUnit

I want to write an arch unit test to assert that a class extends AnyVal type.
val rule = classes().should().beAssignableTo(classOf[AnyVal])
val importedClasses = new ClassFileImporter().importPackages("a.b.c")
isAnyVal.check(importedClasses) // Always returns true
The above code doesn't actually catch anything and passes for classes that don't extend AnyVal also.
classOf[AnyVal] is java.lang.Object, so you are just asking that all classes extend Object, which they do.
From ArchUnit user guide:
It does so by analyzing given Java bytecode, importing all classes into a Java code structure.
I was hoping you'd get Class etc. and could go into Scala reflection from there, even if you wouldn't get the nice DSL, but they use their own API instead.
So to answer Scala-specific questions, it would need to parse #ScalaSignature annotations and that would probably be a very large effort for the developers (not to mention maintenance, or dependence on specific Scala version at least until Scala 3).

Where is the Object class or java.lang imported into the scala package or Any class?

From my understanding the ultimate class in Scala is Any class. However, I thought Scala built of the Java, so would not the ultimate class be Object? I have been checking the documentation and I could be wrong but it does not show that Object is the parent class of Any nor can I see anywhere the java.lang package being imported into Scala, which should be its backbone right?
You are confusing the Scala Programming Language with one of its Implementations.
A Programming Language is a set of mathematical rules and restrictions. Not more. It isn't "written in anything" (except maybe in English) and it isn't "built off anything" (except maybe the paper that the specification is written on).
An Implementation is a piece of software that either reads a program written in the Programming Language and executes that program in such a way that the exact things that the Programming Language Specification say should happen, do happen (in which case we call the Implementation an Interpreter), or it reads the program and outputs another program in another language in such a way that executing that output program with an interpreter for its language makes things happen in exactly the way that the Specification for the input language says.
Either way, it is the job of the person writing the Implementation to make sure that his Implementation does what the Specification says it should do.
So, even if I am writing an Implementation of Scala that is "built off Java" and written in Java, I still need to make sure that Any is the top type, because that's what the Scala Language Specification says. It is probably instructive to look at how, exactly, the Scala Language Specification phrases this [bold emphasis mine]:
Classes AnyRef and AnyVal are required to provide only the members declared in class Any, but implementations may add host-specific methods to these classes (for instance, an implementation may identify class AnyRef with its own root class for objects).
There are currently three actively-maintained Implementations of Scala, and one abandoned one.
The abandoned Implementation of Scala is Scala.NET, which was a compiler targeting the Common Language Infrastructure. It was abandoned due to lack of interest and funding. (Basically, all the users that probably would have used Scala.NET were already using F#.)
The currently maintained Implementations of Scala are:
Scala-native: a compiled Implementation targeting unixoid Operating Systems and Windows.
Scala.js: a compiled Implementation targeting the ECMAScript and Web platform.
Scala (a rather unfortunately confusing name, because it is the same as the language): a compiled Implementation targeting the Java platform. And by "Java platform", I mean the Java Virtual Machine and the Java Runtime Environment but not the Java Programming Language.
All three Implementations are written 100% in Scala. Actually, they are not three fully independent implementations, they use the same compiler frontend with only different backends, and they use the same parts of the Scala Standard Library that are written in Scala, and only re-implement the parts written in other languages.
So, what is true is that the Java Implementation of Scala does indeed do something with java.lang.Object. However, java.lang.Object is not the superclass of scala.Any. In fact, it can't be because scala.Any is the root superclass of both reference types and value types, whereas java.lang.Object is only the root superclass of all reference types. Therefore, java.lang.Object is actually equivalent to scala.AnyRef and not to scala.Any. However, java.lang.Object is not the superclass of scala.AnyRef either, but rather, both are the same class.
Also, java.lang._ is automatically imported just like scala._ is. But this does not apply to the Scala Programming Language, it only applies to the Java Implementation of the Scala Programming Language, whose name is unfortunately also Scala.
So, only for one of the three Implementations, there is some truth to the statement that java.lang.Object is the root class, but it is not a superclass of scala.Any, rather it is the same class as scala.AnyRef.
But again, this is only true for the Java Implementation of Scala. For example, in Scala.NET, the root superclass would be identified with System.Object, not java.lang.Object, and it would be equivalent to scala.Any, not scala.AnyRef because the CLI has a unified type system like Scala where reference types and value types are unified in the same type system. And I haven't checked Scala.js, but I would assume that it would identify Object with scala.AnyRef.
Note, however, that none of this is because the Implementation is "built off" something. The reason for doing this, for trying to merge the Scala and Java / CLI / ECMAScript class hierarchy, is for interoperability, it is for making it easy to call Scala code from every other language on the Java / CLI / ECMAScript platform, and vice versa, call code written in other languages from Scala. If you didn't care about that, then there would be no need to jump through these hoops.
java.lang.Object is not the parent of scala.Any. Consider the following relationships
implicitly[Any <:< java.lang.Object] // error
implicitly[AnyVal <:< java.lang.Object] // error
implicitly[AnyRef <:< java.lang.Object] // ok
However, say you had the following Java class
public class Foo {
public void bar(Object o) {}
public void zar(int o) {}
public void qux(java.lang.Integer o) {}
}
then all of the following would still work when called from Scala
val foo = new Foo
foo.bar(42.asInstanceOf[Int])
foo.bar(42.asInstanceOf[Any])
foo.bar(42.asInstanceOf[AnyVal])
foo.bar(42.asInstanceOf[AnyRef])
foo.zar(42) // zar takes primitive int
foo.qux(42) // qux takes java.lang.Integer

using Calcite's ReflectiveSchema from scala

I'm experimenting with calcite from scala, and trying to pass a simple scala class for creating a schema at runtime (using ReflectiveSchema), I'm having some headache.
For example, re-implementing the FoodMart JDBC Example (which works well in Java), I'm calling it as simple as new ReflectiveSchema(new Hr()), using a Hr class rewritten in scala as:
class HR {
val emps: Array[Employee] = Array(new Employee(100, "Bill"))
}
I'm experiencing an error: ...SqlValidatorException: Object 'emps' not found within 'hr'. This problem seems to be related to the fact that val fields are actually created private in bytecode from java, and the implementation in calcite seems to be able to use (by means of java reflection) only fields accessible through the .getFields() method of a class.
So I suppose this direction requires a lot more hacking than a simple my_field.setAccessible(true) or similar.
Are there any other way to construct a schema by API, avoiding reflection and the usage of JSON?
thanks in advance for any suggestion

Can Scala classes be used in Java

class Wish{
val s = "Hello! User. Wish you a Great day."
}
object Wish{
def main(args: Array[String]){
val w = new Wish()
println("Value - " + w.s )
}
}
Java classes can be used in Scala. Similarly, can Scala classes be used in Java?
Yes, Scala classes can be called from Java and vice versa.
The below text is taken from: Scala FAQs
What does it mean that Scala is compatible with Java?
The standard Scala backend is a Java VM. Scala classes are Java classes, and vice versa. You can call the methods of either language from methods in the other one. You can extend Java classes in Scala, and vice versa. The main limitation is that some Scala features do not have equivalents in Java, for example traits.
The following post also could be helpful to you: how to call Scala from Java
Yes. If you want to do this, there are a few things you might want to remember:
Do not use operators in your method names or provide a wordy alternative. Operator names can be called from Java but are mangled into somethings very ugly.
Java users might expect Java style getters and setters. You can produce those automatically by adding #BeanProperty annotation to fields.
In the same way Java user might be accustomed to factory methods called ClassName.of where Scala uses .apply. Those you have to provide by hand, if you want to provide that service.

Why does Array.fill take an implicit scala.reflect.ClassManifest?

So I'm playing with writing a battlecode player in Scala. In battlecode certain classes are disallowed and there is a runtime exception if you ever try to access them. When I use the Array.fill function I get a message from the battlecode server saying [java] Illegal class: scala/reflect/Manifest$. This is the offending line:
val g_score = Array.fill[Int](rc.getMapWidth(), rc.getMapHeight())(0)
The method takes an implicit ClassManifest argument which has the following documentation:
A ClassManifest[T] is an opaque descriptor for type T. It is used by the compiler
to preserve information necessary for instantiating Arrays in those cases where
the element type is unknown at compile time.
But I do know the type of the array elements at compile time, as shown above I explicitly state that they will be Int. Is there a way to avoid this? To workaround I've written my own version of Array.fill. This seems like a hack. As an aside, does Scala have real 2D arrays? Array.fill seems to return an Array[Array[T]] which is the only way I found to write my own. This also seems inelegant.
Edit: Using Scala 2.9.1
For background information, see this related question: What is a Manifest in Scala and when do you need it?. In this answer, you will find an explanation why manifests are needed for arrays.
In short: Although the JVM uses type erasure, arrays are an exception and need a manifest. Since you could compile your code, that manifest was found (manifests are always available for proper types). Your error occurs at runtime.
I don't know the details of the battlecode server, but there are two possibilities: Either you are running your compiled classes with a binary incompatible version of Scala (difference in major version, e.g. compiled with Scala 2.9 and server uses 2.10). Or the server doesn't even have the scala-library.jar on its class path.
As said in the comment, manifests are deprecated in Scala 2.10 and replaced by ClassTag.
EDIT: So it seems the class loader is artificially restricting the allowed classes. My suggestion would be: Add a helper Java class. You can easily mix Java and Scala code. If it's just about the Int-Array instantiation, you could provide something like:
public static class Helper {
public static int[][] makeArray(int d1, int d2) { return new int[d1][d2](); }
}
(hope that's valid java code, a bit rusty)
Also, have you tried to create the outer array with new Array[Array[Int]](d1), and then iterate to create the inner arrays?