Problems with Scala Swing library - scala

Hello I am having problems when using the Scala Swing library in version 2.8 Beta1-prerelease. I have a situation where I want to show a table in GUI, and update it as results are returned from a SQL request. Which way could this be done in Scala, at the moment i am using the DefaultTableModel from Java library.
Another thing is that I want the table to be sortable afterwards, I cant see if Scala swing library supports this either?

No - the scala swing library does not support sorting of Table - your best best is to revert to using JTable (i.e. the java swing class). A couple of things to note:
Don't use DefaultTableModel - use AbstractTableModel and implement the getSize and getValueAt methods. A table model should follow the adapter pattern
The appalling Java generics on the RowSorter are annoying when used with scala. You will have to use explicit generic type arguments

Related

Access scala function in PySpark

I have a Scala library which contains some utility codes and UDF for the Scala Spark API.
However, I would love to now start to use this Scala library with PySpark. Using Java based classes seems to work pretty OK like outlined Running custom Java class in PySpark, however as I use a library written in Scala some the names of some classes might not be straight forward and contain characters like $.
How is interoperability still possible?
How can I use Java/Scala code which is offering a function requiring a generic type parameter?
In general you don't. While access in such cases is sometimes possible, using __getattribute__ / getattr, Py4j is simply not designed with Scala in mind (that's really not Python specific - while Scala is technically speaking interpolatable with Java, it is much richer language, and many of its features are not easily accessible from other JVM languages).
In practice you should do the same thing that Spark does internally - instead of exposing Scala API directly, you create a lean* Java or Scala API, which is specifically designed for interoperability with guest languages. Since Py4j provides translation only between basic Python and Java types, and doesn't handle commonly used Scala interfaces, you will need such intermediate layer anyway, unless Scala library was specifically designed for Java interoperability.
As of your last concern
How can I use Java/Scala code which is offering a function requiring a generic type parameter?
Py4j can handle Java generics just fine without any special treatment. Advanced Scala features (manifests, class tags, type tags) are typically no go, but once again, there are not designed (though it is possible) with Java interoperability in mind.
* As a rule of thumb, if something is Java friendly (doesn't require any crazy hacks, extensive type conversions, or filling the blanks normally handled by the Scala compiler), it should be a good fit for PySpark as well.

External DSL for CRUD code generation for multiple frameworks

I am developing a general purpose CRUD code generator application. The idea is that codes/files (model, controller, view) for common insert, update, list, delete etc. operations will be automatically generated from model definition (like the definition used in Grails). But the generated code can be for any framework, e.g. Play (Scala or Java version) or Django or Grails or whatever framework user wants to use it for, even AngularJs. That is, same model definition can be used for generating code for any framework.
My question is, what can I use for this task - Scala or Groovy or some DSL specialized tools like Xtext?
This seems like a good case for a DSL. A DSL can be summed up as the following 3 elements:
Abstract Syntax: the concepts of your DSL. Here you want to specify CRUD applications.
Concrete Syntax(es): a way to materialize your Abstract Syntax. As a programmer, the first thought is often text-based syntaxes, but you could also use a graphical or tree-like syntax, or even simply a GUI with text fields and checkboxes.
Semantics: the meanings of your DSL. Here you want to generate code.
I'll now suggest some solutions which are based on Java and come from the Eclipse Modeling ecosystem.
Eclipse EMF implements standards for the definition of so-called "metamodels" (basically abstract syntaxes). In the Eclipse world, EMF is the base for a lot of tooling.
Assuming you have an EMF metamodel, textual syntaxes can be specified using Eclipse Xtext, and graphical syntaxes using Eclipse Sirius. Note that you can also develop your own GUI in Java and create your model using the EMF Java APIs. Also note that Xtext can create your metamodel for you based on the grammar you want for your text-based syntax. This is nice if you don't want to dive too deep into EMF it self (thus steps 1 and 2 are one and the same).
Eclipse Acceleo provides a template language specifically designed to generate text, including code. Once again, you can also write your code generator using plain Java, or any JVM-based language thanks to the EMF Java APIs. If you use Xtext, there is also a facility for including an Xtend-based code generator alonside your syntax.

Scala Metaprogramming at Runtime

I'm building a tool that will receive unpredictable data structure, and I want to generate case class to accomplish the structure of the received data.
I'm trying to figure out if it's possible to generate case class at runtime? This structure will be know only at runtime.
It's something similar to what macro does, but in runtime.
I've found this project on the internet
mars
Which is very close to what I want to do ,but I couldn't find if it was successful of not.
Another way of doing it is generate the code, compile and put the result in the classpath, like IScala is doing to use the code in an iterative way. But I don't think that this will scale.
Does anybody has already done something like runtime code generation?
This question was also posted in scala-user mailing list
UPDATE: (as per the comments)
If all you want is throw-away code generated at runtime to be fed into to a library that cannot work with just lists and maps, and not code to be stored and used later, it would make sense to look for solutions to this problem for Java or JVM. That is, unless the library requires some Scala specific features not available to vanilla JVM bytecode (Scala adds some extras to the bytecode, which Java code doesn't need/have).
what is the benefit of generating statically typed code dynamically? as opposed to using a dynamic data structure.
I would not attempt that at all. Just use a structure such as nested lists and maps.
Runtime code generation is one of the purposes of the Mars Project. Mars is under development, at the moment there is no release version. Mars requires its own toolchain to expand macros at runtime and should use several features unique to scala.meta (http://scalameta.org/), for example, AST interpretation and AST persistence. Currently we are working on ASTs typechecking in scala-reflect, required for runtime macros expansion.

whether it is possible find (in a runtime) all subclasses (which mixing some trait) using scala 2.10

i need find all subclasses which mixing some trait (i won't do this in a runtime). I know tool written in scala (ClassUtil) but this tool is slow. Also I know one tool written in java (fasters than ClassUtil), but if I have choice I wouldn't rather using external libraries - so my question is: scala 2.10 have support resolving my problem?

Need to load a scala class at runtime from a jar and initialize it

To all -- I'm probably at best a new guy here, trying to wrap my head around scala, and I find I need to do the following:
Assume I have a scala class on disk somehwere referenced in my classpath.
I have a scala application that wants to dynamically load this class and call its constructor
Once I have that class reference, I can use it to set up values in other classes and objects.
In Java, I'd use the Java class loader and create a new instance whereupon I'd call its constructor. What is the right way to do this in Scala?
Scala classes are Java classes, so just do what you'd do in Java. The Scala Java Interoperability FAQ doesn't talk about classloaders specifically, but it might be helpful as you figure things out.
I have written a blog entry quite a long time ago on this. Unfortunately I haven't found the time to update it for Scala 2.8 .
Essentially it boils down to
do it like you would in Java
use Scala features to improve the user interface