How to workaround the XmlSerialization issue with type name aliases (in .NET)? - xml-serialization

I have a collection of objects and for each object I have its type FullName and either its value (as string) or a subtree of inner objects details (again type FullName and value or a subtree).
For each root object I need to create a piece of XML that will be xml de-serializable.
The problem with XmlSerializer is that i.e. following object
int age = 33;
will be serialized to
<int>33</int>
At first this seems to be perfectly ok, however when working with reflection you will be using System.Int32 as type name and int is an alias and this
<System.Int32>33</System.Int32>
will not deserialize.
Now additional complexity comes from the fact that I need to handle any possible data type.
So solutions that utilize System.Activator.CreateInstance(..) and casting won't work unless I go down the path of code gen & compilation as a way of achieving this (which I would rather avoid)
Notes:
A quick research with .NET Reflector revealed that XmlSerializer uses internal class TypeScope, look at its static constructor to see that it initializes an inner HashTable with mappings.
So the question is what is the best (and I mean elegant and solid) way to workaround this sad fact?

I don't exactly see, where your problem originally stems from. XmlSerializer will use the same syntax/mapping for serializing as for deserializing, so there is no conflict when using it for both serializing and deserializing.
Probably the used type-tags are some xml-standard thing, but not sure about that.
I guess the problem is more your usage of reflection. Do you instantiate your
imported/deserialized objects by calling Activator.CreateInstance ?
I would recommend the following instead, if you have some type Foo to be created from the xml in xmlReader:
Foo DeserializedObject = (Foo)Serializer(typeof(Foo)).Deserialize(xmlReader);
Alternatively, if you don't want to switch to the XmlSerializer completely, you could do some preprocessing of your input. The standard way would then be to create some XSLT, in which you transform all those type-elements to their aliases or vice versa. then before processing the XML, you apply your transformation using System.Xml.Xsl.XslCompiledTransform and use your (or the reflector's) mappings for each type.

Why don't you serialize all the field's type as an attribute?
Instead of
<age>
<int>33</int>
</age>
you could do
<age type="System.Int32">
33
</age>

Related

Scala macros: Generate factory or ctor for Java Pojo

I'm currently working with reasonably large code base where new code is written in scala, but where a lot of old Java code remains. In particular there are a lot of java APIs we have to talk to. The old code uses simple Java Pojos with public non-final fields, without any methods or constructors, e.g:
public class MyJavaPojo {
public String myField1;
public MyOtheJavaPojo myField2;
}
Note that we dont have the option of adding helper methods or constructors to these types. These are currently created like old c-structs (pre-named parameters) like this:
val myPojo = new MyJavaPojo
myPojo.myField1 = ...
myPojo.myField2 = ...
Because of this, it's very easy to forget about assigning one of the fields, especially when we suddenly add new fields to the MyJavaPojo class, the compiler wont complain that I've left one field to null.
NOTE: We don't have the option of modifying the java types/adding constructors the normal way. We also don't want to start creating lots and lots of manually created helper functions for object creation - We would really like to find a solution based on scala macros instead of possible!
What I would like to do would be to create a macro that generates either a constructor-like method for my Pojos or a macro that creates a factory, allowing for named parameters. (Basically letting a macro do the work instead of creating a gazillion manually written helper methods in scala).
Do you know of any way to do this with scala macros? (I'm certain it's possible, but I've never written a scala macro in my life)
Desired API alternative 1:
val myPojo = someMacro[MyJavaPojo](myField1 = ..., myField2 = ...)
Desired API alternative 2
val factory = someMacro[MyJavaPojo]
val myPojo = factory.apply(myField1 = ..., myField2 = ...)
NOTE/Important: Named parameters!
I'm looking for either a ready-to-use solution or hints as to where I can read up on making one.
All ideas and input appreciated!
Take a look at scala-beanutils.
#beanCompanion[MyJavaPojo] object MyScalaPojo
MyScalaPojo(...)
It probably won't work directly, as you classes are not beans and it's only been made for Scala 2.10, but the source code is < 200 lines and should give you an idea of where to start.

why class Set does not exists in scala?

There are many classes which inherit trait Set.
HashSet, TreeSet, etc.
And there's Object(could i call it companion object of trait Set? not in the case of class Set?) Set and trait Set.
It seems to me that just adding one more class "Set" to this list make it seems to be really easy to understand the structure.
is there any reason Class Set should not exists?
If you just need a set, use Set.apply and you will have a valid set that supports all important operations. You don't need to worry how is it implemented. It is prepared to work well for most use cases.
On the other hand, if performance of certain operations matters for you, create a concrete class for concrete implementation of set, and you will know exactly what you have.
In java you would write:
Set<String> strings = new HashSet<>(Arrays.asList("a", "b"));
in scala you could as well have those types
val strings: Set[String] = HashSet("a", "b")
but you can also use a handy factory if you don't need to worry about the type and simply use
val strings = Set("a", "b")
and nothing is wrong with this, and I don't see how adding another class would help at all. It is normal thing to have an interface/trait and concrete implementations, nothing in the middle is needed nor helpful.
Set.apply is a factory for sets. You can check what is the actual class of resulting object using getClass. This factory creates special, optimized sets for sizes 0-4, for example
scala.collection.immutable.Set$EmptySet$
scala.collection.immutable.Set$Set1
scala.collection.immutable.Set$Set2
for bigger sets it is a hash set, namely scala.collection.immutable.HashSet$HashTrieSet.
In Scala there is no overlap between classes and traits. Classes are implementations that can be instantiated, while traits are independently mixable interfaces. The use of Set.apply gives an object with interface Set and that is all you need to know to use it. I fully understand wanting a concrete type, but that would be unnecessary. The right thing to do here is save it to a val of type Set and use only the interface Set provides.
I know that may not be satisfying, but give it time and the Scala type system will make sense in terms of itself, even if that is different than what Java does.

Scala: when to use explicit type annotations

I've been reading a lot of other people's Scala code recently, and one of the things that I have difficultly with (coming from Java) is a lack of explicit type annotations.
It's certainly convenient when writing code to be able to leave out type annotations -- however when reading code I often find that explicit type annotations help me to understand at a glance what code is doing more easily.
The Scala style guide (http://docs.scala-lang.org/style/types.html) doesn't seem to provide any definitive guidance on this, stating:
Use type inference where possible, but put clarity first, and favour explicitness in public APIs.
To my mind, this is a bit contradictory. While it's clearly obvious what type this variable is:
val tokens = new HashMap[String, Int]
It's not so obvious what type this one is:
val tokens = readTokens()
So, if I was putting clarity first I would probably annotate all variables where the type is not already declared on the same line.
Do any Scala practitioners have guidance on this? Am I crazy to be considering adding type annotations to my local variables? I'm particularly interested in hearing from folks who spend a lot of time reading scala code (for example, in code reviews), as well as writing it.
It's not so obvious what type this one is:
val tokens = readTokens()
Good names are important: the name is plural, ergo it returns some collection of some kind. The most general collection types in Scala are Traversable and Iterator, and they mostly share a common interface, so it's not really important which one of the two it is. The name also talks about "reading tokens", ergo it obviously should return Tokens in some fashion. And last but not least, the method call has parentheses, which according to the style guide means it has side-effects, so I wouldn't count on being able to traverse the collection more than once.
Ergo, the return type is something like
Traversable[Token]
or
Iterator[Token]
and which of the two it is doesn't really matter because their client interfaces are mostly identical.
Note also that the latter constraint (only traversing the collection once) isn't even captured in the type, even if you were providing an explicit type, you would still have to look at the name and the style!

How can I look up a spray-json formatter by the runtime type of a value?

The traditional way to use spray-json seems to be to bind all your models to appropriate JsonFormats (built-in or custom) at compile time, with the formats all as implicits. Is there a way instead to look the formatters up at runtime? I'm trying to marshal a heterogeneous list of values, and the only ways I'm seeing to do it are
Write an explicit lookup (e.g. using pattern matching) that hard-codes which fomratter to use for which value type, or
Something insane using reflection to find all the implicits
I'm pretty new to Scala and spray-json both, so I worry I'm missing some simpler approach.
More context: I'm trying to write a custom serializer that writes out only a specified subset of (lazy) object fields. I'm looping over the list of specified fields (field names) at runtime and getting the values by reflection (actually it's more complicated than that, but close enough), and now for each one I need to find a JsonFormat that can serialize it.

In Scala is there any way to get a parameter's method name and class?

At my work we use a typical heavy enterprise stack of Hibernate, Spring, and JSF to handle our application, but after learning Scala I've wanted to try to replicate much of our functionality within a more minimal Scala stack (Squeryl, Scalatra, Scalate) to see if I can decrease code and improve performance (an Achilles heal for us right now).
Often my way of doing things is influenced by our previous stack, so I'm open to advice on a way of doing things that are closer to Scala paradigms. However, I've chosen some of what I do based on previous paradigms we have in the Java code base so that other team members will hopefully be more receptive to the work I'm doing. But here is my question:
We have a domain class like so:
class Person(var firstName: String, var lastName: String)
Within a jade template I make a call like:
.section
- view(fields)
The backing class has a list of fields like so:
class PersonBean(val person: Person) {
val fields: Fields = Fields(person,
List(
Text(person.firstName),
Text(person.lastName)
))
}
Fields has a base object (person) and a list of Field objects. Its template prints all its fields templates. Text extends Field and its Jade template is supposed to print:
<label for="person:firstName">#{label}</label>: <input type="text" id="person:firstName" value="#{value}" />
Now the #{value} is simply a call to person.firstName. However, to find out the label I reference a ResourceBundle and need to produce a string key. I was thinking of using a naming convention like:
person.firstName.field=First Name
So the problem then becomes, how can I within the Text class (or parent Field class) discover what the parameter being passed in is? Is there a way I can pass in person.firstName and find that it is calling firstName on class Person? And finally, am I going about this completely wrong?
If you want to take a walk on the wild side, there's a (hidden) API in Scala that allows you to grab the syntax tree for a thunk of code - at runtime.
This incantation goes something like:
scala.reflect.Code.lift(f).tree
This should contain all the information you need, and then some, but you'll have your work cut out interpreting the output.
You can also read a bit more on the subject here: Can I get AST from live scala code?
Be warned though... It's rightly classified as experimental, do this at your own risk!
You can never do this anywhere from within Java, so I'm not wholly clear as to how you are just following the idiom you are used to. The obvious reason that this is not possible is that Java is pass-by-value. So in:
public void foo(String s) { ... }
There is no sense that the parameter s is anything other than what it is. It is not person.firstName just because you called foo like:
foo(person.firstName);
Because person.firstName and s are completely separate references!
What you could do is replacing the fields (e.g. firstname) with actual objects, which have a name attribute.
I did something similiar in a recent blog post:http://blog.schauderhaft.de/2011/05/01/binding-scala-objects-to-swing-components/
The property doesn't have a name property (yet), but it is a full object but is still just as easy to use as a field.
I would not be very surprised if the following is complete nonsense:
Make the parameter type of type A that gets passed in not A but Context[A]
create an implicit that turns any A into a Context[A] and while doing so captures the value of the parameter in a call-by-name parameter
then use reflection to inspect the call-by-name parameter that gets passed in
For this to work, you'd need very specific knowledge of how stuff gets turned into call-by-name functions; and how to extract the information you want (if it's present at all).