ServiceStack.Text.XmlSerializer.DeserializeFromString result change when I change the order of xmlnode. Why? - xml-serialization

What is wrong with ServiceStack.Text.XmlSerializer ?
I have object:
public class weatherdata : IReturn<WashService>
{
public Location location { get; set; }
}
public class Location
{
public string name { get; set; }
public string country { get; set; }
}
Try to deserialize thirdparty xml like that:
var data = ServiceStack.Text.XmlSerializer.DeserializeFromString<weatherdata>("<weatherdata><location><name>Moscow</name><country>RU</country></location></weatherdata>");
data.location.name = Moscow.
data.location.country is NULL;
Change xml like that:
var data = ServiceStack.Text.XmlSerializer.DeserializeFromString<weatherdata>("<weatherdata><location><country>RU</country><name>Moscow</name></location></weatherdata>");
and see
data.location.name == "Moscow".
data.location.country =="RU";
Why so different results if I only change order?

As explained here, the default XML serializer used by ServiceStack (.NET's DataContract serializer) assumes that the XML elements must be in the same order as declared in your class. In XML schema terminology, the elements are declared as xs:sequence rather than xs:all. If you need to support XML elements in any possible ordering in the request, then you may need to override the XML serializer used by ServiceStack as explained in the link above.
If you just need to adjust the ordering of the XML elements, I believe you can specify an exact ordering for your elements by decorating your properties with DataMember attributes and specifying the Order property. If you do this, then you will also need to decorate your Location class with a DataContract attribute.

Related

C# Serialization in MongoDb - _id on nested type, and some properties with private setters

I'm in a position where I need to serialize some complex documents into MongoDb, but I can't change the class definition as I don't have control over the source.
However, we need to ensure that callers can still use Linq, so we need to map the class correclty into MongoDb.
Current there are few issues we're faced with:
The _id_ representation is on a nested class.
There are properties with private setters that need to be serialized/ deserialzied.
The shape of the class looks a little like this:
public class AggregateType : AggregateBase
{
public int IntProperty { get; private set; }
public ComplexObject ComplexObjectProperty { get; private set; }
}
With AggregateBase looking like this:
abstract public class AggregateBase
{
public AggregateDetails Details { get; set; }
}
And finally:
public class AggregateDetails
{
public Guid Id { get; set; }
...other properties
}
On the base class AggregateBase, there is a property called Details which contains the Id of the aggregate, which is a Guid. This Id field needs to be mapped to the ObjectId or _id field within a MongoDb document.
I need to be able to serialize the document, forcing the use of the Details.Id as the _id, and have the private setters serialized too.
I've done this with CosmoDb using a custom JsonContractResolver without issue. But the move to MongoDb has proved a little more complex.
It's worth noting that there are many AggregateType classes, all with a different shape. I'd like to find a generic way of serializing them, without having to write lots of specific mappers if possible - much like we do with CosmoDb.
On top of that, we would need this solution to work with the Linq query provider for MongoDb too.
Ive thought a little about this , the only way I can see this working is if you create matching types that will serve as your POCO for inserting into mongodb. Im going to assume you are using the C# Driver for Mongo.
public class AggregateTypeDocument : AggregateBaseDocument
{
public int IntProperty { get; private set; }
public ComplexObject ComplexObjectProperty { get; private set; }
}
abstract public class AggregateBaseDocument
{
public AggregateDetailsDocument Details { get; private set; }
}
public class AggregateDetailsDocument
{
[BsonId]
public Guid Id { get; private set; }
...other properties
}
You will end up replicating the structure but just be appending Document at the end for this example. By no means do you have to conform to this
Now you can mold your types to be more mongo friendly using various attributes.
The next step would be to either in your repository ( or wherever ) to map the types with class definitions you don't have access to to your new mongo friendly ones.
I would suggest AutoMapper for this or plain old instantiation. Now you should be able to safely operate on the collection. See below example for automapper.
var normalAggregateType = new AggregateType();
var client = new MongoClient("yourconnectionstring");
var db = client.GetDatabase("mydatabase");
var collection = db.GetCollection<AggregateTypeDocument>("myaggregatetypes");
var mongoAggregateType = Mapper.Map<AggregateTypeDocument>(normalAggregateType);
collection.InsertOne(mongoAggregateType);

How to ignore properties marked with [IgnoreDataMember] when calling REST service

I am consuming a REST Xml service.
I have all the necessary classes to do this, (supplied by the dev who wrote the service) but at my end I have to save some of the responses to the DB to perform the tasks of the app I am writing.
So I have marked some of these classes I need to put in the DB as partial and extended them so that I can inherit from a DbEntity class which specifies an ID property so I can use EF to save them to the DB thus:
public interface IDbEntity
{
int ID { get; set; }
}
[Serializable]
public class DbEntity : IDbEntity
{
[IgnoreDataMember]
[XmlIgnore]
public int ID { get; set; }
}
the problem I am facing now, is that when the service call is being de-serialized I get the error
Error in line 1 position 113. 'Element' 'ElementName' from namespace '' is not expected. Expecting element '_x003C_ID_x003E_k__BackingField'
I am simply making the call like this:
var response = await client.PostAsXmlAsync<TReq>("Some/API/Call", req);
TResp val = await msg.Content.ReadAsAsync<TResp>(response)
all the properties in the original classes have Orders specified with their DataMember attributes and I have clearly marked my DB properties to be Ignored, but to no avail.
is there any way I can get this to work? - ie getting the DataContractSerializer to actually ignore the properties I have marked to be ignored when de-serializing?
as an aside, these ignored properties are also being passed to the service when making a call - does IgnoreDataMember actually do anything?
seems that the way to do this is like this
public interface IDbEntity
{
int ID { get; set; }
}
[Serializable]
[DataContract]
public class DbEntity : IDbEntity
{
[XmlIgnore]
public int ID { get; set; }
}
so basically adding the DataContract Attribute but omitting the DataMember attribute on the item you don't want
don't know how I missed that first time around. seems its opt in rather than opt out in this instance.

Should I provide different views on the same REST entity?

I've seen this that suggest I can build different views based on user:
different json views for the same entity
However in asp web api, one uses a Model class, I can't just add new properties willy-nilly.
So, for example I may have uri:
http://host/api/products/id
Returning the model:
public class Product{
public string Code { get; set; }
public string Description { get; set; }
}
But for another purpose I want to add more information, suppose this is expensive because it joins other data to build the model, or formats the data in a very specific way:
http://host/api/productsspecial/id
Returning the model:
public class ProductSpecial{
public string Code { get; set; }
public string Description { get; set; }
public decimal Price { get; set; } //assume expensive to look up
}
So obviously I have a way to do this, two different controllers, returning different views on the data. My question is, is this OK or is there a better way?
Anyway I could do this for example: http://host/api/products/id?includeprice=true and use that to return the alternative model? And is that a good idea?
I would suggest
GET /host/api/products/{id}?fields=code,description,price
You should avoid complicating your resource URL in the manner you describe. Every possible configuration of values would need a new name: "productsReallySpecial", etc.
The problem with ?includePrice=true is you then have a parameter for every variable you might want to make optional. Your documentation can list the default return values and the available return values.

How to get EF POCOs from System.Data.Entities.DynamicProxies

My question is the same as this one
However, I don't really see a solution there. Lets say I have a simple model with two POCOs, Country and State.
public class Country
{
public string Code { get; set; }
public string Name { get; set; }
}
public class State
{
public string Code { get; set; }
public string Name { get; set; }
public virtual Country Country { get; set; }
}
When I use the repository to .GetStateByCode(myCode), it retrieves a dynamic proxy object. I want to send that over the wire using a WCF service to my client. The dynamic proxy is not a know type so it fails.
Here are my alternatives. I can set ProxyCreationEnabled to false on the context and then my .GetStateByCode(myCode) gives me a POCO which is great. However, the navigation property in the POCO to Country is then NULL (not great).
Should I new up a state POCO and manually populate and return that from the dynamic proxy that is returned from the repository? Should I try to use AutoMapper to map the dynamic proxy objects to POCOs? Is there something I'm totally missing here?
I think the answer from Ladislav Mrnka is clear. The Warnings Still apply. Even with this idea below. Becareful what gets picked Up. He just didnt include , if you want to proceed how to easily get data from Object a to object B. That is question at hand really.
Sample solution
See nuget package ValueInjecter (not the only tool that can do this... but very easy to use)
it allows easy copying of One object to another especially with the same properties and types.
( remember the lazy loading / navigation implications).
So vanilla option is :
var PocoObject = new Poco();
PocoObject.InjectFrom(DynamicProxy); // copy contents of DynamicProxy to PocoObject
but check the default behaviour and consider a custom rule
var PocoObject = new Poco();
PocoObject.InjectFrom<CopyRule>(DynamicProxy);
public class CopyRule : ConventionInjection
{
protected override bool Match(ConventionInfo c)
{
bool usePropertry; // return if the property it be included in inject process
usePropertry = c.SourceProp.Name == "Id"; // just an example
//or
// usePropertry = c.SourceProp.Type... == "???"
return usePropertry;
}
}

Using "custom data types" with entity framework

I'd like to know if it is possible to map some database columns to a custom data type (a custom class) instead of the basic data types like string, int, and so on. I'll try to explain it better with a concrete example:
Lets say I have a table where one column contains (text) data in a special format (e.g a number followed by a separator character and then some arbitrary string). E.g. the table looks like this:
Table "MyData":
ID |Title(NVARCHAR) |CustomData (NVARCHAR)
---+----------------+-----------------------
1 |Item1 |1:some text
2 |Item2 |333:another text
(Assume I am not allowed to change the database) In my domain model I'd like to have this table represented by two classes, e.g. something like this:
public class MyData
{
public int ID { get; set; }
public string Title { get; set; }
public CustomData { get; set; }
}
public class CustomData
{
public int ID { get; set; }
public string Text { get; set; }
public string SerializeToString()
{
// returns the string as it is stored in the DB
return string.Format("{0}:{1}", ID, Title);
}
public string DeserializeFromString(string value)
{
// sets properties from the string, e.g. "1:some text"
// ...
}
}
Does entity framework (V4) provide a way to create and use such "custom data types"?
No. Not like that, anyway.
However, you could work around this by:
Write a DB function to do the mapping and then use a defining query in SSDL.
Using one type for EF mapping and another type like you show above, and then projecting.
Add extension properties to your EF type to do this translation. You can't use these in L2E, but it may be convenient in other code.