IComparabale sorts in unexpected order - .net-2.0

I'm sorry but I think I'll have to stick in a lot of code into my question. The good news is though, if you have the time, you can just copy this into a Console Application and execute it so you can see the issue with the results.
I have been given a list (and yes, the list in the code below actually is the list!). Essentially, I will be given List<string, string> which I will call List<ColLeft, ColRight> just for clarity.
ColLeft is already grouped and must remain grouped.
ColRight is not alphabetical and needs to be within its group.
I am on .NET 2.0 and as such I've implemented IComparable<T>. However, the list is returned out of order and I can't understand why (same issue persists run on VS2005 or VS2010).
using System;
using System.Collections.Generic;
using System.Diagnostics;
namespace SortingLists
{
class Program
{
static void Main()
{
List<ListContents> listContents = ListContents.GetListContents();
WriteOut(listContents);
Console.WriteLine("\r\n");
listContents.Sort();
WriteOut(listContents);
Console.ReadKey();
}
private static void WriteOut(List<ListContents> listContents)
{
foreach (ListContents content in listContents)
Console.WriteLine(content.ColLeft + " --- " + content.ColRight);
}
}
struct ListContents : IComparable<ListContents>
{
#region Constructor
public ListContents(string l, string r)
{
this.ColLeft = l;
this.ColRight = r;
}
#endregion
#region Fields
public string ColLeft;
public string ColRight;
#endregion
#region IComparable<ListContents> Members
public int CompareTo(ListContents other)
{
if (this.ColLeft.CompareTo(other.ColLeft) == -1)
return this.ColLeft.CompareTo(other.ColLeft);
else
return this.ColRight.CompareTo(other.ColRight);
}
#endregion
#region Methods
public static List<ListContents> GetListContents()
{
List<ListContents> lcList = new List<ListContents>();
lcList.Add(new ListContents("UFT", "a"));
lcList.Add(new ListContents("UFT", "c"));
lcList.Add(new ListContents("UFT", "b"));
lcList.Add(new ListContents("RT", "f"));
lcList.Add(new ListContents("RT", "e"));
lcList.Add(new ListContents("RT", "d"));
lcList.Add(new ListContents("UT", "m"));
lcList.Add(new ListContents("UT", "o"));
lcList.Add(new ListContents("UT", "n"));
return lcList;
}
}
I can solve it though - if I change the order of GetListContents() to something like...
public static List<ListContents> GetListContents()
{
List<ListContents> lcList = new List<ListContents>();
lcList.Add(new ListContents("UFT", "a"));
lcList.Add(new ListContents("UFT", "c"));
lcList.Add(new ListContents("UFT", "b"));
lcList.Add(new ListContents("RT", "e"));
lcList.Add(new ListContents("RT", "f"));//Moved this item
lcList.Add(new ListContents("RT", "d"));
lcList.Add(new ListContents("UT", "m"));
lcList.Add(new ListContents("UT", "o"));
lcList.Add(new ListContents("UT", "n"));
return lcList;
}
...Then the results come out as desired. Obviously this isn't a fix as I can't predict what order the List will come in, the only constant is that ColLeft is grouped.
Can any one help me understand why this behaviour?

The list is returned out of order because you have not preserved the original order anywhere. In LINQ (.NET 3.x) you could do it as follows:
list.GroupBy(x => x.LeftCol)
.Select(g => g.OrderBy(x => x.RightCol))
.SelectMany(x => x)
.ToList();
In .NET 2.0 you'd have to do something similar; i.e. first group by LeftCol, then sort each group by RightCol, then concatenate the groups.
For example, if you control the ListContents class, you could add a field or property int Index that represents the order you want to preserve (0 for the first group; 1 for the second group and so on). Then sort it using a comparer that compares first by Index, then by RightCol:
int index = 0;
for(int i=0; i<list.Count; i++)
{
if (i > 0 && list[i].LeftCol != list[i - 1].LeftCol) index++;
list[i].Index = index;
}
...
public int CompareTo(ListContents other)
{
int result = this.Index.CompareTo(other.Index);
if (result != 0) return result;
return this.ColRight.CompareTo(other.ColRight);
}
If you don't want to modify the ListContents class, you could do something similar by wrapping each item in a Tuple<int, ListContents> before sorting.

Related

Get dependent ids when querying principal

I'm trying to get just the ids for dependents if a principal is queried, every time the principal is queried.
My initial thought is to add it somehow in the OnModelCreating definitions, however that appears to be limited to filtering down larger sets of data, unless I'm missing something.
Something like this:
builder.Entity<ListingModel>()
.AlsoDoThis(
x => x.MenuIds.AddRange(
Menus.Where(y => y.ListingId == x.Id).Select(y => y.Id).ToList()
)
);
There is a need to not do this in code for each individual place I have a Select, since that functionality is normalized in some base classes. The base classes have a <TModel> passed in and don't inherently know what properties need to be handled this way.
I do have a workaround where I'm grabbing everything with an AutoInclude(), then filtering it out in the model definition with customer getter/setter to return a list of ids. But rather than being more performant (grabbing related FK ids at the DB level) it's transferring all of that data to the server and then programmatically selecting a list of ids, as far as I understand it.
private List<int> _topicsIds = new();
[NotMapped]
public List<int> TopicsIds
{
get { return Topics.Count > 0 ? Topics.Select(x => x.Id).ToList() : _topicsIds; }
set { _topicsIds = value; }
}
public List<TopicModel> Topics { get; set; } = new();
"Extra SQL that gets called with every select in a context" is (to my limited knowledge) almost what HasQueryFilter does, with a just slightly broader operation. I think this is the approach I'm looking for, just selecting more stuff instead of filtering stuff out.
You can project everything via Select
var result = ctx.ListingModels
.Select(lm => new // or to DTO
{
Id = lm.Id,
OtherProperty = lm.OtherProperty,
Ids = x.MenuIds.Select(m => m.Id).ToList()
})
.ToList();
To make more general solution we can use annotations and define how to project such entities.
During Model defining:
builder.Entity<TopicModel>()
.WithProjection(
x => x.MenuIds,
x => x.Menus.Where(y => y.ListingId == x.Id).Select(y => y.Id).ToList()
);
Then usage in common code:
public virtual List<TModel> GetList(List<int> ids)
{
var list = _context.Set<TModel>().Where(x => ids.Any(id => id == x.Id))
.ApplyCustomProjection(_context)
.ToList();
return list;
}
ApplyCustomProjection(_context) will find previously defined annotation and will apply custom projection.
And extensions implementation:
public static class ProjectionExtensions
{
public const string CustomProjectionAnnotation = "custom:member_projection";
public class ProjectionInfo
{
public ProjectionInfo(MemberInfo member, LambdaExpression expression)
{
Member = member;
Expression = expression;
}
public MemberInfo Member { get; }
public LambdaExpression Expression { get; }
}
public static bool IsUnderDotnetTool { get; }
= Process.GetCurrentProcess().ProcessName == "dotnet";
public static EntityTypeBuilder<TEntity> WithProjection<TEntity, TValue>(
this EntityTypeBuilder<TEntity> entity,
Expression<Func<TEntity, TValue>> propExpression,
Expression<Func<TEntity, TValue>> assignmentExpression)
where TEntity : class
{
// avoid registering non serializable annotations during migrations update
if (IsUnderDotnetTool)
return entity;
var annotation = entity.Metadata.FindAnnotation(CustomProjectionAnnotation);
var projections = annotation?.Value as List<ProjectionInfo> ?? new List<ProjectionInfo>();
if (propExpression.Body is not MemberExpression memberExpression)
throw new InvalidOperationException($"'{propExpression.Body}' is not member expression");
if (memberExpression.Expression is not ParameterExpression)
throw new InvalidOperationException($"'{memberExpression.Expression}' is not parameter expression. Only single nesting is allowed");
// removing duplicate
projections.RemoveAll(p => p.Member == memberExpression.Member);
projections.Add(new ProjectionInfo(memberExpression.Member, assignmentExpression));
return entity.HasAnnotation(CustomProjectionAnnotation, projections);
}
public static IQueryable<TEntity> ApplyCustomProjection<TEntity>(this IQueryable<TEntity> query, DbContext context)
where TEntity : class
{
var et = context.Model.FindEntityType(typeof(TEntity));
var projections = et?.FindAnnotation(CustomProjectionAnnotation)?.Value as List<ProjectionInfo>;
// nothing to do
if (projections == null || et == null)
return query;
var propertiesForProjection = et.GetProperties().Where(p =>
p.PropertyInfo != null && projections.All(pr => pr.Member != p.PropertyInfo))
.ToList();
var entityParam = Expression.Parameter(typeof(TEntity), "e");
var memberBinding = new MemberBinding[propertiesForProjection.Count + projections.Count];
for (int i = 0; i < propertiesForProjection.Count; i++)
{
var propertyInfo = propertiesForProjection[i].PropertyInfo!;
memberBinding[i] = Expression.Bind(propertyInfo, Expression.MakeMemberAccess(entityParam, propertyInfo));
}
for (int i = 0; i < projections.Count; i++)
{
var projection = projections[i];
var expression = projection.Expression.Body;
var assignExpression = ReplacingExpressionVisitor.Replace(projection.Expression.Parameters[0], entityParam, expression);
memberBinding[propertiesForProjection.Count + i] = Expression.Bind(projection.Member, assignExpression);
}
var memberInit = Expression.MemberInit(Expression.New(typeof(TEntity)), memberBinding);
var selectLambda = Expression.Lambda<Func<TEntity, TEntity>>(memberInit, entityParam);
var newQuery = query.Select(selectLambda);
return newQuery;
}
}

How can I use an extended entity to create a new property in my EF6 class with property changed notification?

I have a table in my entity model called prices. It has several fields named value0, value1, value2, value3, value4... (these are their literal names, sigh..). I cannot rename them or in any way change them.
What I would like is to use an extended entity to create a new property called values. This would be a collection containing value1, value2 etc...
To get access to the values I would then simply need to write prices.values[1]
I need property changed notification for this.
So far I have tried this;
public partial class Prices
{
private ObservableCollection<double?> values = null;
public ObservableCollection<double?> Values
{
get
{
if (values != null)
values.CollectionChanged -= values_CollectionChanged;
else
values = new ObservableCollection<double?>(new double?[14]);
values[0] = value0;
values[1] = value1;
values[2] = value2;
values.CollectionChanged += values_CollectionChanged;
return values;
}
private set
{
value0 = value[0];
value1 = value[1];
value2 = value[2];
}
}
private void values_CollectionChanged(object sender, NotifyCollectionChangedEventArgs e)
{
Values = values;
}
}
The issue comes when trying to set values. if I try to set a value by writing
prices.values[0] = someValue;
The new value is not always reflected in the collection (i.e. when I have previously set value and then try to overwrite the value).
I am willing to try any approach that would achieve my goal, I am not precious about having my solution fixed (although if anyone can explain what I'm missing that would be great!)
You could implement an indexer on Prices class without using a collection.
You can use switch to select the property to write or you can use reflection.
In this case I use reflection.
public double? this[int index]
{
get
{
if (index < 0 || index > 13) throw new ArgumentOutOfRangeException("index");
string propertyName = "Value" + index;
return (double?)GetType().GetProperty(propertyName).GetValue(this);
}
set
{
if (index < 0 || index > 13) throw new ArgumentOutOfRangeException("index");
string propertyName = "Value" + index;
GetType().GetProperty(propertyName).SetValue(this, value);
// Raise your event here
}
}

Querydsl Path Depth

I have a entity Document with a list of DocumentValue
#QueryEntity
#Document
public class Document{
private List<DocumentValue> documentValues;
}
DocumentValue can has also a list of DocumentValue
#QueryEntity
public class DocumentValue {
String value;
String name;
String id;
List<DocumentValue> documentValues;
}
I am now trying to do something like
private QDocumentValue getDocumentValuePathByDepth(int depth){
ListPath path = QDocument.document.documentValues;
if (depth != null) {
for (int i = 0; i < depth; i++) {
path = path.documentValues.any();
}
}
}
Does anybody know if its possible to do an eleMatch in that depth?
Like
ListPath<QDocumentValue> query = getDocumentValuePathByDepth(5);
return query.fieldId.eq(documentFilter.getFieldId()).and(query.value.between(from, to));
One element of documentValues in that depth should fulfill both conditions
BR D.C.
elemMatch is supported in Querydsl Mongodb like this
QDocumentValue documentValue = QDocumentValue.documentValue;
query.anyEmbedded(document.documentValues, documentValue)
.on(documentValue.id.eq(documentFilter.getFieldId(),
documentValue.value.between(from, to));

EF Code-first: Increment a non-autoincremented field manually

I´m using an existing database from our ERP.
In all my database tables, there is a float field called "r_e_c_n_o_", but this field is not auto-incremented by the database and I can´t change it.
For all added entities I would like to increment this field "r_e_c_n_o_", how could I acomplish that in DbContext´s SaveChanges() method?
Using ADO.NET I´d do something like that:
public static int GetNext(string tableName, string fieldName)
{
var cmd = _conn.CreateCommand(string.Format("SELECT MAX({0}) + 1 FROM {1}", fieldName, tableName));
var result = (int)cmd.ExecuteScalar();
return result;
}
UPDATE:
Please take a look in the comment below, its just what I need to solve my problem:
public override int SaveChanges()
{
var entries = this.ChangeTracker.Entries();
Dictionary<string, int> lastRecnos = new Dictionary<string, int>();
foreach (var entry in entries)
{
var typeName = entry.Entity.GetType().Name;
if (lastRecnos.ContainsKey(typeName))
lastRecnos[typeName]++;
else
lastRecnos[typeName] = 0;//How can i get the max here?
int nextRecnoForThisEntity = lastRecnos[typeName];
var entity = entry.Entity as EntityBase;
entity.Recno = nextRecnoForThisEntity;
}
return base.SaveChanges();
}
Tks,
William

Serializing Entity Framework problems

Like several other people, I'm having problems serializing Entity Framework objects, so that I can send the data over AJAX in a JSON format.
I've got the following server-side method, which I'm attempting to call using AJAX through jQuery
[WebMethod]
public static IEnumerable<Message> GetAllMessages(int officerId)
{
SIBSv2Entities db = new SIBSv2Entities();
return (from m in db.MessageRecipients
where m.OfficerId == officerId
select m.Message).AsEnumerable<Message>();
}
Calling this via AJAX results in this error:
A circular reference was detected while serializing an object of type \u0027System.Data.Metadata.Edm.AssociationType
Which is because of the way the Entity Framework creates circular references to keep all the objects related and accessible server side.
I came across the following code from (http://hellowebapps.com/2010-09-26/producing-json-from-entity-framework-4-0-generated-classes/) which claims to get around this problem by capping the maximum depth for references. I've added the code below, because I had to tweak it slightly to get it work (All angled brackets are missing from the code on the website)
using System.Web.Script.Serialization;
using System.Collections.Generic;
using System.Collections;
using System.Linq;
using System;
public class EFObjectConverter : JavaScriptConverter
{
private int _currentDepth = 1;
private readonly int _maxDepth = 2;
private readonly List<int> _processedObjects = new List<int>();
private readonly Type[] _builtInTypes = new[]{
typeof(bool),
typeof(byte),
typeof(sbyte),
typeof(char),
typeof(decimal),
typeof(double),
typeof(float),
typeof(int),
typeof(uint),
typeof(long),
typeof(ulong),
typeof(short),
typeof(ushort),
typeof(string),
typeof(DateTime),
typeof(Guid)
};
public EFObjectConverter( int maxDepth = 2,
EFObjectConverter parent = null)
{
_maxDepth = maxDepth;
if (parent != null)
{
_currentDepth += parent._currentDepth;
}
}
public override object Deserialize( IDictionary<string,object> dictionary, Type type, JavaScriptSerializer serializer)
{
return null;
}
public override IDictionary<string,object> Serialize(object obj, JavaScriptSerializer serializer)
{
_processedObjects.Add(obj.GetHashCode());
Type type = obj.GetType();
var properties = from p in type.GetProperties()
where p.CanWrite &&
p.CanWrite &&
_builtInTypes.Contains(p.PropertyType)
select p;
var result = properties.ToDictionary(
property => property.Name,
property => (Object)(property.GetValue(obj, null)
== null
? ""
: property.GetValue(obj, null).ToString().Trim())
);
if (_maxDepth >= _currentDepth)
{
var complexProperties = from p in type.GetProperties()
where p.CanWrite &&
p.CanRead &&
!_builtInTypes.Contains(p.PropertyType) &&
!_processedObjects.Contains(p.GetValue(obj, null)
== null
? 0
: p.GetValue(obj, null).GetHashCode())
select p;
foreach (var property in complexProperties)
{
var js = new JavaScriptSerializer();
js.RegisterConverters(new List<JavaScriptConverter> { new EFObjectConverter(_maxDepth - _currentDepth, this) });
result.Add(property.Name, js.Serialize(property.GetValue(obj, null)));
}
}
return result;
}
public override IEnumerable<System.Type> SupportedTypes
{
get
{
return GetType().Assembly.GetTypes();
}
}
}
However even when using that code, in the following way:
var js = new System.Web.Script.Serialization.JavaScriptSerializer();
js.RegisterConverters(new List<System.Web.Script.Serialization.JavaScriptConverter> { new EFObjectConverter(2) });
return js.Serialize(messages);
I'm still seeing the A circular reference was detected... exception being thrown!
I solved these issues with the following classes:
public class EFJavaScriptSerializer : JavaScriptSerializer
{
public EFJavaScriptSerializer()
{
RegisterConverters(new List<JavaScriptConverter>{new EFJavaScriptConverter()});
}
}
and
public class EFJavaScriptConverter : JavaScriptConverter
{
private int _currentDepth = 1;
private readonly int _maxDepth = 1;
private readonly List<object> _processedObjects = new List<object>();
private readonly Type[] _builtInTypes = new[]
{
typeof(int?),
typeof(double?),
typeof(bool?),
typeof(bool),
typeof(byte),
typeof(sbyte),
typeof(char),
typeof(decimal),
typeof(double),
typeof(float),
typeof(int),
typeof(uint),
typeof(long),
typeof(ulong),
typeof(short),
typeof(ushort),
typeof(string),
typeof(DateTime),
typeof(DateTime?),
typeof(Guid)
};
public EFJavaScriptConverter() : this(1, null) { }
public EFJavaScriptConverter(int maxDepth = 1, EFJavaScriptConverter parent = null)
{
_maxDepth = maxDepth;
if (parent != null)
{
_currentDepth += parent._currentDepth;
}
}
public override object Deserialize(IDictionary<string, object> dictionary, Type type, JavaScriptSerializer serializer)
{
return null;
}
public override IDictionary<string, object> Serialize(object obj, JavaScriptSerializer serializer)
{
_processedObjects.Add(obj.GetHashCode());
var type = obj.GetType();
var properties = from p in type.GetProperties()
where p.CanRead && p.GetIndexParameters().Count() == 0 &&
_builtInTypes.Contains(p.PropertyType)
select p;
var result = properties.ToDictionary(
p => p.Name,
p => (Object)TryGetStringValue(p, obj));
if (_maxDepth >= _currentDepth)
{
var complexProperties = from p in type.GetProperties()
where p.CanRead &&
p.GetIndexParameters().Count() == 0 &&
!_builtInTypes.Contains(p.PropertyType) &&
p.Name != "RelationshipManager" &&
!AllreadyAdded(p, obj)
select p;
foreach (var property in complexProperties)
{
var complexValue = TryGetValue(property, obj);
if(complexValue != null)
{
var js = new EFJavaScriptConverter(_maxDepth - _currentDepth, this);
result.Add(property.Name, js.Serialize(complexValue, new EFJavaScriptSerializer()));
}
}
}
return result;
}
private bool AllreadyAdded(PropertyInfo p, object obj)
{
var val = TryGetValue(p, obj);
return _processedObjects.Contains(val == null ? 0 : val.GetHashCode());
}
private static object TryGetValue(PropertyInfo p, object obj)
{
var parameters = p.GetIndexParameters();
if (parameters.Length == 0)
{
return p.GetValue(obj, null);
}
else
{
//cant serialize these
return null;
}
}
private static object TryGetStringValue(PropertyInfo p, object obj)
{
if (p.GetIndexParameters().Length == 0)
{
var val = p.GetValue(obj, null);
return val;
}
else
{
return string.Empty;
}
}
public override IEnumerable<Type> SupportedTypes
{
get
{
var types = new List<Type>();
//ef types
types.AddRange(Assembly.GetAssembly(typeof(DbContext)).GetTypes());
//model types
types.AddRange(Assembly.GetAssembly(typeof(BaseViewModel)).GetTypes());
return types;
}
}
}
You can now safely make a call like new EFJavaScriptSerializer().Serialize(obj)
Update : since version Telerik v1.3+ you can now override the GridActionAttribute.CreateActionResult method and hence you can easily integrate this Serializer into specific controller methods by applying your custom [GridAction] attribute:
[Grid]
public ActionResult _GetOrders(int id)
{
return new GridModel(Service.GetOrders(id));
}
and
public class GridAttribute : GridActionAttribute, IActionFilter
{
/// <summary>
/// Determines the depth that the serializer will traverse
/// </summary>
public int SerializationDepth { get; set; }
/// <summary>
/// Initializes a new instance of the <see cref="GridActionAttribute"/> class.
/// </summary>
public GridAttribute()
: base()
{
ActionParameterName = "command";
SerializationDepth = 1;
}
protected override ActionResult CreateActionResult(object model)
{
return new EFJsonResult
{
Data = model,
JsonRequestBehavior = JsonRequestBehavior.AllowGet,
MaxSerializationDepth = SerializationDepth
};
}
}
and finally..
public class EFJsonResult : JsonResult
{
const string JsonRequest_GetNotAllowed = "This request has been blocked because sensitive information could be disclosed to third party web sites when this is used in a GET request. To allow GET requests, set JsonRequestBehavior to AllowGet.";
public EFJsonResult()
{
MaxJsonLength = 1024000000;
RecursionLimit = 10;
MaxSerializationDepth = 1;
}
public int MaxJsonLength { get; set; }
public int RecursionLimit { get; set; }
public int MaxSerializationDepth { get; set; }
public override void ExecuteResult(ControllerContext context)
{
if (context == null)
{
throw new ArgumentNullException("context");
}
if (JsonRequestBehavior == JsonRequestBehavior.DenyGet &&
String.Equals(context.HttpContext.Request.HttpMethod, "GET", StringComparison.OrdinalIgnoreCase))
{
throw new InvalidOperationException(JsonRequest_GetNotAllowed);
}
var response = context.HttpContext.Response;
if (!String.IsNullOrEmpty(ContentType))
{
response.ContentType = ContentType;
}
else
{
response.ContentType = "application/json";
}
if (ContentEncoding != null)
{
response.ContentEncoding = ContentEncoding;
}
if (Data != null)
{
var serializer = new JavaScriptSerializer
{
MaxJsonLength = MaxJsonLength,
RecursionLimit = RecursionLimit
};
serializer.RegisterConverters(new List<JavaScriptConverter> { new EFJsonConverter(MaxSerializationDepth) });
response.Write(serializer.Serialize(Data));
}
}
You can also detach the object from the context and it will remove the navigation properties so that it can be serialized. For my data repository classes that are used with Json i use something like this.
public DataModel.Page GetPage(Guid idPage, bool detach = false)
{
var results = from p in DataContext.Pages
where p.idPage == idPage
select p;
if (results.Count() == 0)
return null;
else
{
var result = results.First();
if (detach)
DataContext.Detach(result);
return result;
}
}
By default the returned object will have all of the complex/navigation properties, but by setting detach = true it will remove those properties and return the base object only. For a list of objects the implementation looks like this
public List<DataModel.Page> GetPageList(Guid idSite, bool detach = false)
{
var results = from p in DataContext.Pages
where p.idSite == idSite
select p;
if (results.Count() > 0)
{
if (detach)
{
List<DataModel.Page> retValue = new List<DataModel.Page>();
foreach (var result in results)
{
DataContext.Detach(result);
retValue.Add(result);
}
return retValue;
}
else
return results.ToList();
}
else
return new List<DataModel.Page>();
}
I have just successfully tested this code.
It may be that in your case your Message object is in a different assembly? The overriden Property SupportedTypes is returning everything ONLY in its own Assembly so when serialize is called the JavaScriptSerializer defaults to the standard JavaScriptConverter.
You should be able to verify this debugging.
Your error occured due to some "Reference" classes generated by EF for some entities with 1:1 relations and that the JavaScriptSerializer failed to serialize.
I've used a workaround by adding a new condition :
!p.Name.EndsWith("Reference")
The code to get the complex properties looks like this :
var complexProperties = from p in type.GetProperties()
where p.CanWrite &&
p.CanRead &&
!p.Name.EndsWith("Reference") &&
!_builtInTypes.Contains(p.PropertyType) &&
!_processedObjects.Contains(p.GetValue(obj, null)
== null
? 0
: p.GetValue(obj, null).GetHashCode())
select p;
Hope this help you.
I had a similar problem with pushing my view via Ajax to UI components.
I also found and tried to use that code sample you provided. Some problems I had with that code:
SupportedTypes wasn't grabbing the types I needed, so the converter wasn't being called
If the maximum depth is hit, the serialization would be truncated
It threw out any other converters I had on the existing serializer by creating its own new JavaScriptSerializer
Here are the fixes I implemented for those issues:
Reusing the same serializer
I simply reused the existing serializer that is passed into Serialize to solve this problem. This broke the depth hack though.
Truncating on already-visited, rather than on depth
Instead of truncating on depth, I created a HashSet<object> of already seen instances (with a custom IEqualityComparer that checked reference equality). I simply didn't recurse if I found an instance I'd already seen. This is the same detection mechanism built into the JavaScriptSerializer itself, so worked quite well.
The only problem with this solution is that the serialization output isn't very deterministic. The order of truncation is strongly dependent on the order that reflections finds the properties. You could solve this (with a perf hit) by sorting before recursing.
SupportedTypes needed the right types
My JavaScriptConverter couldn't live in the same assembly as my model. If you plan to reuse this converter code, you'll probably run into the same problem.
To solve this I had to pre-traverse the object tree, keeping a HashSet<Type> of already seen types (to avoid my own infinite recursion), and pass that to the JavaScriptConverter before registering it.
Looking back on my solution, I would now use code generation templates to create a list of the entity types. This would be much more foolproof (it uses simple iteration), and have much better perf since it would produce a list at compile time. I'd still pass this to the converter so it could be reused between models.
My final solution
I threw out that code and tried again :)
I simply wrote code to project onto new types ("ViewModel" types - in your case, it would be service contract types) before doing my serialization. The intention of my code was made more explicit, it allowed me to serialize just the data I wanted, and it didn't have the potential of slipping in queries on accident (e.g. serializing my whole DB).
My types were fairly simple, and I didn't need most of them for my view. I might look into AutoMapper to do some of this projection in the future.