Entity Framework + ODATA: side-stepping the pagination - entity-framework

The project I'm working on has the Entity Framework on top of an OData layer. The Odata layer has it's server side pagination turned to a value of 75. My reading on the subject leads me to believe that this pagination value is used across the board, rather than a per table basis. The table that I'm currently looking to extract all the data from is, of course, more than 75 rows. Using the entity framework, my code is simply thus:
public IQueryable<ProductColor> GetProductColors()
{
return db.ProductColors;
}
where db is the entity context. This is returning the first 75 records. I read something where I could append a parameter inlinecount set to allpages giving me the following code:
public IQueryable<ProductColor> GetProductColors()
{
return db.ProductColors.AddQueryOption("inlinecount","allpages");
}
However, this too returns 75 rows!
Can anyone shed light on how to truly get all the records regardless of the OData server-side pagination stuff?
important: I cannot remove the pagination or turn it off! It's extremely valuable in other scenarios where performance is a concern.
Update:
Through some more searching I've found an MSDN that describes how to do this task.
I'd love to be able to turn it into a full Generic method but, this was as close as I could get to a generic without using reflection:
public IQueryable<T> TakeAll<T>(QueryOperationResponse<T> qor)
{
var collection = new List<T>();
DataServiceQueryContinuation<T> next = null;
QueryOperationResponse<T> response = qor;
do
{
if (next != null)
{
response = db.Execute<T>(next) as QueryOperationResponse<T>;
}
foreach (var elem in response)
{
collection.Add(elem);
}
} while ((next = response.GetContinuation()) != null);
return collection.AsQueryable();
}
calling it like:
public IQueryable<ProductColor> GetProductColors()
{
QueryOperationResponse<ProductColor> response = db.ProductColors.Execute() as QueryOperationResponse<ProductColor>;
var productColors = this.TakeAll<ProductColor>(response);
return productColors.AsQueryable();
}

If unable turn off paging you'll receive 75 row by call, always. You can get all rows in following ways:
Add another IQueryable<ProductColor> AllProductColors and modify
public static void InitializeService(DataServiceConfiguration config)
{
config.UseVerboseErrors = true;
config.SetEntitySetAccessRule("*", EntitySetRights.AllRead);
config.SetEntitySetPageSize("ProductColors", 75); - Note only paged queries are present
config.SetServiceOperationAccessRule("*", ServiceOperationRights.AllRead);
config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
}
You should call ProductColors as many as needed, for example
var cat = new NetflixCatalog(new Uri("http://odata.netflix.com/v1/Catalog/"));
var x = from t in cat.Titles
where t.ReleaseYear == 2009
select t;
var response = (QueryOperationResponse<Title>)((DataServiceQuery<Title>)x).Execute();
while (true)
{
foreach (Title title in response)
{
Console.WriteLine(title.Name);
}
var continuation = response.GetContinuation();
if (continuation == null)
{
break;
}
response = cat.Execute(continuation);
}
I use Rx with following code
public sealed class DataSequence<TEntry> : IObservable<TEntry>
{
private readonly DataServiceContext context;
private readonly Logger logger = LogManager.GetCurrentClassLogger();
private readonly IQueryable<TEntry> query;
public DataSequence(IQueryable<TEntry> query, DataServiceContext context)
{
this.query = query;
this.context = context;
}
public IDisposable Subscribe(IObserver<TEntry> observer)
{
QueryOperationResponse<TEntry> response;
try
{
response = (QueryOperationResponse<TEntry>)((DataServiceQuery<TEntry>)query).Execute();
if (response == null)
{
return Disposable.Empty;
}
}
catch (Exception ex)
{
logger.Error(ex);
return Disposable.Empty;
}
var initialState = new State
{
CanContinue = true,
Response = response
};
IObservable<TEntry> sequence = Observable.Generate(
initialState,
state => state.CanContinue,
MoveToNextState,
GetCurrentValue,
Scheduler.ThreadPool).Merge();
return new CompositeDisposable(initialState, sequence.Subscribe(observer));
}
private static IObservable<TEntry> GetCurrentValue(State state)
{
if (state.Response == null)
{
return Observable.Empty<TEntry>();
}
return state.Response.ToObservable();
}
private State MoveToNextState(State state)
{
DataServiceQueryContinuation<TEntry> continuation = state.Response.GetContinuation();
if (continuation == null)
{
state.CanContinue = false;
return state;
}
QueryOperationResponse<TEntry> response;
try
{
response = context.Execute(continuation);
}
catch (Exception)
{
state.CanContinue = false;
return state;
}
state.Response = response;
return state;
}
private sealed class State : IDisposable
{
public bool CanContinue { get; set; }
public QueryOperationResponse<TEntry> Response { get; set; }
public void Dispose()
{
CanContinue = false;
}
}
}
so for get any data thru OData, create a sequence and Rx does the rest
var sequence = new DataSequence<Product>(context.Products, context);
sequence.OnErrorResumeNext(Observable.Empty<Product>())
.ObserveOnDispatcher().SubscribeOn(Scheduler.NewThread).Subscribe(AddProduct, logger.Error);

The page size is set by the service author and can be set per entity set (but a service may choose to apply the same page size to all entity sets). There's no way to avoid it from the client (which is by design since it's a security feature).
The inlinecount option asks the server to include the total count of the results (just the number), it doesn't disable the paging.
From the client the only way to read all the data is to issue the request which will return the first page and it may contain a next link which you request to read the next page and so on until the last response doesn't have the next link.
If you're using the WCF Data Services client library it has support for continuations (the next link) and a simple sample can be found in this blog post (for example): http://blogs.msdn.com/b/phaniraj/archive/2010/04/25/server-driven-paging-with-wcf-data-services.aspx

Related

AspNet Boilerplate Parallel DB Access through Entity Framework from an AppService

We are using ASP.NET Zero and are running into issues with parallel processing from an AppService. We know requests must be transactional, but unfortunately we need to break out to slow running APIs for numerous calls, so we have to do parallel processing.
As expected, we are running into a DbContext contingency issue on the second database call we make:
System.InvalidOperationException: A second operation started on this context
before a previous operation completed. This is usually caused by different
threads using the same instance of DbContext, however instance members are
not guaranteed to be thread safe. This could also be caused by a nested query
being evaluated on the client, if this is the case rewrite the query avoiding
nested invocations.
We read that a new UOW is required, so we tried using both the method attribute and the explicit UowManager, but neither of the two worked.
We also tried creating instances of the referenced AppServices using the IocResolver, but we are still not able to get a unique DbContext per thread (please see below).
public List<InvoiceDto> CreateInvoices(List<InvoiceTemplateLineItemDto> templateLineItems)
{
List<InvoiceDto> invoices = new InvoiceDto[templateLineItems.Count].ToList();
ConcurrentQueue<Exception> exceptions = new ConcurrentQueue<Exception>();
Parallel.ForEach(templateLineItems, async (templateLineItem) =>
{
try
{
XAppService xAppService = _iocResolver.Resolve<XAppService>();
InvoiceDto invoice = await xAppService
.CreateInvoiceInvoiceItem();
invoices.Insert(templateLineItems.IndexOf(templateLineItem), invoice);
}
catch (Exception e)
{
exceptions.Enqueue(e);
}
});
if (exceptions.Count > 0) throw new AggregateException(exceptions);
return invoices;
}
How can we ensure that a new DbContext is availble per thread?
I was able to replicate and resolve the problem with a generic version of ABP. I'm still experiencing the problem in my original solution, which is far more complex. I'll have to do some more digging to determine why it is failing there.
For others that come across this problem, which is exactly the same issue as reference here, you can simply disable the UnitOfWork through an attribute as illustrated in the code below.
public class InvoiceAppService : ApplicationService
{
private readonly InvoiceItemAppService _invoiceItemAppService;
public InvoiceAppService(InvoiceItemAppService invoiceItemAppService)
{
_invoiceItemAppService = invoiceItemAppService;
}
// Just add this attribute
[UnitOfWork(IsDisabled = true)]
public InvoiceDto GetInvoice(List<int> invoiceItemIds)
{
_invoiceItemAppService.Initialize();
ConcurrentQueue<InvoiceItemDto> invoiceItems =
new ConcurrentQueue<InvoiceItemDto>();
ConcurrentQueue<Exception> exceptions = new ConcurrentQueue<Exception>();
Parallel.ForEach(invoiceItemIds, (invoiceItemId) =>
{
try
{
InvoiceItemDto invoiceItemDto =
_invoiceItemAppService.CreateAsync(invoiceItemId).Result;
invoiceItems.Enqueue(invoiceItemDto);
}
catch (Exception e)
{
exceptions.Enqueue(e);
}
});
if (exceptions.Count > 0) {
AggregateException ex = new AggregateException(exceptions);
Logger.Error("Unable to get invoice", ex);
throw ex;
}
return new InvoiceDto {
Date = DateTime.Now,
InvoiceItems = invoiceItems.ToArray()
};
}
}
public class InvoiceItemAppService : ApplicationService
{
private readonly IRepository<InvoiceItem> _invoiceItemRepository;
private readonly IRepository<Token> _tokenRepository;
private readonly IRepository<Credential> _credentialRepository;
private Token _token;
private Credential _credential;
public InvoiceItemAppService(IRepository<InvoiceItem> invoiceItemRepository,
IRepository<Token> tokenRepository,
IRepository<Credential> credentialRepository)
{
_invoiceItemRepository = invoiceItemRepository;
_tokenRepository = tokenRepository;
_credentialRepository = credentialRepository;
}
public void Initialize()
{
_token = _tokenRepository.FirstOrDefault(x => x.Id == 1);
_credential = _credentialRepository.FirstOrDefault(x => x.Id == 1);
}
// Create an invoice item using info from an external API and some db records
public async Task<InvoiceItemDto> CreateAsync(int id)
{
// Get db record
InvoiceItem invoiceItem = await _invoiceItemRepository.GetAsync(id);
// Get price
decimal price = await GetPriceAsync(invoiceItem.Description);
return new InvoiceItemDto {
Id = id,
Description = invoiceItem.Description,
Amount = price
};
}
private async Task<decimal> GetPriceAsync(string description)
{
// Simulate a slow API call to get price using description
// We use the token and credentials here in the real deal
await Task.Delay(5000);
return 100.00M;
}
}

How to write a generic WebAPI Put method against Entity Framework that works with child lists?

I am tinkering with WebAPI to create a generic implementation for entity framework. I am able to implement most of the methods just fine, but am finding PUT to be tricky in non-trivial cases. The implementation most commonly found online works for simple entities:
[HttpPut]
[ActionName("Endpoint")]
public virtual T Put(T entity)
{
var db = GetDbContext();
var entry = db.Entry(entity);
entry.State = EntityState.Modified;
var set = db.Set<T>();
set.Attach(entity);
db.SaveChanges();
return entity;
}
...but does not delete or update child lists:
public class Invoice
{
...
public virtual InvoiceLineItem {get; set;} //Attach method doesn't address these
}
In an MVC Controller, you could simply use "UpdateModel" and it would add/update/delete children as needed, however that method is not available on ApiController. I understand that some code would be necessary to get the original item from the database, and that it would need to use Include to get the child lists, but can't quite figure out the best way to replicate UpdateModel's functionality:
[HttpPut]
[ActionName("Endpoint")]
public virtual T Put(T entity)
{
var db = GetDbContext();
var original = GetOriginalFor(entity);
//TODO: Something similar to UpdateModel(original), such as UpdateModel(original, entity);
db.SaveChanges();
return original;
}
How can I implement UpdateModel OR somehow implement Put in such a way that it will handle child lists?
The routine dont validate entity, but fill the pre-existent entity.
protected virtual void UpdateModel<T>(T original, bool overrideForEmptyList = true)
{
var json = ControllerContext.Request.Content.ReadAsStringAsync().Result;
UpdateModel<T>(json, original, overrideForEmptyList);
}
private void UpdateModel<T>(string json, T original, bool overrideForEmptyList = true)
{
var newValues = JsonConvert.DeserializeObject<Pessoa>(json);
foreach (var property in original.GetType().GetProperties())
{
var isEnumerable = property.PropertyType.GetInterfaces().Any(t => t.IsGenericType && t.GetGenericTypeDefinition() == typeof(IEnumerable<>));
if (isEnumerable && property.PropertyType != typeof(string))
{
var propertyOriginalValue = property.GetValue(original, null);
if (propertyOriginalValue != null)
{
var propertyNewValue = property.GetValue(newValues, null);
if (propertyNewValue != null && (overrideForEmptyList || ((IEnumerable<object>)propertyNewValue).Any()))
{
property.SetValue(original, null);
}
}
}
}
JsonConvert.PopulateObject(json, original);
}
public void Post()
{
var sample = Pessoa.FindById(12);
UpdateModel(sample);
}

Generating Cache Keys from IQueryable For Caching Results of EF Code First Queries

I'm trying to implement a caching scheme for my EF Repository similar to the one blogged here. As the author and commenters have reported the limitation is that the key generation method cannot produce cache keys that vary with a given query's parameters. Here is the cache key generation method:
private static string GetKey<T>(IQueryable<T> query)
{
string key = string.Concat(query.ToString(), "\n\r",
typeof(T).AssemblyQualifiedName);
return key;
}
So the following queries will yield the same cache key:
var isActive = true;
var query = context.Products
.OrderBy(one => one.ProductNumber)
.Where(one => one.IsActive == isActive).AsCacheable();
and
var isActive = false;
var query = context.Products
.OrderBy(one => one.ProductNumber)
.Where(one => one.IsActive == isActive).AsCacheable();
Notice that the only difference is that isActive = true in the first query and isActive = false in the second.
Any suggestions/insight to efficiently generating cache keys which vary by IQueryable parameters would be truly appreciated.
Kudos to Sergey Barskiy for sharing the EF CodeFirst caching scheme.
Update
I took the approach of traversing the IQueryable's expression tree myself with the goal of resolving the values of the parameters used in the query. With maxlego's suggestion, I extended the System.Linq.Expressions.ExpressionVisitor class to visit the expression nodes that we're interested in - in this case, the MemberExpression. The updated GetKey method looks something like this:
public static string GetKey<T>(IQueryable<T> query)
{
var keyBuilder = new StringBuilder(query.ToString());
var queryParamVisitor = new QueryParameterVisitor(keyBuilder);
queryParamVisitor.GetQueryParameters(query.Expression);
keyBuilder.Append("\n\r");
keyBuilder.Append(typeof (T).AssemblyQualifiedName);
return keyBuilder.ToString();
}
And the QueryParameterVisitor class, which was inspired by the answers of Bryan Watts and Marc Gravell to this question, looks like this:
/// <summary>
/// <see cref="ExpressionVisitor"/> subclass which encapsulates logic to
/// traverse an expression tree and resolve all the query parameter values
/// </summary>
internal class QueryParameterVisitor : ExpressionVisitor
{
public QueryParameterVisitor(StringBuilder sb)
{
QueryParamBuilder = sb;
Visited = new Dictionary<int, bool>();
}
protected StringBuilder QueryParamBuilder { get; set; }
protected Dictionary<int, bool> Visited { get; set; }
public StringBuilder GetQueryParameters(Expression expression)
{
Visit(expression);
return QueryParamBuilder;
}
private static object GetMemberValue(MemberExpression memberExpression, Dictionary<int, bool> visited)
{
object value;
if (!TryGetMemberValue(memberExpression, out value, visited))
{
UnaryExpression objectMember = Expression.Convert(memberExpression, typeof (object));
Expression<Func<object>> getterLambda = Expression.Lambda<Func<object>>(objectMember);
Func<object> getter = null;
try
{
getter = getterLambda.Compile();
}
catch (InvalidOperationException)
{
}
if (getter != null) value = getter();
}
return value;
}
private static bool TryGetMemberValue(Expression expression, out object value, Dictionary<int, bool> visited)
{
if (expression == null)
{
// used for static fields, etc
value = null;
return true;
}
// Mark this node as visited (processed)
int expressionHash = expression.GetHashCode();
if (!visited.ContainsKey(expressionHash))
{
visited.Add(expressionHash, true);
}
// Get Member Value, recurse if necessary
switch (expression.NodeType)
{
case ExpressionType.Constant:
value = ((ConstantExpression) expression).Value;
return true;
case ExpressionType.MemberAccess:
var me = (MemberExpression) expression;
object target;
if (TryGetMemberValue(me.Expression, out target, visited))
{
// instance target
switch (me.Member.MemberType)
{
case MemberTypes.Field:
value = ((FieldInfo) me.Member).GetValue(target);
return true;
case MemberTypes.Property:
value = ((PropertyInfo) me.Member).GetValue(target, null);
return true;
}
}
break;
}
// Could not retrieve value
value = null;
return false;
}
protected override Expression VisitMember(MemberExpression node)
{
// Only process nodes that haven't been processed before, this could happen because our traversal
// is depth-first and will "visit" the nodes in the subtree before this method (VisitMember) does
if (!Visited.ContainsKey(node.GetHashCode()))
{
object value = GetMemberValue(node, Visited);
if (value != null)
{
QueryParamBuilder.Append("\n\r");
QueryParamBuilder.Append(value.ToString());
}
}
return base.VisitMember(node);
}
}
I'm still doing some performance profiling on the cache key generation and hoping that it isn't too expensive (I'll update the question with the results once I have them). I'll leave the question open, in case anyone has suggestions on how to optimize this process or has a recommendation for a more efficient method for generating cache keys with vary with the query parameters. Although this method produces the desired output, it is by no means optimal.
i suggest to use ExpressionVisitor
http://msdn.microsoft.com/en-us/library/bb882521(v=vs.90).aspx
Just for the record, "Caching the results of LINQ queries" works well with the EF and it's able to work with parameters correctly, so it can be considered as a good second level cache implementation for EF.
While the solution of the OP works quite well, I found that the performance of the solution is a little bit poor.
The duration of the key generation varied between 300ms and 1200ms for my queries.
However, I've found another solution that has quite better performance (<10ms).
public static string ToTraceString<T>(DbQuery<T> query)
{
var internalQueryField = query.GetType().GetFields(BindingFlags.NonPublic | BindingFlags.Instance).Where(f => f.Name.Equals("_internalQuery")).FirstOrDefault();
var internalQuery = internalQueryField.GetValue(query);
var objectQueryField = internalQuery.GetType().GetFields(BindingFlags.NonPublic | BindingFlags.Instance).Where(f => f.Name.Equals("_objectQuery")).FirstOrDefault();
var objectQuery = objectQueryField.GetValue(internalQuery) as ObjectQuery<T>;
return ToTraceStringWithParameters(objectQuery);
}
private static string ToTraceStringWithParameters<T>(ObjectQuery<T> query)
{
string traceString = query.ToTraceString() + "\n";
foreach (var parameter in query.Parameters)
{
traceString += parameter.Name + " [" + parameter.ParameterType.FullName + "] = " + parameter.Value + "\n";
}
return traceString;
}

How can I pass an Entity Framework Entity from aspx page to a user control on the page?

I have a page that uses a large filtered EF object in several datagrids so each datagrid displays a different "Status". The page code is getting a bit out of control so I wanted to seperate the sections out into usercontrols. I only want to get the data once so I want to be able to pass the properly filtered data object list to the appropriate usercontrol. I'm just not sure how to go about it. Any suggestions?
Thanks,
Rhonda
My data object
activeDisplayChecklist = allDisplayChecklist.Where(x => x.ChecklistStatus.ToString().ToUpper() != Checklist.ChecklistStatus.Approved.ToDescriptionString().ToUpper() && x.ChecklistStatus.ToString().ToUpper() != Checklist.ChecklistStatus.Canceled.ToDescriptionString().ToUpper()).ToList();
Workqueue.ascx.cs
public partial class WorkQueue : System.Web.UI.UserControl
{
public List<Entities.Checklist> activeChecklists { get; set; }
private List<AMWOTPortalDisplay> ActiveDisplayChecklist = new List<AMWOTPortalDisplay>();
public List<AMWOTPortalDisplay> activeDisplayChecklist
{
get
{
return ActiveDisplayChecklist;
}
set
{
ActiveDisplayChecklist = value;
}
}
protected void Page_Load(object sender, EventArgs e)
{
PopulateWorkQueueGrid();
}
//show statuses that require approval (Submitted or CTO Exception)
//public void PopulateWorkQueueGrid(List<AMWOTPortalDisplay> ActiveDisplayChecklist)
public void PopulateWorkQueueGrid()
{
// ActiveDisplayChecklist is alway Count = 0 or null.
List filteredChecklist = ActiveDisplayChecklist.Where(x => x.ChecklistStatus.ToString().ToUpper() == Checklist.ChecklistStatus.Submitted.ToDescriptionString().ToUpper() || x.ChecklistStatus.ToString().ToUpper() == Checklist.ChecklistStatus.CTOException.ToDescriptionString().ToUpper()).ToList();
WorkQueueGrid.DataSource = filteredChecklist.ToList();
WorkQueueGrid.DataBind();
}
}
<WQ:WorkQueueList ID="WorkQueueList" runat="Server"></WQ:WorkQueueList>
portal.aspx.cs (the objects contain 9 and 12 rows here)
WorkQueue wq = new WorkQueue();
wq.activeChecklists = activeChecklists.ToList();
wq.activeDisplayChecklist = activeDisplayChecklist.ToList();

Serializing Entity Framework problems

Like several other people, I'm having problems serializing Entity Framework objects, so that I can send the data over AJAX in a JSON format.
I've got the following server-side method, which I'm attempting to call using AJAX through jQuery
[WebMethod]
public static IEnumerable<Message> GetAllMessages(int officerId)
{
SIBSv2Entities db = new SIBSv2Entities();
return (from m in db.MessageRecipients
where m.OfficerId == officerId
select m.Message).AsEnumerable<Message>();
}
Calling this via AJAX results in this error:
A circular reference was detected while serializing an object of type \u0027System.Data.Metadata.Edm.AssociationType
Which is because of the way the Entity Framework creates circular references to keep all the objects related and accessible server side.
I came across the following code from (http://hellowebapps.com/2010-09-26/producing-json-from-entity-framework-4-0-generated-classes/) which claims to get around this problem by capping the maximum depth for references. I've added the code below, because I had to tweak it slightly to get it work (All angled brackets are missing from the code on the website)
using System.Web.Script.Serialization;
using System.Collections.Generic;
using System.Collections;
using System.Linq;
using System;
public class EFObjectConverter : JavaScriptConverter
{
private int _currentDepth = 1;
private readonly int _maxDepth = 2;
private readonly List<int> _processedObjects = new List<int>();
private readonly Type[] _builtInTypes = new[]{
typeof(bool),
typeof(byte),
typeof(sbyte),
typeof(char),
typeof(decimal),
typeof(double),
typeof(float),
typeof(int),
typeof(uint),
typeof(long),
typeof(ulong),
typeof(short),
typeof(ushort),
typeof(string),
typeof(DateTime),
typeof(Guid)
};
public EFObjectConverter( int maxDepth = 2,
EFObjectConverter parent = null)
{
_maxDepth = maxDepth;
if (parent != null)
{
_currentDepth += parent._currentDepth;
}
}
public override object Deserialize( IDictionary<string,object> dictionary, Type type, JavaScriptSerializer serializer)
{
return null;
}
public override IDictionary<string,object> Serialize(object obj, JavaScriptSerializer serializer)
{
_processedObjects.Add(obj.GetHashCode());
Type type = obj.GetType();
var properties = from p in type.GetProperties()
where p.CanWrite &&
p.CanWrite &&
_builtInTypes.Contains(p.PropertyType)
select p;
var result = properties.ToDictionary(
property => property.Name,
property => (Object)(property.GetValue(obj, null)
== null
? ""
: property.GetValue(obj, null).ToString().Trim())
);
if (_maxDepth >= _currentDepth)
{
var complexProperties = from p in type.GetProperties()
where p.CanWrite &&
p.CanRead &&
!_builtInTypes.Contains(p.PropertyType) &&
!_processedObjects.Contains(p.GetValue(obj, null)
== null
? 0
: p.GetValue(obj, null).GetHashCode())
select p;
foreach (var property in complexProperties)
{
var js = new JavaScriptSerializer();
js.RegisterConverters(new List<JavaScriptConverter> { new EFObjectConverter(_maxDepth - _currentDepth, this) });
result.Add(property.Name, js.Serialize(property.GetValue(obj, null)));
}
}
return result;
}
public override IEnumerable<System.Type> SupportedTypes
{
get
{
return GetType().Assembly.GetTypes();
}
}
}
However even when using that code, in the following way:
var js = new System.Web.Script.Serialization.JavaScriptSerializer();
js.RegisterConverters(new List<System.Web.Script.Serialization.JavaScriptConverter> { new EFObjectConverter(2) });
return js.Serialize(messages);
I'm still seeing the A circular reference was detected... exception being thrown!
I solved these issues with the following classes:
public class EFJavaScriptSerializer : JavaScriptSerializer
{
public EFJavaScriptSerializer()
{
RegisterConverters(new List<JavaScriptConverter>{new EFJavaScriptConverter()});
}
}
and
public class EFJavaScriptConverter : JavaScriptConverter
{
private int _currentDepth = 1;
private readonly int _maxDepth = 1;
private readonly List<object> _processedObjects = new List<object>();
private readonly Type[] _builtInTypes = new[]
{
typeof(int?),
typeof(double?),
typeof(bool?),
typeof(bool),
typeof(byte),
typeof(sbyte),
typeof(char),
typeof(decimal),
typeof(double),
typeof(float),
typeof(int),
typeof(uint),
typeof(long),
typeof(ulong),
typeof(short),
typeof(ushort),
typeof(string),
typeof(DateTime),
typeof(DateTime?),
typeof(Guid)
};
public EFJavaScriptConverter() : this(1, null) { }
public EFJavaScriptConverter(int maxDepth = 1, EFJavaScriptConverter parent = null)
{
_maxDepth = maxDepth;
if (parent != null)
{
_currentDepth += parent._currentDepth;
}
}
public override object Deserialize(IDictionary<string, object> dictionary, Type type, JavaScriptSerializer serializer)
{
return null;
}
public override IDictionary<string, object> Serialize(object obj, JavaScriptSerializer serializer)
{
_processedObjects.Add(obj.GetHashCode());
var type = obj.GetType();
var properties = from p in type.GetProperties()
where p.CanRead && p.GetIndexParameters().Count() == 0 &&
_builtInTypes.Contains(p.PropertyType)
select p;
var result = properties.ToDictionary(
p => p.Name,
p => (Object)TryGetStringValue(p, obj));
if (_maxDepth >= _currentDepth)
{
var complexProperties = from p in type.GetProperties()
where p.CanRead &&
p.GetIndexParameters().Count() == 0 &&
!_builtInTypes.Contains(p.PropertyType) &&
p.Name != "RelationshipManager" &&
!AllreadyAdded(p, obj)
select p;
foreach (var property in complexProperties)
{
var complexValue = TryGetValue(property, obj);
if(complexValue != null)
{
var js = new EFJavaScriptConverter(_maxDepth - _currentDepth, this);
result.Add(property.Name, js.Serialize(complexValue, new EFJavaScriptSerializer()));
}
}
}
return result;
}
private bool AllreadyAdded(PropertyInfo p, object obj)
{
var val = TryGetValue(p, obj);
return _processedObjects.Contains(val == null ? 0 : val.GetHashCode());
}
private static object TryGetValue(PropertyInfo p, object obj)
{
var parameters = p.GetIndexParameters();
if (parameters.Length == 0)
{
return p.GetValue(obj, null);
}
else
{
//cant serialize these
return null;
}
}
private static object TryGetStringValue(PropertyInfo p, object obj)
{
if (p.GetIndexParameters().Length == 0)
{
var val = p.GetValue(obj, null);
return val;
}
else
{
return string.Empty;
}
}
public override IEnumerable<Type> SupportedTypes
{
get
{
var types = new List<Type>();
//ef types
types.AddRange(Assembly.GetAssembly(typeof(DbContext)).GetTypes());
//model types
types.AddRange(Assembly.GetAssembly(typeof(BaseViewModel)).GetTypes());
return types;
}
}
}
You can now safely make a call like new EFJavaScriptSerializer().Serialize(obj)
Update : since version Telerik v1.3+ you can now override the GridActionAttribute.CreateActionResult method and hence you can easily integrate this Serializer into specific controller methods by applying your custom [GridAction] attribute:
[Grid]
public ActionResult _GetOrders(int id)
{
return new GridModel(Service.GetOrders(id));
}
and
public class GridAttribute : GridActionAttribute, IActionFilter
{
/// <summary>
/// Determines the depth that the serializer will traverse
/// </summary>
public int SerializationDepth { get; set; }
/// <summary>
/// Initializes a new instance of the <see cref="GridActionAttribute"/> class.
/// </summary>
public GridAttribute()
: base()
{
ActionParameterName = "command";
SerializationDepth = 1;
}
protected override ActionResult CreateActionResult(object model)
{
return new EFJsonResult
{
Data = model,
JsonRequestBehavior = JsonRequestBehavior.AllowGet,
MaxSerializationDepth = SerializationDepth
};
}
}
and finally..
public class EFJsonResult : JsonResult
{
const string JsonRequest_GetNotAllowed = "This request has been blocked because sensitive information could be disclosed to third party web sites when this is used in a GET request. To allow GET requests, set JsonRequestBehavior to AllowGet.";
public EFJsonResult()
{
MaxJsonLength = 1024000000;
RecursionLimit = 10;
MaxSerializationDepth = 1;
}
public int MaxJsonLength { get; set; }
public int RecursionLimit { get; set; }
public int MaxSerializationDepth { get; set; }
public override void ExecuteResult(ControllerContext context)
{
if (context == null)
{
throw new ArgumentNullException("context");
}
if (JsonRequestBehavior == JsonRequestBehavior.DenyGet &&
String.Equals(context.HttpContext.Request.HttpMethod, "GET", StringComparison.OrdinalIgnoreCase))
{
throw new InvalidOperationException(JsonRequest_GetNotAllowed);
}
var response = context.HttpContext.Response;
if (!String.IsNullOrEmpty(ContentType))
{
response.ContentType = ContentType;
}
else
{
response.ContentType = "application/json";
}
if (ContentEncoding != null)
{
response.ContentEncoding = ContentEncoding;
}
if (Data != null)
{
var serializer = new JavaScriptSerializer
{
MaxJsonLength = MaxJsonLength,
RecursionLimit = RecursionLimit
};
serializer.RegisterConverters(new List<JavaScriptConverter> { new EFJsonConverter(MaxSerializationDepth) });
response.Write(serializer.Serialize(Data));
}
}
You can also detach the object from the context and it will remove the navigation properties so that it can be serialized. For my data repository classes that are used with Json i use something like this.
public DataModel.Page GetPage(Guid idPage, bool detach = false)
{
var results = from p in DataContext.Pages
where p.idPage == idPage
select p;
if (results.Count() == 0)
return null;
else
{
var result = results.First();
if (detach)
DataContext.Detach(result);
return result;
}
}
By default the returned object will have all of the complex/navigation properties, but by setting detach = true it will remove those properties and return the base object only. For a list of objects the implementation looks like this
public List<DataModel.Page> GetPageList(Guid idSite, bool detach = false)
{
var results = from p in DataContext.Pages
where p.idSite == idSite
select p;
if (results.Count() > 0)
{
if (detach)
{
List<DataModel.Page> retValue = new List<DataModel.Page>();
foreach (var result in results)
{
DataContext.Detach(result);
retValue.Add(result);
}
return retValue;
}
else
return results.ToList();
}
else
return new List<DataModel.Page>();
}
I have just successfully tested this code.
It may be that in your case your Message object is in a different assembly? The overriden Property SupportedTypes is returning everything ONLY in its own Assembly so when serialize is called the JavaScriptSerializer defaults to the standard JavaScriptConverter.
You should be able to verify this debugging.
Your error occured due to some "Reference" classes generated by EF for some entities with 1:1 relations and that the JavaScriptSerializer failed to serialize.
I've used a workaround by adding a new condition :
!p.Name.EndsWith("Reference")
The code to get the complex properties looks like this :
var complexProperties = from p in type.GetProperties()
where p.CanWrite &&
p.CanRead &&
!p.Name.EndsWith("Reference") &&
!_builtInTypes.Contains(p.PropertyType) &&
!_processedObjects.Contains(p.GetValue(obj, null)
== null
? 0
: p.GetValue(obj, null).GetHashCode())
select p;
Hope this help you.
I had a similar problem with pushing my view via Ajax to UI components.
I also found and tried to use that code sample you provided. Some problems I had with that code:
SupportedTypes wasn't grabbing the types I needed, so the converter wasn't being called
If the maximum depth is hit, the serialization would be truncated
It threw out any other converters I had on the existing serializer by creating its own new JavaScriptSerializer
Here are the fixes I implemented for those issues:
Reusing the same serializer
I simply reused the existing serializer that is passed into Serialize to solve this problem. This broke the depth hack though.
Truncating on already-visited, rather than on depth
Instead of truncating on depth, I created a HashSet<object> of already seen instances (with a custom IEqualityComparer that checked reference equality). I simply didn't recurse if I found an instance I'd already seen. This is the same detection mechanism built into the JavaScriptSerializer itself, so worked quite well.
The only problem with this solution is that the serialization output isn't very deterministic. The order of truncation is strongly dependent on the order that reflections finds the properties. You could solve this (with a perf hit) by sorting before recursing.
SupportedTypes needed the right types
My JavaScriptConverter couldn't live in the same assembly as my model. If you plan to reuse this converter code, you'll probably run into the same problem.
To solve this I had to pre-traverse the object tree, keeping a HashSet<Type> of already seen types (to avoid my own infinite recursion), and pass that to the JavaScriptConverter before registering it.
Looking back on my solution, I would now use code generation templates to create a list of the entity types. This would be much more foolproof (it uses simple iteration), and have much better perf since it would produce a list at compile time. I'd still pass this to the converter so it could be reused between models.
My final solution
I threw out that code and tried again :)
I simply wrote code to project onto new types ("ViewModel" types - in your case, it would be service contract types) before doing my serialization. The intention of my code was made more explicit, it allowed me to serialize just the data I wanted, and it didn't have the potential of slipping in queries on accident (e.g. serializing my whole DB).
My types were fairly simple, and I didn't need most of them for my view. I might look into AutoMapper to do some of this projection in the future.