Get original type from DbEntityType - entity-framework

We have overriden the SaveChanges method because we want to set some final properties automatically upon saving and we have to set SETCONTEXT in each connection. Our current override looks as follows:
public override int SaveChanges()
{
// Use basic SaveChanges if SessionInfo is not initialized
if (SessionInfo.ContextInfo == null)
{
return base.SaveChanges();
}
// SessionInfo was initialized, so use custom logic now
// Set the SqlId according to sessioninfo for each entity to add
foreach (DbEntityEntry entry in ChangeTracker.Entries()
.Where(x => x.State == EntityState.Added))
{
string sqlIdPropertyName =
entry.CurrentValues.PropertyNames.First(x=>x.EndsWith("SqlId");
entry.Property(sqlIdPropertyName).CurrentValue = SessionInfo.ServerSqlId;
}
// Set the IsDeleted boolean to true for each entity to delete
foreach (DbEntityEntry entry in ChangeTracker.Entries()
.Where(x => x.State == EntityState.Deleted))
{
entry.Property("IsDeleted").CurrentValue = true;
entry.State = EntityState.Modified;
}
// Begin custom transaction if SessionInfo was set
this.Database.Connection.Open();
SessionInfo.SetContextInfo(this);
int result = base.SaveChanges();
this.Database.Connection.Close();
return result;
}
As you can see, when we add a new record to the database, the save logic sets the SqlId for the object according to the SessionInfo. However, this now depends of PropertyNames.First(), which is a risk.
The PropertyName of the SqlId we want to set is always equal to the name of the POCO class type + SqlId, so for the class "Invoice" it would be "InvoiceSqlId".
How can we get the typename of the original POCO class from a DbEntityEntry?

Try this: entry.Entity.GetType().Name
EDIT - for when you may be using proxies
var entityType = entry.Entity.GetType();
string name;
if (entityType.BaseType != null &&
entityType.Namespace == "System.Data.Entity.DynamicProxies")
{
name = entityType.BaseType.Name;
}
else
{
name = entityType.Name
}

Related

EF Core 3 Has Value Generator

in the modelBUilder for an entity, I am try to have the created and modified dates be set on add and updates by a custom generator. The reason for going this path is because the DbContext that creates the models is being used a base class. This base class is being inherited by SQL Server & SQLite EFCore extensions. Because of this there should be database explicit functionality in the context. The GetDateUTC() and triggers that were originally implemented SQL Server.
modelBuilder.Entity<CommunicationSendRequest>(entity =>
{
...
entity.Property(p => p.CreatedAt).ValueGeneratedOnAdd().HasValueGenerator<CreatedAtTimeGenerator>();
entity.Property(p => p.ModifiedAt).ValueGeneratedOnUpdate().HasValueGenerator<ModifiedAtTimeGenerator>();
});
but what is happening on add and updates both properties always set to new values. Meaning on brand new inserts the modifiedat is set, and on updates the createdat date is set. Which removes the true created at date.
The question is are those have value generators setup correctly? Is there a way to accomplish this using the generators? In the generators I tried to also check the state an return the value only if the state was added or modified. But the state always equaled Detached.
public class CreatedAtTimeGenerator : ValueGenerator<DateTimeOffset>
{
public override DateTimeOffset Next(EntityEntry entry)
{
if (entry == null)
{
throw new ArgumentNullException(nameof(entry));
}
return DateTimeOffset.UtcNow;
}
public override bool GeneratesTemporaryValues { get; }
}
public class ModifiedAtTimeGenerator : ValueGenerator<DateTimeOffset>
{
public override DateTimeOffset Next(EntityEntry entry)
{
if (entry == null)
{
throw new ArgumentNullException(nameof(entry));
}
return DateTimeOffset.UtcNow;
}
public override bool GeneratesTemporaryValues { get; }
}
What I actually did was to go away from the ValueGenerate concept all together and handle it the creation of CreatedAt, ModifiedAt, and added after this post was created DeletedAt dates by overriding the SaveChanges() methods.
public override int SaveChanges()
{
var selectedEntityList = ChangeTracker.Entries()
.Where(x => (x.Entity is IEntityCreatedAt ||
x.Entity is IEntityModifiedAt ||
x.Entity is IEntityIsDeleted) &&
(x.State == EntityState.Added || x.State == EntityState.Modified)).ToList();
selectedEntityList.ForEach(entity =>
{
if (entity.State == EntityState.Added)
{
if (entity.Entity is IEntityCreatedAt)
((IEntityCreatedAt)entity.Entity).CreatedAt = DateTimeOffset.UtcNow;
}
if (entity.State == EntityState.Modified)
{
if (entity.Entity is IEntityModifiedAt)
((IEntityModifiedAt)entity.Entity).ModifiedAt = DateTimeOffset.UtcNow;
if (entity.Entity is IEntityIsDeleted)
if (((IEntityIsDeleted)entity.Entity).IsDeleted)
((IEntityIsDeleted)entity.Entity).DeletedAt = DateTimeOffset.UtcNow;
}
});
return base.SaveChanges();
}

How to write a generic WebAPI Put method against Entity Framework that works with child lists?

I am tinkering with WebAPI to create a generic implementation for entity framework. I am able to implement most of the methods just fine, but am finding PUT to be tricky in non-trivial cases. The implementation most commonly found online works for simple entities:
[HttpPut]
[ActionName("Endpoint")]
public virtual T Put(T entity)
{
var db = GetDbContext();
var entry = db.Entry(entity);
entry.State = EntityState.Modified;
var set = db.Set<T>();
set.Attach(entity);
db.SaveChanges();
return entity;
}
...but does not delete or update child lists:
public class Invoice
{
...
public virtual InvoiceLineItem {get; set;} //Attach method doesn't address these
}
In an MVC Controller, you could simply use "UpdateModel" and it would add/update/delete children as needed, however that method is not available on ApiController. I understand that some code would be necessary to get the original item from the database, and that it would need to use Include to get the child lists, but can't quite figure out the best way to replicate UpdateModel's functionality:
[HttpPut]
[ActionName("Endpoint")]
public virtual T Put(T entity)
{
var db = GetDbContext();
var original = GetOriginalFor(entity);
//TODO: Something similar to UpdateModel(original), such as UpdateModel(original, entity);
db.SaveChanges();
return original;
}
How can I implement UpdateModel OR somehow implement Put in such a way that it will handle child lists?
The routine dont validate entity, but fill the pre-existent entity.
protected virtual void UpdateModel<T>(T original, bool overrideForEmptyList = true)
{
var json = ControllerContext.Request.Content.ReadAsStringAsync().Result;
UpdateModel<T>(json, original, overrideForEmptyList);
}
private void UpdateModel<T>(string json, T original, bool overrideForEmptyList = true)
{
var newValues = JsonConvert.DeserializeObject<Pessoa>(json);
foreach (var property in original.GetType().GetProperties())
{
var isEnumerable = property.PropertyType.GetInterfaces().Any(t => t.IsGenericType && t.GetGenericTypeDefinition() == typeof(IEnumerable<>));
if (isEnumerable && property.PropertyType != typeof(string))
{
var propertyOriginalValue = property.GetValue(original, null);
if (propertyOriginalValue != null)
{
var propertyNewValue = property.GetValue(newValues, null);
if (propertyNewValue != null && (overrideForEmptyList || ((IEnumerable<object>)propertyNewValue).Any()))
{
property.SetValue(original, null);
}
}
}
}
JsonConvert.PopulateObject(json, original);
}
public void Post()
{
var sample = Pessoa.FindById(12);
UpdateModel(sample);
}

How to ensure proxies are created when using the repository pattern with entity framework?

I have this method in my SurveyController class:
public ActionResult AddProperties(int id, int[] propertyids, int page = 1)
{
var survey = _uow.SurveyRepository.Find(id);
if (propertyids == null)
return GetPropertiesTable(survey, page);
var repo = _uow.PropertySurveyRepository;
propertyids.Select(propertyid => new PropertySurvey
{
//Setting the Property rather than the PropertyID
//prevents the error occurring later
//Property = _uow.PropertyRepository.Find(propertyid),
PropertyID = propertyid,
SurveyID = id
})
.ForEach(x => repo.InsertOrUpdate(x));
_uow.Save();
return GetPropertiesTable(survey, page);
}
The GetPropertiesTable redisplays Properties but PropertySurvey.Property is marked virtual and I have created the entity using the new operator, so a proxy to support lazy loading was never created and it is null when I access it. When we have access direct to the DbContext we can use the Create method to explicitly create the proxy. But I have a unit of work and repository pattern here. I guess I could expose the context.Create method via a repository.Create method and then I need to remember to use that instead of the new operator when I add an entity . But wouldn't it be better to encapsulate the problem in my InsertOrUpdate method? Is there some way to detect that the entity being added is not a proxy when it should be and substitute a proxy? This is my InsertOrUpdate method in my base repository class:
protected virtual void InsertOrUpdate(T e, int id)
{
if (id == default(int))
{
// New entity
context.Set<T>().Add(e);
}
else
{
// Existing entity
context.Entry(e).State = EntityState.Modified;
}
}
Based on the answer supplied by qujck. Here is how you can do it without having to employ automapper:
Edited to always check for proxy - not just during insert - as suggested in comments
Edited again to use a different way of checking whether a proxy was passed in to the method. The reason for changing the technique is that I ran into a problem when I introduced an entity that inherited from another. In that case an inherited entity can fail the entity.e.GetType().Equals(instance.GetType() check even if it is a proxy. I got the new technique from this answer
public virtual T InsertOrUpdate(T e)
{
DbSet<T> dbSet = Context.Set<T>();
DbEntityEntry<T> entry;
if (e.GetType().BaseType != null
&& e.GetType().Namespace == "System.Data.Entity.DynamicProxies")
{
//The entity being added is already a proxy type that supports lazy
//loading - just get the context entry
entry = Context.Entry(e);
}
else
{
//The entity being added has been created using the "new" operator.
//Generate a proxy type to support lazy loading and attach it
T instance = dbSet.Create();
instance.ID = e.ID;
entry = Context.Entry(instance);
dbSet.Attach(instance);
//and set it's values to those of the entity
entry.CurrentValues.SetValues(e);
e = instance;
}
entry.State = e.ID == default(int) ?
EntityState.Added :
EntityState.Modified;
return e;
}
public abstract class ModelBase
{
public int ID { get; set; }
}
I agree with you that this should be handled in one place and the best place to catch all looks to be your repository. You can compare the type of T with an instance created by the context and use something like Automapper to quickly transfer all of the values if the types do not match.
private bool mapCreated = false;
protected virtual void InsertOrUpdate(T e, int id)
{
T instance = context.Set<T>().Create();
if (e.GetType().Equals(instance.GetType()))
instance = e;
else
{
//this bit should really be managed somewhere else
if (!mapCreated)
{
Mapper.CreateMap(e.GetType(), instance.GetType());
mapCreated = true;
}
instance = Mapper.Map(e, instance);
}
if (id == default(int))
context.Set<T>().Add(instance);
else
context.Entry(instance).State = EntityState.Modified;
}

Efficient way of checking if many-to-many relationship exists in EF4.1

I have a many-to-many relationship between two entities - Media and MediaCollection. I want to check if a certain Media already exists in a collection. I can do this as follows:
mediaCollection.Media.Any(m => m.id == mediaId)
However, mediaCollection.Media is an ICollection, so to me this looks like it will have to retrieve every Media in the collection from the database just to make this check. As there could be many media in a collection, this seems very inefficient. I'n thinking that I should use a method of IQueryable, but I can't see how to do this for many-to-many relationships.
How can I check for the existence of the relationship without retrieving the whole collection?
EDIT
I am generating the EF data model from my database, then using the built in VS POCO T4 templates to generate my data context and entity classes. I think the problem is that the generated code does not return EntityCollection for the navigation properties, but instead ObjectSet. ObjectSet implements IQueryable, but does not expose a CreateSourceQuery() method.
Here is a stripped down version of the relevant lines from the context:
public partial class Entities : ObjectContext
{
public const string ConnectionString = "name=Entities";
public const string ContainerName = "Entities";
#region Constructors
public Entities()
: base(ConnectionString, ContainerName)
{
this.ContextOptions.LazyLoadingEnabled = true;
}
public Entities(string connectionString)
: base(connectionString, ContainerName)
{
this.ContextOptions.LazyLoadingEnabled = true;
}
public Entities(EntityConnection connection)
: base(connection, ContainerName)
{
this.ContextOptions.LazyLoadingEnabled = true;
}
#endregion
#region ObjectSet Properties
public ObjectSet<MediaCollection> MediaCollections
{
get { return _mediaCollections ?? (_mediaCollections = CreateObjectSet<MediaCollection>("MediaCollections")); }
}
private ObjectSet<MediaCollection> _mediaCollections;
// snipped many more
#endregion
}
And here is a stripped down version of the class for the MediaCollection entity:
public partial class MediaCollection
{
#region Primitive Properties
// snipped
#endregion
#region Navigation Properties
public virtual ICollection<Medium> Media
{
get
{
if (_media == null)
{
var newCollection = new FixupCollection<Medium>();
newCollection.CollectionChanged += FixupMedia;
_media = newCollection;
}
return _media;
}
set
{
if (!ReferenceEquals(_media, value))
{
var previousValue = _media as FixupCollection<Medium>;
if (previousValue != null)
{
previousValue.CollectionChanged -= FixupMedia;
}
_media = value;
var newValue = value as FixupCollection<Medium>;
if (newValue != null)
{
newValue.CollectionChanged += FixupMedia;
}
}
}
}
private ICollection<Medium> _media;
private void FixupMedia(object sender, NotifyCollectionChangedEventArgs e)
{
if (e.NewItems != null)
{
foreach (Medium item in e.NewItems)
{
if (!item.MediaCollections.Contains(this))
{
item.MediaCollections.Add(this);
}
}
}
if (e.OldItems != null)
{
foreach (Medium item in e.OldItems)
{
if (item.MediaCollections.Contains(this))
{
item.MediaCollections.Remove(this);
}
}
}
}
// snip
#endregion
}
And finally, here is the FixupCollection that the template also generates:
public class FixupCollection<T> : ObservableCollection<T>
{
protected override void ClearItems()
{
new List<T>(this).ForEach(t => Remove(t));
}
protected override void InsertItem(int index, T item)
{
if (!this.Contains(item))
{
base.InsertItem(index, item);
}
}
}
You can do that but you need a context for that:
bool exists = context.Entry(mediaCollection)
.Collection(m => m.Media)
.Query()
.Any(x => x.Id == mediaId);
Edit:
If you are using ObjectContext API with proxied POCOs instead of DbContext API the former sample will not work. You can try this:
context.ContextOptions.LazyLoadingEnabled = false;
bool exists = ((EntityCollection<Media>)mediaCollection.Media).CreateSourceQuery()
.Any(x => x.Id == mediaId);
context.ContextOptions.LazyLoadingEnabled = true;
So it seems that the built in VS POCO T4 template does not generate anything equivalent to CreateSourceQuery(). No matter! We can code it ourselves. If you add the following code at to the context's .tt file and regenerate:
public ObjectQuery<T> CreateNavigationSourceQuery<T>(object entity, string navigationProperty)
{
var ose = ObjectStateManager.GetObjectStateEntry(entity);
var rm = ObjectStateManager.GetRelationshipManager(entity);
var entityType = (System.Data.Metadata.Edm.EntityType)ose.EntitySet.ElementType;
var navigation = entityType.NavigationProperties[navigationProperty];
var relatedEnd = rm.GetRelatedEnd(navigation.RelationshipType.FullName, navigation.ToEndMember.Name);
return ((dynamic)relatedEnd).CreateSourceQuery();
}
then we can check for the existence of a many-to-many as follows:
var exists = _context.CreateNavigationSourceQuery<Medium>(mediaCollection, "Media")
.Any(m => m.Id == medium.Id);
Props to Rowan's answer on Using CreateSourceQuery in CTP4 Code First for this one.
Try,
mediaCollection.CreateSourceQuery()
.Any(....
CreateSourceQuery will create IQueryable for the association.

Serializing Entity Framework problems

Like several other people, I'm having problems serializing Entity Framework objects, so that I can send the data over AJAX in a JSON format.
I've got the following server-side method, which I'm attempting to call using AJAX through jQuery
[WebMethod]
public static IEnumerable<Message> GetAllMessages(int officerId)
{
SIBSv2Entities db = new SIBSv2Entities();
return (from m in db.MessageRecipients
where m.OfficerId == officerId
select m.Message).AsEnumerable<Message>();
}
Calling this via AJAX results in this error:
A circular reference was detected while serializing an object of type \u0027System.Data.Metadata.Edm.AssociationType
Which is because of the way the Entity Framework creates circular references to keep all the objects related and accessible server side.
I came across the following code from (http://hellowebapps.com/2010-09-26/producing-json-from-entity-framework-4-0-generated-classes/) which claims to get around this problem by capping the maximum depth for references. I've added the code below, because I had to tweak it slightly to get it work (All angled brackets are missing from the code on the website)
using System.Web.Script.Serialization;
using System.Collections.Generic;
using System.Collections;
using System.Linq;
using System;
public class EFObjectConverter : JavaScriptConverter
{
private int _currentDepth = 1;
private readonly int _maxDepth = 2;
private readonly List<int> _processedObjects = new List<int>();
private readonly Type[] _builtInTypes = new[]{
typeof(bool),
typeof(byte),
typeof(sbyte),
typeof(char),
typeof(decimal),
typeof(double),
typeof(float),
typeof(int),
typeof(uint),
typeof(long),
typeof(ulong),
typeof(short),
typeof(ushort),
typeof(string),
typeof(DateTime),
typeof(Guid)
};
public EFObjectConverter( int maxDepth = 2,
EFObjectConverter parent = null)
{
_maxDepth = maxDepth;
if (parent != null)
{
_currentDepth += parent._currentDepth;
}
}
public override object Deserialize( IDictionary<string,object> dictionary, Type type, JavaScriptSerializer serializer)
{
return null;
}
public override IDictionary<string,object> Serialize(object obj, JavaScriptSerializer serializer)
{
_processedObjects.Add(obj.GetHashCode());
Type type = obj.GetType();
var properties = from p in type.GetProperties()
where p.CanWrite &&
p.CanWrite &&
_builtInTypes.Contains(p.PropertyType)
select p;
var result = properties.ToDictionary(
property => property.Name,
property => (Object)(property.GetValue(obj, null)
== null
? ""
: property.GetValue(obj, null).ToString().Trim())
);
if (_maxDepth >= _currentDepth)
{
var complexProperties = from p in type.GetProperties()
where p.CanWrite &&
p.CanRead &&
!_builtInTypes.Contains(p.PropertyType) &&
!_processedObjects.Contains(p.GetValue(obj, null)
== null
? 0
: p.GetValue(obj, null).GetHashCode())
select p;
foreach (var property in complexProperties)
{
var js = new JavaScriptSerializer();
js.RegisterConverters(new List<JavaScriptConverter> { new EFObjectConverter(_maxDepth - _currentDepth, this) });
result.Add(property.Name, js.Serialize(property.GetValue(obj, null)));
}
}
return result;
}
public override IEnumerable<System.Type> SupportedTypes
{
get
{
return GetType().Assembly.GetTypes();
}
}
}
However even when using that code, in the following way:
var js = new System.Web.Script.Serialization.JavaScriptSerializer();
js.RegisterConverters(new List<System.Web.Script.Serialization.JavaScriptConverter> { new EFObjectConverter(2) });
return js.Serialize(messages);
I'm still seeing the A circular reference was detected... exception being thrown!
I solved these issues with the following classes:
public class EFJavaScriptSerializer : JavaScriptSerializer
{
public EFJavaScriptSerializer()
{
RegisterConverters(new List<JavaScriptConverter>{new EFJavaScriptConverter()});
}
}
and
public class EFJavaScriptConverter : JavaScriptConverter
{
private int _currentDepth = 1;
private readonly int _maxDepth = 1;
private readonly List<object> _processedObjects = new List<object>();
private readonly Type[] _builtInTypes = new[]
{
typeof(int?),
typeof(double?),
typeof(bool?),
typeof(bool),
typeof(byte),
typeof(sbyte),
typeof(char),
typeof(decimal),
typeof(double),
typeof(float),
typeof(int),
typeof(uint),
typeof(long),
typeof(ulong),
typeof(short),
typeof(ushort),
typeof(string),
typeof(DateTime),
typeof(DateTime?),
typeof(Guid)
};
public EFJavaScriptConverter() : this(1, null) { }
public EFJavaScriptConverter(int maxDepth = 1, EFJavaScriptConverter parent = null)
{
_maxDepth = maxDepth;
if (parent != null)
{
_currentDepth += parent._currentDepth;
}
}
public override object Deserialize(IDictionary<string, object> dictionary, Type type, JavaScriptSerializer serializer)
{
return null;
}
public override IDictionary<string, object> Serialize(object obj, JavaScriptSerializer serializer)
{
_processedObjects.Add(obj.GetHashCode());
var type = obj.GetType();
var properties = from p in type.GetProperties()
where p.CanRead && p.GetIndexParameters().Count() == 0 &&
_builtInTypes.Contains(p.PropertyType)
select p;
var result = properties.ToDictionary(
p => p.Name,
p => (Object)TryGetStringValue(p, obj));
if (_maxDepth >= _currentDepth)
{
var complexProperties = from p in type.GetProperties()
where p.CanRead &&
p.GetIndexParameters().Count() == 0 &&
!_builtInTypes.Contains(p.PropertyType) &&
p.Name != "RelationshipManager" &&
!AllreadyAdded(p, obj)
select p;
foreach (var property in complexProperties)
{
var complexValue = TryGetValue(property, obj);
if(complexValue != null)
{
var js = new EFJavaScriptConverter(_maxDepth - _currentDepth, this);
result.Add(property.Name, js.Serialize(complexValue, new EFJavaScriptSerializer()));
}
}
}
return result;
}
private bool AllreadyAdded(PropertyInfo p, object obj)
{
var val = TryGetValue(p, obj);
return _processedObjects.Contains(val == null ? 0 : val.GetHashCode());
}
private static object TryGetValue(PropertyInfo p, object obj)
{
var parameters = p.GetIndexParameters();
if (parameters.Length == 0)
{
return p.GetValue(obj, null);
}
else
{
//cant serialize these
return null;
}
}
private static object TryGetStringValue(PropertyInfo p, object obj)
{
if (p.GetIndexParameters().Length == 0)
{
var val = p.GetValue(obj, null);
return val;
}
else
{
return string.Empty;
}
}
public override IEnumerable<Type> SupportedTypes
{
get
{
var types = new List<Type>();
//ef types
types.AddRange(Assembly.GetAssembly(typeof(DbContext)).GetTypes());
//model types
types.AddRange(Assembly.GetAssembly(typeof(BaseViewModel)).GetTypes());
return types;
}
}
}
You can now safely make a call like new EFJavaScriptSerializer().Serialize(obj)
Update : since version Telerik v1.3+ you can now override the GridActionAttribute.CreateActionResult method and hence you can easily integrate this Serializer into specific controller methods by applying your custom [GridAction] attribute:
[Grid]
public ActionResult _GetOrders(int id)
{
return new GridModel(Service.GetOrders(id));
}
and
public class GridAttribute : GridActionAttribute, IActionFilter
{
/// <summary>
/// Determines the depth that the serializer will traverse
/// </summary>
public int SerializationDepth { get; set; }
/// <summary>
/// Initializes a new instance of the <see cref="GridActionAttribute"/> class.
/// </summary>
public GridAttribute()
: base()
{
ActionParameterName = "command";
SerializationDepth = 1;
}
protected override ActionResult CreateActionResult(object model)
{
return new EFJsonResult
{
Data = model,
JsonRequestBehavior = JsonRequestBehavior.AllowGet,
MaxSerializationDepth = SerializationDepth
};
}
}
and finally..
public class EFJsonResult : JsonResult
{
const string JsonRequest_GetNotAllowed = "This request has been blocked because sensitive information could be disclosed to third party web sites when this is used in a GET request. To allow GET requests, set JsonRequestBehavior to AllowGet.";
public EFJsonResult()
{
MaxJsonLength = 1024000000;
RecursionLimit = 10;
MaxSerializationDepth = 1;
}
public int MaxJsonLength { get; set; }
public int RecursionLimit { get; set; }
public int MaxSerializationDepth { get; set; }
public override void ExecuteResult(ControllerContext context)
{
if (context == null)
{
throw new ArgumentNullException("context");
}
if (JsonRequestBehavior == JsonRequestBehavior.DenyGet &&
String.Equals(context.HttpContext.Request.HttpMethod, "GET", StringComparison.OrdinalIgnoreCase))
{
throw new InvalidOperationException(JsonRequest_GetNotAllowed);
}
var response = context.HttpContext.Response;
if (!String.IsNullOrEmpty(ContentType))
{
response.ContentType = ContentType;
}
else
{
response.ContentType = "application/json";
}
if (ContentEncoding != null)
{
response.ContentEncoding = ContentEncoding;
}
if (Data != null)
{
var serializer = new JavaScriptSerializer
{
MaxJsonLength = MaxJsonLength,
RecursionLimit = RecursionLimit
};
serializer.RegisterConverters(new List<JavaScriptConverter> { new EFJsonConverter(MaxSerializationDepth) });
response.Write(serializer.Serialize(Data));
}
}
You can also detach the object from the context and it will remove the navigation properties so that it can be serialized. For my data repository classes that are used with Json i use something like this.
public DataModel.Page GetPage(Guid idPage, bool detach = false)
{
var results = from p in DataContext.Pages
where p.idPage == idPage
select p;
if (results.Count() == 0)
return null;
else
{
var result = results.First();
if (detach)
DataContext.Detach(result);
return result;
}
}
By default the returned object will have all of the complex/navigation properties, but by setting detach = true it will remove those properties and return the base object only. For a list of objects the implementation looks like this
public List<DataModel.Page> GetPageList(Guid idSite, bool detach = false)
{
var results = from p in DataContext.Pages
where p.idSite == idSite
select p;
if (results.Count() > 0)
{
if (detach)
{
List<DataModel.Page> retValue = new List<DataModel.Page>();
foreach (var result in results)
{
DataContext.Detach(result);
retValue.Add(result);
}
return retValue;
}
else
return results.ToList();
}
else
return new List<DataModel.Page>();
}
I have just successfully tested this code.
It may be that in your case your Message object is in a different assembly? The overriden Property SupportedTypes is returning everything ONLY in its own Assembly so when serialize is called the JavaScriptSerializer defaults to the standard JavaScriptConverter.
You should be able to verify this debugging.
Your error occured due to some "Reference" classes generated by EF for some entities with 1:1 relations and that the JavaScriptSerializer failed to serialize.
I've used a workaround by adding a new condition :
!p.Name.EndsWith("Reference")
The code to get the complex properties looks like this :
var complexProperties = from p in type.GetProperties()
where p.CanWrite &&
p.CanRead &&
!p.Name.EndsWith("Reference") &&
!_builtInTypes.Contains(p.PropertyType) &&
!_processedObjects.Contains(p.GetValue(obj, null)
== null
? 0
: p.GetValue(obj, null).GetHashCode())
select p;
Hope this help you.
I had a similar problem with pushing my view via Ajax to UI components.
I also found and tried to use that code sample you provided. Some problems I had with that code:
SupportedTypes wasn't grabbing the types I needed, so the converter wasn't being called
If the maximum depth is hit, the serialization would be truncated
It threw out any other converters I had on the existing serializer by creating its own new JavaScriptSerializer
Here are the fixes I implemented for those issues:
Reusing the same serializer
I simply reused the existing serializer that is passed into Serialize to solve this problem. This broke the depth hack though.
Truncating on already-visited, rather than on depth
Instead of truncating on depth, I created a HashSet<object> of already seen instances (with a custom IEqualityComparer that checked reference equality). I simply didn't recurse if I found an instance I'd already seen. This is the same detection mechanism built into the JavaScriptSerializer itself, so worked quite well.
The only problem with this solution is that the serialization output isn't very deterministic. The order of truncation is strongly dependent on the order that reflections finds the properties. You could solve this (with a perf hit) by sorting before recursing.
SupportedTypes needed the right types
My JavaScriptConverter couldn't live in the same assembly as my model. If you plan to reuse this converter code, you'll probably run into the same problem.
To solve this I had to pre-traverse the object tree, keeping a HashSet<Type> of already seen types (to avoid my own infinite recursion), and pass that to the JavaScriptConverter before registering it.
Looking back on my solution, I would now use code generation templates to create a list of the entity types. This would be much more foolproof (it uses simple iteration), and have much better perf since it would produce a list at compile time. I'd still pass this to the converter so it could be reused between models.
My final solution
I threw out that code and tried again :)
I simply wrote code to project onto new types ("ViewModel" types - in your case, it would be service contract types) before doing my serialization. The intention of my code was made more explicit, it allowed me to serialize just the data I wanted, and it didn't have the potential of slipping in queries on accident (e.g. serializing my whole DB).
My types were fairly simple, and I didn't need most of them for my view. I might look into AutoMapper to do some of this projection in the future.