How to add entity with existing attached to it - entity-framework

I want to add entity payment object, containing EXISTING Currency object to EF database:
public Payment()
{
int Id {get;set;}
public int Value {get;set;}
public Currency SelectedCurrency{get;set;}
}
public Currency()
{
int Id {get;set;}
string Name;
}
Suppose that I have existing Currency attached to new entity Payment(). When I add such entity Payment(), the error appears
Violation of PRIMARY KEY constraint 'PK_dbo.Currency'. Cannot insert duplicate key in object 'dbo.MwbeCurrency'. The duplicate key value is (GBP).\r\nThe statement has been terminated."}
How to add higher-level entity with attached existing lower-level entity?
My code for adding entity is:
public virtual TEntity Add(TEntity entity)
{
return DbSet.Add(entity);
}
public void SaveChanges()
{
Context.SaveChanges();
}

I suspect you retrieved Currency with a different instance than the one that retrieved Payment and did something like this :
payment.Currency = retrievedCurrency;
Therefore, the Payment context things that Currency is a new object and tries to persist it. Since it already exists, you are getting a PRIMARY KEY violation.
If you want to persist Payment correctly, add the following lines:
if (payment.Currency != null && payment.Currency.Id != 0)
{
context.Entry(payment.Currency).State = EntityState.Unchanged;
}
although it would probably be cleaner if you retrieved Payment and Currency with the same context, so you can persist them appropriately.

Calling DbSet.Add(entity) adds the entire graph for persistence, which means it will go through all the navigation properties of entity and set each one's state to EntityState.Added.
While the other answer might work, a better approach is to change the way you are adding the objects, and be explicit about what entities you are adding / updating / etc.
To do this, change:
public virtual void Add(TEntity entity)
{
DbSet.Add(entity);
}
To:
public virtual void Add(TEntity entity)
{
context.Entry(entity).State = EntityState.Added;
}
This will add only the supplied entity. If one of your navigation properties objects is also new, you call .Add(entity) on it as well.
If you do need to add the entire graph in other situations, you can add an additional method that works the way your original one does, but has a better name to indicate it's function:
public virtual void AddGraph(TEntity entity)
{
DbSet.Add(entity);
}
Good Luck
Update
Additionally, since it looks like you are using a repository, I prefer to disable auto detect changes by setting context.Configuration.AutoDetectChangesEnabled = false; If you modify a property on an entity that you want persisted, you would need to set the state of the entity to modified like so:
public virtual void Update(TEntity entity)
{
context.Entry(entity).State = EntityState.Modified;
}

Related

.NET 6 EF deleting DB record on an update?

The entity/model has a child object, during ADD (POST) operations where I just want the parent object to be updated in the database, I simply set the child object to null. Parent object adds to database just fine and child object doesn't touch the database.
However, when I do an UPDATE (PUT) and set the same child object to null, the parent object is actually deleted from the database and child object not touched in the database?
Model code:
namespace PROJ.API.Models
{
public partial class Todo
{
public Todo()
{
}
public long TdoId { get; set; }
public string TdoDescription { get; set; } = null!;
public long PtyId { get; set; }
public virtual Priority? Priority { get; set; }
}
public partial class Priority
{
public Priority()
{
}
public long PtyId { get; set; }
public byte PtyLevel { get; set; }
public string PtyDescription { get; set; } = null!;
}
}
Entities code:
using System.ComponentModel.DataAnnotations;
using System.ComponentModel.DataAnnotations.Schema;
namespace PROJ.API.Entities
{
public class Todo
{
[Key]
[DatabaseGenerated(DatabaseGeneratedOption.Identity)]
public long TdoId { get; set; }
public string TdoDescription { get; set; } = null!;
public long PtyId { get; set; }
[ForeignKey("PtyId")]
public virtual Priority? Priority { get; set; }
}
public class Priority
{
[Key]
[DatabaseGenerated(DatabaseGeneratedOption.Identity)]
public long PtyId { get; set; }
public byte PtyLevel { get; set; }
public string PtyDescription { get; set; } = null!;
}
}
Repository code:
public async Task<Todo?> GetTodoAsync(long tdoId)
{
var todo = await _context.Todo.Where(c => c.TdoId == tdoId)
.Include(x => x.Priority)
.FirstOrDefaultAsync();
return todo;
}
Controller code:
[HttpPut()] // UPDATE
public async Task<ActionResult> UpdateTodoAsync(Todo todo)
{
var eTodo = await _myRepository.GetTodoAsync(todo.TdoId);
if (todo.Priority == null || todo.Priority.PtyId == 0)
{
var priority = await _myRepository.GetPriorityAsync(todo.PtyId);
if (priority != null)
{
_mapper.Map(priority, todo.Priority);
}
}
_mapper.Map(todo, eTodo);
await _myRepository.SaveChangesAsync();
return NoContent();
}
My understanding is that setting the child object to null tells EF to NOT perform any operation on it in the database. TODO.PtyId is setup with a FK to PRIORITY.PtyId in the SQL database but I have NOT defined this in context (OnModelCreating) as I don't "think" I need the Fluent API approach here.
Any thoughts on what I'm doing wrong and/or why an UPDATE is actually deleting a record when I set a child object to NULL? As I noted before an ADD using the same null approach works just fine.
A couple things.
In your example and naming convention you should be explicitly nominating your FK either by attribute or fluent declaration. EF's convention is to base FK names on the "type" of the relationship, not the property name. So for instance if you have:
public virtual Priority? Pty { get; set; }
EF will be looking for a FK named Priority_ID or PriorityID, not PtyID. This behaviour may have changed in EF Core, I honestly haven't delved back into whether EF conventions can be trusted to work this out.
Lastly, this is overall a typical example of issues that can come up whenever you mix concerns with entities and use detached entities as view models. It's also outlining an issue with your repository implementation. In your case you are detaching an entity, then when passing it back to the server to update, loading the entity from data state, and using Automapper to copy the values across.
The first problem is that your repository is automatically and unconditionally eager-loading the related entity when in at least this example you don't need or want that related entity. When EF eager loads a relationship and then you set that related entity to #null, the proxy records this action and EF will interpret that as "Remove this relationship". If the related entity is not loaded/associated and left as #null when saving that top-level entity, nothing is removed. Either way you will want to avoid doing something like setting a related entity to #null if you don't want to save changes to them. The solution is either not to load related entities in the first place, ignore mapping across related entities, or marking those entities as Unchanged to avoid persisting changes to them.
Not loading Related entities:
This could either be solved by adding arguments to indicate what should be eager loaded, or considering adopting IQueryable for the repository method:
Argument:
public async Task<Todo?> GetTodoAsync(long tdoId, bool includeRelated = true)
{
var query = _context.Todo.Where(c => c.TdoId == tdoId);
if (includeRelated)
{
query = query.Include(c => c.Pty);
}
return query.FirstOrDefaultAsync();
}
In simple cases this isn't too bad, but in more complex entities it can be a pain, especially if you want to selectively include relatives. This way when you load eToDo from data state, you can tell it to not eager load the Priority. This isn't foolproof as it is still possible that a Priority could be associated if that DbContext instance had previously loaded the Priority associated with that Todo. Tracked entities will be associated even if you don't explicitly eager load them. To be safe this should be combined with the Automapper changes further below.(Excluding mapping changes) This is still a worthwhile change as you can avoid resource/performance costs of unconditionally eager loading every read.
IQueryable:
public IQueryable<Todo> GetTodoById(long tdoId)
{
var query = _context.Todo.Where(c => c.TdoId == tdoId);
return query;
}
IQueryable gives your consumer a lot more flexibility into what it wants to do with regards to data that will be coming back, but it does require a shift in thinking around the unit of work pattern to move the scope of the DbContext out into a Unit of Work so that consumers are responsible for that scope rather than at the individual repository level. The advantages of this approach are that the unit of work (DbContext scope) can be shared across repositories if needed, and with this pattern your consumer has control over things like:
Projection using Select or Count, Any, etc.
async vs. synchronous operations.
Assessing whether or not to eager load related entities.
So as an example with this pattern, the controller or service code would function more like:
[HttpPut()] // UPDATE
public async Task<ActionResult> UpdateTodoAsync(Todo todo)
{
using (var contextScope = _contextScopeFactory.Create())
{
var eTodo = await _myRepository.GetTodoById(todo.TdoId)
.SingleAsync();
_mapper.Map(todo, eTodo);
await contextScope.SaveChangesAsync();
return NoContent();
}
}
contextScope / _contextScopeFactory are a UoW pattern called DbContextScope by Medhi El Gueddari for EF6 which has a number of forks covering EF Core. I like this pattern as it gives the Repository a dependency to a locator to resolve the DbContext from the Scope rather than passing around DbContext instances, giving that scope full control over whether or not SaveChanges() gets committed or not. Leveraging IQueryable enables projection so it can help avoid this issue all-together when used to read data to send to a view to Project to a ViewModel using Automapper's ProjectTo rather than sending Entities to a View which come back to the controller as a deserialized and typically incomplete shell of what they once were.
Excluding mapping changes:
This involves adjusting the mapping you use to exclude copying across changes to the related entity when mapping one Todo across to another. If the mapping ignores Todo.Pty then you can map across just the Todo fields from the one instance to the DB instance and save the DbInstance without change tracking tripping anything changing in Pty or the relationship.
Marking as Unchanged:
Given your repository is managing the scope of the DbContext what you will likely need to do is add a method to isolate changes to just that top-level entity. Since the Repository is scoping the DbContext, this means some form of clunky method since we need to pass the entity to tweak tracking.
// eTodo.Pty = null; don't do
_myRepository.IgnoreRelatedChanges(eTodo);
await _myRepository.SaveChangesAsync();
then...
public void IgnoreRelatedChanges(Todo todo)
{
_context.Entry(todo.Pty).State = EntityState.Unchanged;
}
The trouble with this approach is that it is clumsy and prone to bugs/exceptions.
In any case that should provide you with some options to consider to solve your issue, and possibly consider for updating your repository pattern.

EFCore - Why do i have to null child objects to stop them inserting? One-to-many

Small amount of context, I have been using NHibernate mapping by code for a few years, the last few months I have started using Entity Framework Core.
I'm trying to understand why I have to null child objects to stop them inserting new records. I'm not sure if its an understanding issue on my part or if this is how Entity Framework works.
I have two classes, Command and CommandCategory. Command has a single CommandCategory and CommandCategory can have many commands. For example, The command "set timeout" would go under the "Configuration" category. Similarly, the "set URL" command would also go under the "Configuration" category.
class Command
{
public Guid Id { get; set; }
public string Name { get; set; }
public string CommandString { get; set; }
public Guid CommandCategoryId { get; set; }
public CommandCategory CommandCategory { get; set; }
}
class CommandCategory
{
public CommandCategory(string id, string name)
{
Id = Guid.Parse(id);
Name = name;
Commands = new List<Command>();
}
public Guid Id { get; set; }
public string Name { get; set; }
public ICollection<Command> Commands { get; set; }
}
My DbContext is setup like so:
class EfContext : DbContext
{
private const string DefaultConnection = "XXXXX";
public virtual DbSet<Command> Command { get; set; }
public virtual DbSet<CommandCategory> CommandCategory { get; set; }
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
if (!optionsBuilder.IsConfigured)
{
optionsBuilder.UseSqlServer(DefaultConnection);
optionsBuilder.EnableSensitiveDataLogging();
}
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.Entity<Command>()
.HasOne(x => x.CommandCategory)
.WithMany(x => x.Commands);
}
}
Then here is the code that actually runs it all. First I call Add(). Add creates a new Command and adds it to the database. It also creates a CommandCategory called "Configuration" and inserts both correctly.
Next I call AddWithExisting(). This will create a new Command but using the existing CommandCategory. When it tries to add to the database, it first inserts the Command and then it tries to insert the CommandCategory. Because the CommandCategory.Id already exists, and its setup as the primary key, this then fails as it's a duplicate key. To get around this I have to make sure the CommandCategory property on the Command object is set to null. This will then only insert the Command to the database and not the CommandCategory object.
I know usually you wouldn't create a new CommandCategory object, but in this instance I am simulating the object coming up from the client via an ApiController. My application sends data back and forth via WebApi so the object is basically being created new when a request is made.
Nulling the property seems like a strange thing to do, I thought the point of Object-relational mapping was to not have to deal with individual properties like this.
Is this how its supposed to function or am I doing something wrong?
class Program
{
static void Main(string[] args)
{
var dbContext = new EfContext();
Add(dbContext);
AddWithExisting(dbContext);
Console.WriteLine("Hello World!");
}
private static void Add(EfContext dbContext)
{
var newCommand = new Command();
newCommand.Id = Guid.NewGuid();
newCommand.Name = "set timeout";
newCommand.CommandString = "timeout:500;";
var newCommandCategory = new CommandCategory("8C0D0E31-950E-4062-B783-6817404417D4", "Configuration");
newCommandCategory.Commands.Add(newCommand);
newCommand.CommandCategory = newCommandCategory;
dbContext.Command.Add(newCommand);
dbContext.SaveChanges();
}
private static void AddWithExisting(EfContext dbContext)
{
var newCommand = new Command();
newCommand.Id = Guid.NewGuid();
newCommand.Name = "set URL";
newCommand.CommandString = "url:www.stackoverflow.com";
// this uses the same Id and Name as the existing command, this is to simulate a rest call coming up with all the data.
var newCommandCategory = new CommandCategory("8C0D0E31-950E-4062-B783-6817404417D4", "Configuration");
newCommandCategory.Commands.Add(newCommand);
// If i don't null the below line, it will insert to the database a second time
newCommand.CommandCategory = newCommandCategory;
newCommand.CommandCategoryId = newCommandCategory.Id;
dbContext.Command.Add(newCommand);
dbContext.SaveChanges();
}
This is by design, you can do two things here:
You can look up the existing command category from the DB and set that as the property (as this object is 'attached' to the DB context, it won't create a new one).
Just set the ID of the command category on the command.
e.g.
newCommand.CommandCategory = dbContext.CommandCategories.Find("8C0D0E31-950E-4062-B783-6817404417D4");
or
newCommand.CommandCategoryId = new Guid("8C0D0E31-950E-4062-B783-6817404417D4");
At the minute, it is seeing a new command category (not attached) so is trying to create it.
EF doesn't perform InsertOrUpdate checks. Entities are tracked by a DbContext as either Added or Updated. If you interact with a tracked entity or "Add" an entity to the DbContext, all untracked related entities will be recognized as Added, resulting in an insert.
The simplest advice I can give is to give EF the benefit of the doubt when it comes to entities and don't try to premature optimize. It can save headaches.
using (var dbContext = new EfContext())
{
var newCommand = Add(dbContext);
AddWithExisting(newCommand, dbContext);
dbContext.SaveChanges();
Console.WriteLine("Hello World!");
}
private static command Add(EfContext dbContext)
{
var newCommand = new Command
{
Id = Guid.NewGuid(), // Should either let DB set this by default, or use a Sequential ID implementation.
Name = "set timeout",
CommandString = "timeout:500;"
};
Guid commandCategoryId = new Guid("8C0D0E31-950E-4062-B783-6817404417D4");
var commandCategory = dbContext.CommandCategory.Where(x => x.CommandCategoryId == commandCategoryId);
if(commandCategory == null)
commandCategory = new CommandCategory
{
Id = commandCategoryId,
Name = "Configuration"
};
newCommand.CommandCategory = commandCategory;
dbContext.Command.Add(command);
return command;
}
private static Command AddWithExisting(Command command, EfContext dbContext)
{
var newCommand = new Command
{
Id = Guid.NewGuid(),
Name = "set URL",
CommandString = "url:www.stackoverflow.com",
CommandCategory = command.CommandCategory
};
dbContext.Commands.Add(newCommand);
return newCommand;
}
So what's changed here?
First the DbContext reference is Disposable, so it should always be wrapped with a using block. Next, we create the initial Command, and as a safety measure to avoid an assumption we search the context for an existing CommandCategory by ID and associate that, otherwise we create the command category and associate it to the Command. 1-to-many relationships do not need to be bi-directional, and even if you do want bi-directional relationships you don't typically need to set both references to each other if the mappings are set up correctly. If it makes sense to ever load a CommandCategory and navigate to all commands using that category then keep it, but even to query all commands for a specific category, that is easy enough to query from the command level. Bi-directional references can cause annoying issues so I don't recommend using them unless they will be really necessary.
We return the new command object back from the first call, and pass it into the second. We really only needed to pass the reference to the commandcategory loaded/created in the first call, but in case it may make sense to check/copy info from the first command, I used this example. We create the new additional command instance and set it's command category reference to the same instance as the first one. I then return the new command as well. We don't use that reference to the second command. The important difference between this and what you had tried is that the CommandCategory here points to the same reference, not two references with the same ID. EF will track this instance as it is associated/added, and wire up the appropriate SQL.
Lastly note that the SaveChanges call is moved outside of the two calls. Contexts generally should only ever save changes once in their lifetime. Everything will get committed together. Having multiple SaveChanges is usually a smell when developers want to manually wire up associations when keys are autogenerated by a DB. (Identity or defaults) Provided relationships are mapped correctly with navigation properties and their FKs, EF is quite capable of managing these automatically. This means that if you set up your DB to default your Command IDs to newsequentialid() for instance and tell EF to treat the PK as an Identity column, EF will handle this all automatically. This goes for associating those new PKs as FKs to related entities as well. No need to save the parent record so the parent ID can be set in the child entities, map it, associate them, and let EF take care of it.

EF6 proxy's reference is sometimes null when entity's IValidatableObject.Validate method is called

An EF6 proxy has a reference that is sometimes null when my entity's IValidatableObject.Validate method is called by the DbContext's SaveChangesAsync method.
Running the same exact code multiple times results in different behavior. If I check my stock's Sku property (i.e. stock.Sku == null) outside of the Validate method it always returns a materialized entity. If I do not do that and only check this.Sku within the Validate method then this.Sku will sometimes be null for the exact same entity. And by "exact same entity" I mean that I am testing the one stock multiple times that has the same Id and SkuId across all test runs. I'm not creating a new stock here or changing the value of its SkuId property. The one thing I am doing is calling the stock's ChangeQuantity method and then saving changes.
My best guess at this point is that once save changes is called all entity and reference materialization is frozen. If the Sku property has not already been accessed at least once then it will be null and remain null when the DB context's save changes code calls my object's Validate method.
My questions are: Why is this happening and why can't I depend on that property being available to be lazy loaded at anytime?
public abstract class StockBase : RecordBase
{
// Snipped //
[Required, Display(Name = "SKU")]
public Guid SkuId { get; set; }
[Display(Name = "SKU")]
public virtual Sku Sku { get; protected set; }
[Required]
public int Quantity { get; private set; }
[DataType("StockActions")]
public virtual ICollection<StockAction> Actions { get; private set; }
public void ChangeQuantity(DateTime logged, Guid loggedById, int changeInQuantity, string notes = null)
{
TrackChange(logged, loggedById);
Quantity += changeInQuantity;
Actions.Add(new StockAction(logged, loggedById, changeInQuantity));
}
}
public class StandardStock : StockBase, IValidatableObject
{
// Snipped //
public IEnumerable<ValidationResult> Validate(ValidationContext validationContext)
{
// Right here is where `this.Sku` is sometimes null!
if (Sku.IsExpiringStock)
{
throw new InvalidOperationException("Standard stock must have a non-expiring SKU.");
}
yield break;
}
}
Unfortunately, lazy loading is disabled during validation when performed through Entity Framework.
https://msdn.microsoft.com/en-us/data/gg193959#Considerations
It looks like you are using lazy loading to populate the Sku object. When you test the Sku property manually, you are forcing the lazy loading to run, and the value gets materialised. If you're already doing something with the context at the time you're expecting it to be loaded, or the context has been disposed, then it will remain as null.
If you always need to populate this property, consider explicitly loading it when you load the entity. This will stop your lazy loading problem, and also eliminate a trip to the database.

Getting JSON Serialization Entity Framework Self Reference Loop error even after ProxyCreation false when using explicit Include

JSON Serialization (ASP.Net Web API) fails because of self-referencing loop (it’s a common problem, Reason: an entity being requested lazy loads child entities and every child has a back reference to parent entity).
Work around I found, but doesn’t help me:
Use [JsonIgnore] for navigation properties to be ignored:
This solution works but doesn’t apply in my case. For Example: To get a Customer information along with his Orders, I would quickly add [JsonIgnore] to Customer property in Order class, but when I want to get an Order information along with the Customer details, since there’s [JsonIgnore] on Customer property, it won’t include Customer details.
Change JSON.Net Serializer Settings to Preserve References:
Can’t Preserve because I don’t need Circular referenced data.
Disable Proxy Creation at the Data Context and use explicit loading(this should ideally solve the problem):
Disabling proxy creation stops Lazy Loading and returns data without error, but when I explicitly Include child entities, I again the get the unexpected self-referencing loop error! The error is at the back-reference level to parent entity.
Any experiences along the same lines/suggestions?
I tried all the suggested solutions but didn't work. Ended up with Overriding the JSON.Net Serializer’s DefaultContractResolver to this:
public class FilterContractResolver : DefaultContractResolver
{
Dictionary<Type, List<string>> _propertiesToIgnore;
public FilterContractResolver(Dictionary<Type, List<string>> propertiesToIgnore)
{
_propertiesToIgnore = propertiesToIgnore;
}
protected override JsonProperty CreateProperty(MemberInfo member, MemberSerialization memberSerialization)
{
var property = base.CreateProperty(member, memberSerialization);
List<string> toIgnore;
property.Ignored |= ((_propertiesToIgnore.TryGetValue(member.DeclaringType, out toIgnore) || _propertiesToIgnore.TryGetValue(member.DeclaringType.BaseType, out toIgnore)) && toIgnore.Contains(property.PropertyName));
return property;
}
}
Then created a Static Class which returns a dictionary of Properties to be Ignored based on the Controller:
public static class CriteriaDefination
{
private static Dictionary<string, Dictionary<Type, List<string>>> ToIgnore = new Dictionary<string, Dictionary<Type, List<string>>>
{
{
"tblCustomer", new Dictionary<Type, List<string>>{
{
typeof(tblCustomer), new List<string>{
//include all
}
},
{
typeof(tblOrder), new List<string>{
"tblCustomer"//ignore back reference to tblCustomer
}
}
}
},
{
"tblOrder", new Dictionary<Type, List<string>>{
{
typeof(tblCustomer), new List<string>{
"tblOrders"//ignore back reference to tblOrders
}
},
{
typeof(tblOrder), new List<string>{
//include all
}
}
}
}
};
public static Dictionary<Type, List<string>> IgnoreList(string key)
{
return ToIgnore[key];
}
}
And inside every controller change the JSON Formatter something like:
GlobalConfiguration.Configuration.Formatters.JsonFormatter.SerializerSettings.ContractResolver = new FilterContractResolver(CriteriaDefination.IgnoreList("tblCustomer"));
This is what I ended up settling on, hopefully it helps someone else.
Say the EF classes are structured like this:
public partial class MyEF
{
public virtual ICollection<MyOtherEF> MyOtherEFs {get; set;}
}
public partial class MyOtherEF
{
public virtual MyEF MyEF {get; set;}
}
To keep serialization form happening in JSON.NET, you can extend the class and add a method with the name "ShouldSerialize" + property name like so:
public partial class MyEF
{
public bool ShouldSerializeMyOtherEFs() { return false; }
}
If you wanted to get a little more fancy, you could add logic in the method so that it would serialize in certain cases. This allows you to keep serialization logic out of the EF Model First code creation as long as this code is in a different physical code file.
Instead of letting the Entity Framework generate the model, use Code First with an existing database. Now you are more in control.
See this blog entry from Scott Guthrie

GenericRepository TEntity change attribute value

I am using EF 5.0 and the model first approach. I have build a GenericRepository that has the basic get, insert, delete etc statements. Like:
public virtual void Insert(TEntity entity)
{
dbSet.Add(entity);
}
My EF entities all have the attributes Modified and ModifiedBy. Now I want to change this values everytime I save an entity.
Is it possible to modify this two attributes (set the value) without writing an specific implementation all the time?
Thank you
I see two options for you to do this, but they both entail either introducing a base type or an interface for all of your entities to cover them in a generic function. I would prefer an interface, although each entity would have to implement it again and again.
Let's say you create
interface IAuditable
{
DateTime Modified { get; set; }
string ModifiedBy {get; set; } // User id?
}
Now you can do:
public virtual void Insert(TEntity entity)
where TEntity : IAuditable
{
entity.Modified = DateTime.Now;
entity.ModifiedBy = ???? // Whatever you get the name from
...
}
(Same for edit)
You can also subscribe to the context's SavingChanges event:
// In the constructor:
context.SavingChanges += this.context_SavingChanges;
private void context_SavingChanges(object sender, EventArgs e)
{
foreach (var auditable in context.ObjectStateManager
.GetObjectStateEntries(EntityState.Added | EntityState.Modified)
.Select(entry => entry.Entity)
.OfType<IAuditable>)
{
auditable.Modified = DateTime.Now;
auditable.ModifiedBy = ????;
}
}
If you work with DbContext you can get to the event by
((IObjectContextAdapter)this).ObjectContext.SavingChanges
I'd like to add that more reliable time tracking can (and maybe should) be achieved by database triggers. Now you depend on a client's clock.
You can do this using the following code in your all methods of repository where you want to.
public virtual void Edit(TEntity entity)
{
entity.Modified=DateTime.Now;
entity.ModifiedBy=User.Identity.Name;
//Other saving to repository code
}