I am writing integration tests and I want to use transaction scope.
We use EF and Repositories with Contexts.
If I have one Repository and once Context then it would look like this:
[TestInitialize]
public void RuleEngineTestsStart() {
customContext = new CustomContext();
transaction = customContext.Database.BeginTransaction();
repo = new CustomRepository(customContext);
// I need to make this context to work in the same transaction as above
anotherContext = new AnotherContext();
anotherRepo = new AnotherRepository(anotherContext);
}
At the end of tests (TestCleanup) I would like to transaction.Rollback(); everything.
I want to have the same transaction for all repositories that work with different contexts, is it possible? How to create transaction and 'send' it to all three contexts?
Please, to do not to use one Context for all repositories, it is not possible due to reasons (we want to have each context with its own DbSets later to be used within microservices).
Edit
In comments I was asked to include more code, however, I think is not necessary to answer my question.
customContext = new CustomContext();
repo = new CustomRepository(customContext);
customContext2 = new CustomContext2();
otherRepository = new CustomRepository2(customContext2);
// class to be tested needs both repositories
ToBeTestedClass cl = new ToBeTestedClass(customRepository, otherRepository);
// "BASE" interface
public interface IRepository<TEntity> where TEntity : class
{
TEntity GetById(long id);
IEnumerable<TEntity> GetByFilter(Expression<Func<TEntity, bool>> predicate);
TEntity GetSingleByFilter(Expression<Func<TEntity, bool>> filter);
void Insert(TEntity entity);
void Delete(long id);
void Update(TEntity entity);
...
}
// BASE CLASS
public class Repository<TEntity> : IRepository<TEntity> where TEntity : class
{
protected readonly DbContext _context;
protected readonly DbSet<TEntity> _dbSet;
public Repository(ColldeskDbContext context)
{
_context = context;
_dbSet = context.Set<TEntity>();
}
// GetSingle, GetAll, Insert, Update etc.
}
// CustomRepository (other Repositories are similar, with custom methods)
public interface ICustomRepository : IRepository<CusotmData>
{
// some specific methods that are not in Base class
}
public class CustomRepository: Repository<CustomData>, ICustomRepository
{
public CustomRepository(CustomContext context) : base(context)
{
}
// custom methods that are specific for given context
}
// Contexts - each context consists of its one DbSets
Don't use dbContext.SaveChanges() in your repositories. Use ONE dbContext when creating repositories. Sample:
using ( var db = new YourDbContext() )
{
// Create and begin transaction
using ( var transaction = db.Database.BeginTransaction() )
{
try
{
// ONE dbContext for all repositories
var firstRepo = new Custom1Repository(db);
var secondRepo = new Custom2Repository(db);
City city = new City { Description = "My city" };
Street street = new Street { Description = "My street", City = city};
firstRepo.Insert(city);
secondRepo.Insert(street);
// Save all your changes and after that commit transaction
db.SaveChanges();
transaction.Commit();
}
catch ( Exception ec)
{
transaction.Rollback();
}
}
}
Doing like this your repositories becomes just wrappers over DbSet<TEntity>
I have figured out that I can simply use TransactionScope like this:
private TransactionScope _scope;
[TestInitialize]
public void TestInitialize()
{
_scope = new TransactionScope();
}
[TestCleanup]
public void TestCleanup()
{
_scope.Dispose();
}
And then each Context would be running within this TransactionScope.
Related
I'm having an Azure WebJob running continuously which is doing CRUD operations in my database. I'm using Entity Framework and UnitOfWork pattern and in my WebJob I use Autofac to inject my dependencies, service and repository layer. I'm having some issues with stale data when running my WebJob.
Example:
I update a record on my website and my WebJob is then kicked off but my WebJob can't see this change in the database. It sees the record prior to the change.
To fix this I tried to inject my custom context like this:
builder.RegisterType<PCContext>().As<IPCContext>().InstancePerDependency();
After doing that I can see the newest changes in the database. But now I have another issues. When I insert a new record and then read it, from my WebJob I can't see this new record. This worked fine before I injected my context (as shown in code above).
If I create a new context in my WebJob function I can read the updates from the database, but I want to use my service layer instead like this:
_services.UserExport.ExportUsers();
I can't figure out what I'm doing wrong here. Basically what I want is every time my WebJob function is kicked off I want a new context to be created so I'm sure I have the newest updates from the database and I want to be able to insert into my database and read this again in my WebJob using my service layer.
Can someone point me in the right direction?
Note that my WebJob is continuous so it's Autofac registration code is only executed once when the WebJob is start, not for every time a function in the WebJob is executed.
Please let me know if more description or code is necessary.
Thanks.
According to your description, I tested the similar scenario on my side and I found I could read and update from my database. I defined my generic Repository and UnitOfWork as follows, you could refer to them:
Repository:
public interface IRepository<T>
{
T GetById(object id);
IQueryable<T> GetAll();
void Edit(T entity);
void Insert(T entity);
void Delete(T entity);
}
public class Repository<T> : IRepository<T> where T : class
{
public DbContext context;
public DbSet<T> dbset;
public Repository(DbContext context)
{
this.context = context;
dbset = context.Set<T>();
}
public T GetById(object id)
{
return dbset.Find(id);
}
public IQueryable<T> GetAll()
{
return dbset;
}
public void Insert(T entity)
{
dbset.Add(entity);
}
public void Edit(T entity)
{
context.Entry(entity).State = EntityState.Modified;
}
public void Delete(T entity)
{
context.Entry(entity).State = EntityState.Deleted;
}
}
UnitOfWork:
public class UnitOfWork : IDisposable
{
private DbContext _context;
private Repository<TodoItem> toDoItemRepository;
public Repository<TodoItem> ToDoItemRepository
{
get
{
if (toDoItemRepository == null)
toDoItemRepository = new Repository<TodoItem>(_context);
return toDoItemRepository;
}
}
public UnitOfWork() : this(new BruceDbContext()) { }
public UnitOfWork(DbContext context)
{
_context = context;
}
public void Commit()
{
_context.SaveChanges();
}
#region Dispose
private bool disposed = false;
protected virtual void Dispose(bool disposing)
{
if (!this.disposed)
{
if (disposing)
{
_context.Dispose();
}
}
this.disposed = true;
}
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
#endregion
}
For my WebJob I defined the Functions.cs and initialized the JobActivator as follows:
Functions.cs
public class Functions
{
private UnitOfWork _unitOfWork;
public Functions(UnitOfWork unitOfWork)
{
_unitOfWork = unitOfWork;
}
public async Task CronJob([TimerTrigger("0/30 * * * * *")] TimerInfo timer, CancellationToken cancelToken)
{
//retrieve the latest record
var item = _unitOfWork.ToDoItemRepository.GetAll().OrderByDescending(i => i.CreateDate).FirstOrDefault();
Console.WriteLine($"[{item.CreateDate}] {item.Text}");
//insert a new record
_unitOfWork.ToDoItemRepository.Insert(new Entities.TodoItem()
{
Id = Guid.NewGuid().ToString(),
CreateDate = DateTime.Now,
Text = $"hello world -{DateTime.Now}"
});
_unitOfWork.Commit();
//retrieve the previous added record
item = _unitOfWork.ToDoItemRepository.GetAll().OrderByDescending(i => i.CreateDate).FirstOrDefault();
Console.WriteLine($"[{item.CreateDate}] {item.Text}");
}
}
Program.cs
var builder = new ContainerBuilder();
builder.Register<UnitOfWork>(c => new UnitOfWork(new BruceDbContext())).InstancePerDependency();
builder.RegisterType<Functions>();
var container = builder.Build();
var config = new JobHostConfiguration()
{
JobActivator = new AutoFacJobActivator(container)
};
var host = new JobHost(config);
I'm using EF and MVVM pattern. My question is about the Data Access Layer. in DAL I have the following classes:
MyObjectContext which is technically the standard ObjectContext now, but some Unit-of-work methods will be added to it later.
Repository<TModel> which handles the most needed queries (such as Add, GetAll, ...) on different ObjectSets.
A bunch of DataServices which make use of repositories to provide a higher level of data access for Core.
The project I'm working on is a business application with about 100 EntitySets so far, and there are times when a single interaction of a user can involve up to 20 different EntitySets (updating most of them). I currently add .Include(params string[]) to my queries to prevent ObjectContextDisposedException but it doesn't seem to be a reliable solution.
The question is should I create an instance of MyObjectContext (and therefore Repository) in each of DataService methods (like the following codes, it seems to me that the ability of Unit of work would be useless in this case) or should I create it outside of DataService and pass it to the DataServices through their constructors (or directly to each of the DataService methods) to handle a bunch of database actions (different tables and queries) together. And how?
Here's what MyObjectContext looks like:
public class MyObjectContext : ObjectContext, IUnitOfWork
{
public MyObjectContext()
: base("name=EdmContainer", "EdmContainer")
{
ContextOptions.LazyLoadingEnabled = true;
}
#region IUnitOfWork Members
public void Commit()
{
SaveChanges();
}
#endregion
}
This is how Repository looks like:
public class Repository<TModel>
{
private readonly SoheilEdmContext _context;
public Repository(IUnitOfWork unitOfWork)
{
if (unitOfWork == null)
throw new ArgumentNullException("unitOfWork");
_context = unitOfWork as SoheilEdmContext;
}
public TModel FirstOrDefault(Expression<Func<TModel, bool>> where)
{
return _context.CreateObjectSet<TModel>().FirstOrDefault(where);
}
public void Add(TModel entity)
{
_context.CreateObjectSet<TModel>().AddObject(entity);
}
...
}
And this is how a common DataService looks like:
public class JobDataService : IDataService<Job>
{
#region IDataService<Job> Members
public Job GetSingle(int id)
{
Job model = null;
using (var context = new MyObjectContext())
{
var repos = new Repository<Job>(context);
model = repos.FirstOrDefault(x => x.Id == id);
}
return model;
}
public IEnumerable<Job> GetAll()
{
using (var context = new MyObjectContext())
{
var repos = new Repository<Job>(context);
var models = repos.GetAll();
return models;
}
}
public IEnumerable<Job> GetActives()
{
throw new NotImplementedException();
}
public int AddModel(Job model)
{
using (var context = new MyObjectContext())
{
var repos = new Repository<Job>(context);
repos.Add(model);
context.SaveChanges();
}
}
public void UpdateModel(Job model)
{
throw new NotImplementedException();
}
public void DeleteModel(Job model)
{
using (var context = new MyObjectContext())
{
var repos = new Repository<Job>(context);
var model = repos.FirstOrDefault(x => x.Id == model.Id);
if (model == null) return;
repos.Delete(model);
context.SaveChanges();
}
}
#endregion
}
Any kind of idea or insight would be appreciated.
You can create an instance of MyObjectContext in each service, like JobDataService, however, it makes your code messy and it is hard to maintain. Create instance of MyObjectContext outside of DataService is better. What you have now, if you have 100 EntitySets, you have to create 100 DataServices. That is because the use of "Repository Pattern" and "UnitOfWork" here is not efficient. I would suggest doing the following:
ObjectContext
public class MyObjectContext : ObjectContext
{
public MyObjectContext() : base("name=EdmContainer", "EdmContainer")
{
ContextOptions.LazyLoadingEnabled = true;
}
#region IUnitOfWork Members
public void Commit()
{
SaveChanges();
}
#endregion
}
Generic Repository
public interface IRepository<TModel> where TModel : class
{
void Add(TModel entity);
IEnumerable<TModel> GetAll();
// Do some more implement
}
public class Repository<TModel> : IRepository<TModel> where TModel : class
{
private readonly ObjectContext _context;
public Repository(ObjectContext context)
{
_context = context;
}
public virtual void Add(TModel entity)
{
_context.CreateObjectSet<TModel>().AddObject(entity);
}
public virtual IEnumerable<TModel> GetAll()
{
return _context.CreateObjectSet<TModel>();
}
}
UnitOfWork
public interface IUnitOfWork : IDisposable
{
IRepository<Job> Jobs { get; }
IRepository<User> Users { get;}
void Commit();
}
public class UnitOfWork : IUnitOfWork
{
private readonly SoheilEdmContext _context;
private readonly IRepository<Job> _jobRepository;
private readonly IRepository<User> _userRepository;
public UnitOfWork(SoheilEdmContext context)
{
_context = context;
_jobRepository = new Repository<Job>(_context);
_userRepository = new Repository<User>(_context);
}
public IRepository<Job> Jobs{get { return _jobRepository; }}
public IRepository<User> Users{get { return _userRepository; }}
public void Commit(){_context.Commit();}
public void Dispose()
{
if (_context != null)
{
_context.Dispose();
}
GC.SuppressFinalize(this);
}
JodDataSerivce
public interface IDataService
{
IEnumerable<Job> GetAll();
}
public class DataService : IDataService
{
private readonly IUnitOfWork _unitOfWork;
public DataService(IUnitOfWork unitOfWork)
{
_unitOfWork = unitOfWork;
}
public IEnumerable<Job> GetAll()
{
return _unitOfWork.Jobs.GetAll();
}
}
Here I used interface for implementing everything, if you want to do the same, you need to use IoC Container. I used the "Simple Injector", you can find it here:
Simple Injector
One more suggestion, if you feel like you have too many I/O operations to implement, like database access, querying data, etc., you should consider using Asynchronous. Below is a good video on Asynchronous.
How to Build ASP.NET Web Applications Using Async
So the problem I am trying to solve is this; We are using Entity Framework to access our Oracle database that has 1200-1500 tables. Now mind you we are not accessing them all, but possibly could have 800+ to access. We are using the UnitOfWork --> Repository --> Service pattern and that works great, but we are trying to figure out if we should have one big DbContext, or multiple little contexts that are specific to the task at hand.
Our UnitOfWork is setup using an EFUnitOfWorkBase like so:
public abstract class EFUnitOfWorkBase : IUnitOfWork
{
private bool isDisposed = false;
public DbContextBase Context { get; set; }
protected EFUnitOfWorkBase(DbContextBase context)
{
Context = context;
}
public int Commit()
{
return Context.SaveChanges();
}
public void Dispose()
{
if (!isDisposed)
Dispose(true);
GC.SuppressFinalize(this);
}
private void Dispose(bool disposing)
{
isDisposed = true;
if (disposing)
{
if (this.Context != null)
this.Context.Dispose();
}
}
public IRepository<TEntity> GetRepository<TEntity>() where TEntity : Common.EntityBase<TEntity>
{
return new Repository<TEntity>(this);
}
}
Any unit of work we create extends that base one and provides the context like so:
public class EmployeeDirectoryUnitOfWork : EFUnitOfWorkBase
{
public EmployeeDirectoryUnitOfWork(string connectionString)
: base(new EmployeeDirectoryContext(connectionString))
{
}
}
The DbContext is passed a connection string through the unit of work.
The Repository looks like this:
public abstract class RepositoryBase<TEntity> : IRepository<TEntity> where TEntity : class
{
protected DbContextBase Context;
protected DbSet<TEntity> EntitySet;
public RepositoryBase(EFUnitOfWorkBase unitOfWork)
{
Enforce.ArgumentNotNull(unitOfWork, "unitOfWork");
Context = unitOfWork.Context;
EntitySet = Context.Set<TEntity>();
}
public TEntity Add(TEntity entity)
{
Enforce.ArgumentNotNull(entity, "entity");
return EntitySet.Add(entity);
}
public TEntity Attach(TEntity entity)
{
Enforce.ArgumentNotNull(entity, "entity");
return EntitySet.Attach(entity);
}
public TEntity Delete(TEntity entity)
{
Enforce.ArgumentNotNull(entity, "entity");
return EntitySet.Remove(entity);
}
public System.Linq.IQueryable<TEntity> Query()
{
return EntitySet.AsQueryable();
}
public TEntity Save(TEntity entity)
{
Enforce.ArgumentNotNull(entity, "entity");
Attach(entity);
Context.MarkModified(entity);
return entity;
}
}
Any suggestions on how to best handle this situation?
In such a case when you have a large application like this, I think you should probably go for a more Domain Driven Design approach and split the contexts into some separate, bounded contexts. This way when later developers are adding features to the program they will be confined to only being able to access certain tables depending on which context they will be using there.
For better information, Julie Lerman recently came out with a course on Pluralsight about Entity Framework in the Enterprise that's really good. She posted a small clip of it (actually about bounded contexts) on this site. It's a very good course, and I highly recommend it, especially for what you appear to be doing.
I'm thinking of the options in regards to implementing a single unit of work for dealing with multiple datasources - Entity framework. I came up with a tentative approach - for now dealing with a single context - but it apparently isn't a good idea.
If we were to analyze the code below, would you consider it a bad implementation? Is the lifetime of the transaction scope a potential problem?
Of course if we wrap the transaction scope with different contexts we'd be covered if the second context.SaveChanges() failed...
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Transactions;
namespace ConsoleApplication2
{
class Program
{
static void Main(string[] args)
{
using(UnitOfWork unitOfWork = new UnitOfWork())
{
var repository = new EmployeeRepository(unitOfWork);
var employee = repository.CreateOrGetEmployee("Whatever Name");
Console.Write(employee.Id);
unitOfWork.SaveChanges();
}
}
}
class UnitOfWork : IDisposable
{
TestEntities _context;
TransactionScope _scope;
public UnitOfWork()
{
_scope = new TransactionScope();
_context = new TestEntities();
}
public void SaveChanges()
{
_context.SaveChanges();
_scope.Complete();
}
public TestEntities Context
{
get
{
return _context;
}
}
public void Dispose()
{
_scope.Dispose();
_context.Dispose();
}
}
class EmployeeRepository
{
UnitOfWork _unitOfWork;
public EmployeeRepository(UnitOfWork unitOfWork)
{
_unitOfWork = unitOfWork;
}
public Employee GetEmployeeById(int employeeId)
{
return _unitOfWork.Context.Employees.SingleOrDefault(e => e.Id == employeeId);
}
public Employee CreateEmployee(string fullName)
{
Employee employee = new Employee();
employee.FullName = fullName;
_unitOfWork.Context.SaveChanges();
return employee;
}
public Employee CreateOrGetEmployee(string fullName)
{
var employee = _unitOfWork.Context.Employees.FirstOrDefault(e => e.FullName == fullName);
if (employee == null)
{
employee = new Employee();
employee.FullName = fullName;
this.AddEmployee(employee);
}
return employee;
}
public Employee AddEmployee(Employee employee)
{
_unitOfWork.Context.Employees.AddObject(employee);
_unitOfWork.Context.SaveChanges();
return employee;
}
}
}
Why do you start TransactionScope in constructor? You need it only for saving changes.
public void SaveChanges()
{
// SaveChanges also uses transaction which uses by default ReadCommitted isolation
// level but TransactionScope uses by default more restrictive Serializable isolation
// level
using (var scope = new TransactionScope(TransactionScopeOption.Required,
new TransactionOptions { IsolationLevel = IsolationLevel.ReadCommitted }))
{
_context.SaveChanges();
scope.Complete();
}
}
If you want to have unit of work with more contexts you will simply wrap all those context in the same unit of work class. Your SaveChanges will become little bit more complicated:
public void SaveChanges()
{
using (var scope = new TransactionScope(TransactionScopeOption.Required,
new TransactionOptions { IsolationLevel = IsolationLevel.ReadCommitted }))
{
_contextA.SaveChanges(SaveOptions.DetectChangesBeforeSave);
_contextB.SaveChanges(SaveOptions.DetectChangesBeforeSave);
scope.Complete();
_contextA.AcceptAllChanges();
_contextB.AcceptAllChanges();
}
}
This version separate saving operation from reseting inner state of the context. The reason is that if the first context successfully saves changes but the second fires exception the transaction will be rolled back. Because of that we don't want the first context to have already cleared all changes as accepted (we would lose information about performed changes and we will not be able to save them again).
I am looking into creating an Entity Framework 4 generic repository for a new ASP.NET MVC project i am working on. I have been looking at various tutorials and they all seem to use the Unit of Work pattern ...
From what i have been reading, EF is using this already within the ObjectContext and you are simply extending this to make your own Units of Work.
Source: http://dotnet.dzone.com/news/using-unit-work-pattern-entity?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+zones%2Fdotnet+(.NET+Zone)
Why would one go to the effort of doing this?
Is this the preferred way of working with generic repositories?
Many thanks,
Kohan.
This is not the way I would work with generic repositories. First of all, I would share ObjectContext between ClassARepository, CalssBRepository and other repositories in current request. Using IOC container, using injection and per request behavior is recommended:
This is how my generic repositories look like:
public interface IRepository<T>
{
//Retrieves list of items in table
IQueryable<T> List();
IQueryable<T> List(params string[] includes);
//Creates from detached item
void Create(T item);
void Delete(int id);
T Get(int id);
T Get(int id, params string[] includes);
void SaveChanges();
}
public class Repository<T> : IRepository<T> where T : EntityObject
{
private ObjectContext _ctx;
public Repository(ObjectContext ctx)
{
_ctx = ctx;
}
private static string EntitySetName
{
get
{
return String.Format(#"{0}Set", typeof(T).Name);
}
}
private ObjectQuery<T> ObjectQueryList()
{
var list = _ctx.CreateQuery<T>(EntitySetName);
return list;
}
#region IRepository<T> Members
public IQueryable<T> List()
{
return ObjectQueryList().OrderBy(#"it.ID").AsQueryable();
}
public IQueryable<T> List(params string[] includes)
{
var list = ObjectQueryList();
foreach(string include in includes)
{
list = list.Include(include);
}
return list;
}
public void Create(T item)
{
_ctx.AddObject(EntitySetName, item);
}
public void Delete(int id)
{
var item = Get(id);
_ctx.DeleteObject(item);
}
public T Get(int id)
{
var list = ObjectQueryList();
return list.Where("ID = #0", id).First();
}
public T Get(int id, params string[] includes)
{
var list = List(includes);
return list.Where("ID = #0", id).First();
}
public void SaveChanges()
{
_ctx.SaveChanges();
}
#endregion
}
ObjectContext is injected through constructor. List() methods return IQueryable for further processing in business layer (service) objects. Service layer returns List or IEnumerable, so there is no deferred execution in views.
This code was created using EF1. EF4 version can be a little different and simpler.