Multiple telerik MVC grids in TabStrip not working with ninject and entity framework, unit of work, repository pattern - entity-framework

I am creating an ASP.NET MVC 3 e-commerce website and I am currently working on the admin area where you can add/edit products. To create the UI for the product page I am using Telerik MVC controls.
My problem is that when I added a second telerik grid which both retrieve data from the database through an ajax call I receive a couple different errors listed below:
{"There is already an open DataReader associated with this Command
which must be closed first."}
{"The connection was not closed. The connection's current state is
connecting."}
Database Context Code
public interface IUnitOfWork
{
void Commit();
}
public class GSPDataContext : DbContext, IUnitOfWork
{
/* (omitted) IDbSet's for entities */
public GSPDataContext()
: base("GSPConnectionString")
{
}
public virtual IDbSet<T> DbSet<T>() where T : class
{
return Set<T>();
}
public virtual void Commit()
{
base.SaveChanges();
}
}
Generic Repository Code
public class Repository<T> : IRepository<T> where T : class
{
private GSPDataContext m_dataContext;
private readonly IDbSet<T> m_entity;
public Repository(GSPDataContext dataContext)
{
if (dataContext == null)
throw new ArgumentException();
m_dataContext = dataContext;
m_entity = m_dataContext.Set<T>();
}
public T GetById(int id)
{
return this.m_entity.Find(id);
}
public void Insert(T entity)
{
if (entity == null)
throw new ArgumentException();
this.m_entity.Add(entity);
//this.m_dataContext.SaveChanges();
}
public void Delete(T entity)
{
if (entity == null)
throw new ArgumentException();
this.m_entity.Remove(entity);
//this.m_dataContext.SaveChanges();
}
public virtual IQueryable<T> Table
{
get
{
return this.m_entity;
}
}
}
Ninject Code
private static void RegisterServices(IKernel kernel)
{
//Customer
kernel.Bind<IAddressValidationService>().To<AddressValidationService>().InRequestScope();
kernel.Bind<ICustomerService>().To<CustomerService>().InRequestScope();
kernel.Bind<ICustomerProductService>().To<CustomerProductService>().InRequestScope();
//Authentication
kernel.Bind<IOpenIDLoginService>().To<OpenIDLoginService>().InRequestScope();
kernel.Bind<IAuthenticationService>().To<FormsAuthenticationService>().InRequestScope();
//Products
kernel.Bind<IProductService>().To<ProductService>().InRequestScope();
kernel.Bind<IRecentlyViewedProductService>().To<RecentlyViewedProductService>().InRequestScope();
kernel.Bind<IProductPictureService>().To<ProductPictureService>().InRequestScope();
kernel.Bind<ICategoryService>().To<CategoryService>().InRequestScope();
kernel.Bind<IPictureService>().To<PictureService>().InRequestScope();
//Shopping Cart
kernel.Bind<IShoppingCartService>().To<ShoppingCartService>().InRequestScope();
//Shipping and Payment
kernel.Bind<IShippingService>().To<ShippingService>().InRequestScope();
kernel.Bind<IPaymentService>().To<PaymentService>().InRequestScope();
//Orders
kernel.Bind<IOrderCalculationService>().To<OrderCalculationService>().InRequestScope();
kernel.Bind<IOrderProcessingService>().To<OrderProcessingService>().InRequestScope();
kernel.Bind<IOrderService>().To<OrderService>().InRequestScope();
//
kernel.Bind<IEncryptionService>().To<EncryptionService>().InRequestScope();
kernel.Bind<ILogger>().To<LoggingService>().InRequestScope();
kernel.Bind<IWebManager>().To<WebManager>().InRequestScope();
//Messages
kernel.Bind<IEmailService>().To<EmailService>().InRequestScope();
kernel.Bind<IMessageTemplateService>().To<MessageTemplateService>().InRequestScope();
kernel.Bind<IWorkflowMessageService>().To<WorkflowMessageService>().InRequestScope();
//Data
kernel.Bind<GSPDataContext>().ToSelf().InSingletonScope();
kernel.Bind<IUnitOfWork>().ToMethod(ctx => ctx.Kernel.Get<GSPDataContext>()).InSingletonScope();
kernel.Bind(typeof (IRepository<>)).To(typeof (Repository<>)).InRequestScope();
kernel.Bind<IWorkContext>().To<WebWorkContext>().InRequestScope();
}
I suspect it has something to do with how ninject is managing the lifetimes of the various services, but I am not sure what I need to do to make it work.
Any advice would be much appreciated.
Thanks
UPDATE
According to Remo's comment I change my code to the following:
//Data
kernel.Bind<GSPDataContext>().ToSelf().InRequestScope();
kernel.Bind<IUnitOfWork>().ToMethod(ctx => ctx.Kernel.Get<GSPDataContext>()).InRequestScope();
kernel.Bind(typeof (IRepository<>)).To(typeof (Repository<>)).InRequestScope();
And I am now getting the following error:
The ObjectContext instance has been disposed and can no longer be used
for operations that require a connection.
Any ideas?

No, it has nothing to do with how Ninject manages lifetimes. But it has to do how you configured the lifecycles.
It is important that a new DbContext is used for each request. This has to be InRequestScope.

Related

Using Microsoft.AspNetCore.Identity.MongoDB for Multi Tenancy. How do we inject dynamic Tenant into MongoDbContext

Does anyone know how we can inject context into User Manager > MongoDB serStore at runtime in .net core 2.0.
We cannot do this at startup due to the context being dynamic but the UserStore is not accessible and UserManager has too many variables to new up, and it is wrong. Are there any solutions?
public class UserStore<TUser> :
IUserPasswordStore<TUser>,
IUserRoleStore<TUser>,
IUserLoginStore<TUser>,
IUserSecurityStampStore<TUser>,
IUserEmailStore<TUser>,
IUserClaimStore<TUser>,
IUserPhoneNumberStore<TUser>,
IUserTwoFactorStore<TUser>,
IUserLockoutStore<TUser>,
IQueryableUserStore<TUser>,
IUserAuthenticationTokenStore<TUser>
where TUser : IdentityUser
{
private readonly IMongoCollection<TUser> _Users;
//THIS IS WHERE WE WANT TO INJECT THE users AT RUNTIME
public UserStore(IMongoCollection<TUser> users)
{
_Users = users;
}
public virtual void Dispose()
{
// no need to dispose of anything, mongodb handles connection pooling automatically
}
public virtual async Task<IdentityResult> CreateAsync(TUser user, CancellationToken token)
{
await _Users.InsertOneAsync(user, cancellationToken: token);
return IdentityResult.Success;
}
unfortunately users is null at startup, and should be as the tenant has not been created at that point.
We have also been using the saaskit.Multitenancy and just can't find a solution.
Any help would be much appreciated.
Thanks
i think u need a generic repository to act as a wrapper for IMongoCollection then inject the repository inside controllers
public class Repository<T>
{
public IMongoCollection<T> Collection { get; private set; }
public Repository(IDbFactory dbFactory)
{
MongoClient client = new MongoClient("ur connection string");
this.Collection = client.GetDatabase("db").GetCollection<T>(typeof(T).Name);
}
public T Find(Expression<Func<T, bool>> filter)
{
return this.Collection.AsQueryable<T>().FirstOrDefault<T>(filter);
}
public async Task<T> FindAsync(Expression<Func<T, bool>> filter)
{
return await this.Collection.AsQueryable<T>().FirstOrDefaultAsync<T>(filter);
}
// here add more methods
}
then register the dependency as below inside Startup.cs
public void ConfigureServices(IServiceCollection services)
{
services.AddTransient(typeof(IRepository<>), typeof(Repository<>));
services.AddMvc();
}
finally inside controllers u inject the generic repository, also dont forget to Implement the IDisopsible in genereic repository
public class ProductController : Controller
{
private readonly IRepository<Product> _productRepository = null;
public ProductController(IRepository<Product> productRepository)
{
this._productRepository = productRepository;
}
}

Using single dbcontext for unit of work in service layer

I trying to implement business layer (service layer) along with repository layer. So my project has following layers EF <--- Repository <--- Service <--- Controller <--- View.
My context class looks like :
public class ToDoContext : DbContext
{
public ToDoContext()
: base("ToDoContext")
{
}
public virtual DbSet<Project> Projects { get; set; }
public virtual DbSet <Collaborator> Collaborators { get; set; }
public virtual DbSet<ActionTask> Tasks { get; set; }
}
My ProjectRepository looks like :
public class ProjectRepository : IProjectRepository, IDisposable
{
ToDoContext Context;
public ProjectRepository(ToDoContext context)
{
this.Context = context;
}
public virtual List<Project> AllProjects()
{
IQueryable<Project> projects = Context.Projects;
return projects.ToList<Project>();
}
public Project Find(int? id)
{
// some code
}
public void InsertOrUpdate(Project project)
{
// some code
}
public void Delete(int? id)
{
// some code
}
public void Save()
{
// some code
}
public void Dispose()
{
Context.Dispose();
}
}
My ProjectService class looks like :
public class ProjectService : IProjectService
{
IProjectRepository ProjectRepo;
ICollaboratorRepository CollaboratorRepo;
public ProjectService(IProjectRepository projectRepo, ICollaboratorRepository collaboratorRepo)
{
this.ProjectRepo = projectRepo;
this.CollaboratorRepo = collaboratorRepo;
}
public List<Project> GetAllProjects()
{
return ProjectRepo.AllProjects();
}
public void CreateProject(FormCollection formData)
{
// code to parse form data as per my business needs
// code to code to fetch related collaborator list
// save the project data
}
public List<Collaborator> GetCollaborators(string[] collaboratorId)
{
// Fetch collaborator list using collaborator repository
return CollaboratorRepository.GetAllCollaborators();
}
}
Similarly I have implemented service and repository layer for my collaborator.
To summarize the code my CreateProject() method in ProjectService class fetches collaborator list using collaborator repository and create a new project attaching the collaborator list with this newly created project and save it.So i guess for me this whole process is One Unit Of Work.
When I fetch list of collaborators using CollaboratorRepository and try to save the newly created project using ProjectRepository, it throws me error "An entity object cannot be referenced by multiple instances of IEntityChangeTracker.". I guess this is because CollaboratorRepository dbcontext is not disposed yet. So I am manually disposing each context before using new context. I know I can't afford this overhead of manually disposing the context. Can anyone help me please?
I know I should be using same dbcontext object for one unit of work. But i don't know how to achieve this when dbcontext object is exposed to repository rather than service layer.
How is the IoC configured? You should register the LifeTime of the DbContext per Request\Thread, in order to get the same instance per business transaction.
Tips: In order to have a reusable Service outside of a web enviroment, do not pass the FormCollection to the CreateProject method. The parsing of the FormCollection should be done by the controller\model binder.
For Ninject, try InRequestScope
kernel.Bind<ToDoContext>().To<ToDoContext>().InRequestScope();

Passing connection string to Entity framework at runt time for each call

My Entity framework context is as following
public partial class MyContext : DbContext, IMyContext
{
static MyContext()
{
System.Data.Entity.Database.SetInitializer<MyContext>(null);
}
public MyContext()
: base("Name=MyContext")
{
}
I am resolving it through autofac in the following way
builder.RegisterType(typeof(MainContext)).As(typeof(DbContext)).InstancePerLifetimeScope();
builder.RegisterType<MainContext>().As<IMainContext>().InstancePerRequest();
This db context gets called in repository layer
#region Fields
private readonly IMyContext _context;
#endregion
#region Constructors and Destructors
public EmployeeRepository(IMyContext context)
{
_context = context;
}
#endregion
public void Create(Employee emp)
{
this._context.Employee.Add(emp);
}
Now my issue is , I want to set the connection string dynamically per call. The connection string will be passed through a webapi which i want to pass on to this context. Can anyone help me how can i do that? I am confused about autofac here. Secondly how can i make sure each call sets connection string and does not cache it.
You can use a factory that will build the context and set the connectionstring for you.
public interface IContextFactory
{
IContext GetInstance();
}
public class MyContextFactory : IContextFactory
{
public IContext GetInstance()
{
String connectionString = this.GetConnectionString(HttpContext.Current);
return new MyContext(connectionString);
}
private String GetConnectionString(HttpContext context)
{
// do what you want
}
}
builder.RegisterType<MyContextFactory>()
.As<IContextFactory>()
.InstancePerRequest();
builder.Register(c => c.Resolve<IContextFactory>().GetInstance())
.As<IContext>()
.InstancePerRequest();
If you can't get connectionstring based on HttpContext, you can change contextFactory implementation to expect initialization by WebAPI before creating the instance. For example :
public interface IContextFactory
{
IContext GetInstance();
void Initialize(String connectionString);
}
public class MyContextFactory : IContextFactory
{
private String _connectionString;
public void Initialize(String connectionString)
{
this._connectionString = connectionString;
}
public IContext GetInstance()
{
if (this._connectionString == null)
{
throw new Exception("connectionString not initialized");
}
return new MyContext(this._connectionString);
}
}
At the beginning of your web API call (through attribute for example), you can call the Initialize method. Because the factory is InstancePerRequest you will have one instance for the duration of the request.
By the way, I'm not sure to understand this registration
builder.RegisterType(typeof(MainContext)).As(typeof(DbContext)).InstancePerLifetimeScope();
builder.RegisterType<MainContext>().As<IMainContext>().InstancePerRequest();
It looks buggy because you will have 2 different registration of the same type and not for the same scope, is it intended ? Furthermore, it doesn't sound a good idea to register a DbContext, do you need this registration ?
The following registration looks better :
builder.RegisterType<MainContext>()
.As<IMainContext>()
.As<DbContext>()
.InstancePerRequest();

What's DataService Best practice using Entity Framework and Repository and UnitOfWork Patterns

I'm using EF and MVVM pattern. My question is about the Data Access Layer. in DAL I have the following classes:
MyObjectContext which is technically the standard ObjectContext now, but some Unit-of-work methods will be added to it later.
Repository<TModel> which handles the most needed queries (such as Add, GetAll, ...) on different ObjectSets.
A bunch of DataServices which make use of repositories to provide a higher level of data access for Core.
The project I'm working on is a business application with about 100 EntitySets so far, and there are times when a single interaction of a user can involve up to 20 different EntitySets (updating most of them). I currently add .Include(params string[]) to my queries to prevent ObjectContextDisposedException but it doesn't seem to be a reliable solution.
The question is should I create an instance of MyObjectContext (and therefore Repository) in each of DataService methods (like the following codes, it seems to me that the ability of Unit of work would be useless in this case) or should I create it outside of DataService and pass it to the DataServices through their constructors (or directly to each of the DataService methods) to handle a bunch of database actions (different tables and queries) together. And how?
Here's what MyObjectContext looks like:
public class MyObjectContext : ObjectContext, IUnitOfWork
{
public MyObjectContext()
: base("name=EdmContainer", "EdmContainer")
{
ContextOptions.LazyLoadingEnabled = true;
}
#region IUnitOfWork Members
public void Commit()
{
SaveChanges();
}
#endregion
}
This is how Repository looks like:
public class Repository<TModel>
{
private readonly SoheilEdmContext _context;
public Repository(IUnitOfWork unitOfWork)
{
if (unitOfWork == null)
throw new ArgumentNullException("unitOfWork");
_context = unitOfWork as SoheilEdmContext;
}
public TModel FirstOrDefault(Expression<Func<TModel, bool>> where)
{
return _context.CreateObjectSet<TModel>().FirstOrDefault(where);
}
public void Add(TModel entity)
{
_context.CreateObjectSet<TModel>().AddObject(entity);
}
...
}
And this is how a common DataService looks like:
public class JobDataService : IDataService<Job>
{
#region IDataService<Job> Members
public Job GetSingle(int id)
{
Job model = null;
using (var context = new MyObjectContext())
{
var repos = new Repository<Job>(context);
model = repos.FirstOrDefault(x => x.Id == id);
}
return model;
}
public IEnumerable<Job> GetAll()
{
using (var context = new MyObjectContext())
{
var repos = new Repository<Job>(context);
var models = repos.GetAll();
return models;
}
}
public IEnumerable<Job> GetActives()
{
throw new NotImplementedException();
}
public int AddModel(Job model)
{
using (var context = new MyObjectContext())
{
var repos = new Repository<Job>(context);
repos.Add(model);
context.SaveChanges();
}
}
public void UpdateModel(Job model)
{
throw new NotImplementedException();
}
public void DeleteModel(Job model)
{
using (var context = new MyObjectContext())
{
var repos = new Repository<Job>(context);
var model = repos.FirstOrDefault(x => x.Id == model.Id);
if (model == null) return;
repos.Delete(model);
context.SaveChanges();
}
}
#endregion
}
Any kind of idea or insight would be appreciated.
You can create an instance of MyObjectContext in each service, like JobDataService, however, it makes your code messy and it is hard to maintain. Create instance of MyObjectContext outside of DataService is better. What you have now, if you have 100 EntitySets, you have to create 100 DataServices. That is because the use of "Repository Pattern" and "UnitOfWork" here is not efficient. I would suggest doing the following:
ObjectContext
public class MyObjectContext : ObjectContext
{
public MyObjectContext() : base("name=EdmContainer", "EdmContainer")
{
ContextOptions.LazyLoadingEnabled = true;
}
#region IUnitOfWork Members
public void Commit()
{
SaveChanges();
}
#endregion
}
Generic Repository
public interface IRepository<TModel> where TModel : class
{
void Add(TModel entity);
IEnumerable<TModel> GetAll();
// Do some more implement
}
public class Repository<TModel> : IRepository<TModel> where TModel : class
{
private readonly ObjectContext _context;
public Repository(ObjectContext context)
{
_context = context;
}
public virtual void Add(TModel entity)
{
_context.CreateObjectSet<TModel>().AddObject(entity);
}
public virtual IEnumerable<TModel> GetAll()
{
return _context.CreateObjectSet<TModel>();
}
}
UnitOfWork
public interface IUnitOfWork : IDisposable
{
IRepository<Job> Jobs { get; }
IRepository<User> Users { get;}
void Commit();
}
public class UnitOfWork : IUnitOfWork
{
private readonly SoheilEdmContext _context;
private readonly IRepository<Job> _jobRepository;
private readonly IRepository<User> _userRepository;
public UnitOfWork(SoheilEdmContext context)
{
_context = context;
_jobRepository = new Repository<Job>(_context);
_userRepository = new Repository<User>(_context);
}
public IRepository<Job> Jobs{get { return _jobRepository; }}
public IRepository<User> Users{get { return _userRepository; }}
public void Commit(){_context.Commit();}
public void Dispose()
{
if (_context != null)
{
_context.Dispose();
}
GC.SuppressFinalize(this);
}
JodDataSerivce
public interface IDataService
{
IEnumerable<Job> GetAll();
}
public class DataService : IDataService
{
private readonly IUnitOfWork _unitOfWork;
public DataService(IUnitOfWork unitOfWork)
{
_unitOfWork = unitOfWork;
}
public IEnumerable<Job> GetAll()
{
return _unitOfWork.Jobs.GetAll();
}
}
Here I used interface for implementing everything, if you want to do the same, you need to use IoC Container. I used the "Simple Injector", you can find it here:
Simple Injector
One more suggestion, if you feel like you have too many I/O operations to implement, like database access, querying data, etc., you should consider using Asynchronous. Below is a good video on Asynchronous.
How to Build ASP.NET Web Applications Using Async

UnitOfWork and Entity Framework Contexts

So the problem I am trying to solve is this; We are using Entity Framework to access our Oracle database that has 1200-1500 tables. Now mind you we are not accessing them all, but possibly could have 800+ to access. We are using the UnitOfWork --> Repository --> Service pattern and that works great, but we are trying to figure out if we should have one big DbContext, or multiple little contexts that are specific to the task at hand.
Our UnitOfWork is setup using an EFUnitOfWorkBase like so:
public abstract class EFUnitOfWorkBase : IUnitOfWork
{
private bool isDisposed = false;
public DbContextBase Context { get; set; }
protected EFUnitOfWorkBase(DbContextBase context)
{
Context = context;
}
public int Commit()
{
return Context.SaveChanges();
}
public void Dispose()
{
if (!isDisposed)
Dispose(true);
GC.SuppressFinalize(this);
}
private void Dispose(bool disposing)
{
isDisposed = true;
if (disposing)
{
if (this.Context != null)
this.Context.Dispose();
}
}
public IRepository<TEntity> GetRepository<TEntity>() where TEntity : Common.EntityBase<TEntity>
{
return new Repository<TEntity>(this);
}
}
Any unit of work we create extends that base one and provides the context like so:
public class EmployeeDirectoryUnitOfWork : EFUnitOfWorkBase
{
public EmployeeDirectoryUnitOfWork(string connectionString)
: base(new EmployeeDirectoryContext(connectionString))
{
}
}
The DbContext is passed a connection string through the unit of work.
The Repository looks like this:
public abstract class RepositoryBase<TEntity> : IRepository<TEntity> where TEntity : class
{
protected DbContextBase Context;
protected DbSet<TEntity> EntitySet;
public RepositoryBase(EFUnitOfWorkBase unitOfWork)
{
Enforce.ArgumentNotNull(unitOfWork, "unitOfWork");
Context = unitOfWork.Context;
EntitySet = Context.Set<TEntity>();
}
public TEntity Add(TEntity entity)
{
Enforce.ArgumentNotNull(entity, "entity");
return EntitySet.Add(entity);
}
public TEntity Attach(TEntity entity)
{
Enforce.ArgumentNotNull(entity, "entity");
return EntitySet.Attach(entity);
}
public TEntity Delete(TEntity entity)
{
Enforce.ArgumentNotNull(entity, "entity");
return EntitySet.Remove(entity);
}
public System.Linq.IQueryable<TEntity> Query()
{
return EntitySet.AsQueryable();
}
public TEntity Save(TEntity entity)
{
Enforce.ArgumentNotNull(entity, "entity");
Attach(entity);
Context.MarkModified(entity);
return entity;
}
}
Any suggestions on how to best handle this situation?
In such a case when you have a large application like this, I think you should probably go for a more Domain Driven Design approach and split the contexts into some separate, bounded contexts. This way when later developers are adding features to the program they will be confined to only being able to access certain tables depending on which context they will be using there.
For better information, Julie Lerman recently came out with a course on Pluralsight about Entity Framework in the Enterprise that's really good. She posted a small clip of it (actually about bounded contexts) on this site. It's a very good course, and I highly recommend it, especially for what you appear to be doing.