Runtime swap implementation for Dagger provided dependency [duplicate] - dagger-2

This question already has answers here:
How do I tell Dagger 2 which implementation to instantiate based on X?
(3 answers)
Closed 5 years ago.
Using Dagger 2, I have a domain object that I provide to presenters. That domain object has a dependency on a repository. That repository has two implementations, but both implement the same interface. I need to be able to setup dagger somehow to swap between the two implementations of the repository at runtime based on a user selecting a "Demo Mode" option.
So I have the following domain object:
public class SomeAwesomeBusinessLogic {
Repository repository;
#Inject
public SomeAwesomeBusinessLogic(Repository repository) {
this.repository = repository;
}
//awesome stuff goin down
}
And the two repositories:
public RemoteRepository implements Repository {
#Inject
public RemoteRepository() {
//setup
}
}
public DemoRepository implements Repository {
#Inject
public DemoRepository() {
//setup
}
}
Any ideas on how to structure my modules and components to get this to work?

A couple of ideas come to my head. Depending on how and when you want to exchange this. One possibility is to instantiate your module with a configuration. Can be a simple boolean:
#Module
public class RepositoryModule {
private final boolean isDemo;
public RepositoryModule(boolean isDemo) {
this.isDemo = isDemo;
}
#Provides
public Repository providesRepository() {
return isDemo? new DemoRepository() : new RemoteRepository();
}
// Other providers
}
I'm not a fan of this approach, but it fits your use case. I believe it's quite restrictive and doesn't allow for easy maintainability. I would perhaps choose to use a factory and provide the factory instead of the repo itself.
public interface RepositoryFactory {
Repository getRepository(Configuration configuration);
}
public class RepositoryFactoryImpl implements RepositoryFactory {
#Inject
public RepositoryFactoryImpl() {}
// The implementation
}
#Module
public class RepositoryModule {
#Provides
public RepositoryFactory providesRepositoryFactory(
RepositoryFactoryImpl factory) {
return factory;
}
// Other providers
}
Configuration would be a simple POJO where you can specify several attributes to configure your repo. Say for example:
public class Configuration {
private final boolean isDemo;
// all the POJO stuff you need
}
You can then make your domain object depend on the factory:
public class SomeAwesomeBusinessLogic {
private final RepositoryFactory repositoryFactory;
#Inject
public SomeAwesomeBusinessLogic(
RepositoryFactory repositoryFactory) {
this.repositoryFactory = repositoryFactory;
}
//awesome stuff going down
}
You can then do repositoryFactory.getRepository(new Configuration(true/false)); in your business logic.
I prefer this approach because it's easier to extend to new types of repos. You can also test if the factory logic is correct. No one usually tests dagger modules, that's why I'm not so keen on the first approach.
Another good thing is that this approach allows you to keep the same module instance if there's a possibility to change the app's configuration and suddenly change to a demo repo, by providing a different Configuration object to the factory.
A third possibility might be Producers. However, this really depends on your use case and how you want to handle this runtime dependency exchange. I find this approach quite good, but it might be a bit overkill.
Hope this helps

Related

spring data repository Implenation

I am using Spring DATA JPA and selected #Query annotation for creating queries (instead of using NamedQueries and Queries created from MethodName)
I have a data repository as below
public interface EventRepository extends CrudRepository<Event, Long> {
#Query("select e from Event e where e.name = :eventName)
public List<Event>findEventByName(String eventName );
}
Interface looks good and its enough as per Spring reference doc.
But I need a impl class because I need many other methods in addition to above.
I am facing 2 issues when I create EventRepositoryImpl java implementing EventRepository
Its asking to implement all the methods in EventRepository, findEventByName method is self contained in interface and why I need implement it again in Impl class?
Its asking to implement all the methods in CrudRepository, I know its per OOPS design, But there many methods
So, for these issues can I define my EventRepositoryImpl as abstract,
this seems to be working fine.
But do I need to worry about anything else, when Spring uses a abstract class as a bean.
or is there an elegant way to solve this issue.
Appreciate your help.
You do not have to implement all of these methods neither create an abstract class. Take a look into official documentation.
interface UserRepositoryCustom {
public void someCustomMethod(User user);
}
class UserRepositoryImpl implements UserRepositoryCustom {
public void someCustomMethod(User user) {
// Your custom implementation
}
}
interface UserRepository extends CrudRepository<User, Long>, UserRepositoryCustom {
// Declare query methods here
}

Dagger2 Using an entire Dependency Graph

I have set up dagger2 dependencies in my app as I understand it and through the many examples. What I have not found is the proper way to use all of the dependencies once they are injected.
Each of the singletons in the module depends on the output of the singleton before it. How is the entire dependency graph used without calling each singleton in turn to get the required inputs?
Given the following:
AppComponent
#Singleton
#Component(modules = {
DownloaderModule.class
})
public interface AppComponent {
void inject(MyGameActivity activity);
}
DownloaderModule
#Module
public class DownloaderModule {
public static final String NETWORK_CACHE = "game_cache";
private static final int GLOBAL_TIMEOUT = 30; // seconds
public DownloaderModule(#NonNull String endpoint) {
this(HttpUrl.parse(endpoint));
}
#Provides #NonNull #Singleton
public HttpUrl getEndpoint() {
return this.endpoint;
}
#Provides #NonNull #Singleton #Named(NETWORK_CACHE)
public File getCacheDirectory(#NonNull Context context) {
return context.getDir(NETWORK_CACHE, Context.MODE_PRIVATE);
}
#Provides #NonNull #Singleton
public Cache getNetworkCache(#NonNull #Named(NETWORK_CACHE) File cacheDir) {
int cacheSize = 20 * 1024 * 1024; // 20 MiB
return new Cache(cacheDir, cacheSize);
}
#Provides #NonNull #Singleton
public OkHttpClient getHttpClient(#NonNull Cache cache) {
return new OkHttpClient.Builder()
.cache(cache)
.connectTimeout(GLOBAL_TIMEOUT, TimeUnit.SECONDS)
.readTimeout(GLOBAL_TIMEOUT, TimeUnit.SECONDS)
.writeTimeout(GLOBAL_TIMEOUT, TimeUnit.SECONDS)
.build();
}
MyGameApp
public class MyGameApp extends Application {
private AppComponent component;
private static Context context;
public static MyGameApp get(#NonNull Context context) {
return (MyGameApp) context.getApplicationContext();
}
#Override
public void onCreate() {
super.onCreate();
component = buildComponent();
MyGameApp.context = getApplicationContext();
}
public AppComponent component() {
return component;
}
protected AppComponent buildComponent() {
return DaggerAppComponent.builder()
.downloaderModule(new DownloaderModule("https://bogus.com/"))
.build();
}
}
I'll try to shed some light into this, but there are several ways you can read this. I prefer a bottom up approach - Basically start on what your objects require and work my way up. In this case, I would start at MyGameActivity. Unfortunately, you didn't paste the code for this, so I'll have to be a bit creative, but that's ok for the purpose of the exercise.
So in your app you're probably getting the AppComponent and calling inject for your MyGameActivity. So I guess this activity has some injectable fields. I'm not sure if you're using there directly OkHttpClient but let's say you do. Something like:
public class MyGameActivity extends SomeActivity {
#Inject
OkHttpClient okHttpClient;
// ...
}
The way I like to think about this is as follows. Dagger knows you need an OkHttpClient given by the AppComponent. So it will look into how this can be provided - Can it build the object itself because you annotated the constructor with #Inject? Does it require more dependencies?.
In this case it will look into the modules of the component where this client is being provided. It will reach getHttpClient and realise it needs a Cache object. It will again look for how this object can be provided - Constructor injection, another provider method?.
It's again provided in the module, so it will reach getNetworkCache and once more realise it needs yet another dependency.
This behaviour will carry on, until it reaches objects that require no other dependencies, such as your HttpUrl in getEndpoint.
After all this is done, your OkHttpClient can be created.
I think it's easy to understand from this why you can't have cycles in your dependency graph - You cannot create an object A if it depends on B and B depends on A. So imagine that for some weird reason you'd reach the method getEndpoint which would depend on the OkHttpClient from that module. This wouldn't work. You'd be going in circles an never reach an end.
So if I understand your question: How is the entire dependency graph used without calling each singleton in turn to get the required inputs?
It's not. It has to call all the methods to be able to get the singletons. At least the first time they're provided within the same component/scope. After that, as long as you keep the same instance of your component, the scoped dependencies will always return the same instance. Dagger will make sure of this. If you'd for some reason destroy the component or recreate it, then the dependencies wouldn't be the same instances. More info here. In fact this is true for all scopes. Not just #Singletons.
However, as far as I can tell you're doing it right. When your application is created you create the component and cache it. After that, every time you use the method component() you return always the same component and the scoped dependencies are always the same.

Entity Framework Core DbContext and Dependency Injection

I'm building a service application using Web API, .Net Core and EntityFramework Core.
For configuring options in my DbContext I'm using these lines in "ConfigureServices" method in Startup.cs
var connection = #"Server=ISSQLDEV;Database=EventManagement;Trusted_Connection=True;";
services.AddDbContext<EMContext>(options => options.UseSqlServer(connection));
I know that if I add the context as a constructor parameter in the controller .Net will inject the context in the constructor.
But this is not the behavior I want. I don't want my web api to know anything about the dbcontext. I have a DataAccess Project with a repository class that handles all CRUD operations.
This means that I just want to say Repository.AddEvent(evt) in my controller and then repository knows how to handle that.
On the other hand, repository uses a simple dependency resolver to get the right "IDataAdapter" implementation. One of those implementations is SQLDataAdapter. This is the point I need my context.
How can I pass my context all the way to this point?
You can solve this by adding your dbcontext via constructor injection to your classes from your data access layer.
public class Startup
{
public void ConfigureServices(IServiceCollection services)
{
services.AddDbContext<ApplicationDbContext>(o => o.UseSqlServer(myConnStr));
services.AddScoped<Repository>(); // 'scoped' in ASP.NET means "per HTTP request"
}
}
public class MvcController
{
private Repository repo;
public MvcController(Repository repo)
{
this.repo = repo;
}
[HttpPost]
public void SomeEndpoint()
{
this.repo.AddFoo(new Foo());
}
}
public class Repository
{
private DbContext db;
public Repository(ApplicationDbContext db)
{
this.db = db;
}
public void AddFoo(Foo obj)
{
this.db.Set<Foo>().Add(obj);
this.db.SaveChanges();
}
}
If you want to further customize how your DbContext is injected into your DI container, I suggest you look at what .AddDbContext is actually doing. See https://github.com/aspnet/EntityFramework/blob/1.0.0/src/Microsoft.EntityFrameworkCore/EntityFrameworkServiceCollectionExtensions.cs#L142-L158

Generic Repository session management asp.net-mvc fluent nhibernate

I have gotten into a problem with my project. I am using a generic repository with structure map together with Fluent NHibernate. Everything works rather well, but when it comes to transactions and session management I have really no clue what to do. I have looked around for answers but I cant really find anything that fit my needs.
What I do in my application is that I let structure map instantiate a repository class when it gets a request for it, like so:
internal class RepositoryRegistry : Registry
{
public RepositoryRegistry()
{
For<IRepository<User>>().Use<Repository<User>>();
For<IRepository<Tasks>>().Use<Repository<Tasks>>();
}
}
internal class NHibernateRegistry : Registry
{
public NHibernateRegistry()
{
For<ISessionFactory>()
.Singleton()
.Use(() => new NHibernateSessionFactory().GetSessionFactory());
For<ISession>()
.Singleton()
.Use(x => x.GetInstance<ISessionFactory>().OpenSession());
}
}
public interface IRepository<T>
{
T GetById(int id);
void SaveOrUpdate(T entity);
IList<T> GetAll();
IQueryable<T> Linq();
void Add(T entity);
}
Edit: I have concluded what I need. I wan't to use the unit of work pattern along with structure map, but I also want to have some kind of repository wrapper which can be accessed through a unit of work.
Thanks,
James Ford
I think that you are looking for the Unit Of Work pattern, where the transaction life time is controlled by a unit of work that you inject into the Repostories/Services.
See this answer for a sample implementation of a UoW with NHibernate and StructureMap.
Edit:
Provided you have implemented a Unit of Work and a generic repository you would basically use them by:
1) Mapping them in structure map:
c.For(typeof(IRepository<>)).Use(typeof(Repository<>));
c.For<IUnitOfWork>().Use<UnitOfWork>();
2) Having the Controller accept a Repository(or a Service encapsulating the repository; this approach is often preferred) and the UnitOfWork:
public class MyController
{
public MyController(IRepository<MyEntity> repository, IUnitOfWork uow)
{
_repository = repository;
_unitOfWork = uow;
}
}
This of course also requires that you have created a custom ControllerFactory.
3) Using the Unit of Work and Repository in the controller action:
public ViewResult MyAction(MyEntity entity)
{
_repository.Save(entity);
_unitOfWork.Commit();
return View();
}

How to dispose resources with dependency injection

I'm using StructureMap to resolve references to my repository class. My repository interface implements IDisposable, e.g.
public interface IMyRepository : IDisposable
{
SomeClass GetById(int id);
}
An implementation of the interface using Entity Framework:
public MyRepository : IMyRepository
{
private MyDbContext _dbContext;
public MyDbContext()
{
_dbContext = new MyDbContext();
}
public SomeClass GetById(int id)
{
var query = from x in _dbContext
where x.Id = id
select x;
return x.FirstOrDefault();
}
public void Dispose()
{
_dbContext.Dispose();
}
}
Anyway as mentioned I'm using StructureMap to resolve IMyRepository. So when, where and how should I call my dispose method?
WARNING: please note that my views have changed, and you should consider the following advise outdated. Please see this answer for an updated view: https://stackoverflow.com/a/30287923/264697
While DI frameworks can manage lifetime of objects for you and some could even dispose objects for you after you're done using with them, it makes object disposal just too implicit. The IDisposable interface is created because there was the need of deterministic clean-up of resources. Therefore, in the context of DI, I personally like to make this clean-up very explicit. When you make it explicit, you've got basically two options: 1. Configure the DI to return transient objects and dispose these objects yourself. 2. Configure a factory and instruct the factory to create new instances.
I favor the second approach over the first, because especially when doing Dependency Injection, your code isn't as clean as it could be. Look for instance at this code:
public sealed class Client : IDisposable
{
private readonly IDependency dependency;
public Client(IDependency dependency)
{
this. dependency = dependency;
}
public void Do()
{
this.dependency.DoSomething();
}
public Dispose()
{
this.dependency.Dispose();
}
}
While this code explicitly disposes the dependency, it could raise some eyebrows to readers, because resources should normally only be disposed by the owner of the resource. Apparently, the Client became the owner of the resource, when it was injected.
Because of this, I favor the use of a factory. Look for instance at this example:
public sealed class Client
{
private readonly IDependencyFactory factory;
public Client(IDependencyFactory factory)
{
this.factory = factory;
}
public void Do()
{
using (var dependency = this.factory.CreateNew())
{
dependency.DoSomething();
}
}
}
This example has the exact same behavior as the previous example, but see how the Client class doesn't have to implement IDisposable anymore, because it creates and disposes the resource within the Do method.
Injecting a factory is the most explicit way (the path of least surprise) to do this. That's why I prefer this style. Downside of this is that you often need to define more classes (for your factories), but I personally don't mind.
RPM1984 asked for a more concrete example.
I would not have the repository implement IDisposable, but have a Unit of Work that implements IDisposable, controls/contains repositories and have a factory that knows how to create new unit of works. With that in mind, the above code would look like this:
public sealed class Client
{
private readonly INorthwindUnitOfWorkFactory factory;
public Client(INorthwindUnitOfWorkFactory factory)
{
this.factory = factory;
}
public void Do()
{
using (NorthwindUnitOfWork db =
this.factory.CreateNew())
{
// 'Customers' is a repository.
var customer = db.Customers.GetById(1);
customer.Name = ".NET Junkie";
db.SubmitChanges();
}
}
}
In the design I use, and have described here, I use a concrete NorthwindUnitOfWork class that wraps an IDataMapper that is the gateway to the underlying LINQ provider (such as LINQ to SQL or Entity Framework). In sumary, the design is as follows:
An INorthwindUnitOfWorkFactory is injected in a client.
The particular implementation of that factory creates a concrete NorthwindUnitOfWork class and injects a O/RM specific IDataMapper class into it.
The NorthwindUnitOfWork is in fact a type-safe wrapper around the IDataMapper and the NorthwindUnitOfWork requests the IDataMapper for repositories and forwards requests to submit changes and dispose to the mapper.
The IDataMapper returns Repository<T> classes and a repository implements IQueryable<T> to allow the client to use LINQ over the repository.
The specific implementation of the IDataMapper holds a reference to the O/RM specific unit of work (for instance EF's ObjectContext). For that reason the IDataMapper must implement IDisposable.
This results in the following design:
public interface INorthwindUnitOfWorkFactory
{
NorthwindUnitOfWork CreateNew();
}
public interface IDataMapper : IDisposable
{
Repository<T> GetRepository<T>() where T : class;
void Save();
}
public abstract class Repository<T> : IQueryable<T>
where T : class
{
private readonly IQueryable<T> query;
protected Repository(IQueryable<T> query)
{
this.query = query;
}
public abstract void InsertOnSubmit(T entity);
public abstract void DeleteOnSubmit(T entity);
// IQueryable<T> members omitted.
}
The NorthwindUnitOfWork is a concrete class that contains properties to specific repositories, such as Customers, Orders, etc:
public sealed class NorthwindUnitOfWork : IDisposable
{
private readonly IDataMapper mapper;
public NorthwindUnitOfWork(IDataMapper mapper)
{
this.mapper = mapper;
}
// Repository properties here:
public Repository<Customer> Customers
{
get { return this.mapper.GetRepository<Customer>(); }
}
public void Dispose()
{
this.mapper.Dispose();
}
}
What's left is an concrete implementation of the INorthwindUnitOfWorkFactory and a concrete implementation of the IDataMapper. Here's one for Entity Framework:
public class EntityFrameworkNorthwindUnitOfWorkFactory
: INorthwindUnitOfWorkFactory
{
public NorthwindUnitOfWork CreateNew()
{
var db = new ObjectContext("name=NorthwindEntities");
db.DefaultContainerName = "NorthwindEntities";
var mapper = new EntityFrameworkDataMapper(db);
return new NorthwindUnitOfWork(mapper);
}
}
And the EntityFrameworkDataMapper:
public sealed class EntityFrameworkDataMapper : IDataMapper
{
private readonly ObjectContext context;
public EntityFrameworkDataMapper(ObjectContext context)
{
this.context = context;
}
public void Save()
{
this.context.SaveChanges();
}
public void Dispose()
{
this.context.Dispose();
}
public Repository<T> GetRepository<T>() where T : class
{
string setName = this.GetEntitySetName<T>();
var query = this.context.CreateQuery<T>(setName);
return new EntityRepository<T>(query, setName);
}
private string GetEntitySetName<T>()
{
EntityContainer container =
this.context.MetadataWorkspace.GetEntityContainer(
this.context.DefaultContainerName, DataSpace.CSpace);
return (
from item in container.BaseEntitySets
where item.ElementType.Name == typeof(T).Name
select item.Name).First();
}
private sealed class EntityRepository<T>
: Repository<T> where T : class
{
private readonly ObjectQuery<T> query;
private readonly string entitySetName;
public EntityRepository(ObjectQuery<T> query,
string entitySetName) : base(query)
{
this.query = query;
this.entitySetName = entitySetName;
}
public override void InsertOnSubmit(T entity)
{
this.query.Context.AddObject(entitySetName, entity);
}
public override void DeleteOnSubmit(T entity)
{
this.query.Context.DeleteObject(entity);
}
}
}
You can find more information about this model here.
UPDATE December 2012
This an an update written two years after my original answer. The last two years much has changed in the way I try to design the systems I'm working on. Although it has suited me in the past, I don't like to use the factory approach anymore when dealing with the Unit of Work pattern. Instead I simply inject a Unit of Work instance into consumers directly. Whether this design is feasibly for you however, depends a lot on the way your system is designed. If you want to read more about this, please take a look at this newer Stackoverflow answer of mine: One DbContext per web request…why?
If you want to get it right, i'd advise on a couple of changes:
1 - Don't have private instances of the data context in the repository. If your working with multiple repositories then you'll end up with multiple contexts.
2 - To solve the above - wrap the context in a Unit of Work. Pass the unit of work to the Repositories via the ctor: public MyRepository(IUnitOfWork uow)
3 - Make the Unit of Work implement IDisposable. The Unit of Work should be "newed up" when a request begins, and therefore should be disposed when the request finishes. The Repository should not implement IDisposable, as it is not directly working with resources - it is simply mitigating them. The DataContext / Unit of Work should implement IDispoable.
4 - Assuming you are using a web application, you do not need to explicitly call dispose - i repeat, you do not need to explicitly call your dispose method. StructureMap has a method called HttpContextBuildPolicy.DisposeAndClearAll();. What this does is invoke the "Dispose" method on any HTTP-scoped objects that implement IDisposable. Stick this call in Application_EndRequest (Global.asax). Also - i believe there is an updated method, called ReleaseAllHttpScopedObjects or something - can't remember the name.
Instead of adding Dispose to IMyRepository, you could declare IMyRepository like this:
public interface IMyRepository: IDisposable
{
SomeClass GetById(int id);
}
This way, you ensure all repository will call Dispose sometimes, and you can use the C# "using" pattern on a Repository object:
using (IMyRepository rep = GetMyRepository(...))
{
... do some work with rep
}