I have set up dagger2 dependencies in my app as I understand it and through the many examples. What I have not found is the proper way to use all of the dependencies once they are injected.
Each of the singletons in the module depends on the output of the singleton before it. How is the entire dependency graph used without calling each singleton in turn to get the required inputs?
Given the following:
AppComponent
#Singleton
#Component(modules = {
DownloaderModule.class
})
public interface AppComponent {
void inject(MyGameActivity activity);
}
DownloaderModule
#Module
public class DownloaderModule {
public static final String NETWORK_CACHE = "game_cache";
private static final int GLOBAL_TIMEOUT = 30; // seconds
public DownloaderModule(#NonNull String endpoint) {
this(HttpUrl.parse(endpoint));
}
#Provides #NonNull #Singleton
public HttpUrl getEndpoint() {
return this.endpoint;
}
#Provides #NonNull #Singleton #Named(NETWORK_CACHE)
public File getCacheDirectory(#NonNull Context context) {
return context.getDir(NETWORK_CACHE, Context.MODE_PRIVATE);
}
#Provides #NonNull #Singleton
public Cache getNetworkCache(#NonNull #Named(NETWORK_CACHE) File cacheDir) {
int cacheSize = 20 * 1024 * 1024; // 20 MiB
return new Cache(cacheDir, cacheSize);
}
#Provides #NonNull #Singleton
public OkHttpClient getHttpClient(#NonNull Cache cache) {
return new OkHttpClient.Builder()
.cache(cache)
.connectTimeout(GLOBAL_TIMEOUT, TimeUnit.SECONDS)
.readTimeout(GLOBAL_TIMEOUT, TimeUnit.SECONDS)
.writeTimeout(GLOBAL_TIMEOUT, TimeUnit.SECONDS)
.build();
}
MyGameApp
public class MyGameApp extends Application {
private AppComponent component;
private static Context context;
public static MyGameApp get(#NonNull Context context) {
return (MyGameApp) context.getApplicationContext();
}
#Override
public void onCreate() {
super.onCreate();
component = buildComponent();
MyGameApp.context = getApplicationContext();
}
public AppComponent component() {
return component;
}
protected AppComponent buildComponent() {
return DaggerAppComponent.builder()
.downloaderModule(new DownloaderModule("https://bogus.com/"))
.build();
}
}
I'll try to shed some light into this, but there are several ways you can read this. I prefer a bottom up approach - Basically start on what your objects require and work my way up. In this case, I would start at MyGameActivity. Unfortunately, you didn't paste the code for this, so I'll have to be a bit creative, but that's ok for the purpose of the exercise.
So in your app you're probably getting the AppComponent and calling inject for your MyGameActivity. So I guess this activity has some injectable fields. I'm not sure if you're using there directly OkHttpClient but let's say you do. Something like:
public class MyGameActivity extends SomeActivity {
#Inject
OkHttpClient okHttpClient;
// ...
}
The way I like to think about this is as follows. Dagger knows you need an OkHttpClient given by the AppComponent. So it will look into how this can be provided - Can it build the object itself because you annotated the constructor with #Inject? Does it require more dependencies?.
In this case it will look into the modules of the component where this client is being provided. It will reach getHttpClient and realise it needs a Cache object. It will again look for how this object can be provided - Constructor injection, another provider method?.
It's again provided in the module, so it will reach getNetworkCache and once more realise it needs yet another dependency.
This behaviour will carry on, until it reaches objects that require no other dependencies, such as your HttpUrl in getEndpoint.
After all this is done, your OkHttpClient can be created.
I think it's easy to understand from this why you can't have cycles in your dependency graph - You cannot create an object A if it depends on B and B depends on A. So imagine that for some weird reason you'd reach the method getEndpoint which would depend on the OkHttpClient from that module. This wouldn't work. You'd be going in circles an never reach an end.
So if I understand your question: How is the entire dependency graph used without calling each singleton in turn to get the required inputs?
It's not. It has to call all the methods to be able to get the singletons. At least the first time they're provided within the same component/scope. After that, as long as you keep the same instance of your component, the scoped dependencies will always return the same instance. Dagger will make sure of this. If you'd for some reason destroy the component or recreate it, then the dependencies wouldn't be the same instances. More info here. In fact this is true for all scopes. Not just #Singletons.
However, as far as I can tell you're doing it right. When your application is created you create the component and cache it. After that, every time you use the method component() you return always the same component and the scoped dependencies are always the same.
Related
I'm trying to achieve encapsulation by using subcomponent which is described here, but I got infinite recursion.
Here is my code:
//tried adding #ScopeA, still the same.
public class A {
#Inject
A(B b) {
}
}
#ScopeA
public class B {
#Inject
B() {
}
}
#Component(modules = AModule.class)
#Singleton
public interface AComponent {
public A a();
}
#Module(subcomponents = SComponent.class)
class AModule {
#Provides
#Singleton
A a(SComponent.Factory factory) {
return factory.component().a();
}
}
#Subcomponent
#ScopeA
interface SComponent {
#ScopeA
A a();
#Subcomponent.Factory
interface Factory {
SComponent component();
}
}
public class MainActivity extends AppCompatActivity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
DaggerAComponent.create().a();
}
}
After checking generated dagger code, I found this:
private final class SComponentImpl implements SComponent {
private SComponentImpl() {}
#Override
public A a() {
return DaggerAComponent.this.aProvider.get();
}
}
It seeems that SComponent are getting A from parent component, which is not what I wanted, where is the problem of my code?
Note that the example from the Subcomponents for Encapsulation page uses a qualifier annotation, #PrivateToDatabase, which is not a scoping annotation and which distinguishes the binding of Database from the binding of #PrivateToDatabase Database.
Subcomponents inherit all of the bindings from their parent components, so you currently do have A available from the parent component and also A available from the subcomponent. This is especially tricky if anything in your subcomponent needs to inject A, if it weren't marked #Singleton: Do you want the A from the parent component, or the A from the subcomponent?
Another tricky part of this situation is that you can't use qualifier annotations on classes that use #Inject constructors.
I'd recommend that you do the following:
Extract an interface from A, so then you have A and AImpl.
Keep your #Provides method that gets an A instance from the subcomponent.
Have the subcomponent expose AImpl, and (to best avoid ambiguity) only inject AImpl in the classes in your subcomponent, not A.
If you'd rather not extract an interface, you could also work around this problem by removing #Inject from A and writing a #Provides method in a module in the subcomponent that returns a qualified A, so the unqualified A goes through the top-level component and the qualified A is only available within the subcomponent.
I am pretty new to Dagger and finding the component body a bit difficult to understand,having 2 specific questions related to the component implementation:
1)
#Singleton
#Component(modules = { UserModule.class, BackEndServiceModule.class })
public interface MyComponent {
BackendService provideBackendService();// Line 1
void inject(Main main); // Line 2
}
What is the purpose of Line 2? also will an instance of backendService be created even if line 1 is removed?
and also in the below code where the implementation of the above interface is generated , what does the component.inject(this) actually do?
public class Main {
#Inject
BackendService backendService; //
private MyComponent component;
private Main() {
component = DaggerMyComponent.builder().build();
component.inject(this);
}
private void callServer() {
boolean callServer = backendService.callServer();
if (callServer) {
System.out.println("Server call was successful. ");
} else {
System.out.println("Server call failed. ");
}
}
and also why has the backendservice not obtained using component.provideBackendService()
What is the purpose of void inject(Main main);?
It lets you perform field injection on concrete class Main, assuming that Main is a class that cannot be created by Dagger
where the implementation of the above interface is generated , what does the component.inject(this) actually do?
It uses MemberInjectors to inject the package-protected or public fields marked with #Inject. You can see the implementation of inject(Main) method in DaggerMyComponent class.
Of course, if possible it is better to make it so that:
1.) Main does not instantiate/know about its own injector
2.) Main is created by the Dagger component and #Inject constructor is used
#Singleton
public class Main {
private final BackendService backendService;
#Inject
Main(BackendService backendService) {
this.backendService = backendService;
}
}
This question already has answers here:
How do I tell Dagger 2 which implementation to instantiate based on X?
(3 answers)
Closed 5 years ago.
Using Dagger 2, I have a domain object that I provide to presenters. That domain object has a dependency on a repository. That repository has two implementations, but both implement the same interface. I need to be able to setup dagger somehow to swap between the two implementations of the repository at runtime based on a user selecting a "Demo Mode" option.
So I have the following domain object:
public class SomeAwesomeBusinessLogic {
Repository repository;
#Inject
public SomeAwesomeBusinessLogic(Repository repository) {
this.repository = repository;
}
//awesome stuff goin down
}
And the two repositories:
public RemoteRepository implements Repository {
#Inject
public RemoteRepository() {
//setup
}
}
public DemoRepository implements Repository {
#Inject
public DemoRepository() {
//setup
}
}
Any ideas on how to structure my modules and components to get this to work?
A couple of ideas come to my head. Depending on how and when you want to exchange this. One possibility is to instantiate your module with a configuration. Can be a simple boolean:
#Module
public class RepositoryModule {
private final boolean isDemo;
public RepositoryModule(boolean isDemo) {
this.isDemo = isDemo;
}
#Provides
public Repository providesRepository() {
return isDemo? new DemoRepository() : new RemoteRepository();
}
// Other providers
}
I'm not a fan of this approach, but it fits your use case. I believe it's quite restrictive and doesn't allow for easy maintainability. I would perhaps choose to use a factory and provide the factory instead of the repo itself.
public interface RepositoryFactory {
Repository getRepository(Configuration configuration);
}
public class RepositoryFactoryImpl implements RepositoryFactory {
#Inject
public RepositoryFactoryImpl() {}
// The implementation
}
#Module
public class RepositoryModule {
#Provides
public RepositoryFactory providesRepositoryFactory(
RepositoryFactoryImpl factory) {
return factory;
}
// Other providers
}
Configuration would be a simple POJO where you can specify several attributes to configure your repo. Say for example:
public class Configuration {
private final boolean isDemo;
// all the POJO stuff you need
}
You can then make your domain object depend on the factory:
public class SomeAwesomeBusinessLogic {
private final RepositoryFactory repositoryFactory;
#Inject
public SomeAwesomeBusinessLogic(
RepositoryFactory repositoryFactory) {
this.repositoryFactory = repositoryFactory;
}
//awesome stuff going down
}
You can then do repositoryFactory.getRepository(new Configuration(true/false)); in your business logic.
I prefer this approach because it's easier to extend to new types of repos. You can also test if the factory logic is correct. No one usually tests dagger modules, that's why I'm not so keen on the first approach.
Another good thing is that this approach allows you to keep the same module instance if there's a possibility to change the app's configuration and suddenly change to a demo repo, by providing a different Configuration object to the factory.
A third possibility might be Producers. However, this really depends on your use case and how you want to handle this runtime dependency exchange. I find this approach quite good, but it might be a bit overkill.
Hope this helps
I thought I knew GWT serialization rules, but apparently I don't. This case is just weird, I'm trying to figure it out for couple of hours, still no luck. Maybe you, guys, could lend me a hand on this one?
First things first: the stack trace.
...blah blah blah...
Caused by: com.google.gwt.user.client.rpc.SerializationException: Type 'geos.dto.common.client.Market' was not included in the set of types which can be serialized by this SerializationPolicy or its Class object could not be loaded. For security purposes, this type will not be serialized.: instance = null
at com.google.gwt.user.server.rpc.impl.ServerSerializationStreamWriter.serialize(ServerSerializationStreamWriter.java:619)
at com.google.gwt.user.client.rpc.impl.AbstractSerializationStreamWriter.writeObject(AbstractSerializationStreamWriter.java:126)
at com.google.gwt.user.client.rpc.core.java.util.Collection_CustomFieldSerializerBase.serialize(Collection_CustomFieldSerializerBase.java:44)
at com.google.gwt.user.client.rpc.core.java.util.HashSet_CustomFieldSerializer.serialize(HashSet_CustomFieldSerializer.java:39)
at com.google.gwt.user.client.rpc.core.java.util.HashSet_CustomFieldSerializer.serializeInstance(HashSet_CustomFieldSerializer.java:51)
at com.google.gwt.user.client.rpc.core.java.util.HashSet_CustomFieldSerializer.serializeInstance(HashSet_CustomFieldSerializer.java:28)
at com.google.gwt.user.server.rpc.impl.ServerSerializationStreamWriter.serializeImpl(ServerSerializationStreamWriter.java:740)
at com.google.gwt.user.server.rpc.impl.ServerSerializationStreamWriter.serialize(ServerSerializationStreamWriter.java:621)
at com.google.gwt.user.client.rpc.impl.AbstractSerializationStreamWriter.writeObject(AbstractSerializationStreamWriter.java:126)
at com.extjs.gxt.ui.client.data.RpcMap_CustomFieldSerializer.serialize(RpcMap_CustomFieldSerializer.java:35)
... 78 more
So it appears the problem lies in geos.dto.common.client.Market. Let's see the minimal that still can be compiled.
package geos.dto.common.client;
public class Market extends RowModel<Integer> {
public static final String ID="id";
public static final String NAME="name";
public Market() { }
public Market(int id, String name) { }
public String getName() { }
public void setName(String name) { }
}
Either I really need a vacation, or it's just fine. A LOT of DTO classes inherit from RowModel, they are working and are serialized properly, no problems there. But of course I'll show you anyway. This time some GXT stuff ahead. This class is unedited, but still fairly simple.
package geos.dto.common.client;
import com.extjs.gxt.ui.client.data.BaseModelData;
public class RowModel<I> extends BaseModelData implements IdentifiableModelData<I> {
private I identifier;
private String identifierProperty;
public RowModel() { }
public RowModel(String identifierProperty) {
this.identifierProperty=identifierProperty;
}
#Override
public I getIdentifier() {
return identifier;
}
public void setIdentifier(I identifier) {
this.identifier = identifier;
if((identifierProperty!=null)&&(!identifierProperty.isEmpty())) {
set(identifierProperty,identifier);
}
}
public void setIdentifierProperty(String identifierProperty) {
this.identifierProperty = identifierProperty;
if(identifier!=null) {
set(identifierProperty,identifier);
}
}
public String getIdentifierProperty() {
return identifierProperty;
}
#Override
public <X> X set(String property, X value) {
if(property.equals(identifierProperty)&&((identifier==null)||(!getIdentifier().equals(value)))) {
setIdentifier((I)value);
}
return super.set(property, value);
}
}
Looks somewhat weird, I know, but these identifier is really important. I removed toString() which - in this case - returns null (because internal RpcMap is null, and it's null because no values are set in Market class). And the last piece of code, the interface implemented by RowModel:
package geos.dto.common.client;
import com.extjs.gxt.ui.client.data.ModelData;
import java.io.Serializable;
public interface IdentifiableModelData<I> extends ModelData, Serializable {
public I getIdentifier();
}
The versions are GWT 2.4.0 and GXT 2.2.5. I want to upgrade it soon, but first I want to deal with problems like this one.
And that would be all, I think. Do you see anything I can't see? I certainly hope so! Thanks!
Expecting, that your package structure follows the naming conventions: Is it possible that you have to move your Market-class into the shared package?
If you make a rcp call, the class is serialized on the client side and deserialized on the server side. There fore the class have to be accessible from the client and the server. If you class lies in the client-package, the server can't access this class. Classes that are used on the client and server side are put in a package called shared.
So, all classes that are only needed in your client, should be inside a package called client. Classes, that are needed on the server and the client side should be inside the shared package and classes, that are only neede on the server side are inside the server package.
This is my abstract class, that extends BaseModelData and lies inside the shared package:
package de.gishmo.leela.application.shared.models;
import java.io.Serializable;
import com.extjs.gxt.ui.client.data.BaseModelData;
#SuppressWarnings("serial")
public abstract class MyBaseModel
extends BaseModelData
implements Serializable {
public final static String MYFIELD = "myField";
public abstract String getModelName();
}
works well in RCP-calls.
And please implement the Serializable Interface.
I've got an oblivion.
The problem wasn't in that class, not at all. Thing is, it's transferred using another class, that extends RowModel as well. And it's set this way:
public void setMarkets(Set<Market> markets) {
set(MARKETS,markets);
}
And because I haven't included the Market type in that class, GWT didn't know it should be serialized at compilation time. Adding private Market _market; in that class did the trick. Actually it's well known issue related to subclasses of BaseModelData (that it can't serialize types that are not declared as class fields), but I totally forgotten it...
I want to run the same JUnit tests for different interface implementations. I found a nice solution with the #Parameter option:
public class InterfaceTest{
MyInterface interface;
public InterfaceTest(MyInterface interface) {
this.interface = interface;
}
#Parameters
public static Collection<Object[]> getParameters()
{
return Arrays.asList(new Object[][] {
{ new GoodInterfaceImpl() },
{ new AnotherInterfaceImpl() }
});
}
}
This test would be run twice, first with the GoodInterfaceImpl then with the AnotherInterfaceImpl class. But the problem is I need for most of the testcases a new object. A simplified example:
#Test
public void isEmptyTest(){
assertTrue(interface.isEmpty());
}
#Test
public void insertTest(){
interface.insert(new Object());
assertFalse(interface.isEmpty());
}
If the isEmptyTest is run after the insertTest it fails.
Is there an option to run automatically each testcase with a new instance of an implementation?
BTW: Implementing a clear() or reset()-method for the interface is not really an options since I would not need it in productive code.
Here is another approach with the Template Method pattern:
The interface-oriented tests go into the base class:
public abstract class MyInterfaceTest {
private MyInterface myInterface;
protected abstract MyInterface makeContractSubject();
#Before
public void setUp() {
myInterface = makeContractSubject();
}
#Test
public void isEmptyTest(){
assertTrue(myInterface.isEmpty());
}
#Test
public void insertTest(){
myInterface.insert(new Object());
assertFalse(myInterface.isEmpty());
}
}
For each concrete class, define a concrete test class:
public class GoodInterfaceImplTest extends MyInterfaceTest {
#Override
protected MyInterface makeContractSubject() {
// initialize new GoodInterfaceImpl
// insert proper stubs
return ...;
}
#Test
public void additionalImplementationSpecificStuff() {
...
}
}
A slight advantage over #Parameter is that you get the name of the concrete test class reported when a test fails, so you know right away which implementation failed.
Btw, in order for this approach to work at all, the interface must be designed in a way which allows testing by the interface methods only. This implies state-based testing -- you cannot verify mocks in the base test class. If you need to verify mocks in implementation-specific tests, these tests must go into the concrete test classes.
Create a factory interface and implementations, possibly only in your test hierarchy if you don't need such a thing in production, and make getParameters() return a list of factories.
Then you can invoke the factory in a #Before annotated method to get a new instance of your actual class under test for each test method run.
Just in case somebody reaches here(like I did), looking for testing multiple implementations of the same interface in .net you could see one of the approaches that I was using in one of the projects here
Below is what we are following in short
The same test project dll is run twice using vstest.console, by setting an environment variable. Inside the test, (either in the assembly initialize or test initialize) register the appropriate implementations into a IoC container, based on the environment variable value.
In Junit 5 you could do:
#ParameterizedTest
#MethodSource("myInterfaceProvider")
void test(MyInterface myInterface) {}
static Stream<MyInterface> myInterfaceProvider() {
return Stream.of(new ImplA(), new ImplB());
}
interface MyInterface {}
static class ImplA implements MyInterface {}
static class ImplB implements MyInterface {}