Profiles In dagger - dagger-2

I am new to dagger and I am searching for how can we implement functionality like spring profiles in dagger-2.x. I want different beans for my devo and prod environments, but I am using dagger framework with Java.
#Provides
#Singleton
public void providesDaggerCoffeeShopClient(Stage stage) {
DaggerCoffeeShop.builder()
.dripCoffeeModule(new DripCoffeeModule())
.qualifier(stage)
.build();
}
Here, I want to skip this bean creation if stage is "Devo". Any help will be appreciated.

Well. I have met this question 2 days ago. And since performed research about this matter. I was looking for a solution that would allow me to be able to run application with different profiles passed as a system property on the application run like:
java -Denv=local-dev-env -jar java-app.jar
The only appropriate solution I was able to find is to follow the oficial documentation testing guide:
https://dagger.dev/dev-guide/testing
and devide my one module into different modules, in particular I had to separate and substitute data base dependency when I want to run my app locally avoiding connection to real DB and executing any command against real DB.
And when I run my app I perform check on system property like:
public boolean isLocalDevEnv() {
return Environments.LOCAL_DEV.envName.equals(System.getProperty("env", Environments.PRODUCTION.envName));
}
and if the system property DOES NOT contain the property I am looking for, then I
create the PRODUCTION instance of my component (that is configured to use production modules):
DaggerMyAppComponent.create()
Which approximately looks like:
#Component(modules = {MyAppModule.class, DaoModule.class})
#Singleton
public interface MyAppComponent {...}
otherwise, I create loca-dev-env version of the component that uses the version of the module that produces mock of Dao that would be creating real connection to real Data Base otherwise:
DaggerMyAppLocalDevEnvComponent.create()
Which approximately looks like:
#Component(modules = {MyAppModule.class, DaoMockModule.class})
#Singleton
public interface MyAppLocalDevEnvComponent {...}
Hope it was clear, so just think of Spring Profiles for dagger 2 from the perspective of system properties and programmatic decision making. This approach definitely requires ALOT of boilerplate code in comparison to Spring's Profiles implementation, but it is the only viable approach I was able to come up with.
Hope it helps.

Related

Does NUnit 3.9 support Test Suites?

I am trying to find a way to create custom suites of NUnit tests to target our wide variety of environments. The closest thing I found was this http://nunit.org/docs/2.5.6/suite.html which is exactly what I am looking for. Tying to implement this though, the [Suite] annotation doesnt even seem to exist.. Was this taken away? Is there a better solution now?
The SuiteAttribute was eliminated in NUnit 3. It never got a lot of use as most people simply organize their tests by namespace, which provides the same grouping of tests that the SuiteAttribute used to do.
FUN FACT: "Automatic namespace suites" were once a new cool thing!
If you want the ability to group tests in different ways, across namespace boundaries, you can use categories to do it. It's not as easy of course.
An alternative, if you are using the command-line console runner, is to list the fixtures you want to run in a file and use the --testlist option.
Building off of Charlies post from above - The way I was able to set this up was using the --testlist option.
First create a testlist.txt file and store it somewhere in your solution. Structure the file in such a way so if you have a class like.
namespace NamespaceA
{
class TestGroup
{
[Test]
public void TestOne()
{
}
[Test]
public void TestTwo()
{
}
}
}
the files contents would look like this..
NamespaceA.TestGroup.TestOne
or for both..
NamespaceA.TestGroup
Then just your standard consule runner command
"nunit-console.exe" "path/to/.dll" --testlist="path/to/testlist.txt"
:D)

How to choose MappingContext in spring-data-jpa (2x) + spring-rest-webmvc?

I've got a Module A that provides authentication through users, groups and related classes. This module uses org.springframework.data:spring-data-jpa:1.6.0.RELEASE to access this data from a database. Of note might be that Module A uses a custom BaseRepository configured by extending JpaRepositoryFactoryBean, but removing this does not resolve the issue below.
A second Module B also has some classes and repositories to manage, unrelated to the Module A classes, again using spring-data-jpa for storage, but connected to a different database. This project exposes it's repositories via REST using org.springframework.data:spring-data-rest-webmvc:2.1.0.RELEASE. Module B uses the classes in module A for authenticating users, but does not manipulate those class instances nor does it store any references.
The issue I'm having now is that my module B REST APIs work flawlessly when Module A is not present (or with an older version not yet using spring-data-jpa), but when it is I present it breaks on creating self referential links with the below stacktrace:
java.lang.IllegalArgumentException: Cannot create self link for class Document! No persistent entity found!
at org.springframework.data.rest.webmvc.PersistentEntityResourceAssembler.getSelfLinkFor(PersistentEntityResourceAssembler.java:81) ~[spring-data-rest-webmvc-2.1.0.M1.jar:na]
at org.springframework.data.rest.webmvc.PersistentEntityResourceAssembler.toResource(PersistentEntityResourceAssembler.java:64) ~[spring-data-rest-webmvc-2.1.0.M1.jar:na]
at org.springframework.data.rest.webmvc.PersistentEntityResourceAssembler.toResource(PersistentEntityResourceAssembler.java:32) ~[spring-data-rest-webmvc-2.1.0.M1.jar:na]
at org.springframework.data.web.PagedResourcesAssembler.createResource(PagedResourcesAssembler.java:144) ~[spring-data-commons-1.8.0.M1.jar:na]
at org.springframework.data.web.PagedResourcesAssembler.toResource(PagedResourcesAssembler.java:96) ~[spring-data-commons-1.8.0.M1.jar:na]
at org.springframework.data.rest.webmvc.AbstractRepositoryRestController.entitiesToResources(AbstractRepositoryRestController.java:220) ~[spring-data-rest-webmvc-2.1.0.M1.jar:na]
at org.springframework.data.rest.webmvc.AbstractRepositoryRestController.resultToResources(AbstractRepositoryRestController.java:207) ~[spring-data-rest-webmvc-2.1.0.M1.jar:na]
at org.springframework.data.rest.webmvc.RepositoryEntityController.getCollectionResource(RepositoryEntityController.java:135) ~[spring-data-rest-webmvc-2.1.0.M1.jar:na]
See also: https://github.com/spring-projects/spring-data-rest/blob/master/spring-data-rest-webmvc/src/main/java/org/springframework/data/rest/webmvc/PersistentEntityResourceAssembler.java#L80
It looks to be talking to the wrong MappingContext in the RepositoryFactoryBeanSupport, even if my org.springframework.data.repository.support.Repositories contains all the repositoryBeanNames from both Module A and Module B.
Does anyone know how I can enforce the use of a particular MappingContext, perhaps through my extension of RepositoryRestMvcConfiguration?
** Edit **
Here's an GitHub repository illustrating the problem:
https://github.com/timtebeek/dual-data-jpa-rest-webmvc
It's since been reported as a bug on the data-rest project:
https://jira.spring.io/browse/DATAREST-312
This happens to me today
I was trying to query a specific Entity
I fix it creating the repository of that class
In your case it'll be
#Repository
public interface DocumentRepository extends JpaRepository<Document, Long> {
}
also doing all the needed configuration to use jpa repositories. Look here
I hope that hepls.

How do I detect whether a mongodb serializer is already registered?

I have created a custom serializer for mongoDB.
I can register it and it works as expected.
However the my application sometimes throws an error because it tries to register the serializer twice.
How do I detect whether a serializer has already been registered and thus stop my application from registering a second time?
If you are using
BsonSerializer.RegisterSerializer(typeof (Type), typeSerializer);
you might get this error "there is already a serializer registered for type". Because you cannot register the same type of serializer 2 times. But you can write your own serializer and this serializer will work before default serializers.
For instance: if you want to use local DateTime instead of Utc which is default.
all you need to do is that writing a class implementing IBsonSerializationProviderand register this provider to BsonSerializer as soon as possible!
here is the sample code.
public class LocalDateTimeSerializationProvider : IBsonSerializationProvider
{
public IBsonSerializer GetSerializer(Type type)
{
return type == typeof(DateTime) ? DateTimeSerializer.LocalInstance : null;
}
}
and to be able to register
BsonSerializer.RegisterSerializationProvider(new LocalDateTimeSerializationProvider());
I hope this helps, you can also read the original documentation in here
this .net driver version of mongodb is 2.4!
TL;DR: Ig you are lazy, use BsonSerializer.LookupSerializer or BsonMemberMap.GetSerializer. To do it right, make sure the registration code is called once and only once.
The best approach to avoid this is to make sure the serializer is registered only once. It's a good idea to have some global startup code that registers anything that is global to the application once, and only once. That includes stuff like dependency injector configuration, tools like automapper and the mongodb driver. If you call this code only once and from a single point in code, you don't need to worry about thread safety, dead locks or similar troubles.
The MongoDB driver configuration settings are thread-safe, but don't assume that this is true for all software packages that you might need to configure. Also, locking can be very expensive performance wise if your code is multi-threaded, for instance in a web-application. Last but not least, that lookup you're doing might not be trivial in the first place, because some methods need to walk an entire inheritance tree.

Selenium junit tests - how do I run tests within a test in sequential order?

I am using junit with eclipse to write function tests.
When running an individual test it runs in the order that I have them set within the class.
Eg.
testCreateUser
testJoinUserToRoom
testVerify
testDeleteUser
However when I run this test as part of a suite, (so in a package) the order is random.
It will for example do the verify, then delete user then joinuserToRoom then Createuser.
My tests within the suite are not dependent on each other. However each individual test within a test is dependent on them being run in the correct order.
Is there any way I can achieve this?
Thanks.
You can't guarantee the order of execution of test methods in JUnit.
The order of execution of test classes within a suite is guaranteed (if you're using Suite), but the order of execution if the test classes are found by reflection isn't (for instance, if you're running a package in Eclipse, or a set of tests from maven or ant). This may be definable by ant or maven, but it isn't defined by JUnit.
In general, JUnit executes the test methods in the order in which they are defined in the source file, but not all JVMs guarantee this (particulary with JVM 7). If some of the methods are inherited from an abstract base test class, then this may not hold either. (This sounds like your case, but I can't tell from your description).
For more information on this, see my answer to Has JUnit4 begun supporting ordering of test? Is it intentional?.
So what can you do to fix your problem? There are two solutions.
In your original example, you've actually only got one test (verify), but you've got 4 methods, two setup (createUser, joinUserToRoom) and one teardown (deleteUser). So your first option is to better define your test cases, using a TestRule, in particular ExternalResource. ExternalResource allows you to define before/after behaviour for a test, similar to #Before/#After. However, the advantage of ExternalResource is that you can factor this out of your test.
So, you would create/delete the user in your external resource:
public class UsesExternalResource {
#Rule
public ExternalResource resource= new ExternalResource() {
#Override
protected void before() throws Throwable {
// create user
};
#Override
protected void after() {
// destroy user
};
};
#Test
public void testJoinUserToRoom() {
// join user to room
// verify all ok
}
}
For me, this is simpler and easier to understand, and you get independent tests, which is a good thing. This is what I would do, but you will need to refactor your tests a bit. You can also stack these Rules using RuleChain.
Your second option, if you really want to introduce dependencies between your test methods, is to look at TestNG, in which you can define dependencies from one test to another.
If they have a 'correct' order, then they are not multiple tests, but a single test that you have incorrectly annotated as being multiple independent tests.
Best practise would to rewrite them in approved junit style (setup - act - verify), supported by #Before or #BeforeClass methods that did any required common setup.
Quick workaround would be to have a single #Test-annotated method that called the other test methods in sequence. That becomes something like the preferred alternative if you are using Junit not to do strict unit testing, but something more like scenario-driven systems testing. It's not necessarily the best tool for such use, but it does work perfectly well in some cases.
Then, what you would have so far is have a single test:
#Test public void testUserNominalLifeCycle(...
which could then, if you are feeling virtuous, be supplemented by extra new tests like
#Test public void testUserWhoNeverJoinsARoom(...

Can I use RequestFactory without getId() and getVersion() methods?

We are trying to use RequestFactory with an existing Java entity model. Our Java entities all implement a DomainObject interface and expose a getObjectId() method (this name was chosen as getId() can be ambiguous and conflict with the domain object's actual ID from the domain being modeled.
The ServiceLayerDecorator interface allows for customization of ID and Version property lookup strategies.
public class MyServiceLayerDecorator extends ServiceLayerDecorator {
#Override
public Object getId(Object object) {
DomainObject domainObject = (DomainObject) object;
return domainObject.getObjectId();
}
}
So far, so good. However, trying to deploy this solution yields runtime errors. In particular, RequestFactoryInterfaceValidator complains:
[ERROR] There is no getId() method in type com.mycompany.server.MyEntity
Then later on:
[ERROR] Type type com.mycompany.client.MyEntityProxy was previously marked as bad
[ERROR] The type com.mycompany.client.MyEntityProxy did not pass RequestFactory validation
[ERROR] Unexpected error
com.google.web.bindery.requestfactory.server.UnexpectedException: The type com.mycompany.client.MyEntityProxy did not pass RequestFactory validation
at com.google.web.bindery.requestfactory.server.ServiceLayerDecorator.die(ServiceLayerDecorator.java:212) ~[gwt-servlet.jar:na]
My question is - why does the ServiceLayerDecorator allow for customized ID and Version lookup strategies if RequestFactoryInterfaceValidator is hardcoding the convention of getId() and getVersion()?
I guess I could override ServiceLayerDecorator.resolveClass() to ignore "poisoned" proxy classes but at this point it seems like I'm fighting the framework too much...
Couple of options, some of which have already been mentioned:
Locator. I like to make a single Locator for the entire proj, or at least for groups of related objects that have similar key types. The getId() call will be able to invoke your DomainObject.getObjectId() method and return that value. Note that the getDomainType() method is currently unused, and can return null or throw an exception.
ValueProxy. Instead of having your objects map to something RF can understand as an entity, map them to plain value objects - no id or version required. RF misses out on a lot of clever things it can do, especially with regard to avoiding sending redundant data to the server.
ServiceLayerDecorator. This worked pre 2.4, but with the annotation processing that goes on now, it works less well, since it tries to do some of the work for you. It seems ServiceLayerDecorator has lost a lot of its teeth in the last few months - in theory, you could use it to rebuild getters to talk directly to your persistence mechanism, but now that the annotation processing verifies your code, that is no longer an option.
Big issue in all of this is that RequestFactory is designed to solve a single problem, and solve it well: Allow developers to use POJOs mapped to some persistence mechanism, and refer to those objects from the client, following certain conventions to avoid writing extra code or configuration.
As a result, it solves its own problem pretty well, and ends up being a bad fit for many other problems/use-cases. You might be finding that it isn't worth it: if so, a few thoughts you might consider:
RPC. It isn't perfect for much, but it does an okay job for a lot.
AutoBeans (which RF is based on) is still a pretty fast, lightweight way to send data over the wire and get it into the app. You could build your own wrapper around it, like RF has done, and slim down the problem it is trying to solve to just your use-case.