Where do the Locator methods go when using a ServiceLocator? - gwt

If I use separate service classes with a ServiceLocator, do I still need to define the Locator methods somewhere?:
T create(Class<? extends T> clazz)
I getId(T domainObject)
T find(Class<? extends T> clazz, I id)
Object getVersion(T domainObject)
Do they go on the service class?

In general, these two types serve orthogonal purposes: A ServiceLocator finds code; a Locator finds entities.
Using a ServiceLocator does not change where the entity support methods are declared. The entity support methods are still searched for in the domain types.

Related

Issue while implementing a interface which extends to MongoRepository interface in Kotlin

I am trying to use the in built methods of MongoRepository<T,ID> interface to interact with mongo.
interface MovieRepository : MongoRepository<Movie, String> {
}
But when I try to implement the "MovieRepository" using class. Its asking me to implement all the member functions defined in "MongoRepository" as well
class ControllerClass(private val MovieRepository: MovieRepository): MovieRepository {}
This is what i get when i initialize my controller class:
Class 'ControllerClass' is not abstract and does not implement abstract member public abstract fun <S : Movie!> save(entity: S): S
Is there any way so that i do not need to defined all those MongoRepository's functions again in my ControllerClass?
You don't usually implement a repository interface yourself: you let Spring do it for you!
First, you define your interface, as you have done:
interface MovieRepository : MongoRepository<Movie, String> {
// Add any abstract methods you'll need here…
}
Then, you autowire a property of that type. In Kotlin, you can either do it in the primary constructor, e.g.:
#Controller
class ControllerClass #Autowired constructor(
private val movieRepository: MovieRepository
) {
// …code…
}
Or as a plain property. (In this case, because you can't specify an initial value, you have to make the property a var; it must either be nullable — requiring !! everywhere you use it — or, better, make it lateinit.)
#Controller
class ControllerClass {
#Autowired private lateinit var movieRepository: MovieRepository
// …code…
}
Spring will then create some synthetic class implementing that interface, and set your property to it. (You don't need to worry about how it does that — just as you don't need to worry about all the other magic it does, much of which involves creating synthetic subclasses. That's why Spring objects generally need to be made open — and why there's a Spring plugin which takes care of doing that.)
It's more usual to use the repository in a service class, and then call that from your controller class — at least, that pattern tends to scale better, and be easier to follow and to test. But doing so directly should work too. Either way, you can call whichever repository method you need, e.g. movieRepository.findAll().
See the Spring docs; they use Java, but it's mostly trivial to convert to Kotlin.

Dagger 2, scopes + annotation

I never worked with such a confusing DI-framework like dagger! - However, I try to wrap my head around it.
I have two scopes: ActivityScope and FragmentScope
On some of the samples provided StatisticsFragment.java you see e.g. the fragment annotated with the scope
#ActivityScoped
public class StatisticsFragment extends DaggerFragment implements
StatisticsContract.View {
...
}
Question 1:
Is this just documentation or not? In my app it makes no difference if I annotate the concrete fragment or not.
Question 2: Where in the generated code can I see which scope is used? My fragment injects a Presenter and an AuthProvider. The AuthProvider is annotated with Singleton (in AppModule), the Presenter is defined in UIModule -> LoginModule
looks like this:
UIModule.java:
#Module(includes = AndroidSupportInjectionModule.class)
public abstract class UIModule {
#ActivityScope
#ContributesAndroidInjector(modules = LoginModule.class)
abstract LoginActivity loginActivity();
#ChildFragmentScope
#ContributesAndroidInjector(modules = LoginModule.class)
abstract LoginFragment loginFragment();
#Binds
//#ChildFragmentScope
public abstract LoginContract.View loginView(final LoginFragment fragment);
}
LoginModule.java
#Module
public abstract class LoginModule {
#Provides
//#ChildFragmentScope
static LoginContract.Presenter provideLoginPresenter(final LoginContract.View view, final BaseStore store) {
return new LoginPresenter(view,store);
}
}
LoginFragemt.java
public class LoginFragment extends DaggerFragment {
#Inject
LoginContract.Presenter presenter;
#Inject
Provider<MyAuthClass> myAuthClass;
...
}
presenter is created every time the Fragment gets created, myAuthClass gets created only once and is singleton.
Perfect - but I have no idea HOW this works!!!
DaggerFragment#onAttach must somehow know that Presenter is a "local" singleton and MyAuthClass is a global-singleton ...
Scope is one of two ways you can tell Dagger to always bind the same object, rather than returning a newly-created one on each injection request. (The other way is the manual way: Just return the same object in a #Provides method.)
First, a scope overview: Let's say you have a component, FooComponent, which has a #FooScope annotation. You define a subcomponent, BarComponent, which has a #BarScope annotation. That means that using a single FooComponent instance, you can create as many BarComponent instances as you want.
#FooScoped
#Component(modules = /*...*/)
public interface FooComponent {
BarComponent createBarComponent(/* ... */); // Subcomponent factory method
YourObject1 getYourObject1(); // no scope
YourObject2 getYourObject2(); // FooScoped
}
#BarScoped
#Subcomponent(modules = /*...*/)
public interface BarComponent {
YourObject3 getYourObject3(); // no scope
YourObject4 getYourObject4(); // BarScoped
YourObject5 getYourObject5(); // FooScoped
}
When you call fooComponent.getYourObject1(), YourObject1 is unscoped, so Dagger does its default: create a brand new one. When you call fooComponent.getYourObject2(), though, if you've configured that YourObject2 to be #FooScoped, Dagger will return exactly one instance for the entire lifetime of that FooComponent. Of course, you could create two FooComponent instances, but you'll never see multiple instances of a #FooScoped object from the same #FooScoped component (FooComponent).
Now onto BarComponent: getYourObject3() is unscoped, so it returns a new instance every time; getYourObject4() is #BarScoped, so it returns a new instance for each instance of BarComponent; and getYourObject5() is #FooScoped, so you'll get the same instance along the instance of FooComponent from which the BarComponent was created.
Now to your questions:
Question 1: Is this just documentation or not? In my app it makes no difference if I annotate the concrete fragment or not.
In classes that have an #Inject-annotated constructor like StatisticsFragment does, adding a scope annotation is not simply documentation: Without the scope annotation, any requests to inject a StatisticsFragment will generate a brand new one. If you only expect there to be a single instance of StatisticsFragment per Activity, this may be surprising behavior, but it might be hard to notice the difference.
However, adding an #Inject annotation to a Fragment may be something of a controversial move, because the Android infrastructure is able to create and destroy Fragment instances itself. The object that Android recreates will not the scoped one, and it will have its members reinjected onAttach due to DaggerFragment's superclass behavior. I think a better practice is to drop the #Inject annotation from the constructor and stick with field injection for your Fragment. At that point you can drop the scope, because Dagger will never create your Fragment, so it'll never decide whether to create a new one or return an existing one.
Question 2: Where in the generated code can I see which scope is used? My fragment injects a Presenter and an AuthProvider. The AuthProvider is annotated with Singleton (in AppModule), the Presenter is defined in UIModule -> LoginModule
The generated code and scoping is always generated in the Component; subcomponents will have their implementations generated as an inner class of the Component. For each scoped binding, there will be a place in the initialize method where the Provider (e.g. AuthProvider) is wrapped in an instance of DoubleCheck that manages the double-checked locking for singleton components. If nobody asks Dagger to create an object (like StatisticsFragment), Dagger can determine the lack of component factory methods or injections in the graph, and can avoid adding any code generation for it at all—which might be why you're not seeing any.

Why we need to create xxxCustom and xxxImpl class when we create repository?

From this information, When we create repository class, it is better to create 1 classes and 2 interfaces for one repository UserRepository(interface), UserRepositoryCustom(class), UserRepositoryImpl(interface).
http://docs.spring.io/spring-data/jpa/docs/current/reference/html/#repositories.custom-implementations
But we can create repository class without these classes... why we need to create those classes and what is the merit(or demerit) if we create those classes?
If you checkout the Spring Data core concepts, UserRepository interface definition which extends CrudRepository or JPAReposiory provides you are all basic CRUD operations for free on an entity.
public interface UserRepository extends JpaRepository<User, Long>
You can add your own basic custom queries in this repository interface using the naming convention approach or by using #Query attribute.
In case you want to execute some custom logic which cannot be easily managed and defined in the UserRepository, e.g. complex joins and stored procedure and you need access to underlying EntityManager you need UserCustomRepository interface. UserRepository will extend this interface for inheriting the methods.
public interface UserRepository extends JpaRepository<User, Long>, UserCustomRepository {
void myCustomMethod();
}
You need to provide the implementation of these methods yourself in UserRepositoryImpl class. Spring data look for custom method implementation in this class and invoke them when they are called.
Hope this explanation helps.

How to use QueryDslJpaRepository?

In my current project setup I'm defining repositories as:
public interface CustomerRepository extends JpaRepository<Customer, Long>, QueryDslPredicateExecutor<Customer> {
}
The QueryDslPredicateExecutor provides additional findAll methods which return e.g. an Iterable.
It e.g. does not contain a method to only specify an OrderSpecifier.
I just came across the QueryDslJpaRepository which contains more variants of these Predicate and OrderSpecifier aware methods, and also return Lists instead of Iterables.
I wonder why QueryDslPredicateExecutor is limited and if it is possible to use QueryDslJpaRepository methods?
I used a custom BaseRepository already so It was easy to make sure my repositories use the List variant (instead of Iterable) using:
#NoRepositoryBean
public interface BaseRepository<T, ID extends Serializable> extends JpaRepository<T, ID>, QueryDslPredicateExecutor<T> {
#Override
List<T> findAll(Predicate predicate);
#Override
List<T> findAll(Predicate predicate, Sort sort);
#Override
List<T> findAll(Predicate predicate, OrderSpecifier<?>... orders);
#Override
List<T> findAll(OrderSpecifier<?>... orders);
}
Note that my commnent reagarding missing methods in QueryDslPredicateExecutor was incorrect.
QueryDslJpaRepository extends SimpleJpaRepository
SimpleJpaRepository is used when you want to adding custom behavior to all repositories. It takes three steps to do so:
Step 1: Create an interface (eg CustomRepository) that extends JpaRepository, then add your own interface methods
Step 2: Create a class (eg CustomRepositoryImpl) that implements your CustomRepository, which naturally requires you to supply concrete method implementations to each and every method defined in not only CustomRepository but also JpaRepository as well as JpaRepository's ancestor interfaces. It'd be a tedious job, so Spring provide a SimpleJpaRepository concrete class to do that for you. So all you need to do is to make CustomRepositoryImpl to extend SimpleJpaRepository and then only write concrete method for the method in your own CustomRepository interface.
Step 3: make CustomRepositoryImpl the new base-class in the jpa configuration (either in xml or JavaConfig)
Similarly, QueryDslJpaRepository is the drop-in replacement for SimpleJpaRepository when your CustomRepository extends not only JpaRepository but also QueryDslPredicateExecutor interface, to add QueryDsl support to your repositories.
I wish Spring Data JPA document made it clear what to do if someone is using QueryDslPredicateExecutor but also wants to add his/her own customized methods. It took me some time to figure out what to do when the application throws errors like "No property findAll found for type xxx" or "No property exists found for type xxx".
Check your Predicate import in Service, for my case it was because auto import brings import java.util.function.Predicate; instead of import com.querydsl.core.types.Predicate;. This gives confusion like findall with predicate function gives error.

GWT RequestFactory: implementing InstanceRequest methods in separate services

I am using a class separate from my DAO to handle access requests, but I don't know how to implement InstanceRequest methods:
public class Service
{
public static DAO findDAO(Long id);
}
#Service(Service.class)
public interface DAORequestContext extends RequestContext
{
Request<ProxyForDAO> findDAO(Long id);
InstanceRequest<ProxyForDAO, Long> persist();
}
I define public Long persist() in my DAO implementation, because the gwt docs say, "On the server, instance methods must be implemented as non-static methods in the entity type," but request factory can't seem to find it:
SEVERE: Could not find any methods named persist in com.activegrade.server.data.Service
Feb 23, 2011 10:03:02 PM com.google.gwt.requestfactory.server.ServiceLayerDecorator die
How do I implement an instance method in Service? I don't know how to transform the InstanceRequest. Do I need to connect the DAO class to the Service some way, so that the code generator knows to look in Service for most methods but to skip over to the DAO for InstanceRequest calls?
ASAIK it's not possible because the InstanceRequest methods are bound to the class specified in the #Service annotation. With Request methods it is possible to have a method for multiple data classes one service class, using locators. But from what I understand of request factory this is not possible with the InstanceRequest methods. See also this project, which uses Locators, with methods in one place: http://code.google.com/p/listwidget/
Edit: I've rewritten the answer based after the update/and comment below.