How to access stepExecution in partitioner - spring-batch

We are using partitioner and it is annotated with #Scope(value="step") and has setter method which is annotated with #BeforeStep, but still framewowrk does not inject step execution object?
What we are doing wrong

Did you register the partitioner as "listener" on the step? As soon as you are using Step-Scope, your Bean is hidden behind a proxy, which makes it impossible for spring to register it automatically as a step-listener (it should work if your bean is not "step-scoped").
It is explained here:
Spring-batch #BeforeStep does not work with #StepScope

Related

What is the difference between testng annotation #Beforemethod and beforeinvocation() listener?

Does #Beforemethod() and beforeInvocation() listener do the same? Please help with the difference!
#Beforemethod: The annotated method will be run before each test method meaning Methods annotated with #Test annotation
If you have implemented the IInvokedMethodListener listener
void beforeInvocation(IInvokedMethod method, ITestResult testResult)
void afterInvocation(IInvokedMethod method,ITestResult testResult)
then IInvokedMethodListener will be invoked for configuration(#BeforeSuite...) and test methods (#Test ...).
From an execution stage perspective, both get executed before a method annotated with #Test is called. It allows you to do any kind of setup you need for a test.
But when do you use what will be based on what is it you want to do.
If you have a specific setup for only a few tests, I will use #BeforeMethod in a class which would apply only to a few tests.
But if it is an suite wide setup, say initializing a driver object or creating an API token which needs to be done for every single test in your suite, then I would prefer listeners.

Spring data inject repository without explicit type

I have a service that needs to use Neo4jRepository (regular repository provider by spring data).
public class SomeServiceBean<T>{
#Autowired
private Neo4jRepository<T,Long> Neo4jRepository;
}
This class will generate en error:
expected single matching bean but found 2: systemUserRepository,systemClaimRepository
The problem is that systemClaimRepository and systemUserRepository is extending Neo4jRepository<T,Long> as a bean without implementation.
Spring see systemClaimRepository and systemUserRepository as Neo4jRepository<T,Long> because they are extending it.
Is there anyway to inject Neo4jRepository<T,Long>?
Thanks
No how should this work?
You have two beans that match the interface and Spring does not know which implementation to inject.

Can I use SpringData by itself [duplicate]

I'm trying to wire up Spring Data JPA objects manually so that I can generate DAO proxies (aka Repositories) - without using a Spring bean container.
Inevitably, I will be asked why I want to do this: it is because our project is already using Google Guice (and on the UI using Gin with GWT), and we don't want to maintain another IoC container configuration, or pull in all the resulting dependencies. I know we might be able to use Guice's SpringIntegration, but this would be a last resort.
It seems that everything is available to wire the objects up manually, but since it's not well documented, I'm having a difficult time.
According to the Spring Data user's guide, using repository factories standalone is possible. Unfortunately, the example shows RepositoryFactorySupport which is an abstract class. After some searching I managed to find JpaRepositoryFactory
JpaRepositoryFactory actually works fairly well, except it does not automatically create transactions. Transactions must be managed manually, or nothing will get persisted to the database:
entityManager.getTransaction().begin();
repositoryInstance.save(someJpaObject);
entityManager.getTransaction().commit();
The problem turned out to be that #Transactional annotations are not used automatically, and need the help of a TransactionInterceptor
Thankfully, the JpaRepositoryFactory can take a callback to add more AOP advice to the generated Repository proxy before returning:
final JpaTransactionManager xactManager = new JpaTransactionManager(emf);
final JpaRepositoryFactory factory = new JpaRepositoryFactory(emf.createEntityManager());
factory.addRepositoryProxyPostProcessor(new RepositoryProxyPostProcessor() {
#Override
public void postProcess(ProxyFactory factory) {
factory.addAdvice(new TransactionInterceptor(xactManager, new AnnotationTransactionAttributeSource()));
}
});
This is where things are not working out so well. Stepping through the debugger in the code, the TransactionInterceptor is indeed creating a transaction - but on the wrong EntityManager. Spring manages the active EntityManager by looking at the currently executing thread. The TransactionInterceptor does this and sees there is no active EntityManager bound to the thread, and decides to create a new one.
However, this new EntityManager is not the same instance that was created and passed into the JpaRepositoryFactory constructor, which requires an EntityManager. The question is, how do I make the TransactionInterceptor and the JpaRepositoryFactory use the same EntityManager?
Update:
While writing this up, I found out how to solve the problem but it still may not be the ideal solution. I will post this solution as a separate answer. I would be happy to hear any suggestions on a better way to use Spring Data JPA standalone than how I've solve it.
The general principle behind the design of JpaRepositoryFactory and the according Spring integration JpaRepositoryFactory bean is the following:
We're assuming you run your application inside a managed JPA runtime environment, not caring about which one.
That's the reason we rely on injected EntityManager rather than an EntityManagerFactory. By definition the EntityManager is not thread safe. So if dealt with an EntityManagerFactory directly we would have to rewrite all the resource managing code a managed runtime environment (just like Spring or EJB) would provide you.
To integrate with the Spring transaction management we use Spring's SharedEntityManagerCreator that actually does the transaction resource binding magic you've implemented manually. So you probably want to use that one to create EntityManager instances from your EntityManagerFactory. If you want to activate the transactionality at the repository beans directly (so that a call to e.g. repo.save(…) creates a transaction if none is already active) have a look at the TransactionalRepositoryProxyPostProcessor implementation in Spring Data Commons. It actually activates transactions when Spring Data repositories are used directly (e.g. for repo.save(…)) and slightly customizes the transaction configuration lookup to prefer interfaces over implementation classes to allow repository interfaces to override transaction configuration defined in SimpleJpaRepository.
I solved this by manually binding the EntityManager and EntityManagerFactory to the executing thread, before creating repositories with the JpaRepositoryFactory. This is accomplished using the TransactionSynchronizationManager.bindResource method:
emf = Persistence.createEntityManagerFactory("com.foo.model", properties);
em = emf.createEntityManager();
// Create your transaction manager and RespositoryFactory
final JpaTransactionManager xactManager = new JpaTransactionManager(emf);
final JpaRepositoryFactory factory = new JpaRepositoryFactory(em);
// Make sure calls to the repository instance are intercepted for annotated transactions
factory.addRepositoryProxyPostProcessor(new RepositoryProxyPostProcessor() {
#Override
public void postProcess(ProxyFactory factory) {
factory.addAdvice(new TransactionInterceptor(xactManager, new MatchAlwaysTransactionAttributeSource()));
}
});
// Create your repository proxy instance
FooRepository repository = factory.getRepository(FooRepository.class);
// Bind the same EntityManger used to create the Repository to the thread
TransactionSynchronizationManager.bindResource(emf, new EntityManagerHolder(em));
try{
repository.save(someInstance); // Done in a transaction using 1 EntityManger
} finally {
// Make sure to unbind when done with the repository instance
TransactionSynchronizationManager.unbindResource(getEntityManagerFactory());
}
There must be be a better way though. It seems strange that the RepositoryFactory was designed to use EnitiyManager instead of an EntityManagerFactory. I would expect, that it would first look to see if an EntityManger is bound to the thread and then either create a new one and bind it, or use an existing one.
Basically, I would want to inject the repository proxies, and expect on every call they internally create a new EntityManager, so that calls are thread safe.

changing anotations from JBoss Seam to CDI (JEE6)

We are migrating our App from JBoss Seam to CDI (JEE6), so we are changing some anotations like #In and #Out, there's a lot of information that we have found helpful, but we have some troubles trying to find out how to replace anotations with particular patterns:
For #In anotation
#Name("comprobantes")//context name
...
#In(create=false,value="autenticadoPOJO",required=false)
private UsuarioPOJO autenticadoPOJO;
We can use #Inject from CDI, but how to set the name of the context variable for this case?.
For the #Out anotation
#Out(scope = ScopeType.SESSION, value = "autenticadoPOJO", required = false)
I have read some blogs and they say that I can use #Produces in CDI, how we can set the scope, before or after adding this anotation?
I appreciate any help or any helpful documentation.
I'm afraid there is no such thing like a 1:1 compatibility for #Out.
Technically, #Out in Seam 2 was realized by an interceptor for all method invocations - this turned out to be quite a performance bottleneck.
In CDI, most managed beans are proxied, this makes it technically impossible to implement outjection in the Seam 2 way.
What you can do (well, what you actually have to do) is going through all usages of #Out and replace it individually with some #Producer logic. Have a look at this official example here. In Seam 2, you would have outjected the authenticated user to the session-scope, in CDI a little producer method does (almost) the same.
That should hopefully give you a good start, feel free to ask further questions :)
http://docs.jboss.org/weld/reference/1.0.0/en-US/html/producermethods.html
8.1. Scope of a producer method
The scope of the producer method defaults to #Dependent, and so it will be called every time the container injects this field or any other field that resolves to the same producer method. Thus, there could be multiple instances of the PaymentStrategy object for each user session.
To change this behavior, we can add a #SessionScoped annotation to the method.
#Produces #Preferred #SessionScoped
public PaymentStrategy getPaymentStrategy() {
...
}

Creating new entities while enabling injection

I have a method on a stateless session bean which creates a new instance of an entity and persists it. You might normally use new MyEntity() to create the object but I would like injection to populate some of the properties of the entity for me.
I got partial success using
#Inject
#New
private MyEntity myNewEntity;
in the session bean and then using that instance in my method.
The problem I have now is that the second time the method is called, myNewEntity isn't a new object, its the same object as the one created the first time. As a result I'm getting
com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry '9' for key 'PRIMARY'
Or at least that's why I think I'm getting this exception. Certainly if I use new MyEntity() I don't get the exception but my injection doesn't happen.
Am I on the wrong track? How can I create a new local entity object while enabling injection?
Any help would be great!
First of all - I have serious doubts that it's a good idea to use CDI to control the lifecycle of a Entity. See this quote from the documentation (here):
According to this definition, JPA
entities are technically managed
beans. However, entities have their
own special lifecycle, state and
identity model and are usually
instantiated by JPA or using new.
Therefore we don't recommend directly
injecting an entity class. We
especially recommend against assigning
a scope other than #Dependent to an
entity class, since JPA is not able to
persist injected CDI proxies.
What you should do to create new instances of entities is adding a layer of indirection, either with #Produces or #Unwraps (Seam Solder, if you need it to be truly stateless), and thereby making sure that you code explicitly calls new.
I think I have a working solution now which seems okay, though I'm not quite sure why it works so I welcome your feedback on a better solution. I am now injecting a DAO-style bean into my stateless session bean:
#Stateless
public class PhoneService {
#Inject
protected ProblemReports problemReports;
An injecting my entity into the ProblemReports bean:
public class ProblemReports {
#Inject
#New
private ProblemReport newProblemReport;
I assume that ProblemReports defaults to #Dependant scope which as I understand it should be the same as the stateless session bean which containts it. I could understand this if the scope of ProblemReports was shorter, causing a new instance of ProblemReport to be created when the new ProblemReports is created; but it isn't.
Is this just an example of EJB and CDI not playing well together?