SimpleTriggerFactoryBean in spring integration with javaconfig - quartz-scheduler

I need a quartz job in cluster to initialize spring integration flow on one node only.
Please help how to setup spring integration poller with SimpleTriggerFactoryBean.
I prefer javaconfig solution.
And how to integrate this:
#PersistJobDataAfterExecution
#DisallowConcurrentExecution
public class HarvestStateJob extends QuartzJobBean {
Right now I just call spring integration gateway in quartz job directly to run integration flow.
#Override
protected void executeInternal(JobExecutionContext context) throws JobExecutionException {
But I do not like this solution. Is there way to run integration flow directly?

Unfortunately no, there is no other way.
Spring Integration doesn't support integration with quertz (yet): https://jira.spring.io/browse/INT-2731
Feel free to comment there to track more info and ideas.
Right now your solution sounds good. You have a gateway to the Spring Integration Universe and hide everything from Quearz Job. And just initiate some invocation from there as usual.
Even if it wouldn't be Spring Integration, you need something to invoke from Quartz Jobs. So, everything sound good for you.
And that's may be the reason why we don't have a Quartz Poller out-of-the-box :-).

Related

#SpringBootTest how to pre-populate embedded MongoDB?

Doing implementation of a microservice with a few endpoints based on Spring Boot and MongoDB and trying to write integration tests using #SpringBootTest annotation capabilities.
At the moment, I am facing an issue that I need to pre-populate an embedded MongoDB instance that instantiated only during 'test' phrase with some test data.
And I did not find any out-of-the-box option available in Spring Boot for this purpose.
Some people advice to use for test data pre-populating tools like mongobee or mongoprefill or nosql-unit but for me, it seems like overhead or workaround, do not want to introduce any new dependencies even in test scope.
So could you please advice: In the current Spring Boot ecosystem, what is the right way to pre-populate MongoDB for testing purpose, when we are talking about integration (end-to-end) testing with #SpringBootTest?
There are multiple ways to pre-populate data:
Use the JUnit lifecycle methods like #BeforeEach, #BeforeAll to fill in data
You could disable the Spring Boot autoconfiguration for the embedded MongoDB and do it on your own and insert data after creating the connection
You could somehow mirror the #Sql feature we have for testing relational databases and write something similar using the AsbtractTestExectuionListener. For this have a look at the Spring class SqlScriptsTestExecutionListener
Provide a class that implements the CommandLineRunner interface and only activate this bean for your integration test profile with #Profile("integration-test")

Spring-Boot #RepositoryRestResource How to override a save method with custom behaviour

In my app, I need to customized the call of the save method over a POST Restful service to post an event to a RabbitMQ queue.
Each time a consumer of my API is firing a POST on my resource, I want to publish an event on my RabbitMQ queue to make some asynchronous processing.
Right now, I use #RepositoryRestResource and Spring-Data-Jpa to expose a CRUD API over my Spring-Data JPA Repository. It does the job, very straightforward and simple. I'd like to stick with that so in the case of a POST (save method) I'd like to compose or change the behaviour. I need to store the data in my database but also to publish an event in the RabbitMQ queue.
I tried several solution but I failed.
May be you have the solution .
How to I extend a particular method in my Rest CRUD repository ?
One way that I have solved this kind of problem in the past is to use Aspect Oriented Programming, and luckily since you are using the Spring Framework it is quite easy and well documented. You could put "Around" advice around the constructor for the domain objects (just a suggestion) and have it send a message to the RabbitMQ Exchange.
Another way of doing this is to use the Log4j AMQP appender and log the object prior to saving it.
You can use a custom org.springframework.context.ApplicationListener. Spring Data REST offers the convenient base class AbstractRepositoryEventListener.
#Component
public class PublishToRabbitMQAfterSavingYourEntity extends AbstractRepositoryEventListener<YourEntity> {
#Override
public void onAfterSave(YourEntity entity) {
// publish to RabbitMQ
}
}

Correct way of using/testing event service in Eclipse E4 RCP

Let me ask two coupled questions that might boil down to one about good application design ;-)
What is the best practice for using event based communication in an e4 RCP application?
How can I write simple unit tests (using JUnit) for classes that send/receive events using dependency injection and IEventBroker ?
Let’s be more concrete: say I am developing an Eclipse e4 RCP application consisting of several plugins that need to communicate. For communication I want to use the event service provided by org.eclipse.e4.core.services.events.IEventBroker so my plugins stay loosely coupled. I use dependency injection to inject the event broker to a class that dispatches events:
#Inject static IEventBroker broker;
private void sendEvent() {
broker.post(MyEventConstants.SOME_EVENT, payload)
}
On the receiver side, I have a method like:
#Inject
#Optional
private void receiveEvent(#UIEventTopic(MyEventConstants.SOME_EVENT) Object payload)
Now the questions:
In order for IEventBroker to be successfully injected, my class needs access to the current IEclipseContext. Most of my classes using the event service are not referenced by the e4 application model, so I have to manually inject the context on instantiation using e.g. ContextInjectionFactory.inject(myEventSendingObject, context);
This approach works but I find myself passing around a lot of context to wherever I use the event service. Is this really the correct approach to event based communication across an E4 application?
how can I easily write JUnit tests for a class that uses the event service (either as a sender or receiver)? Obviously, none of the above annotations work in isolation since there is no context available. I understand everyone’s convinced that dependency injection simplifies testability. But does this also apply to injecting services like the IEventBroker?
This article describes creation of your own IEclipseContext to include the process of DI in tests. Not sure if this could resolve my 2nd issue but I also hesitate running all my tests as JUnit Plug-in tests as it appears impractible to fire up the PDE for each unit test. Maybe I just misunderstand the approach.
This article speaks about “simply mocking IEventBroker”. Yes, that would be great! Unfortunately, I couldn’t find any information on how this can be achieved.
All this makes me wonder whether I am still on the right track or if this is already a case of bad design? And if so, how would you go about redesigning? Move all event related actions to dedicated event sender/receiver classes or a dedicated plugin?
Actually, running a JUnit plug-in test is not that expensive. You can configure the launch configuration to run in headless mode so the only thing loaded is a lightweight PDE without workbench. The same happens when you run a headless build with for example Tycho. Surefire launches your test-bundle as headless plug-in test by default.
The advantage over isolated unit tests is that you can access your plug-in's resources and, most importantly, use dependency injection. If you want to mock an injected object you have to run a plug-in test so you can use InjectorFactory.
This is how you would go about mocking the event service: IEventBroker is an interface, so the only thing you need to do is writing a mock implementation for it
public class IEventBrokerMock implements IEventBroker {
#Override
// Implemented Methods
}
In your test method you would have something like
InjectorFactory.getDefault().addBinding(IEventBroker.class).implementedBy(IEventBrokerMock.class);
ClassUnderTest myObject = InjectorFactory.getDefault().make(ClassUnderTest.class, null);
If you want to work with a context the test method would instead contain
IEclipseContext context = EclipseContextFactory.create();
context.set(IEventBroker.class, new IEventBrokerMock());
ClassUnderTest myObject = ContextInjectionFactory.make(ClassUnderTest.class, context);
If you run this as JUnit plug-in test your object will have the mocked event service injected.
for testing, instead of DI, i use "eventBroker=new org.eclipse.e4.ui.services.internal.events.EventBroker();" to get a eventbroker object to use, it works ok

GwtTestCase and Spring

So I managed to create a GWT-SpringMVC setup. Wasn't easy (not too many resources), but possible. I even autowired and stuff. It even works :)
However, I can't figure out how to make the GwtTestCase run. Obviously it needs the "server" to be up, and because I use Spring, it needs to pass through the dispatching servlet (no?). But I can't figure out how to connect the two. In production or hosted mode, I got the web.xml and the spring-servlet.xml to configure these things. What can I do for tests?
I thought of ignoring the web part and testing the service directly - but this will deny me the option to automatically tests that everything is "transferable".
(if you have an idea on how to do that, I might ditch the GWTTestCase altogether).
An alternative to GWTTestCase could be the gwt-test-utils framework, which provides a simple integration with Spring (see here for details)

should I invoke Grails controller from Quartz job for REST API calls?

I've seen a number of postings citing that quartz jobs should not invoke controllers. I'm using Grails to use salesforce.com's new support for the REST API. The nightly job would use that API to update customer data from our proprietary DB to the salesforce environment. There is a session that is created using a login id.
So... I would like to use the jobs plug-in for grails to give me the cron-style way to invoke controllers that interact with services in order to send REST API calls via httpclient to update/upsert our objects in salesforce.com land.
It seems like this would be a legitimate reason for invoking controllers from the jobs area in Grails.
Would love any feedback or alternative approaches (within Grails) for handling this.
thx, David
Why have you invoke controllers from Quartz jobs ? This looks whery awkward.
User grails services.
Quartz plugin has dependency injection so it should be easy to invoke service methods.
Even if you invoke a controller from a quartz task, you won't be able to access the session because there will be no authenticated user. If you want to make some complex business logic put it in a service and then call it from your job. Declaring services in quartz jobs is exactly the same as the declaration in the controllers.
I feel the question is a valid one. I had a similar requirement. I used grails rest plugin. A grails controller action that exports some report data into excel and email it to emailing list on daily basis. So I created a method in controller:
def exportToExcel() {
myService.exportToExcel(response)
}
Then besides exportToExcel() implementation in myService.groovy, I created an additional method as under:
def runExportToExcelJob() {
withHttp(uri: "http://localhost:9092/myProject/"){
return get(path: 'myController/exportToExcel')
}
}
And finally, in my grails quartz job, I invoked myService.runExportToExcelJob().
It works fine. But I too wonder, if there is another way of making a rest call from grails job. Any feedback is really appreciated.