sftp channel outbound adapter with retry - spring-batch

I am using spring batch and spring integration where once my batch job is completed,it creates texts files and those needs to be uploaded to some ftp server. Sometimes we noticed that those connections drops and it needs to be retried. Is there anyway we can use spring retry project to try few sec later to see if it can upload those files. we want it to be configurable.
If so is there any example out there.
Thanks

Yes, Spring Integration provide retry component for you. It is called RequestHandlerRetryAdvice:
<int-sftp:outbound-channel-adapter>
<int-sftp:request-handler-advice-chain>
<bean class="org.springframework.integration.handler.advice.RequestHandlerRetryAdvice" />
</int-sftp:request-handler-advice-chain>
</int-sftp:outbound-channel-adapter>
Please, find more info in the Reference Manual.
Consider to you use RequestHandlerCircuitBreakerAdvice also for your "connections drops" cases.
And here you are the sample.

Related

Spring batch integration using OutBoundGateway and ReplyingKafkaTemplate

My Goal
I need to read a file and divide each line as a message and send to kafka from a spring batch project and another spring integration project will be receiving the messages to process it in a async way. I want to return those messages after processing to the batch project and create 4 different files out of those messages.
Here I am trying to use OutBoundGateway and ReplyingKafkaTemplate. I am unable to configure it properly... Is there any example or reference guide to configure it.
I have checked spring batch integration samples github repository... There is no sample for outBoundGateway or ReplyingKafkaTemplate.
Thanks in Advance.
For ReplyingKafkaTemplate logic in Spring Integration there is a dedicated KafkaProducerMessageHandler which can be configured with a ReplyingKafkaTemplate.
See more info in docs:
https://docs.spring.io/spring-integration/docs/current/reference/html/kafka.html#kafka-outbound-gateway
And more about ReplyingKafkaTemplate:
https://docs.spring.io/spring-kafka/reference/html/#replying-template
Probably on the other a KafkaInboundGateway must be configured, respectively:
https://docs.spring.io/spring-integration/docs/current/reference/html/kafka.html#kafka-inbound-gateway

Connecting Spring Batch to Remote Cassandra Database

I'm hoping to read data from a CSV file, process the data, and upload it to a remote database. I'm using Spring's starter repo as a base. Code is here
I tried putting this information in the properties file:
spring.data.cassandra.contact-points=IP, IP
spring.data.cassandra.port=PORT
spring.data.cassandra.keyspace-name=KEYSPACE_NAME
spring.data.cassandra.username=USER
spring.data.cassandra.password=PASS
spring.data.cassandra.ssl=TRUE
However, I think it keeps defaulting to pushing to some local tomcat jdbc. I'm not really sure where to start. Any help is appreciated! Thanks.
Your code doesn't have anything to use Cassandra. It doesn't have any of the dependencies and the ItemWriter implementation is a JdbcBatchItemWriter which I don't think will work for Cassandra. You need to configure your application to actually use Cassandra (the spring data starter as well as an ItemWriter implementation that can write to Cassandra).

How to monitor multiple JNDI datasources with FlexyPool?

I am starting to use FlexyPool to monitor an JNDI datasource managed by Tomcat.
I found how to monitor one datasource in this answer and in FlexyPool doc. I can not, however, figure how to configure the monitoring of multiple sources through the flexy-pool.properties file. Is this possible ?
Currently, the declarative configuration only supports a single DataSource. You can open an issue on GitHub for this. I would not mind if you send a Pull request for it.

WTRN0006W for liberty on bluemix

I have a long-running REST method on a Bluemix Liberty runtime and I am getting the WTRN0006W error message because the method takes more than 120 seconds to return. However, it's a single-user app, this method is expected to take long and there's no database data to persist. So it's OK to set the timeout to N minutes in this case.
The problem is that I can' t find exactly what file and how to change the file using the Bluemix Liberty profile in order to do that (increase the timeout).
Any help is welcome.
The file to modify is server.xml. See this blog post on Custom Liberty server.xml configurations in IBM Bluemix. Then, try configuration like:
<server>
...
<transaction totalTranLifetimeTimeout="5m"/>
See the Transaction Manager (transaction) topic in the Knowledge Center for other transaction configuration options.

How to integrate ActiveMQ with Dell Boomi

I have a tough time integrating the ActiveMQ using Dell Boomi as the Dell Boomi documentation is old and may be misleading too sometimes. As I could not find a good suggestion on the web I am putting my query here. Can someone please help with the steps of how to integrate ActiveMQ with Boomi?
With below steps I got it working--
Copy activemq-core-5.4.3.jar and geronimo-j2ee-management_1.1_spec-1.0.1.jar files from your ActiveMQ to your Atom/usrlib/database (create if not there) directory.
Create a jndi property file and place in ActiveMQ home directory. Reference this.
You might get NoClassDefFound error for JMS/Topic etc, means your Boomi lib does not have the implementation for that. You need to copy activemq-all-5.4.3.jar from ActiveMQ home folder to Atom/lib.
I am not specifying how to create JMS Connection and Operation in boomi however, below properties you can use for JMS conneciton in Boomi--
Connection Factory JNDI Lookup: ConnectionFactory.
Initial Context Factory: org.apache.activemq.jndi.ActiveMQInitialContextFactory (default).
Provider URL: tcp://localhost:61616 (Default port).
JMS Operation--
Destination : dynamicQueues/Dell_Boomi (Dynamic will create a queue if not existing).
That's all, try your luck and share your experience!
pick
jars activemq-client,hawtbuf,geronimo-jms_1.1_spec,geronimo-j2ee-management_1.1_spec
from the lib\plugin\queue and copy it to the lib folder. Restart Atom and it should work now.