Spring batch integration using OutBoundGateway and ReplyingKafkaTemplate - spring-batch

My Goal
I need to read a file and divide each line as a message and send to kafka from a spring batch project and another spring integration project will be receiving the messages to process it in a async way. I want to return those messages after processing to the batch project and create 4 different files out of those messages.
Here I am trying to use OutBoundGateway and ReplyingKafkaTemplate. I am unable to configure it properly... Is there any example or reference guide to configure it.
I have checked spring batch integration samples github repository... There is no sample for outBoundGateway or ReplyingKafkaTemplate.
Thanks in Advance.

For ReplyingKafkaTemplate logic in Spring Integration there is a dedicated KafkaProducerMessageHandler which can be configured with a ReplyingKafkaTemplate.
See more info in docs:
https://docs.spring.io/spring-integration/docs/current/reference/html/kafka.html#kafka-outbound-gateway
And more about ReplyingKafkaTemplate:
https://docs.spring.io/spring-kafka/reference/html/#replying-template
Probably on the other a KafkaInboundGateway must be configured, respectively:
https://docs.spring.io/spring-integration/docs/current/reference/html/kafka.html#kafka-inbound-gateway

Related

spring cloud stream app starter File Source to Spring Batch Cloud Task

I have a spring batch boot app which takes a flat file as input . I converted the app into cloud task and deployed in spring local data flow server. Next , I created a stream starting with File Source -> tasklaunchrequest-transform -> task-launcher-local which starts my batch cloud task app .
It looks like that the File does not come into the batch app . I do not see anything in the logs to indicate that.
I checked the docs at https://github.com/spring-cloud-stream-app-starters/tasklaunchrequest-transform/tree/master/spring-cloud-starter-stream-processor-tasklaunchrequest-transform
It says
Any input type. (payload and header are discarded)
My question is how do I pass the file as payload from File Source to the Batch app which seems to be a very basic feature.
any help is very much appreciated.
You'll need to write your own transformer that takes the data from the source and packages it up so your task can consume it.

Connecting Spring Batch to Remote Cassandra Database

I'm hoping to read data from a CSV file, process the data, and upload it to a remote database. I'm using Spring's starter repo as a base. Code is here
I tried putting this information in the properties file:
spring.data.cassandra.contact-points=IP, IP
spring.data.cassandra.port=PORT
spring.data.cassandra.keyspace-name=KEYSPACE_NAME
spring.data.cassandra.username=USER
spring.data.cassandra.password=PASS
spring.data.cassandra.ssl=TRUE
However, I think it keeps defaulting to pushing to some local tomcat jdbc. I'm not really sure where to start. Any help is appreciated! Thanks.
Your code doesn't have anything to use Cassandra. It doesn't have any of the dependencies and the ItemWriter implementation is a JdbcBatchItemWriter which I don't think will work for Cassandra. You need to configure your application to actually use Cassandra (the spring data starter as well as an ItemWriter implementation that can write to Cassandra).

Spring Cloud Stream Rabbit Binder Routing Key always '#'

Version: Spring Boot: 1.4.2.RELEASE
Spring Cloud Deps: Brixton.SR7
Here is my application.properties of a processor app.
logging.level.=DEBUG
server.port=0
logging.file=traveller-events-processor.log
server.port=0
spring.cloud.stream.rabbit.bindings.input.consumer.bindingRoutingKey='aa'
spring.cloud.stream.rabbit.bindings.input.consumer.bindingRoutingKey=aa
spring.cloud.stream.rabbit.bindings.input.consumer.bindQueue=true
spring.cloud.stream.rabbit.bindings.input.consumer.routing-key='aa'
spring.cloud.stream.rabbit.bindings.input.consumer.routingKey='aa'
spring.cloud.stream.bindings.input.destination=events-exchange
spring.cloud.stream.bindings.input.group=eventconsumersgroup
spring.cloud.stream.bindings.output.destination=work.out
spring.cloud.stream.bindings.output.contentType=text/plain
spring.cloud.stream.bindings.output.binder=rabbit
spring.cloud.stream.bindings.output.group=traveller-events-output-group
When I start this app, events-exchange is created as expected and bound to a queue named: events-exchange.eventconsumersgroup (which is also ok). But the routingKey is always '#'. I've tried with all the options I have fished from various documentations. Am I missing something here?
I want this app to only subscribe to certain messages (which I want to achieve via the routing key).
I see that Brixton.SR7 uses 1.0.2.RELEASE of Spring Cloud Stream and I don't seem to find the routingKey as a Rabbit consumer property. Do you want to upgrade to Spring Cloud Camden release or the latest one so that you can try using the consumer property: bindingRoutingKey as mentioned here

Configuring Spring Batch jobs via database instead of xml

i am new to spring batch,i need few clarifications regarding spring batch admin.
can i do job configuration related information in database instead of uploading XML file based configuration???

sftp channel outbound adapter with retry

I am using spring batch and spring integration where once my batch job is completed,it creates texts files and those needs to be uploaded to some ftp server. Sometimes we noticed that those connections drops and it needs to be retried. Is there anyway we can use spring retry project to try few sec later to see if it can upload those files. we want it to be configurable.
If so is there any example out there.
Thanks
Yes, Spring Integration provide retry component for you. It is called RequestHandlerRetryAdvice:
<int-sftp:outbound-channel-adapter>
<int-sftp:request-handler-advice-chain>
<bean class="org.springframework.integration.handler.advice.RequestHandlerRetryAdvice" />
</int-sftp:request-handler-advice-chain>
</int-sftp:outbound-channel-adapter>
Please, find more info in the Reference Manual.
Consider to you use RequestHandlerCircuitBreakerAdvice also for your "connections drops" cases.
And here you are the sample.