i am new to spring batch,i need few clarifications regarding spring batch admin.
can i do job configuration related information in database instead of uploading XML file based configuration???
Related
My Goal
I need to read a file and divide each line as a message and send to kafka from a spring batch project and another spring integration project will be receiving the messages to process it in a async way. I want to return those messages after processing to the batch project and create 4 different files out of those messages.
Here I am trying to use OutBoundGateway and ReplyingKafkaTemplate. I am unable to configure it properly... Is there any example or reference guide to configure it.
I have checked spring batch integration samples github repository... There is no sample for outBoundGateway or ReplyingKafkaTemplate.
Thanks in Advance.
For ReplyingKafkaTemplate logic in Spring Integration there is a dedicated KafkaProducerMessageHandler which can be configured with a ReplyingKafkaTemplate.
See more info in docs:
https://docs.spring.io/spring-integration/docs/current/reference/html/kafka.html#kafka-outbound-gateway
And more about ReplyingKafkaTemplate:
https://docs.spring.io/spring-kafka/reference/html/#replying-template
Probably on the other a KafkaInboundGateway must be configured, respectively:
https://docs.spring.io/spring-integration/docs/current/reference/html/kafka.html#kafka-inbound-gateway
I have a spring batch boot app which takes a flat file as input . I converted the app into cloud task and deployed in spring local data flow server. Next , I created a stream starting with File Source -> tasklaunchrequest-transform -> task-launcher-local which starts my batch cloud task app .
It looks like that the File does not come into the batch app . I do not see anything in the logs to indicate that.
I checked the docs at https://github.com/spring-cloud-stream-app-starters/tasklaunchrequest-transform/tree/master/spring-cloud-starter-stream-processor-tasklaunchrequest-transform
It says
Any input type. (payload and header are discarded)
My question is how do I pass the file as payload from File Source to the Batch app which seems to be a very basic feature.
any help is very much appreciated.
You'll need to write your own transformer that takes the data from the source and packages it up so your task can consume it.
I'm hoping to read data from a CSV file, process the data, and upload it to a remote database. I'm using Spring's starter repo as a base. Code is here
I tried putting this information in the properties file:
spring.data.cassandra.contact-points=IP, IP
spring.data.cassandra.port=PORT
spring.data.cassandra.keyspace-name=KEYSPACE_NAME
spring.data.cassandra.username=USER
spring.data.cassandra.password=PASS
spring.data.cassandra.ssl=TRUE
However, I think it keeps defaulting to pushing to some local tomcat jdbc. I'm not really sure where to start. Any help is appreciated! Thanks.
Your code doesn't have anything to use Cassandra. It doesn't have any of the dependencies and the ItemWriter implementation is a JdbcBatchItemWriter which I don't think will work for Cassandra. You need to configure your application to actually use Cassandra (the spring data starter as well as an ItemWriter implementation that can write to Cassandra).
I have a Jenkins job that runs the Selenium tests. The results are then stored in a CSV file and then fed to Cassandra. My requirement is to create JIRA request if the test fails either by analyzing the CSV file or from Cassandra. Please suggest the possible approaches.
Jira API + CSV Reader or Cassandra API
https://docs.atlassian.com/jira/REST/latest/
Our application uses Spring Integration to kick off a Spring Batch job.
It works as follows:
1) The main application class is run, and this loads the Spring application context.
2) A Spring Integration bean is configured to read a file from the file system, and to place the file on a channel.
<int-file:inbound-channel-adapter
directory="${...}" channel="channel"
filename-pattern="*.csv" auto-create-directory="false" prevent-duplicates="true">
<int:poller fixed-delay="${...}"></int:poller>
</int-file:inbound-channel-adapter>
3) The channel connects to a #ServiceActivator bean.
<int:service-activator input-channel="channel" ref="launcher" />
4) The Launcher bean gets the file from the message payload, and launches the Spring Batch job.
The problem is that after the poller fixed-delay time has elapsed, the launcher bean will be called again, and a new Job will be started with the same parameters.
This throws a JobInstanceAlreadyCompleteException.
I do not see anyway to tell the poller to only run once.
What is the recommended way to use Spring Integration and Spring Batch together to avoid this issue?
Thanks