with Spring Batch xml based configuration you can parameterize the commit-interval / chunk size like:
<job id="basicSimpleJob"
xmlns="http://www.springframework.org/schema/batch">
<step id="basicSimpleStep" >
<tasklet>
<chunk
reader="reader"
processor="processor"
writer="writer"
commit-interval="#{jobParameters['commit.interval']}">
</chunk>
</tasklet>
</step>
</job>
with javaconfig based configuration it could look like
#Bean
public Step step(
ItemStreamReader<Map<String, Object>> reader,
ItemWriter<Map<String, Object>> writer,
#Value("#{jobParameters['commit.interval']}") Integer commitInterval
) throws Exception {
return steps
.get("basicSimpleStep")
.<Map<String, Object>, Map<String, Object>>chunk(commitInterval)
.reader(reader)
.processor(new FilterItemProcessor())
.writer(writer)
.build();
}
but it does not work, i get either
Caused by:
org.springframework.expression.spel.SpelEvaluationException:
EL1008E:(pos 0): Property or field 'jobParameters' cannot be found on
object of type
'org.springframework.beans.factory.config.BeanExpressionContext' -
maybe not public?
or - while using #StepScope for the step bean -
Caused by: java.lang.IllegalStateException: No context holder
available for step scope
i know i have a working stepscope, other stepscoped beans work(defined inside same class as step)
right now i use a CompletionPolicy which does work with stepScope but i would like to know if someone got it to work the "normal" way or if it's time for a JIRA ticket
... which is created at https://jira.spring.io/browse/BATCH-2263
Adding #JobScope annotation to Step definition is working in Spring Batch 3:
#Bean
#JobScope
public Step step(
ItemStreamReader<Map<String, Object>> reader,
ItemWriter<Map<String, Object>> writer,
#Value("#{jobParameters['commit.interval']}") Integer commitInterval
)
This will initialize the step bean at job execution, so late binding of jobParameters is working in this case.
I have poor confidence with JavaConfig and - maybe - this can be an issue only for commit-interval during late binding for java configuration (in SB ChunkElementParser.java source there are few lines of code that checks for commit-interval starts with a # and inject a SimpleCompletionPolicy step scoped); you can try injecting a StepExecutionSimpleCompletionPolicy and check if this solution works.
Also, I have never tried late binding commit-interval with xml config but there is a [opened ticket with title Commit Interval not working as intended when used in Late Binding
As last chance, if you are using version 3.0, you can also annotate step with #JobScope and check if this solution works.
Related
The documentation is not very clear about the role of PlatformTransactionManager in steps configuration.
First, stepBuilder.tasklet and stepBuilder.chunk requires a PlatformTransactionManager as second parameter while the migration guide says it is now required to manually configure the transaction manager on any tasklet step definition (...) This is only required for tasklet steps, other step types do not require a transaction manager by design..
More over, in the documentation the transactionManager is injected via a method parameter:
/**
* Note the TransactionManager is typically autowired in and not needed to be explicitly
* configured
*/
But the transactionManager created by Spring Boot is linked to the DataSource created by Spring Boot based on spring.datasource.url. So with autoconfiguration, the following beans works together: dataSource, platformTransactionManager, jobRepository. It makes sense for job and step executions metadata management.
But unless readers, writers and tasklet works with this default DataSource used by JobOperator, the auto configured transactionManager must not be used for the steps configuration. Am I right ?
Tasklets or a chunk oriented steps will often need another PlatformTransactionManager:
if a step writes data in a specific db it needs a specific DataSource (not necessarily declared as bean otherwise the JobRepository will use it) and a specific PlatformTransactionManager linked to this DataSource
if a step writes data in a file or send message to a MOM, the ResourcelessTransactionManager is more appropriate. This useful implementation is not mentioned in the documentation.
As far as I understand, the implementation of PlatformTransactionManager for a step depends on where the data are written and has nothing to do with the transactionManager bean used by the JobOperator Am I right ?
Example:
var builder = new StepBuilder("step-1", jobRepository);
PlatformTransactionManager txManager = new ResourcelessTransactionManager();
return builder.<Input, Output> chunk(10, txManager)
.reader(reader())
.processor(processor())
.writer(writer()/*a FlatFileItemWriter*/)
.build();
or
#SpringBootApplication
#EnableBatchProcessing
public class MyJobConfiguration {
private DataSource dsForStep1Writer;
public MyJobConfiguration(#Value("${ds.for.step1.writer.url"} String url) {
this.dsForStep1Writer = new DriverManagerDataSource(url);
}
// reader() method, processor() method
JdbcBatchItemWriter<Output> writer() {
return new JdbcBatchItemWriterBuilder<Output>()
.dataSource(this.dsForStep1Writer)
.sql("...")
.itemPreparedStatementSetter((item, ps)->{/*code*/})
.build();
}
#Bean
Step step1(JobRepository jobRepository) {
var builder = new StepBuilder("step-1", jobRepository);
var txManager = new JdbcTransactionManager(this.dsForStep1Writer);
return builder.<Input, Output> chunk(10, txManager)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
// other methods
}
Is that correct ?
Role of PlatformTransactionManager with Spring batch 5
The role of the transaction manager did not change between v4 and v5. I wrote an answer about this a couple of years ago for v4, so I will update it for v5 here:
In Spring Batch, there are two places where a transaction manager is used:
In the proxies created around the JobRepository/JobExplorer to create transactional methods when interacting with the job repository/explorer
In each step definition to drive the step's transaction
Typically, the same transaction manager is used in both places, but this is not a requirement. It is perfectly fine to use a ResourcelessTransactionManager with the job repository to not store any meta-data and a JpaTransactionManager in the step to persist data in a database.
Now in v5, #EnableBatchProcessing does not register a transaction manager bean in the application context anymore. You either need to manually configure one in the application context, or use the one auto-configured by Spring Boot (if you are a Spring Boot user).
What #EnableBatchProcessing will do though is look for a bean named transactionManager in the application context and set it on the auto-configured JobRepository and JobExplorer beans (this is configurable with the transactionManagerRef attribute). Again, this transaction manager bean could be manually configured or auto-configured by Boot.
Once that in place, it is up to you to set that transaction manager on your steps or not.
We have a Spring batch project which is XML based
We need to create a new job and we need to add the job as a nested job to previous XML based job
Is it possible to create the new Job annotation based and add a step to existing XML based job?
I have created a Tasklet Step and tried adding to XML based Job as a Step and am getting.
Cannot convert value of type 'org.springframework.batch.core.step.tasklet.TaskletStep' to required type 'org.springframework.batch.core.step.tasklet.Tasklet' for property 'tasklet': no matching editors or conversion strategy found
A tasklet is not the appropriate type to delegate a step processing to a job, you should use a JobStep instead.
The main job can be defined in XML and refer to the "delegate" job (which could be a bean defined in XML or Java config). Here is an example:
<batch:job id="mainJob">
<batch:step id="step">
<batch:job ref="subjob">
</batch:job>
</batch:step>
</batch:job>
In this example, subjob could be a Spring Batch job defined in XML or Java config.
On Finchley.SR2, here is the code
#Configuration
#EnableAutoConfiguration
#SpringBootApplication
#EnableBinding(Processor.class)
#RestController
public class Application {
private static Logger log = LoggerFactory.getLogger(Application.class);
#Autowired
private Processor processor;
#Autowired
MappingJackson2MessageConverter testConverter;
#Bean
#StreamMessageConverter
MappingJackson2MessageConverter createTestConverter(){
return new MappingJackson2MessageConverter();
}
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
When I start up, I got
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'org.springframework.cloud.stream.messaging.Processor' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {#org.springframework.beans.factory.annotation.Autowired(required=true)}
But if I take out #StreamMessageConverter, the Processor can be autowired successfully.
What should I do to keep my customized message converter and autowired Processor at the same time? Thanks!
There is a lot going on there, so lets try to parse it put. . .
First question, why do you need to autowire the following?
#Autowired
private Processor processor;
You, generally don't need to interact with Processor directly since it is used bt the framework to provide a delegation/connectivity model between remote destinations exposed by the binders and your message handlers
Further more, your actual issue is related to a lifecycle which may be a minor yet harmless bug on our end and probably relates to configuring and autowiring Processor in the same configuration class.
Second:
#Configuration
#EnableAutoConfiguration
#SpringBootApplication
You only need one
#SpringBootApplication
Third:
Why do you need to configure MappingJackson2MessageConverter? Content type conversion is a transparent feature of he framework and while we do provide an ability to configure custom message converters, the one you are configuring is already configured by the framework and in fact is the first in the stack of seven pre-configured message converters
The final question:
What is it that your are trying to do? Can you explain your use case?
Is there any way I can inject TestContext in my cucumber step class?
I am using citrus, spring, and cucumber together with the latest version. But when I use the below injection I always get null pointer exception of the TestContext. For TestDesigner and TestRunner I have no issue to get.
#CitrusResource private TestContext tContext;
and in the log i am seeing
Failed to get proper TestContext from Cucumber Spring application context: No qualifying bean of type 'com.consol.citrus.context.TestContext' available
You are obviously using the setting
cucumber.api.java.ObjectFactory=cucumber.runtime.java.spring.CitrusSpringObjectFactory
in the cucumber.properties.
When doing so you need to manually add the Citrus Spring configuration with #ContextConfiguration annotation on your steps class.
#ContextConfiguration(classes = CitrusSpringConfig.class)
public class MySteps {
#CitrusResource
private TestDesigner designer;
[...]
}
In case you are using the default cucumber.xml Spring XML application context you need to add the Citrus Spring config as bean to that file:
<!-- JavaConfig bean post-processor -->
<bean class="org.springframework.context.annotation.ConfigurationClassPostProcessor"/>
<!-- Citrus Java config -->
<bean id="citrusSpringConfig" class="com.consol.citrus.config.CitrusSpringConfig"/>
I try to write example app using spring batch:
#Bean
public Step testStep() {
return steps.get("testStep").<String,String>chunk(1)
.reader(testReader())
.processor(testProcessor())
.writer(testWriter())
.listener(testListener())
.build();
}
When I throw exception in reader or processor or writer, the job stopped with status FAILED. How can I make job ignore exception and keep running.
I'm not use any xml config, just annotation and class. Please give me a hint or link.
Thanks for any support!
Edit: Can we add skip dynamically, like I post in answer below.