I’m trying to update the Camunda DMN table programmatically and deploy it again after the update.
But while creating a process engine, getting the exception for H2 driver, but for my given project I’m using the PostgreSQL database for Camunda tables.
ProcessEngine processEngine = ProcessEngineConfiguration
.createStandaloneInMemProcessEngineConfiguration().buildProcessEngine();
org.camunda.bpm.engine.repository.Deployment deployment = processEngine.getRepositoryService()
.createDeployment()
.addString(fileName, Dmn.convertToString(dmnModelInstance))
.name("Deployment after update").deploy();
java.sql.SQLException: Error setting driver on UnpooledDataSource. Cause: java.lang.ClassNotFoundException: org.h2.Driver
at org.apache.ibatis.datasource.unpooled.UnpooledDataSource.initializeDriver(UnpooledDataSource.java:221)
at org.apache.ibatis.datasource.unpooled.UnpooledDataSource.doGetConnection(UnpooledDataSource.java:200)
at org.apache.ibatis.datasource.unpooled.UnpooledDataSource.doGetConnection(UnpooledDataSource.java:196)
at org.apache.ibatis.datasource.unpooled.UnpooledDataSource.getConnection(UnpooledDataSource.java:93)
at org.apache.ibatis.datasource.pooled.PooledDataSource.popConnection(PooledDataSource.java:385)
at org.apache.ibatis.datasource.pooled.PooledDataSource.getConnection(PooledDataSource.java:89)
at org.camunda.bpm.engine.impl.cfg.ProcessEngineConfigurationImpl.initDatabaseType(ProcessEngineConfigurationImpl.java:1300)
You need to create datasource bean explicitly or can declare the datasource attributes in bootstrap.yml or application.properties file.
#Configuration
public class ExampleProcessEngineConfiguration {
#Bean
public DataSource dataSource() {
// Use a JNDI data source or read the properties from
// env or a properties file.
// Note: The following shows only a simple data source
// for In-Memory H2 database.
SimpleDriverDataSource dataSource = new SimpleDriverDataSource();
dataSource.setDriverClass(org.h2.Driver.class);
dataSource.setUrl("jdbc:h2:mem:camunda;DB_CLOSE_DELAY=-1");
dataSource.setUsername("sa");
dataSource.setPassword("");
return dataSource;
}
#Bean
public PlatformTransactionManager transactionManager() {
return new DataSourceTransactionManager(dataSource());
}
#Bean
public SpringProcessEngineConfiguration processEngineConfiguration() {
SpringProcessEngineConfiguration config = new SpringProcessEngineConfiguration();
config.setDataSource(dataSource());
config.setTransactionManager(transactionManager());
config.setDatabaseSchemaUpdate("true");
config.setHistory("audit");
config.setJobExecutorActivate(true);
return config;
}
#Bean
public ProcessEngineFactoryBean processEngine() {
ProcessEngineFactoryBean factoryBean = new ProcessEngineFactoryBean();
factoryBean.setProcessEngineConfiguration(processEngineConfiguration());
return factoryBean;
}
#Bean
public RepositoryService repositoryService(ProcessEngine processEngine) {
return processEngine.getRepositoryService();
}
#Bean
public RuntimeService runtimeService(ProcessEngine processEngine) {
return processEngine.getRuntimeService();
}
#Bean
public TaskService taskService(ProcessEngine processEngine) {
return processEngine.getTaskService();
}
// more engine services and additional beans ...
}
Related
I wrote a simple demo to overwrite default jobrepo. Instead of map based I wanted a H2 db to hold persistent metadata.
Therefore I wrote a CustomBatchConfigurer like this:
#Configuration
public class CustomBatchConfigurer extends DefaultBatchConfigurer {
#Autowired
#Qualifier("repo-db")
DataSource dataSource;
#Override
public void setDataSource(DataSource dataSource) {
super.setDataSource(dataSource);
}
#Bean(name = "repo-db")
public DataSource getJobRepoDataSource() {
return DataSourceBuilder
.create()
.url("jdbc:h2:tcp://localhost/~/src/spring-batch/batch_repo")
.driverClassName("org.h2.Driver")
.username("sa")
.password("test")
.type(HikariDataSource.class)
.build();
}
}
But Spring-Batch is not picking it up:
o.s.b.c.c.a.DefaultBatchConfigurer: No datasource was provided...using a Map based JobRepository
What am I doing wrong? I thought I had followed the instructions on spring doc ref.
Thanks and regards,
Jörg
You configuration should look more like this:
#Configuration
public class CustomBatchConfiguration {
#Bean
public BatchConfigurer batchConfigurer(#Qualifier("repo-db") DataSource dataSource) {
return new DefaultBatchConfigurer(dataSource);
}
#Bean(name = "repo-db")
public DataSource jobRepoDataSource() {
return DataSourceBuilder
.create()
.url("jdbc:h2:tcp://localhost/~/src/spring-batch/batch_repo")
.driverClassName("org.h2.Driver")
.username("sa")
.password("test")
.type(HikariDataSource.class)
.build();
}
}
If your bean methods are proxied (which is the default), you can also simplify the first bean method to
#Bean
public BatchConfigurer batchConfigurer() {
return new DefaultBatchConfigurer(jobRepoDataSource());
}
Please also have a second look at the official documentation: https://docs.spring.io/spring-batch/docs/4.3.x/reference/html/job.html#javaConfig
trying to find what went wrong with my code which worked fine until i moved to JTAtransactionManager it is having issue saving the record in database but fetching the record working fine, below is my sample TransactionConfig class and service class method.
#Configuration
#ComponentScan
#EnableTransactionManagement
public class TransactionConfig {
#Bean(name = "userTransaction")
public UserTransaction userTransaction() throws Throwable {
UserTransactionImp userTransactionImp = new UserTransactionImp();
//userTransactionImp.setTransactionTimeout(10000);
return userTransactionImp;
}
#Bean(name = "atomikosTransactionManager", initMethod = "init", destroyMethod = "close")
public TransactionManager atomikosTransactionManager() throws Throwable {
UserTransactionManager userTransactionManager = new UserTransactionManager();
userTransactionManager.setForceShutdown(false);
return userTransactionManager;
}
#Bean(name = "transactionManager")
#DependsOn({ "userTransaction", "atomikosTransactionManager" })
public PlatformTransactionManager transactionManager() throws Throwable {
UserTransaction userTransaction = userTransaction();
TransactionManager atomikosTransactionManager = atomikosTransactionManager();
return new JtaTransactionManager(userTransaction, atomikosTransactionManager);
}
}
---Employee Service Class Method---
#Transactional
public void appExample() {
try {
Employee emp = new Employee();
emp.setFirstName("Veer");
emp.setLastName("kumar");
empRepo.save(emp);
} catch (Exception e) {
log.error(e);
}
}
I think issue is with empRepo.save() method call. This call is not committing any changes to database as you are using #Transactional for transaction management.
Please try with empRepo.saveAndFlush() which will immediately flush the data into database. You can refer answer Difference between save and saveAndFlush in Spring data jpa
I am new to Spring Batch. I have configured my job with inmemoryrepository. But still, it seems it is using DB to persist job Metadata.
My spring batch Configuration is :
#Configuration
public class BatchConfiguration {
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private JobBuilderFactory jobBuilder;
#Bean
public JobLauncher jobLauncher() throws Exception {
SimpleJobLauncher job =new SimpleJobLauncher();
job.setJobRepository(getJobRepo());
job.afterPropertiesSet();
return job;
}
#Bean
public PlatformTransactionManager getTransactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public JobRepository getJobRepo() throws Exception {
return new MapJobRepositoryFactoryBean(getTransactionManager()).getObject();
}
#Bean
public Step step1(JdbcBatchItemWriter<Person> writer) throws Exception {
return stepBuilderFactory.get("step1")
.<Person, Person> chunk(10)
.reader(reader())
.processor(processor())
.writer(writer).repository(getJobRepo())
.build();
}
#Bean
public Job job( #Qualifier("step1") Step step1) throws Exception {
return jobBuilder.get("myJob").start(step1).repository(getJobRepo()).build();
}
}
How to resolve above issue?
If you are using Sprint boot
a simple property in your application.properties will solve the issue
spring.batch.initialize-schema=ALWAYS
For a non-Spring Boot setup:This error shows up when a datasource bean is declared in the batch configuration. To workaround the problem I added an embedded datasource, since I didn't want to create those tables in the application database:
#Bean
public DataSource mysqlDataSource() {
// create your application datasource here
}
#Bean
#Primary
public DataSource batchEmbeddedDatasource() {
// in memory datasource required by spring batch
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
return builder.setType(EmbeddedDatabaseType.H2)
.addScript("classpath:schema-drop-h2.sql")
.addScript("classpath:schema-h2.sql")
.build();
}
The initialization scripts can be found inside the spring-batch-core-xxx.jar under org.springframework.batch.core package.Note I used an in-memory database but the solution is valid also for other database systems.
Those who face the same problem with MySql database in CentOS(Most Unix based systems).
Table names are case-sensitive in Linux. Setting lower_case_table_names=1 has solved the problem.
Find official document here
For those using versions greater then spring-boot 2.5 this worked inside of application.properties
spring.batch.jdbc.initialize-schema = ALWAYS
This solved my case:
spring.batch.jdbc.initialize-schema=ALWAYS
As of now, I'm able to connect to Cassandra via the following code:
import com.datastax.driver.core.Cluster;
import com.datastax.driver.core.Session;
public static Session connection() {
Cluster cluster = Cluster.builder()
.addContactPoints("IP1", "IP2")
.withCredentials("user", "password")
.withSSL()
.build();
Session session = null;
try {
session = cluster.connect("database_name");
session.execute("CQL Statement");
} finally {
IOUtils.closeQuietly(session);
IOUtils.closeQuietly(cluster);
}
return session;
}
The problem is that I need to write to Cassandra in a Spring Batch project. Most of the starter kits seem to use a JdbcBatchItemWriter to write to a mySQL database from a chunk. Is this possible? It seems that a JdbcBatchItemWriter cannot connect to a Cassandra database.
The current itemwriter code is below:
#Bean
public JdbcBatchItemWriter<Person> writer() {
JdbcBatchItemWriter<Person> writer = new JdbcBatchItemWriter<Person>();
writer.setItemSqlParameterSourceProvider(new
BeanPropertyItemSqlParameterSourceProvider<Person>());
writer.setSql("INSERT INTO people (first_name, last_name) VALUES
(:firstName, :lastName)");
writer.setDataSource(dataSource);
return writer;
}
Spring Data Cassandra provides repository abstractions for Cassandra that you should be able to use in conjunction with the RepositoryItemWriter to write to Cassandra from Spring Batch.
It is possible to extend Spring Batch to support Cassandra by customising ItemReader and ItemWriter.
ItemWriter example:
public class CassandraBatchItemWriter<Company> implements ItemWriter<Company>, InitializingBean {
protected static final Log logger = LogFactory.getLog(CassandraBatchItemWriter.class);
private final Class<Company> aClass;
#Autowired
private CassandraTemplate cassandraTemplate;
#Override
public void afterPropertiesSet() throws Exception { }
public CassandraBatchItemWriter(final Class<Company> aClass) {
this.aClass = aClass;
}
#Override
public void write(final List<? extends Company> items) throws Exception {
logger.debug("Write operations is performing, the size is {}" + items.size());
if (!items.isEmpty()) {
logger.info("Deleting in a batch performing...");
cassandraTemplate.deleteAll(aClass);
logger.info("Inserting in a batch performing...");
cassandraTemplate.insert(items);
}
logger.debug("Items is null...");
}
}
Then you can inject it as a #Bean through #Configuration
#Bean
public ItemWriter<Company> writer(final DataSource dataSource) {
final CassandraBatchItemWriter<Company> writer = new CassandraBatchItemWriter<Company>(Company.class);
return writer;
}
Full source code can be found in Github repo: Spring-Batch-with-Cassandra
My setup works on my local but not when I deploy it to CloudFoundry/mongolab.
The config is very similar to the docs.
My local spring config
#Configuration
#Profile("dev")
#EnableMongoAuditing
#EnableMongoRepositories(basePackages = "com.foo.model")
public class SpringMongoConfiguration extends AbstractMongoConfiguration {
#Override
protected String getDatabaseName() {
return "myDb";
}
#Override
public Mongo mongo() throws Exception {
return new MongoClient("localhost");
}
#Bean
public AuditorAware<User> myAuditorProvider() {
return new SpringSecurityAuditorAware();
}
}
This is the cloud foundry setup
#Configuration
#Profile("cloud")
#EnableMongoAuditing
#EnableMongoRepositories(basePackages = "com.foo.model")
public class SpringCloudMongoDBConfiguration extends AbstractMongoConfiguration {
private Cloud getCloud() {
CloudFactory cloudFactory = new CloudFactory();
return cloudFactory.getCloud();
}
#Bean
public MongoDbFactory mongoDbFactory() {
Cloud cloud = getCloud();
MongoServiceInfo serviceInfo = (MongoServiceInfo) cloud.getServiceInfo(cloud.getCloudProperties().getProperty("cloud.services.mongo.id"));
String serviceID = serviceInfo.getId();
return cloud.getServiceConnector(serviceID, MongoDbFactory.class, null);
}
#Override
protected String getDatabaseName() {
Cloud cloud = getCloud();
return cloud.getCloudProperties().getProperty("cloud.services.mongo.id");
}
#Override
public Mongo mongo() throws Exception {
Cloud cloud = getCloud();
return new MongoClient(cloud.getCloudProperties().getProperty("cloud.services.mongo.connection.host"));
}
#Bean
public MongoTemplate mongoTemplate() {
return new MongoTemplate(mongoDbFactory());
}
#Bean
public AuditorAware<User> myAuditorProvider() {
return new SpringSecurityAuditorAware();
}
}
And the error I'm getting when I try to save a document in Cloud Foundry is:
OUT ERROR: org.springframework.data.support.IsNewStrategyFactorySupport - Unexpected error
OUT java.lang.IllegalArgumentException: Unsupported entity com.foo.model.project.Project! Could not determine IsNewStrategy.
OUT at org.springframework.data.mongodb.core.MongoTemplate.insert(MongoTemplate.java:739)
OUT at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:221)
OUT at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
Any ideas? Is it my config file etc..?
Thanks in advance
Niclas
This is usually caused if the Mongo mapping metadata obtained for entities does not scan entities at application startup. By default, AbstractMongoConfiguration uses the package of the actual configuration class to look for #Document annotated classes at startup.
The exception message makes me assume that SpringCloudMongoDBConfiguration is not located in any of the super packages of com.foo.model.project. There are two solutions to this:
Stick to the convenience of putting application configuration classes into the root package of your application. This will cause your application packages be scanned for domain classes, metadata obtained, and the is-new-detection work as expected.
Manually hand the package containing domain classes to the infrastructure by overriding MongoConfiguration.getMappingBasePackage().
The reason you might see the configuration working in the local environment is that the mapping metadata might be obtained through a non-persisting persistence operation (e.g. a query) and everything else proceeding from there.