IndexMissingException: [News] missing - postgresql

I have porject in Spring boot where i utilize elasticsearch. i save my data to my primary database in postgres and for searching in elasticsearch(ES). I have configured added some data. and i can see that there is my data in elasticsearch as well as in postgres. But when i try to run custom Query through searchQuery in returns my an error:
2016-12-09 20:03:09.586 ERROR 1704 --- [pool-2-thread-1] o.s.s.s.TaskUtils$LoggingErrorHandler : Unexpected error occurred in scheduled task.
org.elasticsearch.indices.IndexMissingException: [provenNews] missing
at org.elasticsearch.cluster.metadata.MetaData.convertFromWildcards(MetaData.java:868)
at org.elasticsearch.cluster.metadata.MetaData.concreteIndices(MetaData.java:685)
at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.<init>(TransportSearchTypeAction.java:113)
at org.elasticsearch.action.search.type.TransportSearchDfsQueryThenFetchAction$AsyncAction.<init>(TransportSearchDfsQueryThenFetchAction.java:75)
at org.elasticsearch.action.search.type.TransportSearchDfsQueryThenFetchAction$AsyncAction.<init>(TransportSearchDfsQueryThenFetchAction.java:68)
at org.elasticsearch.action.search.type.TransportSearchDfsQueryThenFetchAction.doExecute(TransportSearchDfsQueryThenFetchAction.java:65)
at org.elasticsearch.action.search.type.TransportSearchDfsQueryThenFetchAction.doExecute(TransportSearchDfsQueryThenFetchAction.java:55)
at org.elasticsearch.action.support.TransportAction.execute(TransportAction.java:75)
here is my code:
configuration class
#Configuration
#EnableTransactionManagement
#EnableElasticsearchRepositories(basePackages = "org.news.proven.repository")
public class ProjectConfiguration {
#Bean
public HibernateJpaSessionFactoryBean sessionFactory() {
return new HibernateJpaSessionFactoryBean();
}
#Bean
public ElasticsearchTemplate elasticsearchTemplate() {
return new ElasticsearchTemplate(getNodeClient());
}
private static NodeClient getNodeClient() {
return (NodeClient) nodeBuilder().clusterName(UUID.randomUUID().toString()).local(true).node()
.client();
}
}
my class that calls the searchQuery
#Service
public class NewsSearchServiceBean implements NewsSearchService {
#Autowired
private ProvenNewsRepository newsSearchRepository;
#Autowired
private ElasticsearchTemplate elasticsearchTemplate;
public Page<ProvenNews> search(String query, int page)
{
SearchQuery searchQuery = new NativeSearchQueryBuilder().withQuery(QueryBuilders.multiMatchQuery(query)
.field("title", 0.6f) //boosting title
.field("newsText", 0.4f)
.type(MultiMatchQueryBuilder.Type.BEST_FIELDS)
.slop(50)
.fuzziness(Fuzziness.ONE) //80 % of mispelling have an edit distance 1 (Damerau-Levenshtein edit distance)
)
.withPageable(new PageRequest(page, 15))
.build();
Page<ProvenNews> result = elasticsearchTemplate.queryForPage(searchQuery, ProvenNews.class);//newsSearchRepository.search(searchQuery);
return result;
// return newsSearchRepository.findByNewsTextAndTitle(query,query,new PageRequest(page, 10, Direction.DESC, "newsDate"));
}
MY repositoy:
public interface ProvenNewsRepository extends ElasticsearchCrudRepository<ProvenNews, Long> {
public Page<ProvenNews> findByNewsTextAndTitle(String newsText, String Title, Pageable page);
}
Any advise and assist will be appreciated)

Related

Configuration for Mvc testing

guys. I have spring MVC project and I want to test CoursesController, but can not find out how to do it.
Do I need to make separated configuration class for tests?
Before springMvc I used separated configuration class for test with embedded database.
I'd appreciate all the help I can get.
CoursesController class:
#Controller
#RequestMapping("/courses")
public class CoursesController {
private final CourseService courseService;
#Autowired
public CoursesController(CourseService courseService) {
this.courseService = courseService;
}
#GetMapping()
public String index(Model model, #RequestParam("page") Optional<Integer> page,
#RequestParam("size") Optional<Integer> size) throws ServiceException {
int currentPage = page.orElse(1);
int pageSize = size.orElse(10);
Page<Course> coursePage = courseService.findPaginated(PageRequest.of(currentPage - 1, pageSize));
model.addAttribute("coursePage", coursePage);
int totalPages = coursePage.getTotalPages();
if (totalPages > 0) {
List<Integer> pageNumbers = IntStream.rangeClosed(1, totalPages).boxed().collect(Collectors.toList());
model.addAttribute("pageNumbers", pageNumbers);
}
return "courses/index";
}
}
Configuration class:
#Configuration
#ComponentScan("com.university")
#PropertySource("classpath:/application.properties")
#EnableWebMvc
public class Config implements WebMvcConfigurer {
#Autowired
private Environment env;
private final ApplicationContext applicationContext;
#Autowired
public Config(ApplicationContext applicationContext) {
this.applicationContext = applicationContext;
}
#Bean
public HikariDataSource dataSource() {
return (HikariDataSource) DataSourceBuilder.create().type(HikariDataSource.class)
.url(env.getProperty("spring.datasource.url"))
.driverClassName(env.getProperty("spring.datasource.driverClassName"))
.username(env.getProperty("spring.datasource.username"))
.password(env.getProperty("spring.datasource.password")).build();
}
#Bean
public JdbcTemplate jdbcTemplate() {
return new JdbcTemplate(dataSource());
}
#Bean
public SpringResourceTemplateResolver templateResolver() {
SpringResourceTemplateResolver templateResolver = new SpringResourceTemplateResolver();
templateResolver.setApplicationContext(applicationContext);
templateResolver.setPrefix("/WEB-INF/views/");
templateResolver.setSuffix(".html");
return templateResolver;
}
#Bean
public SpringTemplateEngine templateEngine() {
SpringTemplateEngine templateEngine = new SpringTemplateEngine();
templateEngine.setTemplateResolver(templateResolver());
templateEngine.setEnableSpringELCompiler(true);
return templateEngine;
}
#Override
public void configureViewResolvers(ViewResolverRegistry registry) {
ThymeleafViewResolver resolver = new ThymeleafViewResolver();
resolver.setTemplateEngine(templateEngine());
registry.viewResolver(resolver);
}
#Bean
public SessionLocaleResolver localeResolver() {
SessionLocaleResolver localeResolver = new SessionLocaleResolver();
localeResolver.setDefaultLocale(Locale.ENGLISH);
return localeResolver;
}
}
I did separate configuration class for tests with H2 database.

Dynamic config Spring batch execution time and parameters via database configured parameters

I am a fresh spring batch user, pls help me. Here is my requirement:
I have fulfilled several spring batch jobs with different names. I want to execute these jobs with different job parameters, and I hope these parameters can be configured dynamically in database, so I can add new job execution with different job names and different parameters.
Also, I want to schedule my job execution in different time, and the crontab expression can also be configured.
Maybe the database structure is like:
id
task_name
spring_batch_job_name
cron_expression
Wonder if some guys can guide me. Very thanks!
Here is my job setting entity:
#Entity
#Table(name = "report_tasks_manager", schema = "reconciliation", catalog = "")
public class ReportTasksManager {
private int id;
private String taskDesc;
private String taskName;
// crontab expression
private String cronExpression;
// class name to execute job logic
private String methodName;
private int state;
private Integer conCurrent;
private String reserved1;
private String reserved2;
private String reserved3;
private Timestamp startTime;
private Timestamp endTime;
private Timestamp createTime;
}
I defined a class which implemented Job interface, and the execute() method in this class executes the business logic, like launching a spring batch job:
public class QuartzJobFactory implements Job {
public QuartzJobFactory() {
}
#Override
public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
System.out.println("time ={" + new Date() + "}");
System.out.println("starting job build factory");
ReportTasksManager reportTasksManager = (ReportTasksManager) jobExecutionContext.getMergedJobDataMap().get("scheduleJob");
System.out.println("job name = {" + reportTasksManager.getTaskName() + "}");
}
}
For registering a cron trigger, I defined a rest controller to checkout job parameters in database and configure scheduler
#RestController
#RequestMapping(path = "test")
public class QuartzManager {
private SchedulerFactory schedulerFactory = new StdSchedulerFactory();
#Autowired
private ReportTaskManagerDAO reportTaskManagerDAO;
#GetMapping(value = "schedule")
public void scheduleJob() {
// Read settings from database
List<ReportTasksManager> quartzList = reportTaskManagerDAO.findAll();
if (quartzList.size() > 0) {
quartzList.forEach(reportTasksManager -> {
try {
configQuartz(reportTasksManager, schedulerFactory.getScheduler());
} catch (SchedulerException | ClassNotFoundException e) {
e.printStackTrace();
}
});
}
}
#SuppressWarnings("unchecked")
private void configQuartz(ReportTasksManager reportTasksManager, Scheduler scheduler) throws SchedulerException, ClassNotFoundException {
TriggerKey triggerKey = TriggerKey.triggerKey(reportTasksManager.getTaskName(), Scheduler.DEFAULT_GROUP);
// check if triggers already defined in scheduler
CronTrigger trigger = (CronTrigger) scheduler.getTrigger(triggerKey);
if (null == trigger) {
// not define——new trigger&jobDetail
JobDetail jobDetail =
JobBuilder.newJob((Class<? extends Job>) Class.forName(reportTasksManager.getMethodName()))
.withIdentity(reportTasksManager.getTaskName(), Scheduler.DEFAULT_GROUP)
.build();
jobDetail.getJobDataMap().put("scheduleJob", reportTasksManager);
CronScheduleBuilder scheduleBuilder = CronScheduleBuilder.cronSchedule(reportTasksManager.getCronExpression());
trigger = TriggerBuilder.newTrigger()
.withIdentity(reportTasksManager.getTaskName(), Scheduler.DEFAULT_GROUP)
.withSchedule(scheduleBuilder)
.build();
scheduler.scheduleJob(jobDetail, trigger);
scheduler.start();
} else {
// already defined——update
CronScheduleBuilder scheduleBuilder = CronScheduleBuilder.cronSchedule(reportTasksManager.getCronExpression());
trigger = trigger.getTriggerBuilder()
.withIdentity(triggerKey)
.withSchedule(scheduleBuilder)
.build();
scheduler.rescheduleJob(triggerKey, trigger);
}
}
}
You can use create some Util class( on #PostConstruct) which loads your job config from DB.
For example:
#Entity
public class Configuration{
#Id
private long id;
private String field;
private String value;
// getter and setter
}
#Component
public interface ConfigurationRepo extends JpaRepository<Configuration, Long> {
}
public final class ConfigurationUtil {
private ConfigurationUtil() {
}
private static List<Configuration> defaultConfiguration;
/**
* #return the defaultConfiguration
*/
public static List<Configuration> getDefaultConfiguration() {
return defaultConfiguration;
}
/**
* #param defaultConfiguration the defaultConfiguration to set
*/
public static void setDefaultConfiguration(List<Configuration> defaultConfiguration) {
ConfigurationUtil.defaultConfiguration = defaultConfiguration;
}
public static String getValueByField(String field) {
return defaultConfiguration.stream()
.filter(s -> s.getField()
.equalsIgnoreCase(field))
.findFirst()
.get()
.getValue();
}
}
#Component
public class ConfigurationContextInitializer {
#Autowired
ConfigurationRepo configurationRepo;
#PostConstruct
public void init() {
ConfigurationUtil.setDefaultConfiguration(configurationRepo.findAll());
}
}
//To access DB value
ConfigurationUtil.getValueByField("JOB_NAME"); // depends on your DB key

Multi-Tenancy in Reactive Spring boot application using mongodb-reactive

How can we create a multi-tenant application in spring webflux using Mongodb-reactive repository?
I cannot find any complete resources on the web for reactive applications. all the resources available are for non-reactive applications.
UPDATE:
In a non-reactive application, we used to store contextual data in ThreadLocal but this cannot be done with reactive applications as there is thread switching. There is a way to store contextual info in reactor Context inside a WebFilter, But I don't how get hold of that data in ReactiveMongoDatabaseFactory class.
Thanks.
I was able to Implement Multi-Tenancy in Spring Reactive application using mangodb. Main classes responsible for realizing were: Custom MongoDbFactory class, WebFilter class (instead of Servlet Filter) for capturing tenant info and a ThreadLocal class for storing tenant info. Flow is very simple:
Capture Tenant related info from the request in WebFilter and set it in ThreadLocal. Here I am sending Tenant info using header: X-Tenant
Implement Custom MondoDbFactory class and override getMongoDatabase() method to return database based on current tenant available in ThreadLocal class.
Source code is:
CurrentTenantHolder.java
package com.jazasoft.demo;
public class CurrentTenantHolder {
private static final ThreadLocal<String> currentTenant = new InheritableThreadLocal<>();
public static String get() {
return currentTenant.get();
}
public static void set(String tenant) {
currentTenant.set(tenant);
}
public static String remove() {
synchronized (currentTenant) {
String tenant = currentTenant.get();
currentTenant.remove();
return tenant;
}
}
}
TenantContextWebFilter.java
package com.example.demo;
import org.springframework.http.server.reactive.ServerHttpRequest;
import org.springframework.stereotype.Component;
import org.springframework.web.server.ServerWebExchange;
import org.springframework.web.server.WebFilter;
import org.springframework.web.server.WebFilterChain;
import reactor.core.publisher.Mono;
#Component
public class TenantContextWebFilter implements WebFilter {
public static final String TENANT_HTTP_HEADER = "X-Tenant";
#Override
public Mono<Void> filter(ServerWebExchange exchange, WebFilterChain chain) {
ServerHttpRequest request = exchange.getRequest();
if (request.getHeaders().containsKey(TENANT_HTTP_HEADER)) {
String tenant = request.getHeaders().getFirst(TENANT_HTTP_HEADER);
CurrentTenantHolder.set(tenant);
}
return chain.filter(exchange).doOnSuccessOrError((Void v, Throwable throwable) -> CurrentTenantHolder.remove());
}
}
MultiTenantMongoDbFactory.java
package com.example.demo;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoDatabase;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.SimpleReactiveMongoDatabaseFactory;
public class MultiTenantMongoDbFactory extends SimpleReactiveMongoDatabaseFactory {
private final String defaultDatabase;
public MultiTenantMongoDbFactory(MongoClient mongoClient, String databaseName) {
super(mongoClient, databaseName);
this.defaultDatabase = databaseName;
}
#Override
public MongoDatabase getMongoDatabase() throws DataAccessException {
final String tlName = CurrentTenantHolder.get();
final String dbToUse = (tlName != null ? tlName : this.defaultDatabase);
return super.getMongoDatabase(dbToUse);
}
}
MongoDbConfig.java
package com.example.demo;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoClients;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.core.ReactiveMongoClientFactoryBean;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
#Configuration
public class MongoDbConfig {
#Bean
public ReactiveMongoTemplate reactiveMongoTemplate(MultiTenantMongoDbFactory multiTenantMongoDbFactory) {
return new ReactiveMongoTemplate(multiTenantMongoDbFactory);
}
#Bean
public MultiTenantMongoDbFactory multiTenantMangoDbFactory(MongoClient mongoClient) {
return new MultiTenantMongoDbFactory(mongoClient, "test1");
}
#Bean
public ReactiveMongoClientFactoryBean mongoClient() {
ReactiveMongoClientFactoryBean clientFactory = new ReactiveMongoClientFactoryBean();
clientFactory.setHost("localhost");
return clientFactory;
}
}
UPDATE:
In reactive-stream we cannot store contextual information in ThreadLocal any more as the request is not tied to a single thread, So, This is not the correct solution.
However, Contextual information can be stored reactor Context in WebFilter like this. chain.filter(exchange).subscriberContext(context -> context.put("tenant", tenant));. Problem is how do get hold of this contextual info in ReactiveMongoDatabaseFactory implementation class.
Here is my very rough working solution for Spring WebFlux - they have since updated the ReactiveMongoDatabaseFactory - getMongoDatabase to return a Mono
Create web filter
public class TenantContextFilter implements WebFilter {
private static final Logger LOGGER = LoggerFactory.getLogger(TenantContextFilter.class);
#Override
public Mono<Void> filter(ServerWebExchange swe, WebFilterChain wfc) {
ServerHttpRequest request = swe.getRequest();
HttpHeaders headers = request.getHeaders();
if(headers.getFirst("X-TENANT-ID") == null){
LOGGER.info(String.format("Missing X-TENANT-ID header"));
throw new ResponseStatusException(HttpStatus.UNAUTHORIZED);
}
String tenantId = headers.getFirst("X-TENANT-ID");
LOGGER.info(String.format("Processing request with tenant identifier [%s]", tenantId));
return wfc.filter(swe)
.contextWrite(TenantContextHolder.setTenantId(tenantId));
}
}
Create class to get context (credit to somewhere I found this)
public class TenantContextHolder {
public static final String TENANT_ID = TenantContextHolder.class.getName() + ".TENANT_ID";
public static Context setTenantId(String id) {
return Context.of(TENANT_ID, Mono.just(id));
}
public static Mono<String> getTenantId() {
return Mono.deferContextual(contextView -> {
if (contextView.hasKey(TENANT_ID)) {
return contextView.get(TENANT_ID);
}
return Mono.empty();
}
);
}
public static Function<Context, Context> clearContext() {
return (context) -> context.delete(TENANT_ID);
}
}
My spring security setup (all requests allowed for testing)
#EnableWebFluxSecurity
#EnableReactiveMethodSecurity
public class SecurityConfig {
#Bean
public SecurityWebFilterChain WebFilterChain(ServerHttpSecurity http) {
return http
.formLogin(it -> it.disable())
.cors(it -> it.disable()) //fix this
.httpBasic(it -> it.disable())
.csrf(it -> it.disable())
.securityContextRepository(NoOpServerSecurityContextRepository.getInstance())
.authorizeExchange(it -> it.anyExchange().permitAll()) //allow anonymous
.addFilterAt(new TenantContextFilter(), SecurityWebFiltersOrder.HTTP_BASIC)
.build();
}
}
Create Tenant Mongo DB Factory
I still have some clean-up work for defaults etc...
public class MultiTenantMongoDBFactory extends SimpleReactiveMongoDatabaseFactory {
private static final Logger LOGGER = LoggerFactory.getLogger(MultiTenantMongoDBFactory.class);
private final String defaultDb;
public MultiTenantMongoDBFactory(MongoClient mongoClient, String databaseName) {
super(mongoClient, databaseName);
this.defaultDb = databaseName;
}
#Override
public Mono<MongoDatabase> getMongoDatabase() throws DataAccessException {
return TenantContextHolder.getTenantId()
.map(id -> {
LOGGER.info(String.format("Database trying to retrieved is [%s]", id));
return super.getMongoDatabase(id);
})
.flatMap(db -> {
return db;
})
.log();
}
}
Configuration Class
#Configuration
#EnableReactiveMongoAuditing
#EnableReactiveMongoRepositories(basePackages = {"com.order.repository"})
class MongoDbConfiguration {
#Bean
public ReactiveMongoDatabaseFactory reactiveMongoDatabaseFactory() {
return new MultiTenantMongoDBFactory(MongoClients.create("mongodb://user:password#localhost:27017"), "tenant_catalog");
}
#Bean
public ReactiveMongoTemplate reactiveMongoTemplate() {
ReactiveMongoTemplate template = new ReactiveMongoTemplate(reactiveMongoDatabaseFactory());
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
return template;
}
}
Entity Class
#Document(collection = "order")
//getters
//setters
Testing
Create two mongo db's with same collection, put different documents in both
In Postman I just did a get request with the "X-TENANT-ID" header and database name as the value (e.g. tenant-12343 or tenant-34383) and good to go!

Spring Data MongoDB Converter not getting registered

I have a setup of multiple MongoDB configuration. Here is the configuration class
#Configuration
#RequiredArgsConstructor
#EnableConfigurationProperties(MongoConfigProperties.class)
public class MultipleMongoConfig {
private static final Logger logger = LoggerFactory.getLogger(MultipleMongoConfig.class);
private final MongoConfigProperties mongoProperties;
#Primary
#Bean(name = "sysdiagMongoTemplate")
public MongoOperations sysdiagMongoTemplate() {
MappingMongoConverter mappingMongoConverter = new MappingMongoConverter(new DefaultDbRefResolver(sysdiagFactory(mongoProperties.getSysdiag())),
new MongoMappingContext());
List<Converter<?, ?>> converters = new ArrayList<>();
converters.add(new AggregationResultReadConverter());
mappingMongoConverter.setCustomConversions(new CustomConversions(CustomConversions.StoreConversions.NONE, converters));
mappingMongoConverter.afterPropertiesSet();
boolean canConvert = mappingMongoConverter.getConversionService().canConvert(Document.class, AggregationResult.class);
mappingMongoConverter.afterPropertiesSet();
logger.info("canConvertFromDocumentToAggResult:: " + canConvert); //gives TRUE
return new MongoTemplate(sysdiagFactory(this.mongoProperties.getSysdiag()), mappingMongoConverter);
}
#Bean(name = "monitoringMongoTemplate")
public MongoOperations monitoringMongoTemplate() {
return new MongoTemplate(monitoringFactory(this.mongoProperties.getMonitoring()));
}
public MongoDbFactory sysdiagFactory(final MongoProperties mongo) {
return new SimpleMongoDbFactory(new MongoClient(mongo.getHost(), mongo.getPort()),
mongo.getDatabase());
}
public MongoDbFactory monitoringFactory(final MongoProperties mongo) {
return new SimpleMongoDbFactory(new MongoClient(mongo.getHost(), mongo.getPort()),
mongo.getDatabase());
}
}
Here is the read converter class (I only require reading from MongoDB). We have dynamic keys in the document due to which I need to convert them into a Map
public class AggregationResultReadConverter implements Converter<Document, AggregationResult> {
#Override
public AggregationResult convert(Document source) {
AggregationResult aggregationResult = new AggregationResult();
aggregationResult.setData(new HashMap());
for(Map.Entry<String,Object> entry : source.entrySet()){
if(entry.getKey().matches("[A-Z][A-Z][A-Z]")){
aggregationResult.getData().put(entry.getKey(), entry.getValue());
}
}
return aggregationResult;
}
}
Here is the mapping configuration for one of the MongoDB database
#Configuration
#EnableMongoRepositories(basePackages = {"com.hns.services.restapi.db.mongo.sysdiag.entity", "com.hns.services.restapi.db.mongo.sysdiag.repo"}, mongoTemplateRef = "sysdiagMongoTemplate")
public class SysdiagMongoConfig {
}
And here is the repository interface
#Repository
public interface AggregationResultRepository extends MongoRepository<AggregationResult, ObjectId> {
#Query("{ TIME: {$gte : ?0, $lt: ?1}}")
List<AggregationResult> findInTimeRange(Long startTime, Long endTime);
}
When I query using AggregationResultRepository, I expect the converter code to be executed so that I can convert the fields and put them in the Entity (Document) class object as per the logic. The query is going fine as I saw in the debug logs and I see an output but the converter is not getting called.
The converted is getting registered with the mongo template as the canConvertFromDocumentToAggResult logger gives TRUE. I tried changing the converted from Document -> AggregationResult to DBObject -> AggregationResult but no luck. Not sure what am I missing here.

How to run update query in Spring JPA for quartz job

I have a quartz job in spring 4 and I am using JPA hibernate to update database value through quartz job but I am getting javax.persistence.TransactionRequiredException: Executing an update/delete query
I don't understand what kind of configuration is missing in quartz job. I referred to SpringBeanAutowiringSupport example still update is failing but select is working fine.
Below is my code
#Configuration
#ComponentScan("com.stock")
public class QuartzConfiguration {
#Autowired
private ApplicationContext applicationContext;
#Bean
public JobDetailFactoryBean jobDetailBalanceCarryForward(){
JobDetailFactoryBean factory = new JobDetailFactoryBean();
factory.setJobClass(BillingCroneSvcImpl.class);
Map<String,Object> map = new HashMap<String,Object>();
map.put("task", "balanceCarryForward");
factory.setJobDataAsMap(map);
factory.setGroup("BalanceCarryForwardJob");
factory.setName("balance carry forward");
return factory;
}
#Bean
public CronTriggerFactoryBean cronTriggerBalanceCarryForward(){
CronTriggerFactoryBean stFactory = new CronTriggerFactoryBean();
stFactory.setJobDetail(jobDetailBalanceCarryForward().getObject());
stFactory.setStartDelay(3000);
stFactory.setName("balancCarryForwardTrigger");
stFactory.setGroup("balanceCarryForwardgroup");
stFactory.setCronExpression("0 0/1 * 1/1 * ? *");
return stFactory;
}
#Bean
public SpringBeanJobFactory springBeanJobFactory() {
AutoWiringSpringBeanJobFactory jobFactory = new AutoWiringSpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
return jobFactory;
}
#Bean
public SchedulerFactoryBean schedulerFactoryBean() {
SchedulerFactoryBean schedulerFactory = new SchedulerFactoryBean();
schedulerFactory.setJobFactory(springBeanJobFactory());
schedulerFactory.setTriggers(cronTriggerBalanceCarryForward().getObject());
return schedulerFactory;
}
}
Below class where quartz executeInternal method is written
#Service
#PersistJobDataAfterExecution
#DisallowConcurrentExecution
#Autowired
private BillingCroneRepo billingCroneRepo;
public class BillingCroneSvcImpl extends QuartzJobBean implements BillingCroneSvc {
#Override
#Transactional
protected void executeInternal(JobExecutionContext context) throws JobExecutionException {
SpringBeanAutowiringSupport.processInjectionBasedOnCurrentContext(context);
billingCroneRepo.updateBalance();
// this method throws exception javax.persistence.TransactionRequiredException: Executing an update/delete query
}
}
App config class
#EnableWebMvc
#EnableTransactionManagement
#Configuration
#ComponentScan({ "com.stock.*" })
#Import({ SecurityConfig.class })
#PropertySource("classpath:jdbc.properties")
public class AppConfig extends WebMvcConfigurerAdapter {
private static final String PROPERTY_NAME_DATABASE_DRIVER = "db.driver";
private static final String PROPERTY_NAME_DATABASE_PASSWORD = "db.password";
private static final String PROPERTY_NAME_DATABASE_URL = "db.url";
private static final String PROPERTY_NAME_DATABASE_USERNAME = "db.username";
private static final String PROPERTY_NAME_HIBERNATE_DIALECT = "hibernate.dialect";
private static final String PROPERTY_NAME_HIBERNATE_SHOW_SQL = "hibernate.show_sql";
private static final String PROPERTY_NAME_ENTITYMANAGER_PACKAGES_TO_SCAN = "entitymanager.packages.to.scan";
#Resource
private Environment env;
#Bean(name = "dataSource")
public DriverManagerDataSource dataSource() {
DriverManagerDataSource driverManagerDataSource = new DriverManagerDataSource();
driverManagerDataSource.setDriverClassName(env.getRequiredProperty(PROPERTY_NAME_DATABASE_DRIVER));
driverManagerDataSource.setUrl(env.getRequiredProperty(PROPERTY_NAME_DATABASE_URL));
driverManagerDataSource.setUsername(env.getRequiredProperty(PROPERTY_NAME_DATABASE_USERNAME));
driverManagerDataSource.setPassword(env.getRequiredProperty(PROPERTY_NAME_DATABASE_PASSWORD));
return driverManagerDataSource;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(dataSource());
entityManagerFactoryBean.setPersistenceProviderClass(HibernatePersistence.class);
entityManagerFactoryBean.setPackagesToScan(env.getRequiredProperty(PROPERTY_NAME_ENTITYMANAGER_PACKAGES_TO_SCAN));
entityManagerFactoryBean.setJpaProperties(hibProperties());
return entityManagerFactoryBean;
}
private Properties hibProperties() {
Properties properties = new Properties();
properties.put(PROPERTY_NAME_HIBERNATE_DIALECT,env.getRequiredProperty(PROPERTY_NAME_HIBERNATE_DIALECT));
properties.put(PROPERTY_NAME_HIBERNATE_SHOW_SQL,env.getRequiredProperty(PROPERTY_NAME_HIBERNATE_SHOW_SQL));
return properties;
}
#Bean
public PlatformTransactionManager transactionManager() {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory().getObject());
return transactionManager;
}
#Bean
public ReloadableResourceBundleMessageSource messageSource(){
ReloadableResourceBundleMessageSource messageSource=new ReloadableResourceBundleMessageSource();
String[] resources= {"classpath:messages"};
messageSource.setBasenames(resources);
return messageSource;
}
#Bean
public LocaleResolver localeResolver() {
final CookieLocaleResolver ret = new CookieLocaleResolver();
ret.setDefaultLocale(new Locale("en_IN"));
return ret;
}
#Bean
public LocaleChangeInterceptor localeChangeInterceptor(){
LocaleChangeInterceptor localeChangeInterceptor=new LocaleChangeInterceptor();
localeChangeInterceptor.setParamName("language");
return localeChangeInterceptor;
}
#Override
public void addResourceHandlers(ResourceHandlerRegistry registry) {
registry.addResourceHandler("/Angular/**").addResourceLocations("/Angular/");
registry.addResourceHandler("/css/**").addResourceLocations("/css/");
registry.addResourceHandler("/email_templates/**").addResourceLocations("/email_templates/");
registry.addResourceHandler("/fonts/**").addResourceLocations("/fonts/");
registry.addResourceHandler("/img/**").addResourceLocations("/img/");
registry.addResourceHandler("/js/**").addResourceLocations("/js/");
registry.addResourceHandler("/Landing_page/**").addResourceLocations("/Landing_page/");
}
#Bean
public static PropertySourcesPlaceholderConfigurer properties() {
PropertySourcesPlaceholderConfigurer pspc = new PropertySourcesPlaceholderConfigurer();
org.springframework.core.io.Resource[] resources = new ClassPathResource[] { new ClassPathResource("application.properties") };
pspc.setLocations(resources);
pspc.setIgnoreUnresolvablePlaceholders(true);
return pspc;
}
#Bean
public InternalResourceViewResolver viewResolver() {
InternalResourceViewResolver viewResolver = new InternalResourceViewResolver();
viewResolver.setViewClass(JstlView.class);
viewResolver.setPrefix("/WEB-INF/pages/");
viewResolver.setSuffix(".jsp");
return viewResolver;
}
// through below code we directly read properties file in jsp file
#Bean(name = "propertyConfigurer")
public PropertiesFactoryBean mapper() {
PropertiesFactoryBean bean = new PropertiesFactoryBean();
bean.setLocation(new ClassPathResource("application.properties"));
return bean;
}
}
Can anybody please assist me how to resolve transational issue in spring JPA with quartz
Thanks you all for your help. Finally I autowired EntityManagerFactory instead of persitance EntityManager and it is working fine. I tried all scenario but nothing worked to inject spring transactional in quartz so finally autoriwed entitymanagerfactory
Below is my repo class code.
#Repository
public class BillingCroneRepoImpl implements BillingCroneRepo {
/*#PersistenceContext
private EntityManager entityManager;*/
#Autowired
EntityManagerFactory entityManagerFactory;
public boolean updateTable(){
EntityManager entityManager = entityManagerFactory.createEntityManager();
EntityTransaction entityTransaction = entityManager.getTransaction();
entityTransaction.begin(); // this will go in try catch
Query query = entityManager.createQuery(updateSql);
// update table code goes here
entityTransaction.commit(); // this will go in try catch
}
}
I'm not the Spring specialist, but I think new ... doesn't work with #Transactional
#Service
#PersistJobDataAfterExecution
#DisallowConcurrentExecution
public class BillingCroneSvcImpl extends QuartzJobBean implements BillingCroneSvc {
#Autowired
BillingCroneRepo billingCroneRepo;
#Override
#Transactional
protected void executeInternal(JobExecutionContext context) throws JobExecutionException {
SpringBeanAutowiringSupport.processInjectionBasedOnCurrentContext(context);
billingCroneRepo.updateBalance();
}
}
It's because quartz is using the bean instead of the proxy generated for #Transactional.
Use either MethodInvokingJobDetailFactoryBean (instead of inheriting QuartzJob) or use a dedicated wrapper quarz bean (inheriting from QuartzJob) that call the spring bean (not inheriting from QuartzJob) having the #Transactionnal annotation.
EDIT : this is in fact not the problem
The problem is here :
JobDetailFactoryBean factory = new JobDetailFactoryBean();
factory.setJobClass(BillingCroneSvcImpl.class);
By passing the class, I presume that Quartz will instantiate it itself, so Spring won't create it and won't wrap the bean in a Proxy that handle the #Transactionnal behaviour.
Instead you must use something along the line :
#Bean(name = "billingCroneSvc")
public BillingCroneSvc getSvc(){
return new BillingCroneSvcImpl();
}
#Bean
public JobDetailFactoryBean jobDetailBalanceCarryForward(){
JobDetailFactoryBean factory = new JobDetailFactoryBean();
getSvc();// just make sure the bean is instantiated
factory.setBeanName("billingCroneSvc");
...
}