Spring batch meta data tables in mongo database - mongodb

I tried to have spring batch meta data tables in Mongo database but its not working correctly. I referred and used below mentioned github project to configure JobRepository to store job data in Mongodb. This GitHub project is updated last 3 years ago and looks discontinued.
https://github.com/vfouzdar/springbatch-mongoDao
https://jbaruch.wordpress.com/2010/04/27/integrating-mongodb-with-spring-batch/
Currently my application uses in-memory tables for spring batch and functional part is done. But I want job data to be stored in Mongodb.
I have already used Mysql for spring batch job data but in current application don't want mysql.
If anybody has any other solution/link which can help me, please share.

Related

Spring Batch and Azure Cosmos DB

I am planning to use Spring Batch on Azure as a serverless and looking to explore the Cosmos DB to store and manipulate the data.
Can I still use the Spring Batch Metadata tables with the COSMOS DB? If not, where to store the Spring Batch Metadata tables details?
How can we scheduled Batch Job on Azure? Is there any complete working example?
CosmosDB is not a supported database in Spring Batch, but you may be able to use one of the supported types, if the SQL variant is close enough.
Please refer to the Non-standard Database Types in a Repository section of the documentation for more details and a sample.

MongoDB database is getting reset randonmly

I have created a instance of mongodb on AWS EC2 Instance.
I have 2 Spring boot/microservices connected to this database. One is just inserting the data and other is fetching data. In my microservices, I dont have any delete operation and no such code is provided atany time so by mistake data getting deleted scenario will not occure.
**But some how mongodb database is getting reset/cleared. All database is getting deleted.**
I have checked the mongodb configuration and I haven't explicitly changed anything.
Mongodb is community edition.
On spring boot microservices, I am directly using the Mongorepository
JPA to insert or fetch the data.
I dont have any constraints in POJOs.
Can someone point out what might be the issue? Does mongodb has any
default setting for resetting the database?

Using cosmos db for Spring batch job repository

Is it possible to use CosmosDB as a job repository for Spring Batch?
If that is not possible, can we go with an in-memory DB to handle our Spring batch jobs?
The job itself is triggered on message arrival in a remote queue. We use a variation of the process indicator in our current Spring batch job, to keep track of "chunks" which are being processed. Our attributes for saveStep are also disabled . The reader always uses a DB query to avoid picking up the same chunks and prevent duplicate processing.
We don't commit the message on the queue , till all records for that job are processed. So if the node dies and comes back up in the middle of processing , the same message would be redelivered , which takes of job restarts. Given all this, we have a choice of either coming up with a way to implement a cosmos job repository or simply use in-memory and plug in an "afterJob" listener to clean up the in-memory job data to ensure that java mem is not used in Prod. Any recommendations?
Wanted to provide information that Azure Cosmos DB just release v3 of the Spring Data connector for the SQL API:
The Spring on Azure team, in partnership with the Azure Cosmos DB team, are proud to have just made the Spring Data Azure Cosmos DB v3 generally available. This is the latest version of Azure Cosmos DB’s SQL API Spring Data connector.
Also, Spring.io has an example microservices solution (Spring Cloud Data Flow) based on batch that could be used as an example for your solution.
Additional Information:
Spring Data Azure Cosmos DB v3 for Core (SQL) API: Release notes and resources (link)
A well written 3rd party blog that is super helpful:
Introduction to Spring Data Azure Cosmos DB (link)

How can Spring Cloud Dataflow Server use new tables( with custom prefix ) created for Spring batch and Spring cloud task?

I have created spring cloud task tables i.e. TASK_EXECUTION, TASK_TASK_BATCH with prefix as MYTASK_ and spring batch Tables with prefix MYBATCH_ in oracle database.
There are default tables also there in the same schema which got created automatically or by other team mate.
I have bound my Oracle database service to SCDF server deployed on PCF.
How can i tell my Spring Cloud Dataflow server to use tables created with my prefix to render data on dataflow server dashboard?
Currently, SCDF dashboard uses tables with default prefix to render data. It works fine. I want to use my tables to render SCDF dashboard screens.
I am using Dataflowserver version - 1.7.3 and Deployed it on PCF using manifest.yml
There's an open story to add this enhancement via spring-cloud/spring-cloud-dataflow#2048.
Feel free to consider contributing or share use-case details in the issue.
Currently in a spring-cloud-dataflow and spring-cloud-skipper we use flyway to manage database schemas and it's not possible to prefix table names. Trying to support for this would add too much complexity and I'm not even sure if it'd be possible.

Spring Boot app with Redis insert data on startup

I am working on a Spring Boot app that uses Redis as the primary data store. Is there a way to insert data into the Redis db on application startup the way we do it for MongoDB using a Mongeez script?
Thanks in advance.
many years ago, I used ApplicationRunner to do something