I am working on a Spring Boot app that uses Redis as the primary data store. Is there a way to insert data into the Redis db on application startup the way we do it for MongoDB using a Mongeez script?
Thanks in advance.
many years ago, I used ApplicationRunner to do something
Related
I have created a instance of mongodb on AWS EC2 Instance.
I have 2 Spring boot/microservices connected to this database. One is just inserting the data and other is fetching data. In my microservices, I dont have any delete operation and no such code is provided atany time so by mistake data getting deleted scenario will not occure.
**But some how mongodb database is getting reset/cleared. All database is getting deleted.**
I have checked the mongodb configuration and I haven't explicitly changed anything.
Mongodb is community edition.
On spring boot microservices, I am directly using the Mongorepository
JPA to insert or fetch the data.
I dont have any constraints in POJOs.
Can someone point out what might be the issue? Does mongodb has any
default setting for resetting the database?
I have created spring cloud task tables i.e. TASK_EXECUTION, TASK_TASK_BATCH with prefix as MYTASK_ and spring batch Tables with prefix MYBATCH_ in oracle database.
There are default tables also there in the same schema which got created automatically or by other team mate.
I have bound my Oracle database service to SCDF server deployed on PCF.
How can i tell my Spring Cloud Dataflow server to use tables created with my prefix to render data on dataflow server dashboard?
Currently, SCDF dashboard uses tables with default prefix to render data. It works fine. I want to use my tables to render SCDF dashboard screens.
I am using Dataflowserver version - 1.7.3 and Deployed it on PCF using manifest.yml
There's an open story to add this enhancement via spring-cloud/spring-cloud-dataflow#2048.
Feel free to consider contributing or share use-case details in the issue.
Currently in a spring-cloud-dataflow and spring-cloud-skipper we use flyway to manage database schemas and it's not possible to prefix table names. Trying to support for this would add too much complexity and I'm not even sure if it'd be possible.
I created a spring boot application (simple with database connection) and I would like to put it on a openshift. It's not a problem for me to generate docker image and put it into openshift, but I also want a mongodb database instance on openshift. I already created it on a openshift but know I have no idea how to connect to it from the spring boot application. I recently heard that I need to type a pod name as a connection string. Is that correct? How exactly should I connect to mongodb pod from the spring boot pod. Should I create some route between those two? I am new with playing around docker and openshift, so please try to give me as much info as you can.
Are you using your own OS3 vm?
I'm no expert on the matter, but on OS3 web console, once you create a database from templates already provided by OpenShift, OS3 shows a connection string at the end of the process.
I'm pretty sure OS3 creates a service for your db, the link looks like this:
mysql://servicename:3306/database
I tried to have spring batch meta data tables in Mongo database but its not working correctly. I referred and used below mentioned github project to configure JobRepository to store job data in Mongodb. This GitHub project is updated last 3 years ago and looks discontinued.
https://github.com/vfouzdar/springbatch-mongoDao
https://jbaruch.wordpress.com/2010/04/27/integrating-mongodb-with-spring-batch/
Currently my application uses in-memory tables for spring batch and functional part is done. But I want job data to be stored in Mongodb.
I have already used Mysql for spring batch job data but in current application don't want mysql.
If anybody has any other solution/link which can help me, please share.
Can anyone please advise, what is the address if I am trying connect to an ElasticSearch node that is in-memory(just using the default config) while using Spring boot ? I am looking at something like localhost:port.
Thanks!
Ok, so it is localhost:9200, same as if running it externally.