Spring Boot + MongoDB: How to reuse connections - mongodb

I'm using SpringBoot + MongoDB. I created my object as follows.
I am able to #Autowrite the DocumentStoreConfig object in my Service/Controller and make calls to Mongo.
Sample call:
#Autowired
private DocumentStoreConfig docStoreConfig;
this.docStoreConfig.mongoClient().getDatabase("db_name").getCollection(collection).insertOne(doc);
Problem I see is that each call does a 'new' MongoClient and opens up a new connections.
What is the guidance on setting up a pool.. or reusing the same connection object rather than making the painful cost of opening a brand new connection.
#Configuration
public class DocumentStoreConfig extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.uri}")
private String connectionString;
#Value("${documentstore.database}")
private String databaseName;
#Override
public String getDatabaseName() {
return this.databaseName;
}
#Override
public MongoClient mongoClient() {
System.out.println("**** \n\n\n NEW MONGO \n\n\n");
return new MongoClient(new MongoClientURI(this.connectionString));
}
public MongoCollection<Document> getFailureCollection() {
return this.mongoClient().getDatabase(this.databaseName).getCollection("failure");
}
}

Related

Springboot ignores the MongoDB atlas uri, trying to connect hosts=[127.0.0.1:27017]

I have been working in a application with Spring webflux and reactive mongo DB. in there i used mongo DB atlas as the database and it worked fine.
Recently i had to introduce mongo custom conversion to handle the Zoned Date Time objects.
#Configuration
public class MongoReactiveConfiguration extends AbstractReactiveMongoConfiguration{
#Override
public MongoCustomConversions customConversions() {
ZonedDateTimeReadConverter zonedDateTimeReadConverter = new ZonedDateTimeReadConverter();
ZonedDateTimeWriteConverter zonedDateTimeWriteConverter = new ZonedDateTimeWriteConverter();
List<Converter<?, ?>> converterList = new ArrayList<>();
converterList.add(zonedDateTimeReadConverter);
converterList.add(zonedDateTimeWriteConverter);
return new MongoCustomConversions(converterList);
}
#Override
protected String getDatabaseName() {
// TODO Auto-generated method stub
return "stlDB";
}
}
HoOwever now i no longer can connect to mongo db atlas, it ignores the proeprty spring.data.mongodb.uri and tries to connect local server with default configuration.
i tried
#EnableAutoConfiguration(exclude={MongoReactiveAutoConfiguration.class})
but then it ignored the above conversions as well. Is there any other configurations to override in AbstractReactiveMongoConfiguration to ignore the default server IP and port?
I had the same issue and could not find a solution other than configuring converters differently, without extending AbstractReactiveMongoConfiguration:
#Configuration
public class MongoAlternativeConfiguration {
#Bean
public MongoCustomConversions mongoCustomConversions() {
return new MongoCustomConversions(
Arrays.asList(
new ZonedDateTimeReadConverter(),
new ZonedDateTimeWriteConverter()));
}
}

Springboot MongoDB connection URI is specifed in other key (not using spring.data.mongodb.uri)

For my Springboot application, I have a requirement that MongoDB URI should be specified with "app1.mongodb.uri" in application.properties. Yes we don't want to use "spring.data.mongodb.uri" because I was told that it's misleading (what!?). Does anyone know what is the simplest way to do that ? My application is all running fine, and I'm so reluctant to make any big change because of this "requirement".
Figured it out how to do it. The trick is to override the beam MongoClient and Mongotemplate.
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
#SuppressWarnings("deprecation")
#Configuration
#EnableMongoRepositories
#PropertySource("classpath:application.properties")
public class MongoDBConfig extends AbstractMongoConfiguration {
#Value("${app1.mongodb.db}")
private String database;
#Value("${app1.mongodb.uri}")
private String uri;
#Override
#Bean
public MongoClient mongoClient() {
MongoClientURI mongoURI = new MongoClientURI(uri);
MongoClient client = new MongoClient(mongoURI);
return client;
}
#Override
protected String getDatabaseName() {
return database;
}
#Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoClient(), getDatabaseName());
}
}

Spring Data MongoDB Converter not getting registered

I have a setup of multiple MongoDB configuration. Here is the configuration class
#Configuration
#RequiredArgsConstructor
#EnableConfigurationProperties(MongoConfigProperties.class)
public class MultipleMongoConfig {
private static final Logger logger = LoggerFactory.getLogger(MultipleMongoConfig.class);
private final MongoConfigProperties mongoProperties;
#Primary
#Bean(name = "sysdiagMongoTemplate")
public MongoOperations sysdiagMongoTemplate() {
MappingMongoConverter mappingMongoConverter = new MappingMongoConverter(new DefaultDbRefResolver(sysdiagFactory(mongoProperties.getSysdiag())),
new MongoMappingContext());
List<Converter<?, ?>> converters = new ArrayList<>();
converters.add(new AggregationResultReadConverter());
mappingMongoConverter.setCustomConversions(new CustomConversions(CustomConversions.StoreConversions.NONE, converters));
mappingMongoConverter.afterPropertiesSet();
boolean canConvert = mappingMongoConverter.getConversionService().canConvert(Document.class, AggregationResult.class);
mappingMongoConverter.afterPropertiesSet();
logger.info("canConvertFromDocumentToAggResult:: " + canConvert); //gives TRUE
return new MongoTemplate(sysdiagFactory(this.mongoProperties.getSysdiag()), mappingMongoConverter);
}
#Bean(name = "monitoringMongoTemplate")
public MongoOperations monitoringMongoTemplate() {
return new MongoTemplate(monitoringFactory(this.mongoProperties.getMonitoring()));
}
public MongoDbFactory sysdiagFactory(final MongoProperties mongo) {
return new SimpleMongoDbFactory(new MongoClient(mongo.getHost(), mongo.getPort()),
mongo.getDatabase());
}
public MongoDbFactory monitoringFactory(final MongoProperties mongo) {
return new SimpleMongoDbFactory(new MongoClient(mongo.getHost(), mongo.getPort()),
mongo.getDatabase());
}
}
Here is the read converter class (I only require reading from MongoDB). We have dynamic keys in the document due to which I need to convert them into a Map
public class AggregationResultReadConverter implements Converter<Document, AggregationResult> {
#Override
public AggregationResult convert(Document source) {
AggregationResult aggregationResult = new AggregationResult();
aggregationResult.setData(new HashMap());
for(Map.Entry<String,Object> entry : source.entrySet()){
if(entry.getKey().matches("[A-Z][A-Z][A-Z]")){
aggregationResult.getData().put(entry.getKey(), entry.getValue());
}
}
return aggregationResult;
}
}
Here is the mapping configuration for one of the MongoDB database
#Configuration
#EnableMongoRepositories(basePackages = {"com.hns.services.restapi.db.mongo.sysdiag.entity", "com.hns.services.restapi.db.mongo.sysdiag.repo"}, mongoTemplateRef = "sysdiagMongoTemplate")
public class SysdiagMongoConfig {
}
And here is the repository interface
#Repository
public interface AggregationResultRepository extends MongoRepository<AggregationResult, ObjectId> {
#Query("{ TIME: {$gte : ?0, $lt: ?1}}")
List<AggregationResult> findInTimeRange(Long startTime, Long endTime);
}
When I query using AggregationResultRepository, I expect the converter code to be executed so that I can convert the fields and put them in the Entity (Document) class object as per the logic. The query is going fine as I saw in the debug logs and I see an output but the converter is not getting called.
The converted is getting registered with the mongo template as the canConvertFromDocumentToAggResult logger gives TRUE. I tried changing the converted from Document -> AggregationResult to DBObject -> AggregationResult but no luck. Not sure what am I missing here.

MongoTemplate Custom Config using Spring boot

I was looking to change WriteResultChecking property of mongoTemplate whilst working on Spring boot app (2.0.5). I found out a way via extending AbstractMongoConfiguration as below.
I got the same working, however i found this approach a bit risky.
Saying this because this approach forced me to write a implementation for
public MongoClient mongoClient() {
return new MongoClient(host, port);
}
Now MongoClient is the central class to maintain connections with MongoDB and if i am forced to write implementation for the same, then i may be possibly missing out on optimizations that spring framework does.
Can someone please suggest any other optimal way of overriding some properties/behaviours without having to tinker too much ?
#Configuration
public class MyMongoConfigs extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.database}")
private String databaseName;
#Value("${spring.data.mongodb.host}")
private String host;
#Value("${spring.data.mongodb.port}")
private int port;
#Override
public MongoClient mongoClient() {
return new MongoClient(host, port);
}
#Override
protected String getDatabaseName() {
return databaseName;
}
#Bean
public MongoTemplate mongoTemplate() throws Exception {
MongoTemplate myTemp = new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
**myTemp.setWriteResultChecking(WriteResultChecking.EXCEPTION);**
return myTemp;
}
You are in right direction. Using AbstractMongoConfiguration you override the configs that you need to customize and it's the right way to do it. AbstractMongoConfiguration is still Spring Provided class, so the you don't have to worry about optimization, unless you mess with your configuration.
This is my approach:
package app.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.WriteResultChecking;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import com.mongodb.WriteConcern;
#Configuration
class ApplicationConfig {
#Bean
MongoTemplate mongoTemplate(MongoDbFactory mongoDbFactory, MongoConverter converter) {
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory, converter);
mongoTemplate.setWriteConcern(WriteConcern.MAJORITY);
mongoTemplate.setWriteResultChecking(WriteResultChecking.EXCEPTION);
return mongoTemplate;
}
}
I have figured this out by inspecting the source code of org.springframework.boot.autoconfigure.data.mongo.MongoDataAutoConfiguration.
I wrote CustomMongoTemplate which override method find() with added criteria to check if document has 'deletedAt' field;
Custom MongoTemplate:
public class CustomMongoTemplate extends MongoTemplate {
public CustomMongoTemplate(MongoDatabaseFactory mongoDbFactory, MongoConverter mongoConverter) {
super(mongoDbFactory, mongoConverter);
}
#Override
public <T> List<T> find(Query query, Class<T> entityClass, String collectionName) {
query.addCriteria(Criteria.where("deletedAt").exists(Boolean.FALSE));
return super.find(query, entityClass, collectionName);
}
Then create Bean in configuration class:
#Configuration
public class MyConfiguration {
//...
#Bean(name = "mongoTemplate")
CustomMongoTemplate customMongoTemplate(MongoDatabaseFactory databaseFactory, MappingMongoConverter converter) {
return new CustomMongoTemplate(databaseFactory, converter);
}
//...
}
And last thing - allow Spring to override default MongoTemplate bean. Add next thing to your application.properties file:
spring.main.allow-bean-definition-overriding=true

MonoDb creates connection every single operation on collection

I'm using mongodb-java-driver, and nothing else except that. I created singleten EJB with connection to Mongo.
#Singleton
public class MongoConnection {
private DB db = null;
private MongoClient mongoClient = null;
#PostConstruct
public void init() {
try {
mongoClient = new MongoClient("localhost", 27017);
db = mongoClient.getDB("mydb");
} catch (UnknownHostException e) {
e.printStackTrace();
}
}
public MongoConnection() {
}
public DB getDb() {
return db;
}
public DBCollection getCollectionInDatabase(String collection) {
DBCollection coll;
coll = db.getCollection(collection);
return coll;
}
}
I get this ejb in ApplicationScoped bean (JSF) (just to be sure, that I will have only ONE instance of DB connection).
#Named("appMongo")
#ApplicationScoped
public class MongoApplicationScope implements Serializable{
private static final long serialVersionUID = 1L;
#EJB MongoConnection mu;
public MongoConnection getMu() {
return mu;
}
public void setMu(MongoConnection mu) {
this.mu = mu;
}
}
Then in request scoped bean I get data from db
#Named("mongoBean")
#SessionScoped
public class MongoBean implements Serializable {
private static final long serialVersionUID = 1L;
#Inject MongoApplicationScope mongoAccess;
public void mongoDzialanie() {
DBCollection coll = mongoAccess.getMu().getDb().getCollection("oko"); //at this step everything is correct
System.out.println(coll.getCount()); //new connection is created text from mongoDB console -> connection accepted from 127.0.0.1:57700 #2 (2 connections now open)
}
Why even if I have the same "db" object instance I can't get data without creating new connection, why I can't share this connection as it should be due to pooling?
}
MongoDB drivers open new connections on the background each time you work with a collection. Drivers decide when to open a new one. I believe it depends on the driver's implementation.
You can control the max number of connections opened by setting poolSize value (default is 5 for Node.JS MongoDb driver http://mongodb.github.io/node-mongodb-native/2.2/api/MongoClient.html#connect) It could be different for Java or other languages. Check your documentation.
If you are going to have more than one db object, each object will have its own connection pool. In my case I have mainDb and logsDb. Each have pool of 10. Therefore, up to 20 concurrent connections will be opened.
Finally, if you are using node.js driver, make sure to pass Number as value and not string (i.e poolSize:10). This will save you hours/days of troubleshooting :)