Sprint Boot Mongo Respositry Hangs on Second Reqeust - mongodb

I'm running into a strange issue using SpringBoot MongoRepository.
I've localized the problem to returning the response to the request after it has successfully queried my Mongo instance. I have a simple object that I am querying. On start I can query and get back a response instantly. After that it just hangs after it has successfully queried Mongo.
This is the extent of the logs when the issue happens
2021-04-13 21:20:03 DEBUG [http-nio-8080-exec-2] [MongoQueryCreator.java:160] Created query Query: { "trackingCode" : "PERS4J"}, Fields: {}, Sort: {}
2021-04-13 21:20:03 DEBUG [http-nio-8080-exec-2] [MongoTemplate.java:2551] find using query: { "trackingCode" : "PERS4J"} fields: Document{{}} for class: class com.gotem.domain.Link in collection: link
2021-04-13 21:20:03 DEBUG [http-nio-8080-exec-2] [SLF4JLogger.java:56] Sending command '{"find": "link", "filter": {"trackingCode": "PERS4J"}, "limit": 2, "$db": "gotem"}' with request id 9 to database gotem on connection [connectionId{localValue:2, serverValue:11}] to server localhost:27017
2021-04-13 21:20:03 DEBUG [http-nio-8080-exec-2] [SLF4JLogger.java:56] Execution of command with request id 9 completed successfully in 2.47 ms on connection [connectionId{localValue:2, serverValue:11}] to server localhost:27017
This is using Spring Boot 2.2.0.RELEASE against Mongo 4.4.3.
I'm stumped :/
Adding simplified setup and config.
application.properties
spring.data.mongodb.uri=mongodb://localhost:27017/linkTrack
Repository
#Repository
public interface LinkRepository extends MongoRepository<Link, Long> {
Link findOneByTrackingCode(String trackingCode);
}
Query Controller
public class LinkController {
private static final Logger LOG = LoggerFactory.getLogger(LinkController.class);
#Autowired
private LinkRepository linkRepository;
#RequestMapping(value = "/retrieve/{trackingCode}", method = RequestMethod.GET)
public Link findOneByTrackingCode(#PathVariable String trackingCode) {
Link link = linkRepository.findOneByTrackingCode(trackingCode);
LOG.debug("Link: " + link);
return link;
}
}
Object
#Document
public class Link implements Serializable {
private static final long serialVersionUID = 1L;
#Id
private String id;
private String trackingCode;
public Link() {
this.trackingCode = "123456"; // THIS WAS THE ISSUE :(
}
private getTrackingCode(){
return this.trackingCode
};
}

Well. Crap. I had a constructor in the Link Object that added a generated id to the trackingCode field and once removed it worked as expected.
I still am at a loss as to why it worked on the first request after restarting the service and only hung after.

Related

Why MongoDB Atlas does not work with reactive Spring Data and Spring Boot?

I'm developing an easy sample application. One component is a gateway service using Spring Boot and Reactive Spring Data for Mongo, because that's where I want to store user and login informations.
For testing out different solutions, I wanted to use MongoDB Atlas. So, I set up an application. But when I want to save just a sample user, nothing happens, the data is not saved to the database. However it looks like the application is connected to the MongoDb Atlas. No error logs about failed connections.
This is the main class, where I have the #EnableReactiveMongoRepositories annotation:
#SpringBootApplication
#EnableReactiveMongoRepositories("com.bkk.sm.authentication.repository")
public class GatewayApplication {
public static void main(String[] args) {
SpringApplication.run(GatewayApplication.class, args);
}
}
Here is how I set up Mongo in application.yml and the repository:
spring:
data:
mongodb:
database: users
uri: mongodb+srv://${MONGO_USER}:${MONGO_PASSWORD}#taocluster.qa3sd.mongodb.net/users?retryWrites=true&w=majority
#Repository
public interface ReactiveUserRepository extends ReactiveMongoRepository<User, String> {
Mono<User> findByUsername(String username);
}
I don't use any specific reactive MongoDB config, I don't extend the AbstractReactiveMongoConfiguration (this is really just a bout to experiment how does this work) and I use the defaults.
In my UserDetailsServiceImpl, I try to save a sample record, just right after the bean is constructed:
#Slf4j
#Service
public class UserDetailsServiceImpl implements UserDetailsService {
private ReactiveUserRepository repository;
public UserDetailsServiceImpl(ReactiveUserRepository repository) {
this.repository = repository;
}
#PostConstruct
public void setup() {
BCryptPasswordEncoder encoder = new BCryptPasswordEncoder();
String pwd = encoder.encode("user");
User user = User.builder()
.username("user")
.password(pwd)
.accountExpired(false)
.accountLocked(false)
.activationCode(null)
.activatedTime(Date.from(Instant.now()))
.email("user#user.com")
.enabled(true)
.firstName("User")
.failedLoginAttempts(0)
.lastModificationTime(Date.from(Instant.now()))
.lastName("User")
.middleName("User")
.passwordExpiryTime(Date.from(Instant.now()))
.registrationTime(Date.from(Instant.now()))
.roles(List.of(CompanyRole.builder().companyCode("bkk")
.companyName("Beszterce KK")
.role(Role.ROLE_USER)
.build())
)
.passwordExpiryTime(null)
.version(0)
.build();
this.repository.save(user).map(user1 -> {
log.info("User saved. {}", user1);
return user1;
}).onErrorResume(Objects::nonNull, throwable -> {
log.error("Something is not right here.", throwable);
return Mono.error(throwable);
}).switchIfEmpty(Mono.defer(() -> {
log.info("Cannot save ure={}", user.toString());
return Mono.error(new Exception("WTF?"));
}));
}
... SOME MORE METHODS COME HERE
}
When it executes the this.repository.save(user) line, nothing happens. Well, I tried to debug and went deeper into the framework but ultimately, nothing happens. That's why I added some log messages. But nothing. If I put a breakpoint to the map or onErrorResume or switchIfEmpty branches, the execution doesn't stop there. No log is written to console other that this line:
2022-04-09 00:02:46.061 INFO 72528 --- [ntLoopGroup-3-7] org.mongodb.driver.connection : Opened connection [connectionId{localValue:7, serverValue:78530}] to taocluster-shard-00-02.qa3sd.mongodb.net:27017
And here is my data object where I declare the collection name:
#Getter
#Builder
#NoArgsConstructor
#AllArgsConstructor
#Document(collection = "users")
public class User implements UserDetails {
#Id
private String id;
#Indexed
#NonNull
private String username;
... SOME MORE FIELDS COME HERE ...
}
So, my question is, what am I doing wrong? Why I don't see anything added to my MongoDB Atlas sample database? Where I just set the 0.0.0.0/0 for accepting connections from everywhere for the time being of testing this stuff out.
Any help would be appreciated.

Spring Boot Cassandra Error connecting to Node(endPoint=localhost:1234, hostId=null, hashCode=37hfeouh3),

In my Spring Boot Cassandra build I am getting the following error: s0-admin-1] c.d.o.d.i.c.control.ControlConnection : [s0] Error connecting to Node(endPoint=localhost:1234, hostId=null, hashCode=37hfeouh3), trying next node (ConnectionInitException: [s0|control|connecting...] Protocol initialization request, step 1 (OPTIONS): failed to send request (io.netty.channel.StacklessClosedChannelException))
Entity type of
#Data
#Builder
#Table
public class Class1 {
#Id
private String id;
private String data;
private Class2 data2;
private Integer data3;
...
}
public class2 Class2 {
#Id
#JasonProperty
private String id;
#Indexed
#JasonProperty
private String data;
#JasonProperty
private String data2;
#JasonProperty
private Integer data3;
...
}
#Configuration
#EnableCassandraRepositories
#ConfigurationProperties(prefix = "DBProperties")
public class ApplicationConfig extends AbstractCassandraConfiguration {
private String DBKEYSPACE;
#Override
protected String getKeyspaceName() {
return DBKEYSPACE;
}
public String[] getEntityBasePackages() {
return new String[] { "com.oreilly.springdata.cassandra" };
}
}
#ConfigurationProperties(prefix = "DBPROPERTIES")
#Slf4j
public class FactoryBeanAppConfig {
private String contactPoints;
private String keySpace;
private Integer port;
private String password;
private String username;
private String dataCenter;
/*
* Factory bean that creates the com.datastax.oss.driver.api.core.CqlSession instance
*/
#Bean
public CqlSessionFactoryBean session() {
//log it we made it.
log.info("I made it to CqlSessionFactoryBean");
CqlSessionFactoryBean session = new CqlSessionFactoryBean();
session.setContactPoints(URLINFO);
log.info("Contact Points: " +URLINFO);
session.setKeyspaceName(DBKEYSPACE);
//session.setPort(OURPORT);
session.setUsername(username);
session.setPassword(password);
session.setLocalDatacenter(LOCALDCENTER INFORMATION);
return session;
}
}
I am unable to find a good example or even a get it to work correctly. Looking at this documentation: https://docs.spring.io/spring-data/cassandra/docs/current/reference/html/#cassandra.core thats the only thing I should have to do to implement example 55
For the spring boot run your application, it need to load the DB when your server application (the tomcat for example) is starting. So, your schema should be created first. If it is ok, you could change the "localhost" to "127.0.0.1" in your cassandra.yaml file.
Important: "[s0] Error connecting to Node(endPoint=localhost:1234,..." please check the cassandra's port. The correct is 9042.
It will solve your problem. However, others errors can be happen, because the others classes.
Then, you could correct the classes below:
#SpringBootApplication
#EnableCassandraRepositories(basePackages = { "<package's path>" })
#EntityScan(basePackages = { "<package's path>" })
#ComponentScan(basePackages = { "<package's path>" })
public class ApplicationConfig extends SpringBootServletInitializer
{
SpringApplication.run(ApplicationConfig.class, args);
}
Entity:
#Table("<table name>")
public class Class1 {
#PrimaryKeyColumn(name = "<field name id>", type = PrimaryKeyType.PARTITIONED)
private String id;
#Column("<field name data>")
private String data;
private Class2 data2; //I think this declaretion can cause error
#Column("<field name data3>")
private Integer data3;
...
}
This FactoryBeanAppConfig's class is not sound good. I created a class to read the application.properties and inject this class to connect with the db datas like keyspace's name, port, and so one. This link will help you to creat this class: https://docs.spring.io/spring-boot/docs/current/reference/html/features.html#features.external-config . And then, you use this class in your FactoryBeanAppConfig's class to get DBKEYSPACE, OURPORT, ... .
This is a example to helps you to understand what I'm saying: https://www.baeldung.com/spring-data-cassandra-tutorial .

Use multiple mongo DBs in same application for same model & same Repository

I need to implement Spring boot - MongoDb application where There are 2 mongo DBs which have exact same database name & collections. Based on User making a request, i need to choose whether to fetch data from DB1 or DB2 (only difference in mongo URI host - IP).
E.g. I need some way to create 2 mongoTemplates like mTempA & mTempB in my Repository & based on some condition, use either of the template to execute query as below:
#Repository
public class MyCustomRepository {
private Logger logger = LoggerFactory.getLogger(MyCustomRepository.class);
#Autowired
private MongoTemplateA mongoTemplateA;// Need to know if this is possible & how
#Autowired
private MongoTemplateB mongoTemplateB;// Need to know if this is possible & how
public List<MyModel> findByCriteria(MyRequest request) {
List<MyModel> result;
//Query query = <build query based on request>
if (request.getUserType().equals("A")) {
result = mongoTemplateA.find(query, MyModel.class);
} else {
result = mongoTemplateB.find(query, MyModel.class);
}
logger.debug("Result fetched with {} records", result.size());
return result;
}
}
I don't want to have 2 separate Repo (Class or Interfaces) or different models to be used. Just want to have 2 different mongoTemplates to be injected in single repo.
Is this possible? If yes, please give some example code.
I have followed below tutorial:
https://dzone.com/articles/multiple-mongodb-connectors-with-spring-boot
As rightly pointed out by #Lucia, below is how it can be done:
Have 2 different configuration placeholders
#Configuration
#EnableMongoRepositories(basePackages = "com.snk.repository", mongoTemplateRef = "mongoTemplateA")
public class MongoConfigA {
// Configuration class for DB 1 access
}
#Configuration
#EnableMongoRepositories(basePackages = "com.snk.repository", mongoTemplateRef = "mongoTemplateB")
public class MongoConfigB {
// Configuration class for DB 2 access
}
Get one class which will help in reading custom properties for mongo db properties in application.properties:
#ConfigurationProperties(prefix = "mongodb")
public class MultipleMongoProperties {
private MongoProperties adb = new MongoProperties();
private MongoProperties bdb = new MongoProperties();
public MongoProperties getAdb() {
return adb;
}
public MongoProperties getBdb() {
return bdb;
}
}
Add a configuration class to create mongoTemplates:
#Configuration
#EnableConfigurationProperties(MultipleMongoProperties.class)
public class MultipleMongoConfig {
#Autowired
private MultipleMongoProperties mongoProperties = new MultipleMongoProperties();
#Bean(name = "mongoTemplateA")
#Primary
public MongoTemplate mongoTemplateA() {
return new MongoTemplate(aDbFactory(this.mongoProperties.getAdb()));
}
#Bean(name = "mongoTemplateB")
public MongoTemplate mongoTemplateB() {
return new MongoTemplate(bDbFactory(this.mongoProperties.getBdb()));
}
#Bean
#Primary
public MongoDbFactory aDbFactory(final MongoProperties mongo) {
return new SimpleMongoDbFactory(new MongoClientURI(mongo.getUri()));
}
#Bean
public MongoDbFactory bDbFactory(final MongoProperties mongo) {
return new SimpleMongoDbFactory(new MongoClientURI(mongo.getUri()));
}
}
Add below decelerations to your service/repository:
#Autowired
#Qualifier("mongoTemplateA")
private MongoTemplate mongoTemplateA;
#Autowired
#Qualifier("MongoTemplateB")
private MongoTemplate MongoTemplateB;
Add below properties in your application.properties:
mongodb.adb.uri=mongodb://user:pass#myhost1:27017/adb
mongodb.bdb.uri=mongodb://user:pass#myhost2:27017/bdb
If you have mongo rplica set, URL can be set as:
mongodb.adb.uri=mongodb://user:pass#myhost1,myhost2,myhost13/adb?replicaSet=rsName
mongodb.bdb.uri=mongodb://user:pass#myhost1,myhost2,myhost13/bdb?replicaSet=rsName
Based on your logic, use either of the template.
Thought, there are few catches:
Notice the #Primary annotation, one bean needs to be marked as primary. I haven't find any solution if no template is marked primary.
If any of the mongo DB is down & application is started/restarted, application will not start/deploy. to avoid this, #Autowired needs to be changed to #Autowired(required = false).
If any of the mongo DB is down & application is already running, it automatically uses 2nd mongo BD (which is not down). So, even if you want to use A DB, if it's down, requests are processed with B DB & vice-versa.

Mongo Morphia mapPackage doesn't map classes in package

I am using Morphia mapper for MongoDB/Java. I have successfully used the web application on GlassFish server. I am migrating my project to WildFly8.2.Final/JBoss. I am having issues with Morphia mapping packages. Morphia mapping/scanning packages doesnt work. It worked fine on GlassFish but doesnt work on WilfFly.
I thought that it was a classpath issue and did a small test.
I experimented by individually mapping a class and it worked fine. Its just mapping a package doesn't work. I have the following code for Morphia.
Code :
public class MongoDataSource {
private static final String IP = XXXXXX;
private static final Integer PORT = XXXXXX;
private static final String DB_NAME = XXXXXX;
private static final String USERNAME = XXXXXX;
private static final String PWD = XXXXXX;
private static Morphia m;
private static Datastore ds;
private static DB db;
private static MongoClient client;
private static MongoDataSource INSTANCE = new MongoDataSource();
private MongoDataSource() {
m = new Morphia();
m.mapPackage("xxxx.model.user");//Works on Glassfish but doesnt work on WildFly/JBoss
m.map(xxxx.model.user.User.class);//My Experiment with loading a specific class in the package
try {
List<MongoCredential> credentials = new ArrayList<>();
credentials.add(MongoCredential.createMongoCRCredential(USERNAME, DB_NAME, PWD.toCharArray()));
ServerAddress servAddr = new ServerAddress(IP, PORT);
client = new MongoClient(servAddr, credentials);
db = client.getDB(DB_NAME);
ds = m.createDatastore(client, DB_NAME);
} catch (Exception e) {
//Log
}
}
public static Morphia getMorphia() {
return m;
}
public static Datastore getDatastore() {
return ds;
}
public static DB getDataBase() throws Exception {
return db;
}
}
What I don't understand is, if the code was not able to find package, how is it able to find a class in a package. Is this is a bug in Morphia API or some classpath issue when running the application on WildFly/Jboss. I cannot convince myself that its a classpath issue.
There have been several bugs related with mapPackage in morphia. Two days ago, using the version 0.110 I have experienced the an error with that method and I added to a existing issue in their GitHub
Check the related issues in GitHub with mapPackage and as a workaround you can just provide the classes directly using: morphia.map(ClassA.class, ClassB.class, ClassC.class);

MonoDb creates connection every single operation on collection

I'm using mongodb-java-driver, and nothing else except that. I created singleten EJB with connection to Mongo.
#Singleton
public class MongoConnection {
private DB db = null;
private MongoClient mongoClient = null;
#PostConstruct
public void init() {
try {
mongoClient = new MongoClient("localhost", 27017);
db = mongoClient.getDB("mydb");
} catch (UnknownHostException e) {
e.printStackTrace();
}
}
public MongoConnection() {
}
public DB getDb() {
return db;
}
public DBCollection getCollectionInDatabase(String collection) {
DBCollection coll;
coll = db.getCollection(collection);
return coll;
}
}
I get this ejb in ApplicationScoped bean (JSF) (just to be sure, that I will have only ONE instance of DB connection).
#Named("appMongo")
#ApplicationScoped
public class MongoApplicationScope implements Serializable{
private static final long serialVersionUID = 1L;
#EJB MongoConnection mu;
public MongoConnection getMu() {
return mu;
}
public void setMu(MongoConnection mu) {
this.mu = mu;
}
}
Then in request scoped bean I get data from db
#Named("mongoBean")
#SessionScoped
public class MongoBean implements Serializable {
private static final long serialVersionUID = 1L;
#Inject MongoApplicationScope mongoAccess;
public void mongoDzialanie() {
DBCollection coll = mongoAccess.getMu().getDb().getCollection("oko"); //at this step everything is correct
System.out.println(coll.getCount()); //new connection is created text from mongoDB console -> connection accepted from 127.0.0.1:57700 #2 (2 connections now open)
}
Why even if I have the same "db" object instance I can't get data without creating new connection, why I can't share this connection as it should be due to pooling?
}
MongoDB drivers open new connections on the background each time you work with a collection. Drivers decide when to open a new one. I believe it depends on the driver's implementation.
You can control the max number of connections opened by setting poolSize value (default is 5 for Node.JS MongoDb driver http://mongodb.github.io/node-mongodb-native/2.2/api/MongoClient.html#connect) It could be different for Java or other languages. Check your documentation.
If you are going to have more than one db object, each object will have its own connection pool. In my case I have mainDb and logsDb. Each have pool of 10. Therefore, up to 20 concurrent connections will be opened.
Finally, if you are using node.js driver, make sure to pass Number as value and not string (i.e poolSize:10). This will save you hours/days of troubleshooting :)