How to initialize datasource via code while retrieving password from vault? - postgresql

I have a service which has app and it(integration test) modules and for integration, I deploy necessary services to argocd. When I initialize datasource via application.yml and set the password directly there, it works.
datasource:
url: jdbc:postgresql://${POSTGIS_HOST:localhost}:${POSTGIS_PORT:5433}/${POSTGIS_DB:mydb}
username: ${POSTGIS_USER:username}
password: ${POSTGIS_PASSWORD:password}
But, since I need to retrieve password from vault, I need to initialize datasource via code.
#Configuration
#Slf4j
public class DatabaseConfig {
#Value(value = "${spring.datasource.url}")
private String url;
#Value(value = "${spring.datasource.username}")
private String username;
#Bean
#Primary
// #ConfigurationProperties("spring.datasource")
public DataSource dataSource() throws IOException {
String password = getPostgresPasswordFromVault();
log.info("Password retrieved! {}", password);
return DataSourceBuilder.create()
.url(url)
.username(username)
.password(password)
.build();
}
private String getPostgresPasswordFromVault() throws IOException {
// return blabla
}
}
datasource:
url: jdbc:postgresql://${POSTGIS_HOST:localhost}:${POSTGIS_PORT:5433}/${POSTGIS_DB:mydb}
username: ${POSTGIS_USER:username}
password: ${POSTGIS_PASSWORD:password}
Code retrieves password correctly from vault, no issue there. But above initialization does not work.
When I run this code, I see this trace
Caused by: org.postgresql.util.PSQLException: FATAL: no pg_hba.conf entry for host "some-ip-address", user "postgres", database "mydb", SSL encryption
at org.postgresql.core.v3.ConnectionFactoryImpl.doAuthentication(ConnectionFactoryImpl.java:659)
at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:180)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:235)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:247)
at org.postgresql.Driver.makeConnection(Driver.java:434)
at org.postgresql.Driver.connect(Driver.java:291)
at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:138)
at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:364)
at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:206)
at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:476)
at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:561)
at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:115)
at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:112)
at org.flywaydb.core.internal.jdbc.JdbcUtils.openConnection(JdbcUtils.java:48)
... 60 more
Suppressed: org.postgresql.util.PSQLException: FATAL: password authentication failed for user "postgres"
at org.postgresql.core.v3.ConnectionFactoryImpl.doAuthentication(ConnectionFactoryImpl.java:659)
at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:180)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:244)
... 72 more
But like I said, if I remove DatabaseConfig class and only use application.yml with password set explicitly, it works.

Related

com.atomikos.jdbc.AtomikosSQLException Connection pool exhausted - try increasing 'maxPoolSize' and/or 'borrowConnectionTimeout' on the DataSourceBean

We have an application that has faced such error after successfully doing bunch of transactions from db and we realized the jdbc connection pool is full of idle connections that won't be closed and keep piling up.
Here is the error log from springboot app:
2021-08-12T00:50:28,078 INFO [org.apache.cxf.binding.soap.interceptor.Soap12FaultOutInterceptor:66] (http-nio-8080-exec-83) eventId=w:ef9c7df4-fae6-11eb-b2fa-000d3a434359, tenantId=, pe=, user=, app=, pid=, message=class org.apache.cxf.binding.soap.interceptor.Soap12FaultOutInterceptor$Soap12FaultOutInterceptorInternalmultipart/related; type="application/xop+xml"; boundary="uuid:c21d3fc7-b321-4895-a608-1e4e58824a90"; start="<root.message#cxf.apache.org>"; start-info="application/soap+xml"
2021-08-12 00:51:28 org.hibernate.engine.jdbc.spi.SqlExceptionHelper:137 [WARN] SQL Error: 0, SQLState: null
2021-08-12 00:51:28 org.hibernate.engine.jdbc.spi.SqlExceptionHelper:142 [ERROR] Connection pool exhausted - try increasing 'maxPoolSize' and/or 'borrowConnectionTimeout' on the DataSourceBean.
2021-08-12T00:51:28,074 WARN [org.apache.cxf.phase.PhaseInterceptorChain:475] (http-nio-8080-exec-84) eventId=w:e29c0214-fae6-11eb-8a27-000d3a434359, tenantId=, pe=, user=, app=, pid=, message=Interceptor for {urn:ihe:iti:xds-b:2007}HIEDirectXDRService#{urn:ihe:iti:xds-b:2007}DocumentRepository_ProvideAndRegisterDocumentSet-b has thrown exception, unwinding now
org.springframework.orm.hibernate5.HibernateJdbcException: JDBC exception on Hibernate data access: SQLException for SQL [n/a]; SQL state [null]; error code [0]; Unable to acquire JDBC Connection; nested exception is org.hibernate.exception.GenericJDBCException: Unable to acquire JDBC Connection
at org.springframework.orm.hibernate5.SessionFactoryUtils.convertHibernateAccessException(SessionFactoryUtils.java:252) ~[spring-orm-5.2.8.RELEASE.jar!/:5.2.8.RELEASE]
at org.springframework.orm.hibernate5.HibernateExceptionTranslator.convertHibernateAccessException(HibernateExceptionTranslator.java:102) ~[spring-orm-5.2.8.RELEASE.jar!/:5.2.8.RELEASE]
......
Caused by: org.hibernate.exception.GenericJDBCException: Unable to acquire JDBC Connection
at org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:47) ~[hibernate-core-5.4.18.Final.jar!/:5.4.18.Final]
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:113) ~[hibernate-core-5.4.18.Final.jar!/:5.4.18.Final]
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:99) ~[hibernate-core-5.4.18.Final.jar!/:5.4.18.Final]
at org.hibernate.resource.jdbc.internal.LogicalConnectionManagedImpl.acquireConnectionIfNeeded(LogicalConnectionManagedImpl.java:107) ~[hibernate-core-5.4.18.Final.jar!/:5.4.18.Final]
at org.hibernate.resource.jdbc.internal.LogicalConnectionManagedImpl.getPhysicalConnection(LogicalConnectionManagedImpl.java:134) ~[hibernate-core-5.4.18.Final.jar!/:5.4.18.Final]
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl.connection(StatementPreparerImpl.java:50) ~[hibernate-core-5.4.18.Final.jar!/:5.4.18.Final]
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$5.doPrepare(StatementPreparerImpl.java:149) ~[hibernate-core-5.4.18.Final.jar!/:5.4.18.Final]
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$StatementPreparationTemplate.prepareStatement(StatementPreparerImpl.java:176) ~[hibernate-core-5.4.18.Final.jar!/:5.4.18.Final]
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl.prepareQueryStatement(StatementPreparerImpl.java:151) ~[hibernate-core-5.4.18.Final.jar!/:5.4.18.Final]
......
Caused by: com.atomikos.jdbc.AtomikosSQLException: Connection pool exhausted - try increasing 'maxPoolSize' and/or 'borrowConnectionTimeout' on the DataSourceBean.
at com.atomikos.jdbc.AtomikosSQLException.throwAtomikosSQLException(AtomikosSQLException.java:29) ~[transactions-jdbc-4.0.6.jar!/:?]
at com.atomikos.jdbc.AbstractDataSourceBean.throwAtomikosSQLException(AbstractDataSourceBean.java:76) ~[transactions-jdbc-4.0.6.jar!/:?]
at com.atomikos.jdbc.AbstractDataSourceBean.throwAtomikosSQLException(AbstractDataSourceBean.java:71) ~[transactions-jdbc-4.0.6.jar!/:?]
at com.atomikos.jdbc.AbstractDataSourceBean.getConnection(AbstractDataSourceBean.java:351) ~[transactions-jdbc-4.0.6.jar!/:?]
at com.atomikos.jdbc.nonxa.AtomikosNonXADataSourceBean.getConnection(AtomikosNonXADataSourceBean.java:189) ~[transactions-jdbc-4.0.6.jar!/:?]
at org.hibernate.engine.jdbc.connections.internal.DatasourceConnectionProviderImpl.getConnection(DatasourceConnectionProviderImpl.java:122) ~[hibernate-core-5.4.18.Final.jar!/:5.4.18.Final]
at org.hibernate.internal.NonContextualJdbcConnectionAccess.obtainConnection(NonContextualJdbcConnectionAccess.java:38) ~[hibernate-core-5.4.18.Final.jar!/:5.4.18.Final]
......
We tried to increase maxpoolsize and borrowConnectionTimeout but haven't resolved the issue successfully. The app uses atomikos as transaction manager and we are trying to find how to close the idle connections after several minutes without any activities and keep the total number of connections down, fewer idle connections hanging out means more active connections can be going hopefully. We use XA transaction with atomikos. Please kindly advise how we could achieve the goal to keep idle connection down and close or flush the idle connection after certain time without active transaction to allow more active connections to start wtihout running out of the connection pool. Thanks. BTW, we use atomikos 4.0.6.
The application.yml looks like:
app:
datasource:
transaction:
transaction_timeout: 600
postgres:
dialect: org.hibernate.dialect.PostgreSQL82Dialect
driverClassName: org.postgresql.Driver
tac:
borrow_conn_timeout: 30
max_idle_time: 300
min_pool_size: 5
max_pool_size: 20
show_sql: false
max_lifetime: 1800
test_query: 'select 1'
hibernate:
connection:
release_mode: after_transaction
current_session_context_class: jta
spring:
profiles:
active: PostgreSql
jpa:
hibernate:
use-new-id-generator-mappings: false
properties:
hibernate:
id:
new_generator_mappings: false
jta:
atomikos:
connectionfactory:
ignore-session-transacted-flag: false
max-pool-size: 50
min-pool-size: 5
datasource:
max-pool-size: 50
min-pool-size: 5
properties:
service: com.atomikos.icatch.standalone.UserTransactionServiceFactory
enabled: true
Here is the DatabaseTransactionConfiguration:
#Configuration
public class DatabaseTransactionConfiguration {
#Value("${app.datasource.transaction.transaction_timeout}")
Integer transactionTimeout = 300;
#Bean(name = "userTransaction")
public UserTransaction userTransaction() throws Throwable {
final UserTransactionImp userTransactionImp = new UserTransactionImp();
userTransactionImp.setTransactionTimeout(transactionTimeout);
return userTransactionImp;
}
#Bean(name = "atomikosTransactionManager", initMethod = "init", destroyMethod = "close")
public TransactionManager atomikosTransactionManager() throws Throwable {
final UserTransactionManager userTransactionManager = new UserTransactionManager();
userTransactionManager.setTransactionTimeout(transactionTimeout);
userTransactionManager.setStartupTransactionService(false);
userTransactionManager.setForceShutdown(false);
return userTransactionManager;
}
#Bean(name = { "transactionManager", "jtaTransactionManager" })
#DependsOn({ "userTransaction", "atomikosTransactionManager" })
public PlatformTransactionManager transactionManager() throws Throwable {
final UserTransaction userTransaction = userTransaction();
final TransactionManager atomikosTransactionManager = atomikosTransactionManager();
return new JtaTransactionManager(userTransaction, atomikosTransactionManager);
}
}
Here is the DataSourceConfigurationTAC.java:
#Configuration
#EnableTransactionManagement
public class DataSourceConfigurationTAC {
#Value("${app.datasource.postgres.driverClassName}")
String postgresDriverClass;
#Value("${app.datasource.oracle.dialect}")
String oracleDialect;
#Value("${app.datasource.postgres.dialect}")
String postgresDialect;
#Value("${app.datasource.danc.borrow_conn_timeout}")
Integer borrowConnectionTimeout;
#Value("${app.datasource.danc.max_idle_time}")
Integer maxIdleTime;
#Value("${app.datasource.danc.min_pool_size}")
Integer minPoolSize;
#Value("${app.datasource.danc.max_pool_size}")
Integer maxPoolSize;
/**
* Maximum amount of seconds that a connection is kept in the pool before
* it is destroyed automatically. Optional, defaults to 0 (no limit).
*/
#Value("${app.datasource.tac.max_lifetime:0}")
Integer maxLifetime;
/**
* SQL query or statement used to validate a connection before returning it. Optional.
*/
#Value("${app.datasource.tac.test_query:#null}")
String testQuery;
#Bean(name = "dataSource")
#Primary
public DataSource tacDataSource() throws SQLException {
AtomikosNonXADataSourceBean bean = new AtomikosNonXADataSourceBean();
bean.setUniqueResourceName(dataSourceName);
bean.setDriverClassName(OdxcEnvironment.isPostgreSql() ? postgresDriverClass : oracleDriverClass);
bean.setBorrowConnectionTimeout(borrowConnectionTimeout);
bean.setMaxIdleTime(maxIdleTime);
bean.setMinPoolSize(minPoolSize);
bean.setMaxPoolSize(maxPoolSize);
bean.setMaxLifetime(maxLifetime);
bean.setTestQuery(testQuery);
return bean;
}
#Bean(name = { "sessionFactory" })
#Primary
public LocalSessionFactoryBean customerEntityManager(
#Autowired #Qualifier("jtaTransactionManager") PlatformTransactionManager transactionManager,
#Autowired #Qualifier("dataSource") DataSource tacDataSource) throws Throwable {
LocalSessionFactoryBean sessionFactoryBean = new LocalSessionFactoryBean();
sessionFactoryBean.setDataSource(tacDataSource);
Properties hibernateProperties = new Properties();
hibernateProperties.setProperty("hibernate.dialect", postgresDialect);
hibernateProperties.setProperty("hibernate.transaction.jta.platform", AtomikosJtaPlatform.class.getName());
hibernateProperties.setProperty("javax.persistence.transactionType", "JTA");
hibernateProperties.setProperty("hibernate.transaction.coordinator_class", "jta");
hibernateProperties.setProperty("hibernate.jdbc.use_streams_for_binary", Boolean.FALSE.toString());
hibernateProperties.setProperty("hibernate.id.new_generator_mappings", Boolean.FALSE.toString());
hibernateProperties.setProperty("hibernate.show_sql", Boolean.FALSE.toString());
hibernateProperties.setProperty("hibernate.format_sql", Boolean.FALSE.toString());
hibernateProperties.setProperty("hibernate.use_sql_comments", Boolean.FALSE.toString());
hibernateProperties.setProperty("hibernate.default_batch_fetch_size", "16");
hibernateProperties.setProperty("hibernate.cache.use_second_level_cache", Boolean.TRUE.toString());
hibernateProperties.setProperty("hibernate.cache.region.factory_class",
"org.hibernate.cache.ehcache.SingletonEhCacheRegionFactory");
hibernateProperties.setProperty("hibernate.cache.region_prefix", "connnectivity_springboot");
hibernateProperties.setProperty("hibernate.cache.use_query_cache", Boolean.TRUE.toString());
hibernateProperties.setProperty("hibernate.generate_statistics", Boolean.FALSE.toString());
sessionFactoryBean.setHibernateProperties(hibernateProperties);
sessionFactoryBean.setJtaTransactionManager(transactionManager);
return sessionFactoryBean;
}
}

Spring boot integration test With Dockerized postgres

Trying to do integration testing using dockerzied postgres
12:49:19.647 [main] ERROR org.testcontainers.dockerclient.EnvironmentAndSystemPropertyClientProviderStrategy - ping failed with configuration Environment variables, system properties and defaults. Resolved:
dockerHost=unix:///var/run/docker.sock
apiVersion='{UNKNOWN_VERSION}'
registryUrl='https://index.docker.io/v1/'
registryUsername='aequalis'
registryPassword='null'
registryEmail='null'
dockerConfig='DefaultDockerClientConfig[dockerHost=unix:///var/run/docker.sock,registryUsername=aequalis,registryPassword=<null>,registryEmail=<null>,registryUrl=https://index.docker.io/v1/,dockerConfigPath=/home/aequalis/.docker,sslConfig=<null>,apiVersion={UNKNOWN_VERSION},dockerConfig=<null>]'
due to org.rnorth.ducttape.TimeoutException: Timeout waiting for result with exception
org.rnorth.ducttape.TimeoutException: Timeout waiting for result with exception
at org.rnorth.ducttape.unreliables.Unreliables.retryUntilSuccess(Unreliables.java:51)
at org.testcontainers.dockerclient.DockerClientProviderStrategy.ping(DockerClientProviderStrategy.java:190)
at org.testcontainers.dockerclient.EnvironmentAndSystemPropertyClientProviderStrategy.test(EnvironmentAndSystemPropertyClientProviderStrategy.java:42)
at org.testcontainers.dockerclient.DockerClientProviderStrategy.lambda$getFirstValidStrategy$2(DockerClientProviderStrategy.java:113)
at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:267)
org.testcontainers.dockerclient.DockerClientProviderStrategy.getFirstValidStrategy(DockerClientProviderStrategy.java:148)
at org.testcontainers.DockerClientFactory.client(DockerClientFactory.java:105)
at org.testcontainers.containers.GenericContainer.<init>(GenericContainer.java:142)
at org.testcontainers.containers.JdbcDatabaseContainer.<init>(JdbcDatabaseContainer.java:45)
at org.testcontainers.containers.PostgreSQLContainer.<init>(PostgreSQLContainer.java:30)
at com.lava.configuration.management.activity.AbstractIntegrationTest.<clinit>(AbstractIntegrationTest.java:21)
at sun.misc.Unsafe.ensureClassInitialized(Native Method)
at sun.reflect.UnsafeFieldAccessorFactory.newFieldAccessor(UnsafeFieldAccessorFactory.java:43)
at sun.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:156)
at java.lang.reflect.Field.acquireFieldAccessor(Field.java:1088)
at java.lang.reflect.Field.getFieldAccessor(Field.java:1069)
at java.lang.reflect.Field.get(Field.java:393)
at org.junit.runners.model.FrameworkField.get(FrameworkField.java:73)
at org.junit.runners.model.TestClass.getAnnotatedFieldValues(TestClass.java:230)
at org.junit.runners.ParentRunner.classRules(ParentRunner.java:255)
at org.junit.runners.ParentRunner.withClassRules(ParentRunner.java:244)
at org.junit.runners.ParentRunner.classBlock(ParentRunner.java:194)
at org.junit.runners.ParentRunner.run(ParentRunner.java:362)
org.rnorth.ducttape.unreliables.Unreliables.lambda$retryUntilSuccess$0(Unreliables.java:41)
at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
at java.util.concurrent.FutureTask.run(FutureTask.java)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.sun.jna.LastErrorException: [13] Permission denied
at org.testcontainers.shaded.org.scalasbt.ipcsocket.UnixDomainSocketLibrary.connect(Native Method)
at org.testcontainers.shaded.org.scalasbt.ipcsocket.UnixDomainSocket.<init>(UnixDomainSocket.java:57)
... 36 common frames omitted
Above error is thrown whiling connecting to dockerized postgres for integration test. Below is the configuration code to connect. It seems like permission issue on the docker image.
#RunWith(SpringJUnit4ClassRunner.class)
#SpringBootTest(classes = LibraryConfigurationApplication.class, webEnvironment = WebEnvironment.RANDOM_PORT)
#ContextConfiguration(initializers = AbstractIntegrationTest.Initializer.class)
public abstract class AbstractIntegrationTest {
#ClassRule
public static PostgreSQLContainer postgreSQLContainer = new PostgreSQLContainer("kartoza/postgis:12.0")
.withDatabaseName("integration-tests-db")
.withUsername("docker")
.withPassword("docker");
static class Initializer
implements ApplicationContextInitializer<ConfigurableApplicationContext> {
public void initialize(ConfigurableApplicationContext configurableApplicationContext) {
TestPropertyValues.of(
"spring.datasource.url=" + postgreSQLContainer.getJdbcUrl(),
"spring.datasource.username=" + postgreSQLContainer.getUsername(),
"spring.datasource.password=" + postgreSQLContainer.getPassword()
).applyTo(configurableApplicationContext.getEnvironment());
}
}
}
Please helpout to fix this issue
You can try this I think it should solve your problem as it is mostly a missing permission
https://docs.docker.com/engine/install/linux-postinstall/

Hazelcast need to be connected as client in the existing cluster instead of member

The changes which I made in server side:
#Bean(name = {"hazelcast"})
public HazelcastInstance hazelcastInstance() {
ClientConfig clientConfig = new ClientConfig();
clientConfig.getGroupConfig().setName(integrationSettings.getHazelcastClusterGroupName())
.setPassword(integrationSettings.getHazelcastClusterGroupPass());
final ClientNetworkConfig clientNetworkConfig = new ClientNetworkConfig();
clientNetworkConfig.addAddress("127.0.0.1:6701");
clientConfig.setNetworkConfig(clientNetworkConfig);
clientConfig.setInstanceName("INTEGRATION_INSTANCE");
final String hazelcastEnterpriseLicenseKey = null;
if (hazelcastEnterpriseLicenseKey != null) {
clientConfig.setLicenseKey(hazelcastEnterpriseLicenseKey);
}
return HazelcastClient.newHazelcastClient(clientConfig);
}
I will be getting my groupname and password from my property file.
My client side code:
ClientConfig clientConfig = new ClientConfig();
clientConfig.getGroupConfig().setName(hazelcastGroupName).setPassword(hazelcastGroupPwd);
clientConfig.getNetworkConfig().addAddress(serverAddress);
hazelcastInstance = HazelcastClient.newHazelcastClient(clientConfig);
My error log:
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.hazelcast.core.HazelcastInstance]: Factory method 'hazelcastInstance' threw exception; nested exception is java.lang.IllegalStateException: Unable to connect to any address in the config! The following addresses were tried: [[127.0.0.1]:6701]
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:189)
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:588)
... 37 more
Caused by: java.lang.IllegalStateException: Unable to connect to any address in the config! The following addresses were tried: [[127.0.0.1]:6701]
at com.hazelcast.client.spi.impl.ClusterListenerSupport.connectToCluster(ClusterListenerSupport.java:178)
at com.hazelcast.client.spi.impl.ClientClusterServiceImpl.start(ClientClusterServiceImpl.java:189)
at com.hazelcast.client.impl.HazelcastClientInstanceImpl.start(HazelcastClientInstanceImpl.java:404)
at com.hazelcast.client.HazelcastClientManager.newHazelcastClient(HazelcastClientManager.java:78)
at com.hazelcast.client.HazelcastClient.newHazelcastClient(HazelcastClient.java:72)
at com.zafin.zrpe.integration.config.ZrpeIntegrationConfiguration.hazelcastInstance(ZrpeIntegrationConfiguration.java:85)
at com.zafin.zrpe.integration.config.ZrpeIntegrationConfiguration$$EnhancerBySpringCGLIB$$7af6798e.CGLIB$hazelcastInstance$6(<generated>)
at com.zafin.zrpe.integration.config.ZrpeIntegrationConfiguration$$EnhancerBySpringCGLIB$$7af6798e$$FastClassBySpringCGLIB$$25f010cb.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228)
at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:358)
at com.zafin.zrpe.integration.config.ZrpeIntegrationConfiguration$$EnhancerBySpringCGLIB$$7af6798e.hazelcastInstance(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:162)
... 38 more
I need to connect my hazelcast as a client, but this bean exception is failing the deployments. Is there is any other way of doing it?
Looking at your code you are creating "Hazelcast-Client" on the server side and Client Side. In the server side code , please create a hazelcast ServerMember instance by passing "Config" object and not "ClientConfig" more like
#Bean
public HazelcastInstance hazelcastInstance() throws Exception {
Config cfg = new Config();
...
...
HazelcastInstance instance = Hazelcast.newHazelcastInstance(cfg);
return instance;
}
The the Hazelcast-client can connect to the Hazelcast ServerMember. You also need to ensure the ServerMember is started before client can connect to it.

Invalid mongo configuration, either uri or host/port/credentials must be specified

I'm getting this exception :
Caused by: java.lang.IllegalStateException: Invalid mongo configuration, either uri or host/port/credentials must be specified
at org.springframework.boot.autoconfigure.mongo.MongoProperties.createMongoClient(MongoProperties.java:207)
at org.springframework.boot.autoconfigure.mongo.MongoAutoConfiguration.mongo(MongoAutoConfiguration.java:73)
at org.springframework.boot.autoconfigure.mongo.MongoAutoConfiguration$$EnhancerBySpringCGLIB$$15f9b896.CGLIB$mongo$1(<generated>)
at org.springframework.boot.autoconfigure.mongo.MongoAutoConfiguration$$EnhancerBySpringCGLIB$$15f9b896$$FastClassBySpringCGLIB$$c0338f6a.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228)
at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:356)
at org.springframework.boot.autoconfigure.mongo.MongoAutoConfiguration$$EnhancerBySpringCGLIB$$15f9b896.mongo(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:162)
... 25 common frames omitted
Here is my application.yml content :
spring:
data:
mongodb:
uri: mongodb://develop:d3VeL0p$#<my_host>:27017/SHAM
Here is my configuration class :
package com.me.service.testservice.config;
import com.mongodb.MongoClient;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.repository.config.EnableMongoRepositories;
#Configuration
#EnableMongoRepositories(basePackages = {"com.me.service.testservice.repository"}, considerNestedRepositories = true)
public class SpringMongoConfiguration {
#Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(new MongoClient("<my_host>"), "SHAM");
}
}
Now I'm getting this stack trace when starting without failing, it looks like user develop doesn't have the right to connect:
Caused by: com.mongodb.MongoCommandException: Command failed with error 18: 'Authentication failed.' on server phelbwlabect003.karmalab.net:27017. The full response is { "ok" : 0.0, "errmsg" : "Authentication failed.", "code" : 18, "codeName" : "AuthenticationFailed" }
at com.mongodb.connection.CommandHelper.createCommandFailureException(CommandHelper.java:170)
at com.mongodb.connection.CommandHelper.receiveCommandResult(CommandHelper.java:123)
at com.mongodb.connection.CommandHelper.executeCommand(CommandHelper.java:32)
at com.mongodb.connection.SaslAuthenticator.sendSaslStart(SaslAuthenticator.java:117)
at com.mongodb.connection.SaslAuthenticator.access$000(SaslAuthenticator.java:37)
at com.mongodb.connection.SaslAuthenticator$1.run(SaslAuthenticator.java:50)
... 9 common frames omitted
You are mixing the uri style connection settings with the individual properties style settings.
Either use
spring:
data:
mongodb:
host: localhost
port: 27017
database: SHAM_TEST
username: develop
password: pass
Or
spring:
data:
mongodb:
uri:mongodb://develop:pass#localhost:27017/SHAM_TEST
If your project has more then one application.properties/application.yml file, sometime they both conflicts if one has URI and other has Host/Port/Credential.
This result in the error "Invalid mongo configuration, either uri or host/port/credentials must be specified"
To Avoid the conflict, make sure you either use URI or Host/Port/Credential in all.
The issue was authentication-database was missing.
Here is now the working configuration :
spring:
data:
mongodb:
host: <my_host>
username: develop
password: d3VeL0p$
port: 27017
database: SHAM
repositories:
enabled: true
authentication-database: admin
If you are using application.properties file to store your configuration, Use the following structure.
spring.data.mongodb.host = localhost
spring.data.mongodb.port = 27017
spring.data.mongodb.database = SHAM_TEST
spring.data.mongodb.username = develop
spring.data.mongodb.password = pass
This issue appears with mongo version 4.x,
spring or spring boot supports version 2.x only so when we try to connect with mongo 4 this error appears:
Downgrade your mongo from 4.x to 3.x or lower.
If not, then change these placeholders.
Change from :
spring.data.mongodb.host= localhost
spring.data.mongodb.port=27017
spring.data.mongodb.database= your db name
spring.data.mongodb.username= something
spring.data.mongodb.password= ***
-to-
mongo.replica.hosts=localhost:27017
mongo.dbname= your db name
mongo.username= something
mongo.password= ***

mongoDB async driver java - unable to connect with authentication

Recently we've updated our mongoDB with port 35489 and also with authentication and roles to the database.
I've granted readWrite and readWriteAnyDatabase roles for a user and can successfully connect from our c# coding from mentioning it from the web.config
My web.config:
<add key="MongoConnectionString" value="mongodb://readWriteUser:********#192.168.1.225:35489/admin" />
Now, my problem is I'm uanble to connect to mongoDB from java async driver and it is throwing an exception like
Exception
Exception in monitor thread while connecting to server localhost:35489 com.mongodb.MongoSecurityException: Exception authenticating MongoCredential{mechanism=null, userName='AdminAllDatabases', source='admin', password=, mechanismProperties={}} at com.mongodb.connection.SaslAuthenticator.authenticate(SaslAuthenticator.java:61) at com.mongodb.connection.DefaultAuthenticator.authenticate(DefaultAuthenticator.java:32) at com.mongodb.connection.InternalStreamConnectionInitializer.authenticateAll(InternalStreamConnectionInitializer.java:99) at com.mongodb.connection.InternalStreamConnectionInitializer.initialize(InternalStreamConnectionInitializer.java:44) at com.mongodb.connection.InternalStreamConnection.open(InternalStreamConnection.java:115) at com.mongodb.connection.DefaultServerMonitor$ServerMonitorRunnable.run(DefaultServerMonitor.java:127) at java.lang.Thread.run(Unknown Source) Caused by: com.mongodb.MongoCommandException: Command failed with error 18: 'Authentication failed.' on server localhost:35489. The full response is { "ok" : 0.0, "code" : 18, "errmsg" : "Authentication failed." } at com.mongodb.connection.CommandHelper.createCommandFailureException(CommandHelper.java:170) at com.mongodb.connection.CommandHelper.receiveCommandResult(CommandHelper.java:123) at com.mongodb.connection.CommandHelper.executeCommand(CommandHelper.java:32) at com.mongodb.connection.SaslAuthenticator.sendSaslContinue(SaslAuthenticator.java:99) at com.mongodb.connection.SaslAuthenticator.authenticate(SaslAuthenticator.java:58) ... 6 common frames omitted
My async java driver code:
public static final String DEFAULT_URI = "mongodb://readWriteUser:******#localhost:35489/";
public static synchronized ConnectionString getConnectionString()
{
if (connectionString == null)
{
connectionString = new ConnectionString(DEFAULT_URI);
}
return connectionString;
}
public static synchronized MongoClient getMongoClient()
{
if (mongoClient == null)
{
mongoClient = MongoClients.create(getConnectionString());
}
return mongoClient;
}
we've tried with both localhost and ip address, but nothing seems to work.
Can anyone suggest me the right way to do this?