Dataproc trying to connect to Postgres through JDBC, missing permissions - postgresql

Thanks in advance...
I want to connect/write using JDBC APIs to a Postgres SQL instance running using Cloud SQL programmatically. I have used following jars:
postgresql
postgres-socket-factory
postgres-socket-factory-1.0.11-jar-with-dependencies.jar
I am running Dataproc which will try to connect using step #1, but I get following exception:
2019-04-01 11:05:03.998 IST
Something unusual has occurred to cause the driver to fail. Please report this exception.
at org.postgresql.Driver.connect(Driver.java:277)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:270)
at rdsConnector$.getConnection(rdsConnector.scala:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721)
Caused by: java.lang.RuntimeException: Unable to retrieve information about Cloud SQL instance [projectID:us-east1:dB]
at com.google.cloud.sql.core.SslSocketFactory.obtainInstanceMetadata(SslSocketFactory.java:459)
at com.google.cloud.sql.core.SslSocketFactory.fetchInstanceSslInfo(SslSocketFactory.java:333)
at com.google.cloud.sql.core.SslSocketFactory.getInstanceSslInfo(SslSocketFactory.java:313)
at com.google.cloud.sql.core.SslSocketFactory.createAndConfigureSocket(SslSocketFactory.java:194)
at com.google.cloud.sql.core.SslSocketFactory.create(SslSocketFactory.java:160)
at com.google.cloud.sql.postgres.SocketFactory.createSocket(SocketFactory.java:96)
at org.postgresql.core.PGStream.<init>(PGStream.java:62)
at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:91)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:192)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:195)
at org.postgresql.Driver.makeConnection(Driver.java:454)
at org.postgresql.Driver.connect(Driver.java:256)
... 11 more
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
2019-04-01 11:05:03.000 IST
User class threw exception: org.postgresql.util.PSQLException: Something unusual has occurred to cause the driver to fail. Please report this exception.
I understand this as a permission issue, but since I am using Dataproc to connect to Postgres what permission is missing? if I was running from a local laptop machine then I have to set GOOGLE_APPLICATION_CREDENTIALS to a json file. But what is the process in case of Dataproc?

The JDBC SocketFactory uses the Application Default Credentials strategy for accessing account credentials.
For Cloud Dataproc, a default service account is provided for you at [project-number]-compute#developer.gserviceaccount.com. You grant this account the "Cloud SQL Client" IAM role, the JDBC SocketFactory will use it to authenticate, and thus you will be able to connect to your application.

Just leave further notes for future visitors:
If the error you're getting is like:
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "Insufficient Permission",
"reason" : "insufficientPermissions"
} ],
"message" : "Request had insufficient authentication scopes.",
"status" : "PERMISSION_DENIED"
}
then it could be caused by issues with the scopes (Cloud API access scopes) of the VM instances. One way to fix this is adding the scope of sql-admin when creating the Dataproc cluster. For example:
gcloud dataproc clusters create <your-cluster-name> \
--region=<your-region> \
--zone=<your-zone> \
--scopes=https://www.googleapis.com/auth/sqlservice.admin \
...
Another way is edited that in the VM instances.
See the gcloud documentation for details

Related

Keycloak - Connection Null issue when spun up on AWS ECS

I have checked that all relevant parameters are being parsed correctly and i am able to see them in my AWS ECS service. When the service is spun up i get the below error WARN  [org.jboss.jca.core.connectionmanager.pool.strategy.OnePool] (ServerService Thread Pool -- 60) IJ000604: Throwable while attempting to get a new connection: null: javax.resource.ResourceException: IJ031084: Unable to create connection which then follows with this error Caused by: org.postgresql.util.PSQLException: FATAL: database "keycloak" does not exist
I am unsure if the two are related. I validated relevant parameters and have tried to imitate the same issue locally.
Any help or advice would be aprpeciated
TYIA

How to use Spark-BigQuery_connector for existing spark environment ( not with google dataproc)

I working on creating data pipeline, which takes the data from various conventional database and CSV files .It will use pyspark for preprocessing and then writes the result dataframe into BigQuery table.
I have copied the jar file gs://spark-lib/bigquery/spark-bigquery-with-dependencies_2.12-0.23.1.jar to the $SPARK_HOME/jars folder and while creating spark object have specified the same in spark.jars.packages.
Please give some suggestions, as all blogs I can find are using DataProc as example.
spark = SparkSession. \
builder. \
enableHiveSupport(). \
appName('data_load'). \
master('yarn'). \
config('spark.jars.packages','com.google.cloud.spark:spark-bigquery-with-dependencies_2.12:0.23.1'). \
getOrCreate()
Scala Version = 2.12.15
While writing to big query, I am getting error related to authentication.
BigQuery Connection API is in enabled.
df.write.format('bigquery') \
.option('table',big_table_id) \
.save()
ERROR:
Py4JJavaError: An error occurred while calling o59.save.
: com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryException: Request had insufficient authentication scopes.
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:115)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getTable(HttpBigQueryRpc.java:286)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryImpl$18.call(BigQueryImpl.java:746)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryImpl$18.call(BigQueryImpl.java:743)
at com.google.cloud.spark.bigquery.repackaged.com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:105)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.RetryHelper.run(RetryHelper.java:76)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:50)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryImpl.getTable(BigQueryImpl.java:742)
at com.google.cloud.bigquery.connector.common.BigQueryClient.getTable(BigQueryClient.java:107)
at com.google.cloud.spark.bigquery.BigQueryInsertableRelation.getTable$lzycompute(BigQueryInsertableRelation.scala:67)
at com.google.cloud.spark.bigquery.BigQueryInsertableRelation.getTable(BigQueryInsertableRelation.scala:67)
at com.google.cloud.spark.bigquery.BigQueryInsertableRelation.exists$lzycompute(BigQueryInsertableRelation.scala:53)
at com.google.cloud.spark.bigquery.BigQueryInsertableRelation.exists(BigQueryInsertableRelation.scala:49)
at com.google.cloud.spark.bigquery.BigQueryRelationProvider.createRelation(BigQueryRelationProvider.scala:114)
at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:110)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:110)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91)
at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:128)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:848)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:382)
at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:355)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:247)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.google.cloud.spark.bigquery.repackaged.com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
GET https://www.googleapis.com/bigquery/v2/projects/project212/datasets/NYSE_datasets/tables/NYSE_table?prettyPrint=false
{
"code" : 403,
"details" : [ {
"#type" : "type.googleapis.com/google.rpc.ErrorInfo",
"reason" : "ACCESS_TOKEN_SCOPE_INSUFFICIENT"
} ],
"errors" : [ {
"domain" : "global",
"message" : "Insufficient Permission",
"reason" : "insufficientPermissions"
} ],
"message" : "Request had insufficient authentication scopes.",
"status" : "PERMISSION_DENIED"
}
at com.google.cloud.spark.bigquery.repackaged.com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at com.google.cloud.spark.bigquery.repackaged.com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)
at com.google.cloud.spark.bigquery.repackaged.com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)
at com.google.cloud.spark.bigquery.repackaged.com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428)
at com.google.cloud.spark.bigquery.repackaged.com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)
at com.google.cloud.spark.bigquery.repackaged.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514)
at com.google.cloud.spark.bigquery.repackaged.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455)
at com.google.cloud.spark.bigquery.repackaged.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getTable(HttpBigQueryRpc.java:284)
You have to use a service account to authenticate outside Dataproc, as described he in spark-bigquery-connector documentation:
Use a service account JSON key and GOOGLE_APPLICATION_CREDENTIALS as
described
here.
Credentials can also be provided explicitly either as a parameter or
from Spark runtime configuration. It can be passed in as a
base64-encoded string directly, or a file path that contains the
credentials (but not both).
Example: spark.read.format("bigquery").option("credentials", "<SERVICE_ACCOUNT_JSON_IN_BASE64>")
or
spark.conf.set("credentials", "<SERVICE_ACCOUNT_JSON_IN_BASE64>")
Alternatively, specify the credentials file name.
spark.read.format("bigquery").option("credentialsFile","</path/to/key/file>")
or
spark.conf.set("credentialsFile","</path/to/key/file>")
Another alternative to passing the credentials, is to pass the access
token used for authenticating the API calls to the Google Cloud
Platform APIs. You can get the access token by running gcloud auth application-default print-access-token.
spark.read.format("bigquery").option("gcpAccessToken","<acccess token>")
or
spark.conf.set("gcpAccessToken","<access-token>")

How do I install google cloud jdbc driver manually for the flyway cli?

Looking at this reference: https://docs.kony.com/konylibrary/konyfabric/kony_fabric_manual_install_guide/Content/FlywayNew.htm
It says the Google Cloud SQL drivers need to be installed manually for the flyway cli, but how do I install them manually? I can't find any documentation on it.
EDIT:
I added this to my flyway.conf: flyway.jarDirs=/Users/my/flyway
Then I downloaded the driver into that folder:
mvn dependency:get -Ddest=/Users/my/flyway -Dartifact=com.google.cloud.sql:postgres-socket-factory:1.3.0
but when I try to use it I get this error:
The SocketFactory class provided com.google.cloud.sql.postgres.SocketFactory could not be instantiated.
EDIT: this is the debug output when running flyway baseline against my DB
Flyway Community Edition 7.10.0 by Redgate
DEBUG: AWS SDK available: false
DEBUG: Google Cloud Storage available: true
DEBUG: Scanning for filesystem resources at 'sql'
ERROR: Skipping filesystem location:sql (not found).
ERROR: Unexpected error
org.flywaydb.core.internal.exception.FlywaySqlException:
Unable to obtain connection from database (jdbc:postgresql:///mydb ?unixSocketPath=/var/run/cloudsql/ &cloudSqlInstance=my-gcp-project:us-central1:my-cloudsql-instance &socketFactory=com.google.cloud.sql.postgres.SocketFactory &user=jenkins-flyway &password=******************************************) for user 'null': The SocketFactory class provided com.google.cloud.sql.postgres.SocketFactory could not be instantiated.
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
SQL State : 08006
Error Code : 0
Message : The SocketFactory class provided com.google.cloud.sql.postgres.SocketFactory could not be instantiated.
at org.flywaydb.core.internal.jdbc.JdbcUtils.openConnection(JdbcUtils.java:71)
at org.flywaydb.core.internal.jdbc.JdbcConnectionFactory.<init>(JdbcConnectionFactory.java:68)
at org.flywaydb.core.Flyway.execute(Flyway.java:510)
at org.flywaydb.core.Flyway.baseline(Flyway.java:406)
at org.flywaydb.commandline.Main.executeOperation(Main.java:226)
at org.flywaydb.commandline.Main.main(Main.java:149)
Caused by: org.postgresql.util.PSQLException: The SocketFactory class provided com.google.cloud.sql.postgres.SocketFactory could not be instantiated.
at org.postgresql.core.SocketFactoryFactory.getSocketFactory(SocketFactoryFactory.java:43)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:182)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:223)
at org.postgresql.Driver.makeConnection(Driver.java:465)
at org.postgresql.Driver.connect(Driver.java:264)
at org.flywaydb.core.internal.jdbc.DriverDataSource.getConnectionFromDriver(DriverDataSource.java:268)
at org.flywaydb.core.internal.jdbc.DriverDataSource.getConnection(DriverDataSource.java:232)
at org.flywaydb.core.internal.jdbc.JdbcUtils.openConnection(JdbcUtils.java:55)
... 5 more
Caused by: java.lang.ClassNotFoundException: com.google.cloud.sql.postgres.SocketFactory
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:636)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:182)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:519)
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:375)
at org.postgresql.util.ObjectFactory.instantiate(ObjectFactory.java:46)
at org.postgresql.core.SocketFactoryFactory.getSocketFactory(SocketFactoryFactory.java:39)
... 13 more
This is for Postgres on Google Cloud, yes?
The intention is that you put driver JARs into the drivers folder inside the Flyway product (not 100% sure for v4.0 as that's very elderly now - 7.11 is current). jarDirs should work, but it's the intended place for Java-based migrations.
However, it looks like a problem within the driver. Are you making sure to put all dependencies of the driver there too? Could you provide a full debug log (that is, flyway migrate -X) with exception details so we could take a look?
EDIT: It's definitely a problem within the driver, looking at that debug log - it's not able to instantiate a class internal to the driver.

Error: FATAL: role "login" does not exist

Some context: I am a novice engineer developing a Grails 3.3.3 app that requires a database connection. Currently, there is an Amazon RDS instance created and configured for this purpose. A proper connection from the application to the db has not been established.
I've confirmed my security configuration is correct, as well as my credentials, since I am able to connect from the shell using the standard psql command.
Unless, I've missed something, the RDS troubleshooting guide provided insight but little help.
I've also seen this which is helpful, however I (seemingly) don't have operating system privileges as AWS restricts this type of access.
My issue seems it stems from either postgresql or RDS, and not specifically the application.yml configuration from the grails project. Furthermore, I have the necessary dependencies installed for the connection. application.yml is included below:
dataSource:
pooled: true
jmxExport: true
logSql: true
formatSql: true
driverClassName: org.postgresql.Driver
dialect: org.hibernate.dialect.PostgreSQLDialect
dbCreate: "update"
url: jdbc:postgresql:endpoint.rds.amazonaws.com:port/database
username: "login"
password: "password"
I have examined the postgres roles and databases. Futhermore, I have created and deleted roles as well as added and revoked various high-level permissions to these roles. I have been unable to log in using any role in the role table; the error org.postgresql.util.PSQLException: FATAL: role "login" does not exist indicates it does not exist, however, I have contrary evidence.
postgres=> \du
login | Create role, Create DB +| {rds_superuser}
| Password valid until infinity|
rds_replication | Cannot login | {}
rds_superuser | Cannot login | {rds_replication,pg_signal_backend}
rdsadmin | Superuser, Create role, Create DB, Replication, Bypass RLS+| {}
| Password valid until infinity |
rdsrepladmin | No inheritance, Cannot login, Replication | {}
The provided stacktrace with the relevant first error:
| Running application...
2018-06-22 12:12:20.343 ERROR --- [ main] org.postgresql.Driver : Connection error:
org.postgresql.util.PSQLException: FATAL: role "login" does not exist
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2433)
at org.postgresql.core.v3.QueryExecutorImpl.readStartupMessages(QueryExecutorImpl.java:2566)
at org.postgresql.core.v3.QueryExecutorImpl.(QueryExecutorImpl.java:131)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:210)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
at org.postgresql.jdbc.PgConnection.(PgConnection.java:195)
at org.postgresql.Driver.makeConnection(Driver.java:452)
at org.postgresql.Driver.connect(Driver.java:254)
at org.apache.tomcat.jdbc.pool.PooledConnection.connectUsingDriver(PooledConnection.java:310)
at org.apache.tomcat.jdbc.pool.PooledConnection.connect(PooledConnection.java:203)
at org.apache.tomcat.jdbc.pool.ConnectionPool.createConnection(ConnectionPool.java:735)
at org.apache.tomcat.jdbc.pool.ConnectionPool.borrowConnection(ConnectionPool.java:667)
at org.apache.tomcat.jdbc.pool.ConnectionPool.init(ConnectionPool.java:482)
at org.apache.tomcat.jdbc.pool.ConnectionPool.(ConnectionPool.java:154)
at org.apache.tomcat.jdbc.pool.DataSourceProxy.pCreatePool(DataSourceProxy.java:118)
at org.apache.tomcat.jdbc.pool.DataSourceProxy.createPool(DataSourceProxy.java:107)
at org.apache.tomcat.jdbc.pool.DataSourceProxy.getConnection(DataSourceProxy.java:131)
at org.springframework.jdbc.datasource.LazyConnectionDataSourceProxy.afterPropertiesSet(LazyConnectionDataSourceProxy.java:162)
at org.springframework.jdbc.datasource.LazyConnectionDataSourceProxy.(LazyConnectionDataSourceProxy.java:106)
at org.grails.datastore.gorm.jdbc.connections.DataSourceConnectionSourceFactory.proxy(DataSourceConnectionSourceFactory.java:95)
at org.grails.datastore.gorm.jdbc.connections.DataSourceConnectionSourceFactory.create(DataSourceConnectionSourceFactory.java:88)
at org.grails.datastore.gorm.jdbc.connections.CachedDataSourceConnectionSourceFactory.create(CachedDataSourceConnectionSourceFactory.java:37)
at org.grails.orm.hibernate.connections.AbstractHibernateConnectionSourceFactory.create(AbstractHibernateConnectionSourceFactory.java:38)
at org.grails.orm.hibernate.connections.AbstractHibernateConnectionSourceFactory.create(AbstractHibernateConnectionSourceFactory.java:23)
at org.grails.datastore.mapping.core.connections.AbstractConnectionSourceFactory.create(AbstractConnectionSourceFactory.java:64)
at org.grails.datastore.mapping.core.connections.AbstractConnectionSourceFactory.create(AbstractConnectionSourceFactory.java:52)
at org.grails.datastore.mapping.core.connections.ConnectionSourcesInitializer.create(ConnectionSourcesInitializer.groovy:24)
at org.grails.orm.hibernate.HibernateDatastore.(HibernateDatastore.java:204)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.springsource.loaded.ri.ReflectiveInterceptor.jlrConstructorNewInstance(ReflectiveInterceptor.java:1076)
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:142)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:122)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:271)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1193)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1095)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:513)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:483)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:351)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:108)
at org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:648)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:145)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1193)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1095)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getSingletonFactoryBeanForTypeCheck(AbstractAutowireCapableBeanFactory.java:923)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:804)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:558)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:432)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:395)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:220)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.findAutowireCandidates(DefaultListableBeanFactory.java:1267)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1101)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1066)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:835)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:741)
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:467)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1173)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1067)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:513)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:483)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202)
at org.springframework.context.support.PostProcessorRegistrationDelegate.registerBeanPostProcessors(PostProcessorRegistrationDelegate.java:225)
at org.springframework.context.support.AbstractApplicationContext.registerBeanPostProcessors(AbstractApplicationContext.java:703)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:528)
at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:122)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:693)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:360)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:303)
at grails.boot.GrailsApp.run(GrailsApp.groovy:84)
at grails.boot.GrailsApp.run(GrailsApp.groovy:393)
at grails.boot.GrailsApp.run(GrailsApp.groovy:380)
at grails.boot.GrailsApp$run.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:136)
at charity.navigator.Application.main(Application.groovy:8)
It's fixed! I realized the dataSource.url connection string was misconfigured; it needs two forward slashes.
jdbc:postgresql://endpoint.rds.amazonaws.com:port/database
The application defaults to a localhost data source when it can't interpret a url string. This was realized when the application tried to locally connect to psql that was not started and caused the error: Unable to create initial connections of pool. org.postgresql.util.PSQLException: Connection to localhost:5432 refused. When running, my local psql did not have the correct roles, hence the error.
Are you using the "postgres-extensions" plugin by any chance? Yesterday, I ran into the same exact issue and found that rolling back the version of this plugin to 4.6.1 did the trick. For some reason, the newer versions of the plugin do not play nice with Grails 3.
To rollback, change the line in build.gradle that adds the plugin to:
compile 'org.grails.plugins:postgresql-extensions:4.6.1'
Hope this helps!

MongoDB: PLAIN method not suported

I'm getting this message when I'm trying to deploy my war artifact. My app is using hibernate ogm and it's trying to build persistence context when it's deployed. The message I'm getting is:
org.hibernate.service.spi.ServiceException: OGM000071: Unable to start datatore provider Caused by: org.hibernate.service.spi.ServiceException: OGM000071: Unable to start datatore provider Caused by: org.hibernate.HibernateException: OGM001214: Unable to connect to MongoDB instance: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=mongo:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSecurityException: Exception authenticating MongoCredential{mechanism=PLAIN, userName='living', source='lvdb', password=, mechanismProperties={}}}, caused by {com.mongodb.MongoCommandException: Command failed with error 2: 'Unsupported mechanism PLAIN' on server mongo:27017. The full response is { \"supportedMechanisms\" : [\"MONGODB-CR\", \"MONGODB-X509\", \"SCRAM-SHA-1\"], \"ok\" : 0.0, \"errmsg\" : \"Unsupported mechanism PLAIN\", \"code\" : 2, \"codeName\" : \"BadValue\" }}}] Caused by: com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=mongo:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSecurityException: Exception authenticating MongoCredential{mechanism=PLAIN, userName='living', source='lvdb', password=, mechanismProperties={}}}, caused by {com.mongodb.MongoCommandException: Command failed with error 2: 'Unsupported mechanism PLAIN' on server mongo:27017. The full response is { \"supportedMechanisms\" : [\"MONGODB-CR\", \"MONGODB-X509\", \"SCRAM-SHA-1\"], \"ok\" : 0.0, \"errmsg\" : \"Unsupported mechanism PLAIN\", \"code\" : 2, \"codeName\" : \"BadValue\" }}}]"}}
What do I need to do in order to use the other mechanisms?
You'd have to specificy the property hibernate.ogm.mongodb.authentication_mechanism (see here).
Which is the authentication mechanism you'd want to use? We may not support it yet, but if you let us know about your preference, we can try and look into making it work quickly.
For MongoDB, default supported mechanism is 'SCRAM-SHA-1'. So, while connecting, you need to specify this mechanism. I saw the same error when connecting to MongoDB through spring-data. While I didn't make any special configuration for mongo, first and obvious option for me to try was PLAIN mechanism for authentication. Finally, resolved it using SCRAM-SHA-1.
For Spring-data, code looked something like this:
MongoCredential mongoCredential = MongoCredential.createScramSha1Credential(mongoUser, mongoDB, mongoPass.toCharArray());
Hope this helps.