Redshift-Postgres RDS federated query: Authentication method 10 not supported - amazon-redshift

VPC is configured, secret is in Secrets Manager with correct policy attached to Redshift cluster.
Created external schema using
CREATE EXTERNAL SCHEMA schema_ext
FROM POSTGRES
DATABASE 'db' SCHEMA 'schema'
URI 'rds.some_symbols.eu-west-1.rds.amazonaws.com' PORT 5432
IAM_ROLE 'arn:aws:iam::999999999999:role/redshift-iam-role'
SECRET_ARN 'arn:aws:secretsmanager:eu-west-1:999999999999:secret:some-secret-some-symbols';
But when I try to run query to some table in this schema I get error:
SQL Error [XX000]: ERROR:
-----------------------------------------------
error: authentication method 10 not supported
code: 25300
context:
query: 0
location: pgclient.cpp:535
process: padbmaster [pid=2022]
-----------------------------------------------
Details for this error are following
org.jkiss.dbeaver.model.sql.DBSQLException: SQL Error [XX000]: ERROR:
-----------------------------------------------
error: authentication method 10 not supported
code: 25300
context:
query: 0
location: pgclient.cpp:535
process: padbmaster [pid=2022]
-----------------------------------------------
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:133)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeStatement(SQLQueryJob.java:575)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.lambda$1(SQLQueryJob.java:484)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:172)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeSingleQuery(SQLQueryJob.java:491)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.extractData(SQLQueryJob.java:878)
at org.jkiss.dbeaver.ui.editors.sql.SQLEditor$QueryResultsContainer.readData(SQLEditor.java:3526)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.lambda$0(ResultSetJobDataRead.java:118)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:172)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.run(ResultSetJobDataRead.java:116)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetViewer$ResultSetDataPumpJob.run(ResultSetViewer.java:4868)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:105)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
Caused by: com.amazon.redshift.util.RedshiftException: ERROR:
-----------------------------------------------
error: authentication method 10 not supported
code: 25300
context:
query: 0
location: pgclient.cpp:535
process: padbmaster [pid=2022]
-----------------------------------------------
at com.amazon.redshift.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2601)
at com.amazon.redshift.core.v3.QueryExecutorImpl.processResultsOnThread(QueryExecutorImpl.java:2269)
at com.amazon.redshift.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1880)
at com.amazon.redshift.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1872)
at com.amazon.redshift.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:368)
at com.amazon.redshift.jdbc.RedshiftStatementImpl.executeInternal(RedshiftStatementImpl.java:514)
at com.amazon.redshift.jdbc.RedshiftStatementImpl.execute(RedshiftStatementImpl.java:435)
at com.amazon.redshift.jdbc.RedshiftStatementImpl.executeWithFlags(RedshiftStatementImpl.java:376)
at com.amazon.redshift.jdbc.RedshiftStatementImpl.executeCachedSql(RedshiftStatementImpl.java:362)
at com.amazon.redshift.jdbc.RedshiftStatementImpl.executeWithFlags(RedshiftStatementImpl.java:339)
at com.amazon.redshift.jdbc.RedshiftStatementImpl.execute(RedshiftStatementImpl.java:329)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.execute(JDBCStatementImpl.java:329)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.lambda$0(JDBCStatementImpl.java:131)
at org.jkiss.dbeaver.utils.SecurityManagerUtils.wrapDriverActions(SecurityManagerUtils.java:94)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:131)
... 12 more
I tried to run the same query from query editor in AWS Console for Redshift. The error seems to be the same
ERROR: ----------------------------------------------- error: authentication method 10 not supported code: 25300 context: query: 0 location: pgclient.cpp:535 process: padbmaster [pid=12588] -----------------------------------------------
I tried to update JDBC client drivers. No result.
Maybe the problem is in the custom KMS key used to encrypt the secret. But the guy who understands how it works tells that "it decrypts and only then goes for authorization".
What should I do to avoid this error?

For now, there is no answer in Redshift documentation.
But what helped is explicit change of encryption to MD5 on RDS side for the Redshift user
set password_encryption = 'md5';
ALTER ROLE ... PASSWORD ...;
Also, don't forget to run ANALYZE after dropping an object on RDS side. SELECT on fresh objects can return an error for Redshift.

Related

LIQUIBASE: [Amazon](500310) Invalid operation: terminating connection due to administrator command

I'm trying to do a liquibase generateChangeLog on 48 schemas in an Amazon Redshift database and I'm getting the following error:
[Amazon](500310) Invalid operation: terminating connection due to administrator command
When I do a smaller number of schemas, I don't get the error. I've tried adding &pooling=false to my JDBC url, but that hasn't helped. Thanks in advance for your help!!

WSO2 API-Manager with Postgres database is not working properly

I have shifted the default h2 database to Postgresql for WSO2 API Manager by following this documentation: https://apim.docs.wso2.com/en/latest/install-and-setup/setup/setting-up-databases/changing-default-databases/changing-to-postgresql/
Creating a new API on throws:
"Something went wrong while getting the Revisions!"
On server found this error
ERROR - ApiMgtDAO Failed to get API Revision deployment mapping details for api uuid: a96f7266-c340-49b6-bbe1-cb252b49860e
org.postgresql.util.PSQLException: ERROR: UNION types integer and boolean cannot be matched
Any help would be greatly appreciated... Thanks...

Cant import table to H2O, using Postgresql JDBC in Ubuntu

I am having trouble to import a sql table to H2O.ai using Postgresql JDBC Driver in Ubuntu. I'm getting the follow error:
ERROR MESSAGE:
SQLException: ERROR: relation "XXX" does not exist
Position: 22
Failed to connect and read from SQL database with connection_url: jdbc:postgresql://localhost:5432/...**
I am executing H2O with the follow command:
java -cp h2o.jar:/usr/share/java/postgresql-9.4.1212.jar water.H2OApp
The JDBC driver is installed and already try to construct the Connection URL in several ways.
I'm using this one right now:
jdbc:postgresql://localhost:5432/XXX?&useSSL=false

Permission denied after pg_upgrade on RDS

After trying to upgrade a shadow-copy of our PostgreSQL 9.6.6 RDS instance to 10.4, most operations on the database, including those done with a "root" user (the one created when setting up the database), result in an error like this:
SQL Error [42501]: ERROR: permission denied for schema public
Position: 15
Another example is a query like select * from example_table limit 100; which results in the error:
SQL Error [42P01]: ERROR: relation "example_table" does not exist
Position: 15
ERROR: relation "example_table" does not exist
Position: 15
ERROR: relation "example_table" does not exist
Position: 15
However, I am able to execute SELECT * FROM pg_catalog.pg_tables where schemaname = 'public'; which correctly lists all my tables
The upgrade logs don't seem to show anything unusual. I've been unable to find any RDS-specific instructions on upgrading from 9.x to 10.x so I assumed that the normal upgrade procedure in the interface (which I've used in the past and seems to be using a pg_upgrade operation) would "just work". Is there anything I'm missing?

S3ServiceException when using AWS RedshiftBasicEmitter

I am using the sample AWS kinesis/redshift code from GitHub. I ran the code in an EC2 instance and ran into the following exception. Note that the emitting from Kinesis to S3 actually succeeded. But the emitting from S3 to Redshift failed. As both emitters in the same program used the same credentials, I am very puzzled why only one of them failed!?
I understand most people getting “The AWS Access Key Id you provided does not exist in our records” exception probably may have issue setting up the S3 key pair properly. But it does not seem to be the case here as emitting to S3 succeeded. If the credentials do not have read access, it should throw an authorization error instead.
Please comment if you have any insight.
Mar 16, 2014 4:32:49 AM com.amazonaws.services.kinesis.connectors.s3.S3Emitter emit
INFO: Successfully emitted 31 records to S3 in s3://mybucket/495362565978733426345566872055061454326385819810529281-49536256597873342638068737503047822713441029589972287489
Mar 16, 2014 4:32:50 AM com.amazonaws.services.kinesis.connectors.redshift.RedshiftBasicEmitter executeStatement
SEVERE: org.postgresql.util.PSQLException: ERROR: S3ServiceException:The AWS Access Key Id you provided does not exist in our records.,Status 403,Error InvalidAccessKeyId,Rid 5TY6Y784TT67,ExtRid qKzklJflmmgnhtttthbce+8T0NIR/sdd4RgffTgfgfdfgdfgfffgghgdse56f,CanRetry 1
Detail:
-----------------------------------------------
error: S3ServiceException:The AWS Access Key Id you provided does not exist in our records.,Status 403,Error InvalidAccessKeyId,Rid 5TY6Y784TT67,ExtRid qKzklJflmmgnhtttthbce+8T0NIR/sdd4RgffTgfgfdfgdfgfffgghgdse56f,CanRetry 1
code: 8001
context: Listing bucket=mfpredshift prefix=49536256597873342637951299872055061454326385819810529281-49536256597873342638068737503047822713441029589972287489
query: 3464108
location: s3_utility.cpp:536
process: padbmaster [pid=8116]
-----------------------------------------------
Mar 16, 2014 4:32:50 AM com.amazonaws.services.kinesis.connectors.redshift.RedshiftBasicEmitter emit
SEVERE: java.io.IOException: org.postgresql.util.PSQLException: ERROR: S3ServiceException:The AWS Access Key Id you provided does not exist in our records.,Status 403,Error InvalidAccessKeyId,Rid 5TY6Y784TT67,ExtRid qKzklJflmmgnhtttthbce+8T0NIR/sdd4RgffTgfgfdfgdfgfffgghgdse56f,CanRetry 1
Detail:
-----------------------------------------------
error: S3ServiceException:The AWS Access Key Id you provided does not exist in our records.,Status 403,Error InvalidAccessKeyId,Rid 5TY6Y784TT67,ExtRid qKzklJflmmgnhtttthbce+8T0NIR/sdd4RgffTgfgfdfgdfgfffgghgdse56f,CanRetry 1
code: 8001
context: Listing bucket=mybucket prefix=495362565978733426345566872055061454326385819810529281-49536256597873342638068737503047822713441029589972287489
query: 3464108
location: s3_utility.cpp:536
process: padbmaster [pid=8116]
-----------------------------------------------
I encountered the same errors. I'm using IAM role to get credentials. In my case, it was solved by modify RedshiftBasicEmitter to add ;token=TOKEN to CREDENTIALS parameter (finally I created my own IEmitter).
See http://docs.aws.amazon.com/redshift/latest/dg/r_COPY.html