Error in connection with PostgreSQL 12 by creating a datasource - postgresql

I'm getting an exception while connecting to PostgreSQL 12 by creating a data source via wildfly admin console.
It works fine on my another machine which has PostgreSQL 9.5. and it seems to work fine when I connect without data-source.
try (Connection conn = DriverManager.getConnection(
"jdbc:postgresql://127.0.0.1:5432/registrar", "postgres", "postgres")) {
if (conn != null) {
System.out.println("Connected to the database!");
Statement statement = conn.createStatement();
ResultSet rs = statement.executeQuery("Select * FROM public.user LIMIT 10");
while(rs.next())
System.out.println(rs.getString(2));
} else {
System.out.println("Failed to make connection!");
}
But when I create a data-source like this:
and this the exception that I get when I try to test the connection from UI.
Request
{
"address" => [
("subsystem" => "datasources"),
("data-source" => "RegistrarDS")
],
"operation" => "test-connection-in-pool"
}
Response
Internal Server Error
{
"outcome" => "failed",
"failure-description" => "WFLYJCA0040: failed to invoke operation: WFLYJCA0047: Connection is not valid",
"rolled-back" => true,
"response-headers" => {"process-state" => "reload-required"}
}
Has anything changed in PostgreSQL 12? What could be the issue?

Could be an issue with the datasource-class defined in your driver as described here on issues.redhat.com
If a datasource-class is defined then it takes precedence to create the datasource and as such it can only use connection-properties.
Since there is not a single connection-url property for all drivers, you have to use connection-properties if you are defining the datasource-class.
Another thing you could try is to set "Connection Properties" instead of "Connection URL" and make sure you are using the correct/latest driver jar file.
Also check your Postgresql logs for further info on the exception..
-> https://www.postgresql.org/about/news/2000/

Related

Logstash refuse to see a postgres table

so I created my logstash conf file, and spun up logstash, kibana, Postgres, and elasticsearch in one docker compose file, it connected seemlessly with my database however it says the table "products" don't exist.
[2023-01-18T14:06:00,182][WARN ][logstash.inputs.jdbc ][main][6a13cd40fa144828caae9db4ed20b978765149c99cc59d5830fa4ccad80b4017] Exception when executing JDBC query {:exception=>"Java::OrgPostgresqlUtil::PSQLException: ERROR: relation \"products\" does not exist\n Position: 15"}
This is my conf
input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://elastic-postgres-1:5432/shopdb"
jdbc_user => "postgres"
jdbc_password => "****"
jdbc_driver_library => "./postgresql-42.2.27.jre7.jar"
jdbc_driver_class => "org.postgresql.Driver"
statement => "SELECT * FROM products;"
schedule => "* * * * *"
}
}
output {
elasticsearch {
hosts => ["http://elasticsearch:9200"]
index => "PostgreSQL"
}
}
granted I did link my postgres to logstash with the conf BEFORE creating the table but I have tried re-starting the containers again. Again the error persists I tried putting in the wrong table to know if it's even updating the conf which it noticed so why isn't it seeing the table "products" which has been created and populated now?
Try explicitly to use the object schema name in your query for avoid this error not found the table.
like:
SELECT * FROM schema_name.object_name

How to solve the connecting between logstash to aws postgres issue?

I've a postgres database rds and I need to query it using logstash through JDBC inputs. I tried most of the solutions such as adding the useSSL=false, etc. But it seems there is an issue. still the logs are not so informative 'maybe bad JDBC string':
Error: Sequel::DatabaseError: driver.new.connect returned nil: probably bad JDBC connection string
Exception: Sequel::DatabaseConnectionError
Stack: /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/adapters/jdbc.rb:228:in connect' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/connection_pool.rb:122:in make_new'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/connection_pool/threaded.rb:209:in assign_connection' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/connection_pool/threaded.rb:139:in acquire'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/connection_pool/threaded.rb:91:in hold' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/database/connecting.rb:270:in synchronize'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/database/connecting.rb:279:in test_connection' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/database/connecting.rb:58:in connect'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/core.rb:125:in connect' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.6/lib/logstash/plugin_mixins/jdbc/jdbc.rb:114:in block in jdbc_connect'
and here is the .conf file which I use to connect:
input {
jdbc {
jdbc_connection_string => "jdbc:psql://domain.xxxxxx.us-east-1.rds.amazonaws.com:5432/dbtest?useSSL=true"
jdbc_user => "USERNAME"
jdbc_password => "PASSWORD"
jdbc_driver_library => "path/to/postgresql-42.2.16.jar"
jdbc_driver_class => "org.postgresql.Driver"
statement => "SELECT * from dbtableX"
}
}
output {
stdout { codec => json_lines }
}

Aurora Serverless (Postgresql) - execute statement timeout 502

I am building a lambda api connecting to AWS Aurora Serverless Postgres.
Follow AWS instruction to setup an Aurora Serverless with Postgres compatible (for MySql but still useful for Postgres): https://aws.amazon.com/getting-started/tutorials/configure-connect-serverless-mysql-database-aurora/
I added port 5432 in inbound of security group
I also use data-api-client to query the database (https://github.com/jeremydaly/data-api-client)
Built API lambda in Serverless Framework, set timeout is 1 min, added role AmazonRDSDataFullAccess
My lambda code (built in Serverless framework) is simple:
async function query_db(_sql) {
const data = require('data-api-client')({
secretArn: constants.DBSecretsStoreArn,
resourceArn: constants.DBAuroraClusterArn,
database: constants.DatabaseName
});
try {
let result = await data.query(_sql);
return result.records;
} catch (error) {
console.log('Lambda :: query_db :: Error: ' + error);
return error;
}
}
async function run() {
let sql = 'SELECT * FROM products LIMIT 10';
let result = await query_db(sql);
console.log('result: '+ JSON.stringify(result));
return callback(null, {
headers: {
'Access-Control-Allow-Origin': '*'
},
statusCode: 200,
body: JSON.stringify({msg: 'done})
});
}
Result:
It ran successfully in local (serverless-offline)
After deploying, it ran timeout, returned 502, error: "Internal server error"
Any suggestion is appreciated.
The data-api-client doesn't officially support Postgres, yet.
https://github.com/jeremydaly/data-api-client/issues/27

Problem with PostgreSQL json object (PGobject) to logstash - type cast problem

I am trying to index data from a PostgreSQL database to elastic search using logstash. I have a column in the database which is of type JSON. When I run logstash, I get an error called "Missing Converter handling". I do understand that is java/logstash doesn't recognize the JSON type column.
I can typecast the column to "col_name::TEXT" which works fine. However, I need that column as JSON in an elastic search index. Is there any workaround?
Note: Keys in the JSON object is not fixed, it varies.
Logstash postgres query example:
input {
jdbc {
# Postgres jdbc connection string to our database, bot
jdbc_connection_string => "***"
# The user we wish to execute our statement as
jdbc_user => "***"
jdbc_password => "***"
# The path to our downloaded jdbc driver
jdbc_driver_library => "<ja_file_path>"
# The name of the driver class for Postgresql
jdbc_driver_class => "org.postgresql.Driver"
# our query
statement => "SELECT col_1, col_2, col_3, json_col_4 from my_db;"
}
}
filter{
}
output {
stdout { codec => json_lines }
}
Need JSON object from PostgreSQL to elastic search.

How to connect postgres database with logstash from JDBC to import data?

I'm trying to connect PostgreSQL database with Logstash to import data from postgres to elasticsearch.
I'm using JDBC driver to connect Logstash with postgres.
But i'm getting following error
[2019-06-27T13:04:05,943][ERROR][logstash.javapipeline ] A plugin
had an unrecoverable error. Will restart this plugin.
Pipeline_id:main Plugin: "postgres", jdbc_password=>, statement=>"SELECT
* FROM public.\"contacts\";", jdbc_driver_library=>"postgresql-42.2.6.jar",
jdbc_connection_string=>"jd
bc:postgresql://localhost:5432/LogstashTest",
id=>"a76a604bb9cb591dd4a19afc95e03873023e008c564101b4ac19aefe30071213",
jdbc_driver_class=>"org.postgresql.Driver", enable_metric=>true,
codec=>"plain_8f80bf3a-29fe-49e8-86b1-c94e9a298ffb", enable_metric=>true,
charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000,
jdbc_validate_connection=>false, jdbc_validation_timeout=
3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1,
connection_retry_attempts_wait_time=>0.5,
parameters=>{"sql_last_value"=>1970-01-01 00:00:00 UTC},
last_run_metadata_path=>"C :\Users\roshan/.logstash_jdbc_last_run",
use_column_value=>false, tracking_column_type=>"numeric",
clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
Error: org.postgresql.Driver not loaded. Are you sure you've included
the correct jdbc driver in :jdbc_driver_library? Exception:
LogStash::ConfigurationError Stack:
D:/Swares/logstash-7.2.0/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:163:in
open_jdbc_connection'
D:/Swares/logstash-7.2.0/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:221:in
execute_statement'
D:/Swares/logstash-7.2.0/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:277:in execute_query'
D:/Swares/logstash-7.2.0/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:263:inrun'
D:/Swares/logstash-7.2.0/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:309:in
inputworker'
D:/Swares/logstash-7.2.0/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:302:in
block in start_input'
[2019-06-27T13:04:06,946][ERROR][logstash.inputs.jdbc ] Failed to
load postgresql-42.2.6.jar {:exception=>#}
My configurations are
Java version - "1.8.0_211"
postgres (PostgreSQL) 11.0
logstash-7.2.0
And here is my logstash conf file
input {
jdbc{
#input configuration
jdbc_driver_library => "postgresql-42.2.6.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://localhost:5432/LogstashTest"
jdbc_user => "postgres"
jdbc_password => "root"
statement => 'SELECT * FROM public."contacts";'
}
}
output{
stdout { codec => json_lines }
}
The problem can occur due to corrupted PostgreSQL JDBC driver when you passing incorrect config file to Logstash. I had the same issue, you need to check your JDBC driver file.