Problem with PostgreSQL json object (PGobject) to logstash - type cast problem - postgresql

I am trying to index data from a PostgreSQL database to elastic search using logstash. I have a column in the database which is of type JSON. When I run logstash, I get an error called "Missing Converter handling". I do understand that is java/logstash doesn't recognize the JSON type column.
I can typecast the column to "col_name::TEXT" which works fine. However, I need that column as JSON in an elastic search index. Is there any workaround?
Note: Keys in the JSON object is not fixed, it varies.
Logstash postgres query example:
input {
jdbc {
# Postgres jdbc connection string to our database, bot
jdbc_connection_string => "***"
# The user we wish to execute our statement as
jdbc_user => "***"
jdbc_password => "***"
# The path to our downloaded jdbc driver
jdbc_driver_library => "<ja_file_path>"
# The name of the driver class for Postgresql
jdbc_driver_class => "org.postgresql.Driver"
# our query
statement => "SELECT col_1, col_2, col_3, json_col_4 from my_db;"
}
}
filter{
}
output {
stdout { codec => json_lines }
}
Need JSON object from PostgreSQL to elastic search.

Related

Logstash refuse to see a postgres table

so I created my logstash conf file, and spun up logstash, kibana, Postgres, and elasticsearch in one docker compose file, it connected seemlessly with my database however it says the table "products" don't exist.
[2023-01-18T14:06:00,182][WARN ][logstash.inputs.jdbc ][main][6a13cd40fa144828caae9db4ed20b978765149c99cc59d5830fa4ccad80b4017] Exception when executing JDBC query {:exception=>"Java::OrgPostgresqlUtil::PSQLException: ERROR: relation \"products\" does not exist\n Position: 15"}
This is my conf
input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://elastic-postgres-1:5432/shopdb"
jdbc_user => "postgres"
jdbc_password => "****"
jdbc_driver_library => "./postgresql-42.2.27.jre7.jar"
jdbc_driver_class => "org.postgresql.Driver"
statement => "SELECT * FROM products;"
schedule => "* * * * *"
}
}
output {
elasticsearch {
hosts => ["http://elasticsearch:9200"]
index => "PostgreSQL"
}
}
granted I did link my postgres to logstash with the conf BEFORE creating the table but I have tried re-starting the containers again. Again the error persists I tried putting in the wrong table to know if it's even updating the conf which it noticed so why isn't it seeing the table "products" which has been created and populated now?
Try explicitly to use the object schema name in your query for avoid this error not found the table.
like:
SELECT * FROM schema_name.object_name

How to solve the connecting between logstash to aws postgres issue?

I've a postgres database rds and I need to query it using logstash through JDBC inputs. I tried most of the solutions such as adding the useSSL=false, etc. But it seems there is an issue. still the logs are not so informative 'maybe bad JDBC string':
Error: Sequel::DatabaseError: driver.new.connect returned nil: probably bad JDBC connection string
Exception: Sequel::DatabaseConnectionError
Stack: /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/adapters/jdbc.rb:228:in connect' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/connection_pool.rb:122:in make_new'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/connection_pool/threaded.rb:209:in assign_connection' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/connection_pool/threaded.rb:139:in acquire'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/connection_pool/threaded.rb:91:in hold' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/database/connecting.rb:270:in synchronize'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/database/connecting.rb:279:in test_connection' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/database/connecting.rb:58:in connect'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/core.rb:125:in connect' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.6/lib/logstash/plugin_mixins/jdbc/jdbc.rb:114:in block in jdbc_connect'
and here is the .conf file which I use to connect:
input {
jdbc {
jdbc_connection_string => "jdbc:psql://domain.xxxxxx.us-east-1.rds.amazonaws.com:5432/dbtest?useSSL=true"
jdbc_user => "USERNAME"
jdbc_password => "PASSWORD"
jdbc_driver_library => "path/to/postgresql-42.2.16.jar"
jdbc_driver_class => "org.postgresql.Driver"
statement => "SELECT * from dbtableX"
}
}
output {
stdout { codec => json_lines }
}

How to connect postgres database with logstash from JDBC to import data?

I'm trying to connect PostgreSQL database with Logstash to import data from postgres to elasticsearch.
I'm using JDBC driver to connect Logstash with postgres.
But i'm getting following error
[2019-06-27T13:04:05,943][ERROR][logstash.javapipeline ] A plugin
had an unrecoverable error. Will restart this plugin.
Pipeline_id:main Plugin: "postgres", jdbc_password=>, statement=>"SELECT
* FROM public.\"contacts\";", jdbc_driver_library=>"postgresql-42.2.6.jar",
jdbc_connection_string=>"jd
bc:postgresql://localhost:5432/LogstashTest",
id=>"a76a604bb9cb591dd4a19afc95e03873023e008c564101b4ac19aefe30071213",
jdbc_driver_class=>"org.postgresql.Driver", enable_metric=>true,
codec=>"plain_8f80bf3a-29fe-49e8-86b1-c94e9a298ffb", enable_metric=>true,
charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000,
jdbc_validate_connection=>false, jdbc_validation_timeout=
3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1,
connection_retry_attempts_wait_time=>0.5,
parameters=>{"sql_last_value"=>1970-01-01 00:00:00 UTC},
last_run_metadata_path=>"C :\Users\roshan/.logstash_jdbc_last_run",
use_column_value=>false, tracking_column_type=>"numeric",
clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
Error: org.postgresql.Driver not loaded. Are you sure you've included
the correct jdbc driver in :jdbc_driver_library? Exception:
LogStash::ConfigurationError Stack:
D:/Swares/logstash-7.2.0/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:163:in
open_jdbc_connection'
D:/Swares/logstash-7.2.0/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:221:in
execute_statement'
D:/Swares/logstash-7.2.0/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:277:in execute_query'
D:/Swares/logstash-7.2.0/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:263:inrun'
D:/Swares/logstash-7.2.0/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:309:in
inputworker'
D:/Swares/logstash-7.2.0/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:302:in
block in start_input'
[2019-06-27T13:04:06,946][ERROR][logstash.inputs.jdbc ] Failed to
load postgresql-42.2.6.jar {:exception=>#}
My configurations are
Java version - "1.8.0_211"
postgres (PostgreSQL) 11.0
logstash-7.2.0
And here is my logstash conf file
input {
jdbc{
#input configuration
jdbc_driver_library => "postgresql-42.2.6.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://localhost:5432/LogstashTest"
jdbc_user => "postgres"
jdbc_password => "root"
statement => 'SELECT * FROM public."contacts";'
}
}
output{
stdout { codec => json_lines }
}
The problem can occur due to corrupted PostgreSQL JDBC driver when you passing incorrect config file to Logstash. I had the same issue, you need to check your JDBC driver file.

Logstash, mongodb and jdbc

I have a problem configuring logstash. I want to be able to put in input jdbc for mongodb.
My config :
input{
jdbc{
jdbc_driver_library => "mongo-java-driver-3.2.2.jar"
jdbc_driver_class => "com.mongodb.MongoClient"
jdbc_connection_string => "jdbc:mongodb://localhost:27017"
jdbc_user => ""
}
}
output{
stdout{
}
}
The problem is :
:error=>"Java::JavaSql::SQLException: No suitable driver found for jdbc:mongodb://localhost:27017/"}
The MongoDB JDBC Driver setting is not correct. You must specify the name of the driver class, not the client class.
jdbc_driver_class => "mongodb.jdbc.MongoDriver"
Also make sure that the jdbc_driver_library contains the full absolute path to your mongo-java-driver-3.2.2.jar JAR file
More inputs would be good.
you must specify the location of the mongo-java-driver-3.2.2.jar in jdbc_driver_library.
please see the following links :
Documentation
Similar problem

logstash mongo Db connection issue

I am unable to push data to mongo Db using logstash
My config file looks like:-
input {
file {
type => "log"
path => "d:\logs\*.txt"
}
}
output {
mongodb {
database => "abhi1"
collection => "plain"
uri => "mongodb://127.0.0.1:27017"
}
}
command used for executing configuration file is logstash -f ./conf/demo.conf
ERROR :-
[2015-09-08T16:26:04.883000 #4528] DEBUG -- : MONGODB | COMMAND | namespace=a
in.$cmd selector={:ismaster=>1} flags=[] limit=-1 skip=0 project=nil | runtime
46.9999ms
hoping to get a workaround soon. thanks