Logstash, mongodb and jdbc - mongodb

I have a problem configuring logstash. I want to be able to put in input jdbc for mongodb.
My config :
input{
jdbc{
jdbc_driver_library => "mongo-java-driver-3.2.2.jar"
jdbc_driver_class => "com.mongodb.MongoClient"
jdbc_connection_string => "jdbc:mongodb://localhost:27017"
jdbc_user => ""
}
}
output{
stdout{
}
}
The problem is :
:error=>"Java::JavaSql::SQLException: No suitable driver found for jdbc:mongodb://localhost:27017/"}

The MongoDB JDBC Driver setting is not correct. You must specify the name of the driver class, not the client class.
jdbc_driver_class => "mongodb.jdbc.MongoDriver"
Also make sure that the jdbc_driver_library contains the full absolute path to your mongo-java-driver-3.2.2.jar JAR file

More inputs would be good.
you must specify the location of the mongo-java-driver-3.2.2.jar in jdbc_driver_library.
please see the following links :
Documentation
Similar problem

Related

Logstash refuse to see a postgres table

so I created my logstash conf file, and spun up logstash, kibana, Postgres, and elasticsearch in one docker compose file, it connected seemlessly with my database however it says the table "products" don't exist.
[2023-01-18T14:06:00,182][WARN ][logstash.inputs.jdbc ][main][6a13cd40fa144828caae9db4ed20b978765149c99cc59d5830fa4ccad80b4017] Exception when executing JDBC query {:exception=>"Java::OrgPostgresqlUtil::PSQLException: ERROR: relation \"products\" does not exist\n Position: 15"}
This is my conf
input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://elastic-postgres-1:5432/shopdb"
jdbc_user => "postgres"
jdbc_password => "****"
jdbc_driver_library => "./postgresql-42.2.27.jre7.jar"
jdbc_driver_class => "org.postgresql.Driver"
statement => "SELECT * FROM products;"
schedule => "* * * * *"
}
}
output {
elasticsearch {
hosts => ["http://elasticsearch:9200"]
index => "PostgreSQL"
}
}
granted I did link my postgres to logstash with the conf BEFORE creating the table but I have tried re-starting the containers again. Again the error persists I tried putting in the wrong table to know if it's even updating the conf which it noticed so why isn't it seeing the table "products" which has been created and populated now?
Try explicitly to use the object schema name in your query for avoid this error not found the table.
like:
SELECT * FROM schema_name.object_name

How to solve the connecting between logstash to aws postgres issue?

I've a postgres database rds and I need to query it using logstash through JDBC inputs. I tried most of the solutions such as adding the useSSL=false, etc. But it seems there is an issue. still the logs are not so informative 'maybe bad JDBC string':
Error: Sequel::DatabaseError: driver.new.connect returned nil: probably bad JDBC connection string
Exception: Sequel::DatabaseConnectionError
Stack: /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/adapters/jdbc.rb:228:in connect' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/connection_pool.rb:122:in make_new'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/connection_pool/threaded.rb:209:in assign_connection' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/connection_pool/threaded.rb:139:in acquire'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/connection_pool/threaded.rb:91:in hold' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/database/connecting.rb:270:in synchronize'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/database/connecting.rb:279:in test_connection' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/database/connecting.rb:58:in connect'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.35.0/lib/sequel/core.rb:125:in connect' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.6/lib/logstash/plugin_mixins/jdbc/jdbc.rb:114:in block in jdbc_connect'
and here is the .conf file which I use to connect:
input {
jdbc {
jdbc_connection_string => "jdbc:psql://domain.xxxxxx.us-east-1.rds.amazonaws.com:5432/dbtest?useSSL=true"
jdbc_user => "USERNAME"
jdbc_password => "PASSWORD"
jdbc_driver_library => "path/to/postgresql-42.2.16.jar"
jdbc_driver_class => "org.postgresql.Driver"
statement => "SELECT * from dbtableX"
}
}
output {
stdout { codec => json_lines }
}

sync mongo data to elastic using logstash

I want to sync my mongodb data(local mongodb) to elastic search(local elastic) using logstash-plugin of mongodb
I have install logstash plugin using
bin/logstash-plugin install logstash-input-mongodb .
Then i created a mongodata.conf file in /usr/share/logstash directory.
When I execute the conf file then it shows
--> Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties
My config file is:
input{
mongodb{
uri => "mongodb://localhost:27017/reporterDB"
placeholder_db_dir => "/opt/logstash-mongodb/"
placeholder_db_name => "logstash_sqlite.db"
collection => "iam_ms_test"
batch_size => 5000
}
}
filter{
}
output {
stdout { codec => rubydebug }
elasticsearch {
action => "index"
hosts => "localhost:9200"
user => elastic
password => changeme
index => "mongo_log"
document_type => "document_type"
document_id => "%{id}"
}
}
I am getting below lines in logstash-plain.log file
[2019-11-01T15:41:00,869][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-11-01T15:41:00,871][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>6, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>750, :thread=>"#<Thread:0x351f7fd1#/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:245 run>"}
[2019-11-01T15:41:01,068][INFO ][logstash.inputs.mongodb ] Registering MongoDB input
[2019-11-01T15:41:01,116][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"<LogStash::Inputs::MongoDB uri=>\"mongodb://localhost:27017/anchorReports\", placeholder_db_dir=>\"/opt/logstash-mongodb/\", placeholder_db_name=>\"logstash_sqlite.db\", collection=>\"hi_p5m\", batch_size=>5000, id=>\"ec7682e8c6c5676deca84d5072c5f7865120a107ffce81ce21caa878c6e4ed09\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_441f95b8-cc8a-4b9e-a45f-657ed2011e2b\", enable_metric=>true, charset=>\"UTF-8\">, since_table=>\"logstash_since\", since_column=>\"_id\", since_type=>\"id\", parse_method=>\"flatten\", isodate=>false, retry_delay=>3, generateId=>false, unpack_mongo_id=>false, message=>\"Default message...\", interval=>1>", :error=>"Java::JavaSql::SQLException: path to '/opt/logstash-mongodb/logstash_sqlite.db': '/opt/logstash-mongodb' does not exist", :thread=>"#<Thread:0x351f7fd1#/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:245 run>"}
[2019-11-01T15:41:01,869][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Sequel::DatabaseConnectionError: Java::JavaSql::SQLException: path to '/opt/logstash-mongodb/logstash_sqlite.db': '/opt/logstash-mongodb' does not exist>, :backtrace=>["org.sqlite.core.CoreConnection.open(org/sqlite/core/CoreConnection.java:190)", "org.sqlite.core.CoreConnection.<init>(org/sqlite/core/CoreConnection.java:74)", "org.sqlite.jdbc3.JDBC3Connection.<init>(org/sqlite/jdbc3/JDBC3Connection.java:24)", "org.sqlite.jdbc4.JDBC4Connection.<init>(org/sqlite/jdbc4/JDBC4Connection.java:23)", "org.sqlite.SQLiteConnection.<init>
"(org/sqlite/SQLiteConnection.java:45)",
"org.sqlite.JDBC.createConnection(org/sqlite/JDBC.java:114)",
"org.sqlite.JDBC.connect(org/sqlite/JDBC.java:88)"
I want the records on my elastic search under `index("mongo_log").
I also want to know the uses of placeholder_db_dir and placeholder_db_name and whats should be these values when we are using mongodb as the input database.
Problem solved! actually the directory opt/logstash was not created . So I manually create the logstash folder under opt. After that i gave Write permission to that directory , so that when we execute the command for logstash then it can create file inside this folder.

Problem with PostgreSQL json object (PGobject) to logstash - type cast problem

I am trying to index data from a PostgreSQL database to elastic search using logstash. I have a column in the database which is of type JSON. When I run logstash, I get an error called "Missing Converter handling". I do understand that is java/logstash doesn't recognize the JSON type column.
I can typecast the column to "col_name::TEXT" which works fine. However, I need that column as JSON in an elastic search index. Is there any workaround?
Note: Keys in the JSON object is not fixed, it varies.
Logstash postgres query example:
input {
jdbc {
# Postgres jdbc connection string to our database, bot
jdbc_connection_string => "***"
# The user we wish to execute our statement as
jdbc_user => "***"
jdbc_password => "***"
# The path to our downloaded jdbc driver
jdbc_driver_library => "<ja_file_path>"
# The name of the driver class for Postgresql
jdbc_driver_class => "org.postgresql.Driver"
# our query
statement => "SELECT col_1, col_2, col_3, json_col_4 from my_db;"
}
}
filter{
}
output {
stdout { codec => json_lines }
}
Need JSON object from PostgreSQL to elastic search.

How to connect postgres database with logstash from JDBC to import data?

I'm trying to connect PostgreSQL database with Logstash to import data from postgres to elasticsearch.
I'm using JDBC driver to connect Logstash with postgres.
But i'm getting following error
[2019-06-27T13:04:05,943][ERROR][logstash.javapipeline ] A plugin
had an unrecoverable error. Will restart this plugin.
Pipeline_id:main Plugin: "postgres", jdbc_password=>, statement=>"SELECT
* FROM public.\"contacts\";", jdbc_driver_library=>"postgresql-42.2.6.jar",
jdbc_connection_string=>"jd
bc:postgresql://localhost:5432/LogstashTest",
id=>"a76a604bb9cb591dd4a19afc95e03873023e008c564101b4ac19aefe30071213",
jdbc_driver_class=>"org.postgresql.Driver", enable_metric=>true,
codec=>"plain_8f80bf3a-29fe-49e8-86b1-c94e9a298ffb", enable_metric=>true,
charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000,
jdbc_validate_connection=>false, jdbc_validation_timeout=
3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1,
connection_retry_attempts_wait_time=>0.5,
parameters=>{"sql_last_value"=>1970-01-01 00:00:00 UTC},
last_run_metadata_path=>"C :\Users\roshan/.logstash_jdbc_last_run",
use_column_value=>false, tracking_column_type=>"numeric",
clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
Error: org.postgresql.Driver not loaded. Are you sure you've included
the correct jdbc driver in :jdbc_driver_library? Exception:
LogStash::ConfigurationError Stack:
D:/Swares/logstash-7.2.0/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:163:in
open_jdbc_connection'
D:/Swares/logstash-7.2.0/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:221:in
execute_statement'
D:/Swares/logstash-7.2.0/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:277:in execute_query'
D:/Swares/logstash-7.2.0/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:263:inrun'
D:/Swares/logstash-7.2.0/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:309:in
inputworker'
D:/Swares/logstash-7.2.0/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:302:in
block in start_input'
[2019-06-27T13:04:06,946][ERROR][logstash.inputs.jdbc ] Failed to
load postgresql-42.2.6.jar {:exception=>#}
My configurations are
Java version - "1.8.0_211"
postgres (PostgreSQL) 11.0
logstash-7.2.0
And here is my logstash conf file
input {
jdbc{
#input configuration
jdbc_driver_library => "postgresql-42.2.6.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://localhost:5432/LogstashTest"
jdbc_user => "postgres"
jdbc_password => "root"
statement => 'SELECT * FROM public."contacts";'
}
}
output{
stdout { codec => json_lines }
}
The problem can occur due to corrupted PostgreSQL JDBC driver when you passing incorrect config file to Logstash. I had the same issue, you need to check your JDBC driver file.