JDBC Logstash Elastic Kibana - mongodb

I'm using JDBC input plugin to ingest data from mongodb to ElasticSearch.
My config is:
`input {
jdbc {
jdbc_driver_class => "mongodb.jdbc.MongoDriver"
jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/mongodb_unityjdbc_free.jar"
jdbc_user => ""
jdbc_password => ""
jdbc_connection_string => "jdbc:mongodb://localhost:27017/pritunl"
schedule => "* * * * *"
jdbc_page_size => 100000
jdbc_paging_enabled => true
statement => "select * from servers_output"
}
}
filter {
mutate {
copy => { "_id" => "[#metadata][id]"}
remove_field => ["_id"]
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "pritunl"
document_id => "%{[#metadata][_id]}"
}
stdout {}
}`
In Kibana I see only one hitenter image description here, but in stdout I see many records from mongodb collection. What should I do, to see them all?

The problem is, that all your documents are saved with the same "_id", so even though you're sending different records to ES, only one document is being overwritten internally - thus you get 1 hit in Kibana.
There is a typo in your configuration to cause this issue.
You're copying "_id" into "[#metadata][id]"
But you're trying to read "[#metadata][_id]" with an underscore.
Removing the underscore when reading the value for document_id should fix your issue.
output {
elasticsearch {
hosts => "localhost:9200"
index => "pritunl"
document_id => "%{[#metadata][id]}"
}
stdout {}
}`

Related

Not getting SQL Statement output using ELK logstash jdbc plugin

Trying to get output using logstash with jdbc plugin but not getting data. Try different way to get pg_stat_replication data. But at Index Management did not found that index where my index name is pg_stat_replication.
Conf. file path /etc/logstash/conf.d/postgresql.conf
input {
# pg_stat_replication;
jdbc {
jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/postgresql-jdbc.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://192.168.43.21:5432/postgres"
jdbc_user => "postgres"
jdbc_password => "p"
statement => "SELECT * FROM pg_stat_replication"
schedule => "* * * * *"
type => "pg_stat_replication"
}
}
output {
elasticsearch {
hosts => "http://192.168.43.100:9200"
index => "%{type}"
user => "elastic"
password => "abc#123"
}
}
~

Connect to mongodb using logstash Jdbc_streaming filter plugin

I'm trying to fetch data from mongodb using Jdbc_streaming filter plugin in logstash in windows.
I'm using mongo-java-driver-3.4.2.jar to connect to the database but, getting a error like this.
JavaSql::SQLException: No suitable driver found for jdbc:mongo://localhost:27017/EmployeeDB
No any luck with existing references. I'm using logstash 7.8.0 version. This is my logstash config:
jdbc_streaming {
jdbc_driver_library => "C:/Users/iTelaSoft-User/Downloads/logstash-7.8.0/mongo-java-driver-3.4.2.jar"
jdbc_driver_class => "com.mongodb.MongoClient"
jdbc_connection_string => "jdbc:mongo://localhost:27017/EmployeeDB"
statement => "select * from Employee"
target => "name"
}
You can also try as follows:
download https://dbschema.com/jdbc-drivers/MongoDbJdbcDriver.zip
unzip and copy all the files to the path(~/logstash-7.8.0/logstash-core/lib/jars/)
modify the .config file
Example:
input {
jdbc{
jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
jdbc_driver_library => "mongojdbc2.1.jar"
jdbc_user => "user"
jdbc_password => "pwd"
jdbc_connection_string => "jdbc:mongodb://localhost:27017/EmployeeDB"
statement => "select * from Employee"
}
}
output {
stdout { }
}

How to sync changes from mongodb to elasticsearch(i.e.reflect changes in elastic when crud operation performed in mongo) using logstash

I just want to sync the data from mongodb to elastic using logstash. Its working good, when any new record comes in mongodb, logstash pushes into elastic. But when I update any record in mongodb then it does not change into elasticsearch. I want to make changes in the config file so that when any record updates in mongo it should reflect in elastic as well.
I have already tried by making action => "index" as well as action =>"update" in .conf file.
Here is my mongodata.conf file
input{
mongodb{
uri => 'mongodb://localhost:27017/DBname'
placeholder_db_dir => '/opt/logstash'
placeholder_db_name => 'logstash_sqlite.db'
collection => "collectionName"
batch_size => 500
}
}
filter {
mutate {
remove_field => [ "_id" ]
}
}
output{
elasticsearch{
action => "index"
hosts => ["localhost:9200"]
index => "mongo_hi_log_data"
}
stdout{ codec => rubydebug }
}
I want to sync the data from mongodb to elastic using logstash when any record is updated or inserted in mongodb.
input {
mongodb {
uri => 'mongodb://mongo-db-url/your-collection'
placeholder_db_dir => '/opt/logstash-mongodb/'
placeholder_db_name => 'logstash_sqlite.db'
collection => 'products'
batch_size => 5000
}
}
filter {
date {
match => [ "logdate", "ISO8601" ]
}
mutate {
rename => { "[_id]" => "product_id" }
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["http://es-url"]
user => "elastic"
password => "changeme"
index => "your-index"
action => "update"
doc_as_upsert => true
document_id => "%{product_id}"
timeout => 30
workers => 1
}
}

Logstash-input-mongodb configuration file

I am trying to pull data from a mongoDB to Elasticsearch using logstash.
I am using the Logstash-input-mongodb plugin. This is my config file:
input {
mongodb {
uri => 'mongodb://localhost:27017/test'
placeholder_db_dir => '/opt/logstash-mongodb/'
placeholder_db_name => 'logstash_sqlite.db'
collection => 'student'
batch_size => 202
}
}
filter {
}
output {
elasticsearch {
host => "localhost:9200"
user => elastic
password => changeme
index => "testStudent"
}
stdout { codec => rubydebug }
}
I am having an error saying:
Pipelines YAML file is empty.
Is this because I left the filter part empty?

How to have an input of type MongoDB for Logstash

I know we can input files, and output to a mongo database. But I have a collection in my mongodb that I would like to have as an input so that I can use it with ES. Is this possible?
Thank you.
I have had a similar problem, the logstash-input-mongodb plugin is fine, but it is very limited, it also seems that it is no longer being maintained, so, I have opted for the logstash-integration-jdbc plugin.
I have followed the following steps to sync a MongoDB collection with ES:
First, I have downloaded the JDBC driver for MongoDB developed by DBSchema that you can find here.
I have prepared a custom Dockerfile to integrate the driver and plugins as you can see below:
FROM docker.elastic.co/logstash/logstash:7.9.2
RUN mkdir /usr/share/logstash/drivers
COPY ./drivers/* /usr/share/logstash/drivers/
RUN logstash-plugin install logstash-integration-jdbc
RUN logstash-plugin install logstash-output-elasticsearch
I have configured a query that will be executed every 30 seconds and will look for documents with an insert timestamp later than the timestamp of the last query (provided with the parameter :sql_last_value)
input {
jdbc {
jdbc_driver_library => "/usr/share/logstash/drivers/mongojdbc2.3.jar"
jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
jdbc_connection_string => "jdbc:mongodb://devroot:devroot#mongo:27017/files?authSource=admin"
jdbc_user => "devroot"
jdbc_password => "devroot"
schedule => "*/30 * * * * *"
statement => "db.processed_files.find({ 'document.processed_at' : {'$gte': :sql_last_value}},{'_id': false});"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
action => "create"
index => "processed_files"
hosts => ["elasticsearch:9200"]
user => "elastic"
password => "password"
ssl => true
ssl_certificate_verification => false
cacert => "/etc/logstash/keys/certificate.pem"
}
}
Hope it can help someone, regards
You could set up a river to pull data from MongoDB to Elasticsearch.
See the instructions here - http://www.codetweet.com/ubuntu-2/configuring-elasticsearch-mongodb/
I tried out with Sergio Sánchez Sánche's solution suggestion and found following updates and improvements:
input {
jdbc {
jdbc_driver_library => "/usr/share/logstash/drivers/mongojdbc3.0.jar"
jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
jdbc_connection_string => "jdbc:mongodb://devroot:devroot#mongo:27017/files?authSource=admin"
jdbc_user => "devroot"
jdbc_password => "devroot"
schedule => "*/30 * * * * *"
statement => "db.processed_files.find({ 'document.processed_at' : {'$gte': new ISODate(:sql_last_value)}},{'_id': false});"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
action => "update"
doc_as_upsert => true
document_id => "%{[document][uuid]}"
index => "processed_files"
hosts => ["elasticsearch:9200"]
user => "elastic"
password => "password"
ssl => true
ssl_certificate_verification => false
cacert => "/etc/logstash/keys/certificate.pem"
}
}
Explanation:
The date comparison in Mongodb has to use new ISODate to convert
:sql_last_value
I'd like to use "update" instead of "create" to cover
the case of update. The query result from the section input is
contained in "document". Assume you have a field with unique value
"uuid", you have to use it to identify the document, because Mongodb's
"_id" is not supported anyway.
If you have any embedded document which has also "_id" filed, you have to exclude it, too, e.g.
statement => "db.profiles.find({'updatedAt' : {'$gte': new ISODate(:sql_last_value)}},
{'_id': false, 'embedded_doc._id': false}});"
So apparently, the short answer is No, it is not possible to have an input from a database in Logstash.
EDIT
#elssar thank you for your answer:
Actually, there is a 3rd party mongodb input for logstash -
github.com/phutchins/logstash-input-mongodb – elssar