Having multiple output using Logstash - mongodb

I am new to Logstash and I have trouble understanding how to configure the following process:
Let's say I want to have my logstash collecting Tweets and simultaneously indexing the tweets in my ES and store the tweet in a MongoDB?
I succeed to have my log stash collecting tweets and indexing it in a ES but I don't know how to configure it to store the tweets in my mongoDB as well?
Is it possible? How to configure it?

It's possible, you can configure multiple plugins in the output section of conf file:
output
{
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["my-elasticsearch:9200"]
index => "logs"
document_type => "applog"
}
mongodb
{
isodate => true
database => "metrics"
collection => "logs"
uri => "mongodb://127.0.0.1:27017"
}
}
Check logstash documentation for all available mongodb options as this may vary depending on logstash version (collection, database and uri are required).

Related

Logstash can't send logs to mongoDB

I'm trying to save the logs to my local mongoDB with logstash. this is my configuration in logstash:
input {
tcp {
port => 5001
codec => "json_lines"
}
}
output {
file {
path => "/var/log/some.log"
}
mongodb {
id => "logstash-mongo"
collection => "logs"
database => "somelogs"
uri => "mongodb://my-user:my-password#127.0.0.1:27017/somelogs"
codec => "json"
}
stdout { codec => rubydebug }
}
For output to file its actually works, but not for output to mongoDB. I setup the mongoDB using authentication. I created both admin and basic user for read and write to the specific db. I tried to run logstash but its give me this:
2021-10-29T06:37:51,798][WARN ][logstash.outputs.mongodb ][main][logstash-mongo] Failed to send event to MongoDB, retrying in 3 seconds {:event=>#<LogStash::Event:0x2f0b40c8>, :except
ion=>#<Mongo::Auth::Unauthorized: User my-user is not authorized to access somelogs.>}
[2021-10-29T06:37:54,887][WARN ][logstash.outputs.mongodb ][main][logstash-mongo] Failed to send event to MongoDB, retrying in 3 seconds {:event=>#<LogStash::Event:0x2f0b40c8>, :except
ion=>#<Mongo::Error::OperationFailure: command insert requires authentication (13)>}
I'm using both user to access the mongoDB but its still give me this output. Any suggestion what I have to do with this issue? many thanks
NOTE: I've done install output plugins for mongodb
My mistake, the uri must be
uri => "mongodb://my-user:my-password#127.0.0.1:27017
instead of
uri => "mongodb://my-user:my-password#127.0.0.1:27017/somelogs
The logstash running well now

How sync mongodb collection with elasticseatch index using logstash?

I'm tring to use logstash-7.6.2 to sync my mongo with elasticsearch. I'm using dbschema jdbc driver.
input {
jdbc{
jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
jdbc_driver_library => "/home/user/mongojdbc2.3.jar,/home/user/mongo-java-driver-3.12.6.jar,/home/user/gson-2.8.6.jar"
jdbc_user => ""
jdbc_password => ""
jdbc_connection_string => "jdbc:mongodb://localhost:27027/test"
statement => "db.mycollection.find()"
}
}
output {
elasticsearch {
hosts => ["http://localhost:9220"]
manage_template => false
index => "testing"
}
stdout { codec => rubydebug }
}
But I catch the next error:
Error: Java::JavaSql::SQLException: No suitable driver found for
jdbc:mongodb://localhost:27027/test Exception:
Sequel::DatabaseConnectionError Stack:
java.sql.DriverManager.getConnection(java/sql/DriverManager.java:689)
java.sql.DriverManager.getConnection(java/sql/DriverManager.java:247)
I also tried to use the mongo native java driver and the unity jdbc driver. I also tried to use different version of mongo, tried from localhost and from a remote server. I tried to use different versions of logstash. Everything comes down to this error.
I've meet the same problems syncing MongoDB with Elastic and I want to share my experience with you. What I've tried:
Longstash input MongoDB plugin. The easist way to integrate, needs minimum configuration. But it doesn't support "arrays of objects", it simply igrores this datastructure. You may flatten them on filter stage using ruby script (converting arr: [{a: 1, b: 2}, {a: 3, b: 4}] -> arr_1_a = 1, arr_1_b = 2, ... and so on).
Logstash + jdcb mongo driver. The problem is that all mentions and instructions about this connection are a little bit outdated and all of them do not contain link to necessary Mongo Library. And libraries used in these instructions had been updated many times. Surely we have an option to search for older versions, but they do not support recent MongoDB versions. I've spent few days iterating through tens of different options (wisecoders dbschema, unity driver, many others). I've opened every library to investigate correct path to Java class, but none of them worked: every time I've meet different errors (and the most popular was No suitable driver found for jdbc:mongodb as yours).
Monstache. It was my last hope and it worked flawlessly. The only thing I had to do is to convert my Mongo instance to ReplicaSet (it can be single-node for dev puproses) and prepare simple .toml config for monstache. Also it has ready-to-go Docker image.
So I can personally recommend to use monstache for Mongo + Elastic syncing.
Hope this helps you.

Mongo::Error::NoServerAvailable: No primary server is available in cluster -- An error occurred when I used logstash to mongodb data in docker

this is my logstash.conf config
input{
stdin{
}
jdbc{
jdbc_connection_string => "jdbc:mysql://ip:3306/database"
jdbc_user => "root"
jdbc_password => ""
jdbc_driver_library => "/app/mysql-connector-java-5.1.46/mysql-connector-java-5.1.46-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
record_last_run => "true"
use_column_value => "false"
tracking_column => "id"
last_run_metadata_path => "/app/info/station_parameter.txt"
clean_run => "false"
jdbc_paging_enabled => "true"
jdbc_page_size => 100
statement => "select * from database where id>=1"
schedule => "* * * * *"
}
}
output{
mongodb{
uri => "mongodb://192.168.1.103:27017"
database => "mydata"
collection => "mycollection"
}
}
the error is
Failed to send event to MongoDB, retrying in 3 seconds {:event=>#, :exception=>#]> with timeout=30, LT=0.015>}
Not sure if you find this useful but we experienced the same issue in two different scenarios and resolved it, which I would like to share with you,
client was not able to resolve the Primary Server address due to network related issues in docker files, it was resulting to this error, the mongodb server was up and running but the client was not able to resolve the address so the error was thrown.
mongodb was restarted (or for some reason connection was lost) and when mongodb was back online, the mongoid client (ruby) was not able to re-connect again, the solution was to restart the client as well, to establish the connection again. After investigation, I would say most likely it was a version incompatibility of the driver and mongodb server.
In general, we were not able to find a permanent solution for it and it appears to be an issue with mongodb drivers, but one thing for sure, please check the compatibility of mongodb driver and the server.

logstash connection to database issue

Using logstash, what I'm trying to do is dump all tagnames from a table in a database to an index. The problem I'm facing here is, the logstash works fine if specify the IP address as 127.0.0.1 for the postgresql connection.
But if I specify my actual IP or some other user's IP i'm not getting the connection and logstash eventually fails.
Here's the configuration :
input {
jdbc {
# Postgres jdbc connection string to our database, empris
jdbc_connection_string => "jdbc:postgresql://127.0.0.1:5432/empris"
# The user we wish to execute our statement as
jdbc_user => "postgres"
#The password for the user
jdbc_password => ""
# The path to our downloaded jdbc driver
jdbc_driver_library => "/home/aravind/Desktop/postgresql-9.4-1206-jdbc4.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "org.postgresql.Driver"
# our query
statement => "SELECT distinct tagname from filemetadata"
}
}
output {
elasticsearch {
hosts => "192.168.16.238:9200"
index => "arempris"
document_type => "tags"
}
}
And if the jdbc_connection_string is jdbc:postgresql://192.168.16.233:5432,
logstash fails.
Would greatly appreciate your help making me understanding the cause of this. Thanks in advance.
It seems like you ip you are mentioning is not accessible via logstash. Can you try to ping the ip and see if it is able to connect

How to read from a replicaset mongo by mongodb-erlang

1. {ok,P}= mongoc:connect({rs, <<"dev_mongodb">>, [ "dev_mongodb001:27017", "dev_mongodb002:27017"]}, [{name, mongopool}, {register, mongotopology}, { rp_mode, primary},{ rp_tags, [{tag,1}]}], [{login, <<"root">>}, {password, <<"mongoadmin">>}, {database, <<"admin">>}]).
2. {ok, Pool} = mc_topology:get_pool(P, []).
3. mongoc:find(Pool, {<<"DoctorLBS">>, <<"mongoMessage">>}, #{<<"type">> => <<"5">>}).
I used latest version in github, and got an error at step 3.
It seems my selector is not valid, is there any example of how to use mongodb-erlang ?
My mongodb version is 3.2.6, auth type is SCRAM-SHA1.
mongoc:find(Pool, <<"mongoMessage">>, #{<<"type">> => <<"5">>}).
I tried this in rs and single mode, still got this error.
Is there any other simple way to connect and read?
I just need to read some data once from mongo when my erlang program start, no other actions.
Todays version of mongo does not support tuple colldb due to new query api introduced in mongo 2.6
You should connect to DoctorLBS database instead, and than use
mongoc:find(Pool, <<"mongoMessage">>, #{<<"type">> => <<"5">>}).