I'm tring to use logstash-7.6.2 to sync my mongo with elasticsearch. I'm using dbschema jdbc driver.
input {
jdbc{
jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
jdbc_driver_library => "/home/user/mongojdbc2.3.jar,/home/user/mongo-java-driver-3.12.6.jar,/home/user/gson-2.8.6.jar"
jdbc_user => ""
jdbc_password => ""
jdbc_connection_string => "jdbc:mongodb://localhost:27027/test"
statement => "db.mycollection.find()"
}
}
output {
elasticsearch {
hosts => ["http://localhost:9220"]
manage_template => false
index => "testing"
}
stdout { codec => rubydebug }
}
But I catch the next error:
Error: Java::JavaSql::SQLException: No suitable driver found for
jdbc:mongodb://localhost:27027/test Exception:
Sequel::DatabaseConnectionError Stack:
java.sql.DriverManager.getConnection(java/sql/DriverManager.java:689)
java.sql.DriverManager.getConnection(java/sql/DriverManager.java:247)
I also tried to use the mongo native java driver and the unity jdbc driver. I also tried to use different version of mongo, tried from localhost and from a remote server. I tried to use different versions of logstash. Everything comes down to this error.
I've meet the same problems syncing MongoDB with Elastic and I want to share my experience with you. What I've tried:
Longstash input MongoDB plugin. The easist way to integrate, needs minimum configuration. But it doesn't support "arrays of objects", it simply igrores this datastructure. You may flatten them on filter stage using ruby script (converting arr: [{a: 1, b: 2}, {a: 3, b: 4}] -> arr_1_a = 1, arr_1_b = 2, ... and so on).
Logstash + jdcb mongo driver. The problem is that all mentions and instructions about this connection are a little bit outdated and all of them do not contain link to necessary Mongo Library. And libraries used in these instructions had been updated many times. Surely we have an option to search for older versions, but they do not support recent MongoDB versions. I've spent few days iterating through tens of different options (wisecoders dbschema, unity driver, many others). I've opened every library to investigate correct path to Java class, but none of them worked: every time I've meet different errors (and the most popular was No suitable driver found for jdbc:mongodb as yours).
Monstache. It was my last hope and it worked flawlessly. The only thing I had to do is to convert my Mongo instance to ReplicaSet (it can be single-node for dev puproses) and prepare simple .toml config for monstache. Also it has ready-to-go Docker image.
So I can personally recommend to use monstache for Mongo + Elastic syncing.
Hope this helps you.
Related
I am using the Slim 4 framework along with Jenssegers MongoDB library and Capsule (Illuminate\Database from Laravel). I have got the MongoDB extension installed on my Linux server and everything seems ok connection wise, but I cannot seem to insert data into the database or get anything from it. I have tried with the query builder and Eloquent. My code with query builder example is below.
use Illuminate\Database\Capsule\Manager as Capsule;
$capsule = new Capsule;
$capsule->getDatabaseManager()->extend('mongodb', function($config, $name) {
$config['name'] = $name;
return new \Jenssegers\Mongodb\Connection($config);
});
$capsule->addConnection([
'host' => '127.0.0.1',
'port' => 27017,
'database' => 'testing',
'username' => '',
'password' => '',
], 'mongodb');
// Set the event dispatcher used by Eloquent models... (optional)
use Illuminate\Events\Dispatcher;
use Illuminate\Container\Container;
$capsule->setEventDispatcher(new Dispatcher(new Container));
// Make this Capsule instance available globally via static methods... (optional)
$capsule->setAsGlobal();
// Setup the Eloquent ORM... (optional; unless you've used setEventDispatcher())
$capsule->bootEloquent();
$capsule->connection('mongodb')->collection('testing')->insert([
'test1'=>'hello',
'test2'=>'world',
]);
The database and collection exist as I can see them in Compass. Can anyone see where I'm going wrong with the code or is it a configuration issue?
There were two problems, one was the driver missing as you point out, but the other was the fact my PHP is running in Linux using Vagrant and it was pointing to localhost, but the MongoDB server is running on the Windows machine and not Linux. Thanks.
this is my logstash.conf config
input{
stdin{
}
jdbc{
jdbc_connection_string => "jdbc:mysql://ip:3306/database"
jdbc_user => "root"
jdbc_password => ""
jdbc_driver_library => "/app/mysql-connector-java-5.1.46/mysql-connector-java-5.1.46-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
record_last_run => "true"
use_column_value => "false"
tracking_column => "id"
last_run_metadata_path => "/app/info/station_parameter.txt"
clean_run => "false"
jdbc_paging_enabled => "true"
jdbc_page_size => 100
statement => "select * from database where id>=1"
schedule => "* * * * *"
}
}
output{
mongodb{
uri => "mongodb://192.168.1.103:27017"
database => "mydata"
collection => "mycollection"
}
}
the error is
Failed to send event to MongoDB, retrying in 3 seconds {:event=>#, :exception=>#]> with timeout=30, LT=0.015>}
Not sure if you find this useful but we experienced the same issue in two different scenarios and resolved it, which I would like to share with you,
client was not able to resolve the Primary Server address due to network related issues in docker files, it was resulting to this error, the mongodb server was up and running but the client was not able to resolve the address so the error was thrown.
mongodb was restarted (or for some reason connection was lost) and when mongodb was back online, the mongoid client (ruby) was not able to re-connect again, the solution was to restart the client as well, to establish the connection again. After investigation, I would say most likely it was a version incompatibility of the driver and mongodb server.
In general, we were not able to find a permanent solution for it and it appears to be an issue with mongodb drivers, but one thing for sure, please check the compatibility of mongodb driver and the server.
I have an application based using .Net Core 2.2 that is connecting to MondoDb cluster V3.6 in Atlas. The application is hosted in Azure as a Linux Docker container. The app is using MongoDB .Net driver 2.7.3. The app periodically (once in a couple minutes) receives the following timeout exceptions:
System.TimeoutException at MongoDB.Driver.Core.Clusters.Cluster.ThrowTimeoutException
and
System.TimeoutException at MongoDB.Driver.Core.Connections.TcpStreamFactory.ConnectAsync
The mongo client instance is configured according to the MongoDb docs, i.e.
var url = MongoUrl.Create("mongodb+srv://user:password#cluster.gcp.mongodb.net/?authSource=admin&retryWrites=true&ssl=true");
var clientSettings = MongoClientSettings.FromUrl(url);
clientSettings.SslSettings = new SslSettings() { EnabledSslProtocols = SslProtocols.Tls12 };
void SocketConfigurator(Socket s) => s.SetSocketOption(SocketOptionLevel.Socket, SocketOptionName.KeepAlive, true);
clientSettings.ClusterConfigurator = builder =>
builder.ConfigureTcp(tcp => tcp.With(socketConfigurator: (Action<Socket>)SocketConfigurator));
return new MongoClient(clientSettings);
I checked number of SO questions including MongoDB C# 2.0 TimeoutException and SocketTimeout with opened connection in MongoDB but the suggestions seem to be either outdated (reported as fixed in the current version of driver) or don't have permanent positive effect (setting timeouts in the connection string i.e. connectTimeoutMS=90000&socketTimeoutMS=90000&maxIdleTimeMS=90000). The second one (setting tcp_keepalive_time) seems to be not applicable to a docker container in Azure. Please help.
Have you tried setting like this:
var client = new MongoClient(new MongoClientSettings
{
Server = new MongoServerAddress("xxxx"),
ClusterConfigurator = builder =>
{
builder.ConfigureCluster(settings => settings.With(serverSelectionTimeout: TimeSpan.FromSeconds(10)));
}
});
I am new to Logstash and I have trouble understanding how to configure the following process:
Let's say I want to have my logstash collecting Tweets and simultaneously indexing the tweets in my ES and store the tweet in a MongoDB?
I succeed to have my log stash collecting tweets and indexing it in a ES but I don't know how to configure it to store the tweets in my mongoDB as well?
Is it possible? How to configure it?
It's possible, you can configure multiple plugins in the output section of conf file:
output
{
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["my-elasticsearch:9200"]
index => "logs"
document_type => "applog"
}
mongodb
{
isodate => true
database => "metrics"
collection => "logs"
uri => "mongodb://127.0.0.1:27017"
}
}
Check logstash documentation for all available mongodb options as this may vary depending on logstash version (collection, database and uri are required).
1. {ok,P}= mongoc:connect({rs, <<"dev_mongodb">>, [ "dev_mongodb001:27017", "dev_mongodb002:27017"]}, [{name, mongopool}, {register, mongotopology}, { rp_mode, primary},{ rp_tags, [{tag,1}]}], [{login, <<"root">>}, {password, <<"mongoadmin">>}, {database, <<"admin">>}]).
2. {ok, Pool} = mc_topology:get_pool(P, []).
3. mongoc:find(Pool, {<<"DoctorLBS">>, <<"mongoMessage">>}, #{<<"type">> => <<"5">>}).
I used latest version in github, and got an error at step 3.
It seems my selector is not valid, is there any example of how to use mongodb-erlang ?
My mongodb version is 3.2.6, auth type is SCRAM-SHA1.
mongoc:find(Pool, <<"mongoMessage">>, #{<<"type">> => <<"5">>}).
I tried this in rs and single mode, still got this error.
Is there any other simple way to connect and read?
I just need to read some data once from mongo when my erlang program start, no other actions.
Todays version of mongo does not support tuple colldb due to new query api introduced in mongo 2.6
You should connect to DoctorLBS database instead, and than use
mongoc:find(Pool, <<"mongoMessage">>, #{<<"type">> => <<"5">>}).