im not able to config my logstash-2.3.2 with my postgresql-9.5.4-1-windows-x64.
here's is my log-config.conf file
input {
jdbc {
# Postgres jdbc connection string to our database, mydb
jdbc_connection_string => "jdbc:postgresql://localhost:5432/ambule"
# The user we wish to execute our statement as
jdbc_user => "postgres"
# The path to our downloaded jdbc driver
jdbc_driver_library => "C:\Users\Administrator\Downloads\postgresql-9.4-1201-jdbc41.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "org.postgresql.Driver"
# our query
statement => "SELECT * from table1"
}
}
output {
stdout { codec => json_lines }
}
im getting
error
I guess the exception itself lies within the jdbc_connection_string. What if you have it as such:
jdbc_connection_string => "jdbc:postgresql://host:port/database?user=username"
----------------------------------------------------------------------------------------^ try adding the user
Seems like it has been missed out from the doc. Hope it helps!
Related
Trying to get output using logstash with jdbc plugin but not getting data. Try different way to get pg_stat_replication data. But at Index Management did not found that index where my index name is pg_stat_replication.
Conf. file path /etc/logstash/conf.d/postgresql.conf
input {
# pg_stat_replication;
jdbc {
jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/postgresql-jdbc.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://192.168.43.21:5432/postgres"
jdbc_user => "postgres"
jdbc_password => "p"
statement => "SELECT * FROM pg_stat_replication"
schedule => "* * * * *"
type => "pg_stat_replication"
}
}
output {
elasticsearch {
hosts => "http://192.168.43.100:9200"
index => "%{type}"
user => "elastic"
password => "abc#123"
}
}
~
I'm trying to fetch data from mongodb using Jdbc_streaming filter plugin in logstash in windows.
I'm using mongo-java-driver-3.4.2.jar to connect to the database but, getting a error like this.
JavaSql::SQLException: No suitable driver found for jdbc:mongo://localhost:27017/EmployeeDB
No any luck with existing references. I'm using logstash 7.8.0 version. This is my logstash config:
jdbc_streaming {
jdbc_driver_library => "C:/Users/iTelaSoft-User/Downloads/logstash-7.8.0/mongo-java-driver-3.4.2.jar"
jdbc_driver_class => "com.mongodb.MongoClient"
jdbc_connection_string => "jdbc:mongo://localhost:27017/EmployeeDB"
statement => "select * from Employee"
target => "name"
}
You can also try as follows:
download https://dbschema.com/jdbc-drivers/MongoDbJdbcDriver.zip
unzip and copy all the files to the path(~/logstash-7.8.0/logstash-core/lib/jars/)
modify the .config file
Example:
input {
jdbc{
jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
jdbc_driver_library => "mongojdbc2.1.jar"
jdbc_user => "user"
jdbc_password => "pwd"
jdbc_connection_string => "jdbc:mongodb://localhost:27017/EmployeeDB"
statement => "select * from Employee"
}
}
output {
stdout { }
}
I have to import mongoDB data into an elastic search, so I used the given conf with logstash:
input{
jdbc{
jdbc_driver_library => "D:/mongodb_unityjdbc_full.jar"
jdbc_driver_class => "mongodb.jdbc.MongoDriver"
jdbc_connection_string => "jdbc:mongodb://10.10.20.125:27017"
jdbc_user => ""
statement => "SELECT * FROM collection_name.documentname"
}
}
output {
elasticsearch {
hosts => 'http://localhost:9200'
index => 'person_data'
document_type => "person_data"
}
stdout { codec => rubydebug }
}
But I receive the following error:
Error: mongodb.jdbc.MongoDriver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
The file D:/mongodb_unityjdbc_full.jar either does not exist or is the wrong file.
In either case: you should download the official file and put it at the specified location. This is the official download URL: http://www.unityjdbc.com/mongojdbc/mongo_jdbc.php
The file path you have used is incorrect . Please use as:
jdbc_driver_library => "D:\mongodb_unityjdbc_full.jar"
Correct the backward slash to forward slash.
Hope it works !
I want to check one of our DB2 database's table via logstash but I got this exception.
[2018-02-06T13:34:34,175][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#, :backtrace=>["com.ibm.as400.access.JDError.createSQLExceptionSubClass(com/ibm/as400/access/JDError.java:824)", "com.ibm.as400.access.JDError.throwSQLException(com/ibm/as400/access/JDError.java:553)", "com.ibm.as400.access.AS400JDBCConnection.setProperties(com/ibm/as400/access/AS400JDBCConnection.java:3391)", "com.ibm.as400.access.AS400JDBCDriver.prepareConnection(com/ibm/as400/access/AS400JDBCDriver.java:1419)", "com.ibm.as400.access.AS400JDBCDriver.initializeConnection(com/ibm/as400/access/AS400JDBCDriver.java:1256)", "com.ibm.as400.access.AS400JDBCDriver.connect(com/ibm/as400/access/AS400JDBCDriver.java:395)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)",
this is my input config
input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
jdbc {
jdbc_connection_string => "jdbc:as400://ip/db"
jdbc_user => "usr"
jdbc_password => "pass"
jdbc_driver_library => "/etc/logstash/lib/jt400-9.1.jar"
jdbc_driver_class => "com.ibm.as400.access.AS400JDBCDriver"
statement => "SELECT * FROM table1 FETCH FIRST ROWS ONLY"
}
}
I have to mention that the firewall inside of the Database have been disabled.
Im trying to index a Large database table with more then 10.000.000
AND logstash is running out of memory.. :(
The Error:
logstash_1 | Error: Your application used more memory than the safety cap of 1G.
logstash_1 | Specify -J-Xmx####m to increase it (#### = cap size in MB).
logstash_1 | Specify -w for full OutOfMemoryError stack trace
My logstash configuration:
input {
jdbc {
# Postgres jdbc connection string to our database, mydb
jdbc_connection_string => "jdbc:postgresql://database:5432/predictiveparking"
# The user we wish to execute our statement as
jdbc_user => "predictiveparking"
jdbc_password => "insecure"
# The path to our downloaded jdbc driver
jdbc_driver_library => "/app/postgresql-9.4.1212.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "org.postgresql.Driver"
# our query
statement => "SELECT * from scans_scan limit 10"
}
}
#output {
# stdout { codec => json_lines }
#}
output {
elasticsearch {
index => "scans"
sniffing => false
document_type => "scan"
document_id => "id"
hosts => ["elasticsearch"]
}
}
Just enabling paging..
added:
jdbc_paging_enabled => true
Now the data form database get's cut into pieces and we do not run out of memory. Make sure the sql query is ORDERED!
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html#plugins-inputs-jdbc-jdbc_paging_enabled