logstash JDBC connection to DB2 failed - db2

I want to check one of our DB2 database's table via logstash but I got this exception.
[2018-02-06T13:34:34,175][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#, :backtrace=>["com.ibm.as400.access.JDError.createSQLExceptionSubClass(com/ibm/as400/access/JDError.java:824)", "com.ibm.as400.access.JDError.throwSQLException(com/ibm/as400/access/JDError.java:553)", "com.ibm.as400.access.AS400JDBCConnection.setProperties(com/ibm/as400/access/AS400JDBCConnection.java:3391)", "com.ibm.as400.access.AS400JDBCDriver.prepareConnection(com/ibm/as400/access/AS400JDBCDriver.java:1419)", "com.ibm.as400.access.AS400JDBCDriver.initializeConnection(com/ibm/as400/access/AS400JDBCDriver.java:1256)", "com.ibm.as400.access.AS400JDBCDriver.connect(com/ibm/as400/access/AS400JDBCDriver.java:395)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)",
this is my input config
input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
jdbc {
jdbc_connection_string => "jdbc:as400://ip/db"
jdbc_user => "usr"
jdbc_password => "pass"
jdbc_driver_library => "/etc/logstash/lib/jt400-9.1.jar"
jdbc_driver_class => "com.ibm.as400.access.AS400JDBCDriver"
statement => "SELECT * FROM table1 FETCH FIRST ROWS ONLY"
}
}
I have to mention that the firewall inside of the Database have been disabled.

Related

Not getting SQL Statement output using ELK logstash jdbc plugin

Trying to get output using logstash with jdbc plugin but not getting data. Try different way to get pg_stat_replication data. But at Index Management did not found that index where my index name is pg_stat_replication.
Conf. file path /etc/logstash/conf.d/postgresql.conf
input {
# pg_stat_replication;
jdbc {
jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/postgresql-jdbc.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://192.168.43.21:5432/postgres"
jdbc_user => "postgres"
jdbc_password => "p"
statement => "SELECT * FROM pg_stat_replication"
schedule => "* * * * *"
type => "pg_stat_replication"
}
}
output {
elasticsearch {
hosts => "http://192.168.43.100:9200"
index => "%{type}"
user => "elastic"
password => "abc#123"
}
}
~

Connect to mongodb using logstash Jdbc_streaming filter plugin

I'm trying to fetch data from mongodb using Jdbc_streaming filter plugin in logstash in windows.
I'm using mongo-java-driver-3.4.2.jar to connect to the database but, getting a error like this.
JavaSql::SQLException: No suitable driver found for jdbc:mongo://localhost:27017/EmployeeDB
No any luck with existing references. I'm using logstash 7.8.0 version. This is my logstash config:
jdbc_streaming {
jdbc_driver_library => "C:/Users/iTelaSoft-User/Downloads/logstash-7.8.0/mongo-java-driver-3.4.2.jar"
jdbc_driver_class => "com.mongodb.MongoClient"
jdbc_connection_string => "jdbc:mongo://localhost:27017/EmployeeDB"
statement => "select * from Employee"
target => "name"
}
You can also try as follows:
download https://dbschema.com/jdbc-drivers/MongoDbJdbcDriver.zip
unzip and copy all the files to the path(~/logstash-7.8.0/logstash-core/lib/jars/)
modify the .config file
Example:
input {
jdbc{
jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
jdbc_driver_library => "mongojdbc2.1.jar"
jdbc_user => "user"
jdbc_password => "pwd"
jdbc_connection_string => "jdbc:mongodb://localhost:27017/EmployeeDB"
statement => "select * from Employee"
}
}
output {
stdout { }
}

Pushing data to a kafka server with Logstash

i have a logstash conf file that looks like this:
input {
jdbc {
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:#//the-s2.db.oracle.yn:1521/DPP2.mind.com"
jdbc_user => "STG_TEST"
jdbc_password => "cddcdcd"
parameters => {"orderid" => 1212332365}
statement => "select PO_SEARCH_IL_ID,ORDER_DATE,REF_1,SHIPPING_WINDOW_START,SHIPPING_WINDOW_END FROM ods.po_search_il where PO_SEARCH_IL_ID =:orderid "
schedule => "* * * * *"
clean_run => true
}
}
output {
kafka {
bootstrap_servers => "mykafkaservername.kn:9092"
topic_id => ["test3"]
}
}
the Script run the topic test3 is created into the kafka server but no data in there.
Does somebody could help on that issue?

cant config logstash to postgres

im not able to config my logstash-2.3.2 with my postgresql-9.5.4-1-windows-x64.
here's is my log-config.conf file
input {
jdbc {
# Postgres jdbc connection string to our database, mydb
jdbc_connection_string => "jdbc:postgresql://localhost:5432/ambule"
# The user we wish to execute our statement as
jdbc_user => "postgres"
# The path to our downloaded jdbc driver
jdbc_driver_library => "C:\Users\Administrator\Downloads\postgresql-9.4-1201-jdbc41.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "org.postgresql.Driver"
# our query
statement => "SELECT * from table1"
}
}
output {
stdout { codec => json_lines }
}
im getting
error
I guess the exception itself lies within the jdbc_connection_string. What if you have it as such:
jdbc_connection_string => "jdbc:postgresql://host:port/database?user=username"
----------------------------------------------------------------------------------------^ try adding the user
Seems like it has been missed out from the doc. Hope it helps!

logstash out of memory reading postgres large table

Im trying to index a Large database table with more then 10.000.000
AND logstash is running out of memory.. :(
The Error:
logstash_1 | Error: Your application used more memory than the safety cap of 1G.
logstash_1 | Specify -J-Xmx####m to increase it (#### = cap size in MB).
logstash_1 | Specify -w for full OutOfMemoryError stack trace
My logstash configuration:
input {
jdbc {
# Postgres jdbc connection string to our database, mydb
jdbc_connection_string => "jdbc:postgresql://database:5432/predictiveparking"
# The user we wish to execute our statement as
jdbc_user => "predictiveparking"
jdbc_password => "insecure"
# The path to our downloaded jdbc driver
jdbc_driver_library => "/app/postgresql-9.4.1212.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "org.postgresql.Driver"
# our query
statement => "SELECT * from scans_scan limit 10"
}
}
#output {
# stdout { codec => json_lines }
#}
output {
elasticsearch {
index => "scans"
sniffing => false
document_type => "scan"
document_id => "id"
hosts => ["elasticsearch"]
}
}
Just enabling paging..
added:
jdbc_paging_enabled => true
Now the data form database get's cut into pieces and we do not run out of memory. Make sure the sql query is ORDERED!
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html#plugins-inputs-jdbc-jdbc_paging_enabled