Replace the last search result - elastic-stack

I am currently monitoring the database(Jdbc).
This bank refers to product sales, where at first the status of the sale receives "WAITING FOR PAYMENT" and after payment it receives the status of "PAID".
however, when the payment status changes, the log is not updated, so it only counts the first status.
I ask for help on how can I make logstash update the status field dynamically?
the idea is that it is similar to the bank, receiving updates on top of those that already exist, in this way we can mount graphics on top of these updates.
My input is like this:
input {
jdbc {
tags => ["oracle"]
jdbc_driver_library => "/usr/share/logstash/lib/ojdbc8.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "***"
jdbc_user => "***"
jdbc_password => "***"
jdbc_validate_connection => true
jdbc_paging_enabled => true
use_column_value => true
tracking_column => unix_ts_in_secs
tracking_column_type => "timestamp"
schedule => "*/1 * * * *"
statement => "SELECT PRODUCTS.ID, PRODUCTS.STATUS, TO_TIMESTAMP(PRODUCTS.CREATION_DATE) AS unix_ts_in_secs FROM PRODUCTS ON (TO_TIMESTAMP(PRODUCTS.CREATION_DATE) > :sql_last_value) ORDER BY PRODUCTS.CREATION_DATE ASC"
record_last_run => true
}
}

In output elastic configuration just update each document with "PRODUCTS.ID" value as below
output {
elasticsearch {
ssl => true
user => "***"
password => "***"
hosts => ["https://elastic-ip:9200"]
keystore => "path to cert"
keystore_password => "****"
truststore => "path to cert"
truststore_password =>"****"
index => "data-%{+yyyy.MM.dd}"
document_id => "%{PRODUCTS.ID}"
doc_as_upsert => true
action => "update"
}
}

Related

Logstash Reading Same Data (Duplicates)

I'm using logstash input jdbc plugin to read one database and send the data to elasticsearch.
My Logstash.conf file looks like this:
input {
jdbc {
jdbc_driver_library => "${LOGSTASH_JDBC_DRIVER_JAR_LOCATION}"
jdbc_driver_class => "${LOGSTASH_JDBC_DRIVER}"
jdbc_connection_string => "${LOGSTASH_JDBC_URL}"
jdbc_user => "${LOGSTASH_JDBC_USERNAME}"
jdbc_password => "${LOGSTASH_JDBC_PASSWORD}"
schedule => "* * * * *"
statement => "select * from testtable"
use_column_value => true
tracking_column => "time"
}
}
filter {
mutate {
add_field => { "message" => "%{time}" }
convert => [ "time", "string" ]
}
date {
timezone => "Etc/GMT+3"
match => ["time" , "ISO8601", "yyyy-MM-dd HH:mm:ss.SSS"]
target => "#timestamp"
remove_field => [ "time", "timestamp" ]
}
fingerprint {
source => ["testid", "programid", "unitid"]
target => "[#metadata][fingerprint]"
method => "MD5"
key => "${LOGSTASH_JDBC_PASSWORD}"
}
ruby {
code => "event.set('[#metadata][tsprefix]', event.get('#timestamp').to_i.to_s(16))"
}
}
output {
elasticsearch {
hosts => ["${LOGSTASH_ELASTICSEARCH_HOST}"]
user => "${ELASTIC_USER}"
password => "${ELASTIC_PASSWORD}"
index => "test"
document_id => "%{[#metadata][tsprefix]}%{[#metadata][fingerprint]}"
}
stdout { codec => json_lines }
}
I tried using this .conf without these lines:
use_column_value => true
tracking_column => "time"
Also tried using:
clean_run => true
But Logstash keeps reading same data over and over again.
Can you help me understand why Logstash keeps reading?
Logstash (8.3.1)
Database (PostgreSQL 14.5)
JDBC (42.4.1)
statement query in your jdbc input configuration "select * from testtable" will read all the contents from DB table on each run. Input configuration should be as below to avoid reading same data repeatedly.
jdbc {
jdbc_driver_library => "${LOGSTASH_JDBC_DRIVER_JAR_LOCATION}"
jdbc_driver_class => "${LOGSTASH_JDBC_DRIVER}"
jdbc_connection_string => "${LOGSTASH_JDBC_URL}"
jdbc_user => "${LOGSTASH_JDBC_USERNAME}"
jdbc_password => "${LOGSTASH_JDBC_PASSWORD}"
schedule => "* * * * *"
statement => "select * from testtable where time > :sql_lat_value"
use_column_value => true
tracking_column => "time"
record_last_run => true
last_run_metadata_path => <valid file path>
}

Not getting SQL Statement output using ELK logstash jdbc plugin

Trying to get output using logstash with jdbc plugin but not getting data. Try different way to get pg_stat_replication data. But at Index Management did not found that index where my index name is pg_stat_replication.
Conf. file path /etc/logstash/conf.d/postgresql.conf
input {
# pg_stat_replication;
jdbc {
jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/postgresql-jdbc.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://192.168.43.21:5432/postgres"
jdbc_user => "postgres"
jdbc_password => "p"
statement => "SELECT * FROM pg_stat_replication"
schedule => "* * * * *"
type => "pg_stat_replication"
}
}
output {
elasticsearch {
hosts => "http://192.168.43.100:9200"
index => "%{type}"
user => "elastic"
password => "abc#123"
}
}
~

Pushing data to a kafka server with Logstash

i have a logstash conf file that looks like this:
input {
jdbc {
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:#//the-s2.db.oracle.yn:1521/DPP2.mind.com"
jdbc_user => "STG_TEST"
jdbc_password => "cddcdcd"
parameters => {"orderid" => 1212332365}
statement => "select PO_SEARCH_IL_ID,ORDER_DATE,REF_1,SHIPPING_WINDOW_START,SHIPPING_WINDOW_END FROM ods.po_search_il where PO_SEARCH_IL_ID =:orderid "
schedule => "* * * * *"
clean_run => true
}
}
output {
kafka {
bootstrap_servers => "mykafkaservername.kn:9092"
topic_id => ["test3"]
}
}
the Script run the topic test3 is created into the kafka server but no data in there.
Does somebody could help on that issue?

logstash JDBC connection to DB2 failed

I want to check one of our DB2 database's table via logstash but I got this exception.
[2018-02-06T13:34:34,175][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#, :backtrace=>["com.ibm.as400.access.JDError.createSQLExceptionSubClass(com/ibm/as400/access/JDError.java:824)", "com.ibm.as400.access.JDError.throwSQLException(com/ibm/as400/access/JDError.java:553)", "com.ibm.as400.access.AS400JDBCConnection.setProperties(com/ibm/as400/access/AS400JDBCConnection.java:3391)", "com.ibm.as400.access.AS400JDBCDriver.prepareConnection(com/ibm/as400/access/AS400JDBCDriver.java:1419)", "com.ibm.as400.access.AS400JDBCDriver.initializeConnection(com/ibm/as400/access/AS400JDBCDriver.java:1256)", "com.ibm.as400.access.AS400JDBCDriver.connect(com/ibm/as400/access/AS400JDBCDriver.java:395)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)",
this is my input config
input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
jdbc {
jdbc_connection_string => "jdbc:as400://ip/db"
jdbc_user => "usr"
jdbc_password => "pass"
jdbc_driver_library => "/etc/logstash/lib/jt400-9.1.jar"
jdbc_driver_class => "com.ibm.as400.access.AS400JDBCDriver"
statement => "SELECT * FROM table1 FETCH FIRST ROWS ONLY"
}
}
I have to mention that the firewall inside of the Database have been disabled.

Missing CreditCard Class when using Omnipay Paypal for Laravel 5

Here are the repo included in my composer:
omnipay &
paypal
In my config/laravel-omnipay.php:
'gateways' => [
'paypal' => [
'driver' => 'PayPal_Rest',
'options' => [
'solutionType' => '',
'landingPage' => '',
'headerImageUrl' => ''
]
]
]
Here is in my Controller:
// omnipay start
$gateway = Omnipay::create('PayPal_Rest');
// Initialise the gateway
$gateway->initialize(array(
'clientId' => 'xxxxxx',
'secret' => 'xxxxxx',
'testMode' => true, // Or false when you are ready for live transactions
));
// Create a credit card object
// DO NOT USE THESE CARD VALUES -- substitute your own
$card = new CreditCard(array(
'firstName' => $request->firstname,
'lastName' => $request->lastname,
'number' => $request->cardnumber,
'expiryMonth' => $month_year[0],
'expiryYear' => $month_year[1],
'cvv' => $request->ccv,
'billingAddress1' => $request->address
/*
'billingCountry' => 'AU',
'billingCity' => 'Scrubby Creek',
'billingPostcode' => '4999',
'billingState' => 'QLD',*/
));
// Do an authorisation transaction on the gateway
$transaction = $gateway->authorize(array(
'amount' => '100',
'currency' => 'USD',
'description' => $eventName->event_title,
'card' => $card,
));
$response = $transaction->send();
if ($response->isSuccessful()) {
echo "Authorize transaction was successful!\n";
// Find the authorization ID
$auth_id = $response->getTransactionReference();
}
I've got this error:
Class 'App\Http\Controllers\CreditCard' not found
Note: If I use RestGateway to replace PayPal_Rest, I get this error instead:
Class '\Omnipay\RestGateway\Gateway' not found
Searching an answer for a long time but didn't find a solution that works for me. So, not entirely sure how to proceed.
You need to have this at the top of your class file:
use Omnipay\Common\CreditCard;
$creditCard = new \Omnipay\Common\CreditCard([...]);
Backslash
Further reading: https://stackoverflow.com/questions/4790020/what-does-a-backslash-do-in-php-5-3#:~:text=%5C%20(backslash)%20is%20the%20namespace,name%20in%20the%20current%20namespace.
The issue is because it will fetch the class from the global namespace - rather than the current namespace.