I receive a "Precondition Failed" error when I try to run:
my $obj = $s3->SelectObjectContent(
Bucket => 'MyBucket',
Expression => 'SELECT * FROM s3object s',
ExpressionType => 'SQL',
InputSerialization => {
JSON => {
Type => 'LINES'
}
},
Key => 'MyKey',
OutputSerialization => {
JSON => {}
} );
I followed https://metacpan.org/pod/Paws::S3::SelectObjectContent#RequestProgress-=%3E-Paws::S3::RequestProgress
I have successfully called "GetObject" using my bucket and key, so those are not responsible for the error.
Related
I'm using logstash input jdbc plugin to read one database and send the data to elasticsearch.
My Logstash.conf file looks like this:
input {
jdbc {
jdbc_driver_library => "${LOGSTASH_JDBC_DRIVER_JAR_LOCATION}"
jdbc_driver_class => "${LOGSTASH_JDBC_DRIVER}"
jdbc_connection_string => "${LOGSTASH_JDBC_URL}"
jdbc_user => "${LOGSTASH_JDBC_USERNAME}"
jdbc_password => "${LOGSTASH_JDBC_PASSWORD}"
schedule => "* * * * *"
statement => "select * from testtable"
use_column_value => true
tracking_column => "time"
}
}
filter {
mutate {
add_field => { "message" => "%{time}" }
convert => [ "time", "string" ]
}
date {
timezone => "Etc/GMT+3"
match => ["time" , "ISO8601", "yyyy-MM-dd HH:mm:ss.SSS"]
target => "#timestamp"
remove_field => [ "time", "timestamp" ]
}
fingerprint {
source => ["testid", "programid", "unitid"]
target => "[#metadata][fingerprint]"
method => "MD5"
key => "${LOGSTASH_JDBC_PASSWORD}"
}
ruby {
code => "event.set('[#metadata][tsprefix]', event.get('#timestamp').to_i.to_s(16))"
}
}
output {
elasticsearch {
hosts => ["${LOGSTASH_ELASTICSEARCH_HOST}"]
user => "${ELASTIC_USER}"
password => "${ELASTIC_PASSWORD}"
index => "test"
document_id => "%{[#metadata][tsprefix]}%{[#metadata][fingerprint]}"
}
stdout { codec => json_lines }
}
I tried using this .conf without these lines:
use_column_value => true
tracking_column => "time"
Also tried using:
clean_run => true
But Logstash keeps reading same data over and over again.
Can you help me understand why Logstash keeps reading?
Logstash (8.3.1)
Database (PostgreSQL 14.5)
JDBC (42.4.1)
statement query in your jdbc input configuration "select * from testtable" will read all the contents from DB table on each run. Input configuration should be as below to avoid reading same data repeatedly.
jdbc {
jdbc_driver_library => "${LOGSTASH_JDBC_DRIVER_JAR_LOCATION}"
jdbc_driver_class => "${LOGSTASH_JDBC_DRIVER}"
jdbc_connection_string => "${LOGSTASH_JDBC_URL}"
jdbc_user => "${LOGSTASH_JDBC_USERNAME}"
jdbc_password => "${LOGSTASH_JDBC_PASSWORD}"
schedule => "* * * * *"
statement => "select * from testtable where time > :sql_lat_value"
use_column_value => true
tracking_column => "time"
record_last_run => true
last_run_metadata_path => <valid file path>
}
I'm attempting to do something that should be simple but I cannot get it to work. I've looked and search all over to find detailed doc for perl search::elsticsearch. I can only find CPAN doc and as far as search is concerned it is barely mentioned. I've search here and cannot find a duplicate question.
I have elasticsearch and filebeat. Filebeat is sending syslog to elasticsearch. I just want to search for messages with matching text and date range. I can find the messages but when I try to add date range the query fails. Here is the query from kibana dev tools.
GET _search
{
"query": {
"bool": {
"filter": [
{ "term": { "message": "metrics" }},
{ "range": { "timestamp": { "gte": "now-15m" }}}
]
}
}
}
I don't get exactly what I'm looking for but there isn't an error.
Here is my attempt with perl
my $results=$e->search(
body => {
query => {
bool => {
filter => {
term => { message => 'metrics' },
range => { timestamp => { 'gte' => 'now-15m' }}
}
}
}
}
);
This is the error.
[Request] ** [http://x.x.x.x:9200]-[400]
[parsing_exception]
[range] malformed query, expected [END_OBJECT] but found [FIELD_NAME],
with: {"col":69,"line":1}, called from sub Search::Elasticsearch::Role::Client::Direct::__ANON__
at ./elasticsearchTest.pl line 15.
With vars: {'body' => {'status' => 400,'error' => {
'root_cause' => [{'col' => 69,'reason' => '[range]
malformed query, expected [END_OBJECT] but found [FIELD_NAME]',
'type' => 'parsing_exception','line' => 1}],'col' => 69,
'reason' => '[range] malformed query, expected [END_OBJECT] but found [FIELD_NAME]',
'type' => 'parsing_exception','line' => 1}},'request' => {'serialize' => 'std',
'path' => '/_search','ignore' => [],'mime_type' => 'application/json',
'body' => {
'query' => {
'bool' =>
{'filter' => {'range' => {'timestamp' => {'gte' => 'now-15m'}},
'term' => {'message' => 'metrics'}}}}},
'qs' => {},'method' => 'GET'},'status_code' => 400}
Can someone help me figure out how to search with the search::elasticsearch perl module?
Multiple filter clauses must be passed as separate JSON objects within an array (like in your initial JSON query), not multiple filters in the same JSON object. This maps to how you must create the Perl data structure.
filter => [
{term => { message => 'metrics' }},
{range => { timestamp => { 'gte' => 'now-15m' }}}
]
I have an hash of hash map as below. Please note that the hash map is very huge which contains PluginsStatus as Success or Error. When PluginsStatus for a key is Success then I need not process anything (I have handled this scenario) but if its Error I need to to display in the order - PluginsStatus, PluginspatchLogName, PluginsLogFileName_0, PluginsLogFileLink_0, PluginsLogFileErrors_0 and so on.
Please note, I do not know exactly how many keys (in hash of a hash) i.e. PluginsLogFileName, PluginsLogFileLink, PluginsLogFileErrors exists i.e. it is dynamic.
$VAR1 = { 'Applying Template Changes' => {
'PluginsLogFileErrors_2' => 'No Errors',
'PluginsStatus' => 'Error',
'PluginsLogFileName_1' => 'Applying_Template_Changes_2015-05-12_02-57-40AM.log',
'PluginsLogFileName_2' => 'ApplyingTemplates.log',
'PluginsLogFileErrors_1' => 'ERROR: FAPSDKEX-00024 : Error in undeploying template.Cause : Unknown.Action : refer to log file for more details.',
'PluginspatchLogName' => '2015-05-11_08-14-28PM.log',
'PluginsLogFileLink_0' => '/tmp/xpath/2015-05-11_08-14-28PM.log',
'PluginsLogFileName_0' => '2015-05-11_08-14-28PM.log',
'PluginsLogFileErrors_0' => 'No Errors',
'PluginsLogFileLink_2' => 'configlogs/ApplyingTemplates.log',
'PluginsLogFileLink_1' => 'configlogs/Applying_Template_Changes_2015-05-12_02-57-40AM.log'
},
'Configuring Keystore Service' => {
'PluginsStatus' => 'Error',
'PluginsLogFileName_1' => 'Configuring_Keystore_Service_2015-05-11_11-11-37PM.log',
'PluginsLogFileErrors_1' => 'ERROR: Failed to query taxonomy attribute AllProductFamilyAndDomains.',
'PluginspatchLogName' => '2015-05-11_08-14-28PM.log',
'PluginsLogFileLink_0' => '/tmp/xpath/2015-05-11_08-14-28PM.log',
'PluginsLogFileName_0' => '2015-05-11_08-14-28PM.log',
'PluginsLogFileErrors_0' => 'No Errors',
'PluginsLogFileLink_1' => 'configlogs/Configuring_Keystore_Service_2015-05-11_11-11-37PM.log'
},
'Applying Main Configuration' => {
'PluginsStatus' => 'Error',
'PluginspatchLogName' => '2015-05-11_08-14-28PM.log',
'PluginsLogFileName_0' => 'Applying_Main_Configuration_2015-05-12_01-11-21AM.log',
'PluginsLogFileLink_0' => '/tmp/xpath/2015-05-11_08-14-28PM.log',
'PluginsLogFileErrors_0' => 'ERROR: Failed to query taxonomy attribute AllProductFamilyAndDomains.apps.ad.common.exception.ADException: Failed to query taxonomy attribute AllProductFamilyAndDomains.... 104 lines more'
}
};
Below is an output snippet I am looking for:
Plugin name is = Applying Template Changes
PluginsStatus = Error
PluginspatchLogName = 2015-05-11_08-14-28PM.log
PluginsLogFileName_0 = 2015-05-11_08-14-28PM.log
PluginsLogFileLink_0 = /tmp/xpath/2015-05-11_08-14-28PM.log
PluginsLogFileErrors_0 = No Errors
PluginsLogFileName_1 = Applying_Template_Changes_2015-05-12_02-57-40AM.log
PluginsLogFileLink_1 = configlogs/Applying_Template_Changes_2015-05-12_02- 57-40AM.log
PluginsLogFileErrors_1 = ERROR: FAPSDKEX-00024 : Error in undeploying template.Cause : Unknown.Action : refer to log file for more details.,
PluginsLogFileName_2 = ApplyingTemplates.log
PluginsLogFileLink_2 = configlogs/ApplyingTemplates.log
PluginsLogFileErrors_2 = No Errors`
Please let me know if someone could help me here ?
You have built a hash that is less than ideal for your purposes. You should create a LogFile hash element that has an array as its value. After that the process is trivial
{
"Applying Main Configuration" => {
LogFile => [
{
Errors => "ERROR: Failed to query taxonomy attribute AllProductFamilyAndDomains.apps.ad.common.exception.ADException: Failed to query taxonomy attribute AllProductFamilyAndDomains.... 104 lines more",
Link => "/tmp/xpath/2015-05-11_08-14-28PM.log",
Name => "Applying_Main_Configuration_2015-05-12_01-11-21AM.log",
},
],
patchLogName => "2015-05-11_08-14-28PM.log",
Status => "Error",
},
"Applying Template Changes" => {
LogFile => [
{
Errors => "No Errors",
Link => "/tmp/xpath/2015-05-11_08-14-28PM.log",
Name => "2015-05-11_08-14-28PM.log",
},
{
Errors => "ERROR: FAPSDKEX-00024 : Error in undeploying template.Cause : Unknown.Action : refer to log file for more details.",
Link => "configlogs/Applying_Template_Changes_2015-05-12_02-57-40AM.log",
Name => "Applying_Template_Changes_2015-05-12_02-57-40AM.log",
},
{
Errors => "No Errors",
Link => "configlogs/ApplyingTemplates.log",
Name => "ApplyingTemplates.log",
},
],
patchLogName => "2015-05-11_08-14-28PM.log",
Status => "Error",
},
"Configuring Keystore Service" => {
LogFile => [
{
Errors => "No Errors",
Link => "/tmp/xpath/2015-05-11_08-14-28PM.log",
Name => "2015-05-11_08-14-28PM.log",
},
{
Errors => "ERROR: Failed to query taxonomy attribute AllProductFamilyAndDomains.",
Link => "configlogs/Configuring_Keystore_Service_2015-05-11_11-11-37PM.log",
Name => "Configuring_Keystore_Service_2015-05-11_11-11-37PM.log",
},
],
patchLogName => "2015-05-11_08-14-28PM.log",
Status => "Error",
},
}
Just iterate over the keys of the hash. Use the $hash{key}{inner_key} syntax to get into the nested hash.
#!/usr/bin/perl
use warnings;
use strict;
use feature qw{ say };
my %error = ( 'Applying Template Changes' => {
'PluginsLogFileErrors_2' => 'No Errors',
'PluginsStatus' => 'Error',
'PluginsLogFileName_1' => 'Applying_Template_Changes_2015-05-12_02-57-40AM.log',
# ...
'PluginsLogFileLink_0' => '/tmp/xpath/2015-05-11_08-14-28PM.log',
'PluginsLogFileErrors_0' => 'ERROR: Failed to query taxonomy attribute AllProductFamilyAndDomains.apps.ad.common.exception.ADException: Failed to query taxonomy attribute AllProductFamilyAndDomains.... 104 lines more',
},
);
for my $step (keys %error) {
print "Plugin name is = $step\n";
for my $detail (sort keys %{ $error{$step} }) {
print "$detail = $error{$step}{$detail}\n";
}
}
I'm using Search::Elasticsearch to query MetaCPAN.
my $es = Search::Elasticsearch->new(
cxn_pool => 'Static::NoPing',
nodes => 'api.metacpan.org:80',
);
my $scroller = $es->scroll_helper(
index => 'v0',
type => 'release',
search_type => 'scan',
scroll => '2m',
size => $size,
body => {
fields => [qw(author archive date)],
query => { range => { date => { gte => $date } } },
},
);
This works ok, but I'd like to set the HTTP User-Agent header to a custom value so my requests can be identified if there's a problem. How do I do that with Search::Elasticsearch?
You can pass arguments to the handle constructor using handle_args. So for the default HTTP::Tiny you would use agent:
my $es = Search::Elasticsearch->new(
cxn_pool => 'Static::NoPing',
nodes => 'api.metacpan.org:80',
handle_args => { agent => "youragent/0.1" },
);
I configured logstash to send email alerts in case there are some combinations of words in the log message. I get the alerts but instead of receiving the message field value in the alert, I get word "#message".
How can I solve this problem?
Here is my logstash config file:
root#srv-syslog:~# cat /etc/logstash/conf.d/central.conf
input {
syslog {
type => "syslog"
port => 5144
}
tcp {
type => "cisco_asa"
port => 5145
}
tcp {
type => "cisco_ios"
port => 5146
}
}
output {
elasticsearch {
bind_host => "127.0.0.1"
port => "9200"
protocol => http
}
if "executed the" in [message] {
email {
from => "logstash_alert#company.local"
subject => "logstash alert"
to => "myemail#company.local"
via => "smtp"
body => "Here is the event line that occured: %{#message}"
}
}
}
The field name in this case is message, not #message.
See demo:
input {
generator {
count => 1
lines => ["Example line."]
}
}
filter {
mutate {
add_field => {
"m1" => "%{message}"
"m2" => "%{#message}"
}
}
}
output {
stdout {
codec => rubydebug{}
}
}
In your case, you should just need to fix the one line:
body => "Here is the event line that occured: %{message}"
Remove the # sign. The field is message, not #message.