Problems with date logstash - date

I have a log in a CSV file with the date field in this pattern "24/09/2014", but when I read the file with Logstash the date field has a string type.
csv file example:
fecha,cant,canb,id,zon
24/09/2014,80,20,1.5,2
01/12/2014,50,25,1,3
My Logstash conf file:
input {
file {
path => "path/to/data.csv"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["fecha","cant","canb","id","zon"]
}
date {
match=> ["fecha","dd/MM/yyyy"]
}
mutate {convert => ["cant", "integer"]}
mutate {convert => ["canb", "integer"]}
mutate {convert => ["id", "float"]}
}
output {
elasticsearch {
action => "index"
host => "localhost"
index => "barrena"
workers => 1
}
stdout {}
}
Thanks!

The date would be copied to a field called #timestamp ( The date filter does it ) and this field would be date format.
You can safely remove the fetcha field once you have used the date filter.

Related

Set Logstash Date Filter from current date to last 3 days

I have Complete database in one index and daily bases i need to create or fetch 3 days records and store in CSV Format. Target is daily it take 3 days back records and store in CSV File. How To set start from Current date to Last 3 Days using only logstash.config?
My Logstash Config File
input {
elasticsearch {
hosts => "**Endpoint URL**"
index => "**Index NAME**"
user => "***"
password => "***"
query => '{ "query": { "query_string": { "query": "*" } } }'
}
}
filter {
csv {
separator => ","
autodetect_column_names => true
autogenerate_column_names => true
}
}
output {
stdout {
codec => json_lines
}
csv {
fields => []
path => "C:/ELK_csv/**cvs_File_Name**.csv"
}
}
Need To Add date filter range
{"query":{"bool":{"must":[{"range":{"createddate":{"gte":"","lt":""}}}],"must_not":[],"should":[]}},"from":0,"size":5000,"sort":[],"aggs":{}}
gte start from current date and lt to last 3 days.
Working Logstash.config File Code
input {
elasticsearch {
hosts => "**ELK ENDPOINT URL**"
index => "**INDEX NAME**"
user => "***"
password => "***"
query => '{ "query":{"bool":{"must":[{"range":{"createddate":{"gt":"now-3d/d","lte":"now/d"}}}],"must_not":[],"should":[]}},"from":0,"size":10000,"sort":[],"aggs":{} }'
}
}
filter {
csv {
separator => ","
autodetect_column_names => true
autogenerate_column_names => true
}
}
output {
stdout {
codec => json_lines
}
csv {
fields => [**FIELDS NAMES**]
path => "C:/ELK6.4.2/logstash-6.4.2/bin/tmp/**CSV_3days**.csv"
}
}

Logstash date parsing failing when using default target

I can't parse a date field using the logstash date plugin, my config is as follow:
if "test" in [tags] {
csv {
separator => ","
columns => [ "value", "received_date" ]
convert => {
"value" => "float"
}
}
mutate {
gsub => [ "received_date" , ".\d*$" , ""]
}
date {
match => [ "received_date", "yyyy-MM-dd HH:mm:ss" ]
}
}
I get the error:
[2018-06-19T11:51:20,583][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"f2d34d84-1ea4-4510-8237-2329a4d1ffba",
:_index=>"logstash-2018.06.19", :_type=>"doc", :_routing=>nil}, #], :response=>{"index"=>{"_index"=>"logstash-2018.06.19", "_type"=>"doc", "_id"=>"f2d34d84-1ea4-4510-8237-2329a
4d1ffba", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [received_date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"2018-06-19 11:51:15\" is malformed at \" 11:51:15\""}}}}}
If I add a target:
date {
match => [ "received_date", "yyyy-MM-dd HH:mm:ss" ]
target => "received_date"
}
Then it works, but the timestamp field takes the date logstash received the input, which is not what I want.
Why would the target impact the date parsing ?
timestamp field is mapped as date in elasticsearch for some reason.
You can delete the timestamp field,
date {
locale => "en"
remove_field => ["timestamp"]
}

Logstash date parse failure with milliseconds since epoch

Logstash is not able to parse milliseconds since epoch and returns me an parse failure. There are no whitspaces in the content of the timestamo field in the xml and logstash selects the right value.
filter {
xml {
source => "message"
remove_namespaces => true
store_xml => false
xpath => ["//event/#timestamp", "#time_since_epoch"]
}
date {
match => [ "#time_since_epoch","UNIX_MS" ]
target => "#time"
}
}
What I am doing wrong?
EDIT
sample xml data line:
<event timestamp="1494599590213" ><message>Dummy message</message></event>
Apparently the value extracted from the xpath is put in an array (see: "#time_since_epoch":["1494599590213"] with the stdout plugin and the json codec).
So you'll need to access the time as an array element:
date {
match => [ "[#time_since_epoch][0]","UNIX_MS" ]
target => "#time"
}

Logstash - Custom Timestamp Error

I am trying to input a timestamp field in Logstash and i am getting dateparsefailure message.
My Message -
2014-08-01;11:00:22.123
Pipeline file
input {
stdin{}
#beats {
# port => "5043"
# }
}
# optional.
filter {
date {
locale => "en"
match => ["message", "YYYY-MM-dd;HH:mm:ss.SSS"]
target => "#timestamp"
add_field => { "debug" => "timestampMatched"}
}
}
output {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
stdout { codec => rubydebug }
}
Can someone tell me what i am missing ?
Update 1
I referred to the link - How to remove trailing newline from message field and now it works.
But, in my log message, i have multiple values other than timestamp
<B 2014-08-01;11:00:22.123 Field1=Value1 Field2=Value2
When i give this as input, it is not working. How to read a part of the log and make it as timestamp ?
Update 2
it works now.
Changed the config file as below
filter {
kv
{
}
mutate {
strip => "message"
}
date {
locale => "en"
match => ["timestamp1", "YYYY-MM-dd;HH:mm:ss.SSS"]
target => "#timestamp"
add_field => { "debug" => "timestampMatched"}
}
}
I am posting the answer below and steps i used to solve the issue so that i can help people like me.
Step 1 - I read the message in the form of key and value pair
Step 2 - I trimmed off the extra space that leads to parse exception
Step 3 - I read the timestamp value and other fields in respective fields.
input {
beats {
port => "5043"
}
}
# optional.
filter {
kv { }
date {
match => [ "timestamp", "yyyy-MM-dd HH:mm:ss,SSS" ]
remove_field => [ "timestamp" ]
}
}
output {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
}

Filter Date Logstash

I just started with Logstash parsing a CSV document. CSV document only has two columns "Date" and "High". I have read various configurations to parse a date but I can not, giving me error in that field. The date has the format DD / MM / YYYY and error tells me the following:
Failed parsing date from field {:field=>"Date", :value=>"Date", :exception=>"Invalid format: \"Date\"", :config_parsers=>"dd/MM/YYYY", :config_locale=>"default=es_ES", :level=>:warn}
This is my configuration file to filter Logstash:
input {
file {
path => "/path/to/data.csv"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["Date","High"]
}
date{
match => [ "Date", "dd/MM/YYYY" ]
}
mutate {convert => ["High", "float"]}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
action => "index"
index => "machine"
workers => 1
}
stdout { codec => rubydebug }
}
Thank you!!
In your date plugin try to change the letter cases in the match setting. Something like this:
date{
match => [ "Date", "DD/MM/YYYY" ]
}
If not helping try to make them all lowercase.
The format string dd/MM/yyyy should work. You can find detailed specifications for formatting strings in the JodaTime documentation.