Logstash date parse failure with milliseconds since epoch - date

Logstash is not able to parse milliseconds since epoch and returns me an parse failure. There are no whitspaces in the content of the timestamo field in the xml and logstash selects the right value.
filter {
xml {
source => "message"
remove_namespaces => true
store_xml => false
xpath => ["//event/#timestamp", "#time_since_epoch"]
}
date {
match => [ "#time_since_epoch","UNIX_MS" ]
target => "#time"
}
}
What I am doing wrong?
EDIT
sample xml data line:
<event timestamp="1494599590213" ><message>Dummy message</message></event>

Apparently the value extracted from the xpath is put in an array (see: "#time_since_epoch":["1494599590213"] with the stdout plugin and the json codec).
So you'll need to access the time as an array element:
date {
match => [ "[#time_since_epoch][0]","UNIX_MS" ]
target => "#time"
}

Related

Ingesting data in MongoDB with mongodb-output-plugin in Logstash

I am trying to ingest data from a txt file in MongoDB (Machine 1), using Logstash (Machine 2).
I set a DB and a collection with Compass and I am using the mongodb-output-plugin in Logstash.
Here's the Logstash conf file:
input
{
file {
path => "/home/user/Data"
type => "cisco-asa"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter
{
grok {
match => { "message" => "^%{SYSLOGTIMESTAMP:syslog_timestamp} %{HOSTNAME:device_src} %%{CISCO_REASON:facility}-%{INT:severity_level}-%{CISCO_REASON:facility_mnemonic}: %{GREEDY>
}
date {
match => ["syslog_timestamp", "MMM dd HH:mm:ss" ]
target => "#timestamp"
}
}
output
{
stdout {
codec => dots
}
mongodb {
id => "mongo-cisco"
collection => "Cisco ASA"
database => "Logs"
uri => "mongodb+srv://user:pass#192.168.10.9:27017/Logs"
}
}
Here's a screenshot of the Logstash output:
Logstash output
[2021-03-27T13:29:35,178][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
.............................................................................................................................
[2021-03-27T13:30:06,201][WARN ][logstash.outputs.mongodb ][main][mongo-cisco] Failed to send event to MongoDB, retrying in 3 seconds {:event=>#<LogStash::Event:0x6d0984a>, :exception=>#<Mongo::Error::NoServerAvailable: No server is available matching preference: #<Mongo::ServerSelector::Primary:0x6711494c #tag_sets=[], #server_selection_timeout=30, #options={:database=>"Logs", :user=>"username", :password=>"passwd"}>>}
PS: this is my first time using MongoDB

Logstash date parsing failing when using default target

I can't parse a date field using the logstash date plugin, my config is as follow:
if "test" in [tags] {
csv {
separator => ","
columns => [ "value", "received_date" ]
convert => {
"value" => "float"
}
}
mutate {
gsub => [ "received_date" , ".\d*$" , ""]
}
date {
match => [ "received_date", "yyyy-MM-dd HH:mm:ss" ]
}
}
I get the error:
[2018-06-19T11:51:20,583][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"f2d34d84-1ea4-4510-8237-2329a4d1ffba",
:_index=>"logstash-2018.06.19", :_type=>"doc", :_routing=>nil}, #], :response=>{"index"=>{"_index"=>"logstash-2018.06.19", "_type"=>"doc", "_id"=>"f2d34d84-1ea4-4510-8237-2329a
4d1ffba", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [received_date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"2018-06-19 11:51:15\" is malformed at \" 11:51:15\""}}}}}
If I add a target:
date {
match => [ "received_date", "yyyy-MM-dd HH:mm:ss" ]
target => "received_date"
}
Then it works, but the timestamp field takes the date logstash received the input, which is not what I want.
Why would the target impact the date parsing ?
timestamp field is mapped as date in elasticsearch for some reason.
You can delete the timestamp field,
date {
locale => "en"
remove_field => ["timestamp"]
}

Logstash - Custom Timestamp Error

I am trying to input a timestamp field in Logstash and i am getting dateparsefailure message.
My Message -
2014-08-01;11:00:22.123
Pipeline file
input {
stdin{}
#beats {
# port => "5043"
# }
}
# optional.
filter {
date {
locale => "en"
match => ["message", "YYYY-MM-dd;HH:mm:ss.SSS"]
target => "#timestamp"
add_field => { "debug" => "timestampMatched"}
}
}
output {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
stdout { codec => rubydebug }
}
Can someone tell me what i am missing ?
Update 1
I referred to the link - How to remove trailing newline from message field and now it works.
But, in my log message, i have multiple values other than timestamp
<B 2014-08-01;11:00:22.123 Field1=Value1 Field2=Value2
When i give this as input, it is not working. How to read a part of the log and make it as timestamp ?
Update 2
it works now.
Changed the config file as below
filter {
kv
{
}
mutate {
strip => "message"
}
date {
locale => "en"
match => ["timestamp1", "YYYY-MM-dd;HH:mm:ss.SSS"]
target => "#timestamp"
add_field => { "debug" => "timestampMatched"}
}
}
I am posting the answer below and steps i used to solve the issue so that i can help people like me.
Step 1 - I read the message in the form of key and value pair
Step 2 - I trimmed off the extra space that leads to parse exception
Step 3 - I read the timestamp value and other fields in respective fields.
input {
beats {
port => "5043"
}
}
# optional.
filter {
kv { }
date {
match => [ "timestamp", "yyyy-MM-dd HH:mm:ss,SSS" ]
remove_field => [ "timestamp" ]
}
}
output {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
}

Filter Date Logstash

I just started with Logstash parsing a CSV document. CSV document only has two columns "Date" and "High". I have read various configurations to parse a date but I can not, giving me error in that field. The date has the format DD / MM / YYYY and error tells me the following:
Failed parsing date from field {:field=>"Date", :value=>"Date", :exception=>"Invalid format: \"Date\"", :config_parsers=>"dd/MM/YYYY", :config_locale=>"default=es_ES", :level=>:warn}
This is my configuration file to filter Logstash:
input {
file {
path => "/path/to/data.csv"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["Date","High"]
}
date{
match => [ "Date", "dd/MM/YYYY" ]
}
mutate {convert => ["High", "float"]}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
action => "index"
index => "machine"
workers => 1
}
stdout { codec => rubydebug }
}
Thank you!!
In your date plugin try to change the letter cases in the match setting. Something like this:
date{
match => [ "Date", "DD/MM/YYYY" ]
}
If not helping try to make them all lowercase.
The format string dd/MM/yyyy should work. You can find detailed specifications for formatting strings in the JodaTime documentation.

Problems with date logstash

I have a log in a CSV file with the date field in this pattern "24/09/2014", but when I read the file with Logstash the date field has a string type.
csv file example:
fecha,cant,canb,id,zon
24/09/2014,80,20,1.5,2
01/12/2014,50,25,1,3
My Logstash conf file:
input {
file {
path => "path/to/data.csv"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["fecha","cant","canb","id","zon"]
}
date {
match=> ["fecha","dd/MM/yyyy"]
}
mutate {convert => ["cant", "integer"]}
mutate {convert => ["canb", "integer"]}
mutate {convert => ["id", "float"]}
}
output {
elasticsearch {
action => "index"
host => "localhost"
index => "barrena"
workers => 1
}
stdout {}
}
Thanks!
The date would be copied to a field called #timestamp ( The date filter does it ) and this field would be date format.
You can safely remove the fetcha field once you have used the date filter.