Filebeat gives: object mapping for [error] tried to parse field [error] as object, but found a concrete value - elastic-stack

In elastic search i have created an ingest pipeline with the next grok pattern:
OK -%{DATA:label},%{INT:samples},%{BASE16FLOAT:average},%{BASE16FLOAT:min},%{BASE16FLOAT:max},%{BASE16FLOAT:p90},%{BASE16FLOAT:stddev},(?<error>([0-9].[0-9]*%)),
Simulating this with the next line:
OK - test,272,2275,593,14830,4581,1826.76,0.00%,.0,9.53,291717.4,30-04-2018 10:29:09
works perfect in kibana.
When i let filebeat indexing this file i get this error: object mapping for [error] tried to parse field [error] as object, but found a concrete value what goes wrong?

Renaming the field error to run_error solves all the problems. It looks like the field error is reserved.

Looks like an mapping error in your elasticsearch index template. So check the value for error in your index template.
Thats why it works with the grok matcher in kibana.
Anyway I would suggest the filter plugin csv, because it would parse it for you

I had a similar case, the cause of your problem is that in the same elastic index, your error property is found of type object and sometimes of type values, Elasticsearch does not allow this

Related

How to use ReactiveMongo streaming with aggregation?

I'm using ReactiveMongo. I want to run a set of pipelines (i.e., use the Mongo aggregation framework) on a collection, and stream the results. I want to retrieve them as BSON documents.
I've seen examples that suggest something like:
coll.aggregatorContext[BSONDocument](pipelines.head, pipelines.tail).prepared[AkkaStreamCursor].cursor.documentSource()
This gets me a compilation error because I'm missing an implicit CursorProducer.Aux[BSONDocument, AkkaStreamCursor], but I have no idea how to import or construct one of these -- can't find examples. (Please don't refer me to the ReactiveMongo tests, as I can't see an example of this in them, or don't understand what I'm looking at if there is one.)
I should add that I'm using version 0.20.3 of ReactiveMongo.
If my issue is due to my choice of retrieving results as BSON focuments, and there's another type that would find an existing implicit CursorProducer.Aux, I'd happily switch if someone can tell me how to get this to compile?
So, IntelliJ is telling me that I'm missing an implicit for .prepared.
But, an sbt compile is telling me that my problem is that AkkaStreamCursor doesn't fulfill the type bounds of .prepared:
type arguments [reactivemongo.akkastream.AkkaStreamCursor] do not conform to method prepared's type parameter bounds [AC[_] <: reactivemongo.api.Cursor.WithOps[_]]
What ReactiveMongo type is available to use for this?

DataStage Job stucks with warnings

i am trying to stage a dataset from source to my server, When I run my job in DataStage, It keeps stucked with no errors.
All I see is a warning which says:
When checking operator: When binding output interface field "DRIVERS" to field "DRIVERS": Implicit conversion from source type "dfloat" to result type "sfloat": Possible range/precision limitation.
Try to reset and see if you get any other information......! otherwise one thing straight you can do is use Cast function to convert that to integer and process if the source is a DB, if it is a file read as is and change it in Transformer. hope this helps.
When checking operator: When binding output interface field "DRIVERS" to field "DRIVERS": Implicit conversion from source type "dfloat" to result type "sfloat": Possible range/precision limitation.
You have to learn to read APT/Torrent error messages, which is the company that originally created Datastage PX. It is say:
When checking operator ===> As the compiler is pre-checking a stage
When binding ... "Drivers" ===> I'm looking at a stage in which you are assigning the input field "driver" to the output field "drivers"
Implicit conversion from source type "dfloat" to result type "sfloat": ===> You've got a type mismatch
I believe you can tell datastage to compile even if you get warnings, but the real answer is to go back inside your job and figure out why you're sticking a dfloat (double precision) into an sfloat (single precision). It's likely that you need to specify how to get from dfloat to the sfloat using a transformer and a user specified rule for accuracy truncation.

REST API V2: Converting String type parameter to List type

I'm trying to execute a JasperReports report through REST V2 services by passing the param value as part of the url. Then, in report, I have a SQL which take a list type param. But how to convert the String type param to List type to run the query?
Here, the String param has the comma separated values like below:
https://[host_name]:[port]/jasperserver/rest_v2/reports/reports/samples/[report_name].pdf?param_str=value1,value2,value3
we need to convert param_str to a List type from String type.
I'm getting the cast exception like :
Caused by: net.sf.jasperreports.engine.fill.JRExpressionEvalException: Error evaluating expression :
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to java.util.List
Can't believe I am doing this, it seems you can earn quite a lot on Jasper obscure and badly documented area by doing consulting. But I work with open source tech so should have the correct mindset. :-)
Here you are param_str=value1&param_str=value2&param_str=value3
If you find out how to do the same with Map, pls tell.

Fragment Evaluation Error

Can someone tell me what "Fragment evaluation error" means, or where I might look for solutions? I sometimes (but not always) get lots of these errors (without changing my code):
[error] ! Fragment evaluation error
[error] ThrowableException: Could not initialize class code.model.Post$ (FutureTask.java:138)
[error] code.model.PostSpec$$anonfun$1$$anonfun$apply$1.apply$mcZ$sp(PostSpec.scala:68)
[error] code.model.PostSpec$$anonfun$1$$anonfun$apply$1.apply(PostSpec.scala:51)
[error] code.model.PostSpec$$anonfun$1$$anonfun$apply$1.apply(PostSpec.scala:51)
Line 68 of PostSpec is the first line in the (specs2) test that references the Post model companion object:
val test4 = Post.fixJValue(toextract4).extract[Selection]
I'm using Scala 2.9.0-1.
Also: I have no idea whether it matters, but Post is a net.liftweb.mongodb.record.MongoRecord class companion object:
object Post extends Post with MongoMetaRecord[Post] { ... }
In a specs2 specification, Fragments are pieces of the specification. A Fragment can be a Text, an Example, a Step.
Some fragments, like Example and Step are meant to be executed and are supposed to catch Exceptions so that they can be marked as failures. But they won't catch Errors (except AssertionErrors). So if an Example throws an OutOfMemoryError, this will be reported as a Fragment evaluation error.
Other fragments, like Text fragments are not supposed to throw exceptions when being evaluated. If they do, you will get the same Fragment evaluation error message.
Without seeing the full specification it's hard for me to say what's happening there but I suspect that you had a non-Exception type thrown in the body of an Example. But I have more questions than answers for now:
where is test4 declared? Inside the specification body? Inside a Context case class?
since errors happen intermittently, are you sure you always have a proper mongodb context? Maybe your specification examples are being executed concurrently on the same mongo db instance?

ReCaptcha.scala.scala:114: not found: value compact

I have the following code:
http://www.assembla.com/spaces/liftweb/wiki/ReCaptcha
and compile the project(sbt), send me the following message:
src/main/scala/code/model/ReCaptcha.scala.scala:114: not found: value compact
[error] val RecaptchaOptions = compact(render(reCaptchaOptions))
any suggestions please :(
Compact comes from lift-json I would imagine as thats the only place I recall that method being defined. Try adding this import:
import net.liftweb.json.JsonAST._
In addition, you may want to refer to the lift-json documentation for usage on compact(...)