I have a label cs_job_time in Prometheus/Alert Manager and would like to send an email alert when a condition is met for another job. The email sends fine but is it possible to include the value of cs_job_time within the email? I can use {{$value}} for the metric in question but I would also like to print the value of cs_job_time.
I came across this but when I try
time = "{{ `cs_job_time{instance='%s', job='/'}` $labels.instance | query | first }}
or similar variants, I get the error message "Error expanding alert template CSJobAlert with data '{map[] 2123}': runtime error: invalid memory address or nil pointer dereference" source="alerting.go:199"
Is it possible to email metric values?
You're missing the printf there from the example:
"{{ printf `cs_job_time{instance='%s', job='/'}` $labels.instance | query | first }}"
Be careful though, if there's no results then the first will fail. It's generally best to use a range statement as that'll be resilient to that.
Related
I need to read my logs just like they were displayed in the application console.
My grafana LogQL query looks like this :
{job="myjob", host="myserver"}
| json
| line_format `{{ regexReplaceAll "{(.*?)}" .MessageTemplate "{{.Properties_${1}}}" }}`
It ouputs a json object, I would like to use the "MessageTemplate" property and replace all the bracketed variables back where they belong so the log is readable in grafana log dashboards.
My problem is when using regexReplaceAll("<regexp>" <String_To_search> <output>) you cannot parse the output as a variable. Is there a way to do this that I missed ? I've tried hundreds of combinations and parsed through the official documentation.
Please let me know if it's impossible !
I am working on a Loki-based Dashboard on Grafana. I have one panel for searching text in the Loki trace logs, the current query is like:
{job="abc-service"}
|~ "searchTrace"
|json
|line_format "{if .trace_message}} Message: \t{{.trace_message}} {{end}}"
Where searchTrace is a variable of type "Text box" for the user to input search text.
I want to include another variable skipTestLog to skip logs created by some test cron tasks. skipTestLog is a custom variable of two options: Yes,No.
Suppose the logs created by test cron tasks contain the text CronTest in the field trace_message after the json parser, are there any ways to filter them out based on the selected value of skipTestLog?
Create a key/value custom variable like in the following example:
Use the variable like in the following example:
I have several metrics with the label "service". I want to get a list of all the "service" levels that begin with "abc" and end with "xyz". These will be the values of a grafana template variable.
This is that I have tried:
label_values(service) =~ "abc.*xyz"
However this produces a error Template variables could not be initialized: parse error at char 13: could not parse remaining input "(service_name) "...
Any ideas on how to filter the label values?
This should work (replacing up with the metric you mention):
label_values(up{service=~"abc.*xyz"}, service)
Or, in case you actually need to look across multiple metrics (assuming that for some reason some metrics have some service label values and other metrics have other values):
label_values({__name__=~"metric1|metric2|metric3", service=~"abc.*xyz"}, service)
Using Kapacitor 1.3 and I am trying to use the following where node to keep measurements with an empty tag. Nothing is passing through and I get the same result with ==''.
| where(lambda: 'process-cpu__process-name' =~ /^$/)
I can workaround this issue using a default value for missing tags and filter on this default tag, in the following node but I am wondering if there is a better way structure the initial where statement and avoid an extra node.
| default()
.tag('process-cpu__process-name','system')
| where(lambda: \"process-cpu__process-name\" == 'system' )
Sure it doesn't pass, 'cause this
'process-cpu__process-name'
is a string literal it TICKScript, not a reference to a field, which is
"process-cpu__process-name"
You obviously got the condition always false in this case.
Quite common mistake though, especially for someone with previous experience with the languages that tolerates both single & double quote for mere string. :-)
Also, there's a function in TICKScript lambda called strLength(), find the doc here, please.
I have just started investigating into treeline.io beta, so, I could not find any way in the existing machinepacks that would do the job(sanitizing user inputs). Wondering if i can do it in anyway, best if within treeline.
Treeline automatically does type-checking on all incoming request parameters. If you create a route POST /foo with parameter age and give it 123 as an example, it will automatically display an error message if you try to post to /foo with age set to abc, because it's not a number.
As far as more complex validation, you can certainly do it in Treeline--just add more machines to the beginning of your route. The if machine works well for simple tasks; for example, to ensure that age is < 150, you can use if and set the left-hand value to the age parameter, the right-hand value to 150, and the comparison to "<". For more custom validations you can create your own machine using the built-in editor and add pass and fail exits like the if machine has!
The schema-inspector machinepack allow you to sanitize and validate the inputs in Treeline: machinepack-schemainspector
Here a screenshot how I'm using it in my Treeline project:
The content of the Sanitize element:
The content of the Validate element (using the Sanitize output):
For the next parts, I'm always using the Sanitize output (email trimmed and in lowercase for this example).