I'm using InfuxDB as a datasource and have 2 filters, currency and host.
When viewing my data in Grafana for a specific field, the legend contains the 2 filters:
total is the field and currency and host are the filters.
Is there a way to format the legend in order to change this to either remove it or change it to Total USD?
My query if it helps:
from(bucket: "Test")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["_measurement"] == "orders")
|> filter(fn: (r) => r["_field"] == "total")
|> filter(fn: (r) => r["host"] == "EU")
|> filter(fn: (r) => r["currency"] == "${currency}")
|> aggregateWindow(every: v.windowPeriod, fn: mean, createEmpty: false)
|> yield(name: "mean")
Related
I have data from a vehicle's route in InfluxDB with geotagging in the variables Latitude and Longitude. I would like to display the data as Markers in a Grafana Geomap panel.
My query for the data in Grafana looks as below:
from(bucket: v.defaultBucket)
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["_measurement"] == "1F4F8F26")
|> filter(fn: (r) => r["_field"] == "Latitude" or r["_field"] == "Longitude")
|> aggregateWindow(every: v.windowPeriod, fn: median)
|> yield(name: "median")
This query lets me plot the data as in the following picture:
In table form, the data looks as follows (note how I can switch between display Latitude and Longitude separately):
The problem is that when I try to use a Geomap panel, nothing is displayed as per below:
I have tried also with a restructuring of my data into a Time, Latitude, Longitude format via below query:
from(bucket: v.defaultBucket)
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["_measurement"] == "1F4F8F26")
|> filter(fn: (r) => r["_field"] == "Latitude" or r["_field"] == "Longitude")
|> aggregateWindow(every: v.windowPeriod, fn: median)
|> yield(name: "median")
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
However, this still does not enable me to display the points as markers in Grafana. Any inputs are welcome.
I suggest to remove the aggregation function.
I'm using this query with Geomap:
import "influxdata/influxdb/schema"
from(bucket: "fleet")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> range(start: r._measurement == "points")
|> filter(fn: (r) => r._field == "lat" or r._field == "lon" or r._field == "alt" or r._field == "speed")
|> schema.fieldsAsCols()
|> group()
|> sort(columns: ["_time"])
Seems that geomap does not handle well grouped data, therefore pivoted data are ungrouped (and then sort is needed to get correct order).
I'm trying to create a dashboard where i can filter data by gas station location and fuel type.
This is my Table from this query:
from(bucket: "homeassistant")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["entity_id"] == "tankerkoenig_aral_tankstelle_bat_waldmohr_e5")
|> filter(fn: (r) => r["_field"] == "city_str" or r["_field"] == "value")
|> aggregateWindow(every: v.windowPeriod, fn: last, createEmpty: false)
|> yield(name: "last")
flux database
How can i get the _value of _field "city_str" and the _value of the _field "value" into one table so i can query the location and the price at the same time within grafana?
Use schema.fieldsAsCols function. You will get one table with city_str and value columns.
import "influxdata/influxdb/schema"
from(bucket: "homeassistant")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["entity_id"] == "tankerkoenig_aral_tankstelle_bat_waldmohr_e5")
|> filter(fn: (r) => r["_field"] == "city_str" or r["_field"] == "value")
|> aggregateWindow(every: v.windowPeriod, fn: last, createEmpty: false)
|> schema.fieldsAsCols()
|> yield(name: "last")
When I explore my data in the Influx web frontend all datapoint are equally distributed in 10s steps. (12:00:00, 12:00:10, 12:00:20, ...)
When I use the same query in a Grafana panel the time changes slightly by 1-2s. (12:00:01, 12:00:12, 12:00:21, ...)
How can I force the Grafana panel to keep the 10s steps?
from(bucket: "my_bucket")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["_measurement"] == "docker_container_cpu")
|> filter(fn: (r) => r["_field"] == "usage_percent")
|> aggregateWindow(every: v.windowPeriod, fn: mean, createEmpty: false)
|> yield(name: "mean")
Aggregate per 10s:
from(bucket: "my_bucket")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["_measurement"] == "docker_container_cpu")
|> filter(fn: (r) => r["_field"] == "usage_percent")
|> aggregateWindow(every: 10s, fn: mean, createEmpty: false)
|> yield(name: "mean")
But it is good idea to use macro v.windowPeriod, because it generates aggregation period automatically based on selected dashboard time range. Static 10 sec aggregation will be overkill for your server and browser if you select last year data for example. See doc.
I'm using influxdb v2.1 and having this source measurement issue in task running. Someone please help.
from(bucket: "volume_discount")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["_measurement"] == "nginx_access_log")
|> filter(fn: (r) => r["_field"] == "request")
|> filter(fn: (r) => r["host"] == "volume-discount-process-server-1")
|> count(column: "_value")
|> group(columns: ["_time"])
|> sum(column: "_value")
|> map(fn: (r) => ({
_time: now(),
_field: "request",
_measurement: "nginx_access_log",
_value: r._value
}))
I made this call:
curl -X POST -H "Content-Type: application/json" -d '{
"user": {
"email": "leo.hetsch#testapp.com",
"first_name": "Léo",
"last_name": "Hetsch",
"password": "notsosecure",
"username": "test1"
}
}' "http://localhost:4000/api/users"
And on the server I get:
[info] POST /api/users
[debug] Processing by VirtualTeams.UserController.create/2
Parameters: %{"user" => %{"email" => "leo.hetsch#testapp.com", "first_name" => "Léo", "last_name" => "Hetsch", "password" => "[FILTERED]", "username" => "test1"}}
Pipelines: [:api]
This is in my user.ex file:
schema "users" do
field :email, :string
field :password, :string
field :first_name, :string
field :last_name, :string
field :api_token, :string
field :username, :string
timestamps()
end
#doc """
Builds a changeset based on the `struct` and `params`.
"""
def changeset(struct, params \\ %{}) do
struct
|> cast(params, [:email, :password, :first_name, :last_name, :api_token, :username])
|> validate_required([:email, :password, :first_name, :last_name, :api_token, :username])
|> unique_constraint(:email)
|> unique_constraint(:username)
end
def create(params) do
changeset(%VirtualTeams.User{}, params)
|> put_change(:password, hashed_password(params["password"]))
|> put_change(:api_token, :base64.encode(:crypto.strong_rand_bytes(24)))
|> VirtualTeams.Repo.insert()
end
defp hashed_password(password) do
Comeonin.Pbkdf2.hashpwsalt(password)
end
Just to verify I did:
mix ecto.migrate
00:40:35.074 [info] Already up
Why am I getting an error?
UPDATE:
Forgot the error:
{"error":"error creating user"}
In my controller I have this code, which has the error:
def create(conn, %{"user" => user_params}) do
case User.create(user_params) do
{:ok, user} ->
conn
|> put_status(:created)
|> render("user.json", user: user)
{:error, changeset} ->
conn
|> put_status(:bad_request)
|> json(%{error: "error creating user"})
end
The problem appears to be that :api_token is required by changeset/2, but not present in the params from the request.
In general, it's useful to get a look at the changeset in {:error, changeset} (returned by create/1 in this case) in order to see what validation failed.