Stefano L
Stefano L

Reputation: 1582

Prometheus barely used counters not showing in Grafana

I am having a counter that rarely increases. The low frequency of increasing it seems to cause trouble for us, because the event the counter is supposed to represent does not show up in Grafana, even though I can see the counter in the /prometheus endpoint:

my_counter{client="some_label", capture_channel="DESKTOP", instace_name="foo",stage="dev",testRequest="false",validation_issue="INVALID_SELECTION",} 1.0

In Grafana, this looks as follows:

enter image description here

Interestingly though, if I graph the data in a raw fashion, I do see that Prometheus has scraped it (and other variants of this counter too, all distuingished by one of the labels): enter image description here

Am I getting it wrong? Is it because of Prometheus' datamodel to actually "count" something by processing deltas between scrapes? And if there is a very slowly increasing counter with a lot of labels in it, that Prometheus is unable to calculate the "increase" correctly?

I also tried using rate() function but nothing works.

sum(rate(my_counter[$__rate_interval])) by (validation_issue)

Upvotes: 3

Views: 1493

Answers (1)

trallnag
trallnag

Reputation: 2386

This is expected, because (generally) counters are not initialized at the time they are defined with a value of zero. And in the second screenshot you are seeing nothing because there is only a single data point for every individual time series, and so increase fails to return anything, since there is no increase.

Under circumstances you can counteract this by manually initializing metrics in your application. Though this only makes sense if the cardinality of your label values is well defined.

Upvotes: 4

Related Questions