promql: rename holt_winters to double_exponential_smoothing

Signed-off-by: Jan Fajerski <jfajersk@redhat.com>
This commit is contained in:
Jan Fajerski 2024-09-18 11:20:17 +02:00
parent 15cea39136
commit 96e5a94d29
13 changed files with 117 additions and 100 deletions

View file

@ -380,17 +380,22 @@ do not show up in the returned vector.
Similarly, `histogram_stdvar(v instant-vector)` returns the estimated standard Similarly, `histogram_stdvar(v instant-vector)` returns the estimated standard
variance of observations in a native histogram. variance of observations in a native histogram.
## `holt_winters()` ## `double_exponential_smoothing()`
**This function has to be enabled via the [feature flag](../feature_flags.md#experimental-promql-functions) `--enable-feature=promql-experimental-functions`.** **This function has to be enabled via the [feature flag](../feature_flags.md#experimental-promql-functions) `--enable-feature=promql-experimental-functions`.**
`holt_winters(v range-vector, sf scalar, tf scalar)` produces a smoothed value `double_exponential_smoothing(v range-vector, sf scalar, tf scalar)` produces a smoothed value
for time series based on the range in `v`. The lower the smoothing factor `sf`, for time series based on the range in `v`. The lower the smoothing factor `sf`,
the more importance is given to old data. The higher the trend factor `tf`, the the more importance is given to old data. The higher the trend factor `tf`, the
more trends in the data is considered. Both `sf` and `tf` must be between 0 and more trends in the data is considered. Both `sf` and `tf` must be between 0 and
1. 1.
For additional details, refer to [NIST Engineering Statistics Handbook](https://www.itl.nist.gov/div898/handbook/pmc/section4/pmc433.htm).
In Prometheus V2 this function was called `holt_winters`. This caused confusion
since the Holt-Winters method usually refers to triple exponential smoothing.
Double exponential smoothing as implemented here is also referred to as "Holt
Linear".
`holt_winters` should only be used with gauges. `double_exponential_smoothing` should only be used with gauges.
## `hour()` ## `hour()`

View file

@ -117,7 +117,7 @@ func rangeQueryCases() []benchCase {
}, },
// Holt-Winters and long ranges. // Holt-Winters and long ranges.
{ {
expr: "holt_winters(a_X[1d], 0.3, 0.3)", expr: "double_exponential_smoothing(a_X[1d], 0.3, 0.3)",
}, },
{ {
expr: "changes(a_X[1d])", expr: "changes(a_X[1d])",

View file

@ -350,7 +350,7 @@ func calcTrendValue(i int, tf, s0, s1, b float64) float64 {
// data. A lower smoothing factor increases the influence of historical data. The trend factor (0 < tf < 1) affects // data. A lower smoothing factor increases the influence of historical data. The trend factor (0 < tf < 1) affects
// how trends in historical data will affect the current data. A higher trend factor increases the influence. // how trends in historical data will affect the current data. A higher trend factor increases the influence.
// of trends. Algorithm taken from https://en.wikipedia.org/wiki/Exponential_smoothing titled: "Double exponential smoothing". // of trends. Algorithm taken from https://en.wikipedia.org/wiki/Exponential_smoothing titled: "Double exponential smoothing".
func funcHoltWinters(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) (Vector, annotations.Annotations) { func funcDoubleExponentialSmoothing(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper) (Vector, annotations.Annotations) {
samples := vals[0].(Matrix)[0] samples := vals[0].(Matrix)[0]
// The smoothing factor argument. // The smoothing factor argument.
@ -1657,82 +1657,82 @@ func funcYear(vals []parser.Value, args parser.Expressions, enh *EvalNodeHelper)
// FunctionCalls is a list of all functions supported by PromQL, including their types. // FunctionCalls is a list of all functions supported by PromQL, including their types.
var FunctionCalls = map[string]FunctionCall{ var FunctionCalls = map[string]FunctionCall{
"abs": funcAbs, "abs": funcAbs,
"absent": funcAbsent, "absent": funcAbsent,
"absent_over_time": funcAbsentOverTime, "absent_over_time": funcAbsentOverTime,
"acos": funcAcos, "acos": funcAcos,
"acosh": funcAcosh, "acosh": funcAcosh,
"asin": funcAsin, "asin": funcAsin,
"asinh": funcAsinh, "asinh": funcAsinh,
"atan": funcAtan, "atan": funcAtan,
"atanh": funcAtanh, "atanh": funcAtanh,
"avg_over_time": funcAvgOverTime, "avg_over_time": funcAvgOverTime,
"ceil": funcCeil, "ceil": funcCeil,
"changes": funcChanges, "changes": funcChanges,
"clamp": funcClamp, "clamp": funcClamp,
"clamp_max": funcClampMax, "clamp_max": funcClampMax,
"clamp_min": funcClampMin, "clamp_min": funcClampMin,
"cos": funcCos, "cos": funcCos,
"cosh": funcCosh, "cosh": funcCosh,
"count_over_time": funcCountOverTime, "count_over_time": funcCountOverTime,
"days_in_month": funcDaysInMonth, "days_in_month": funcDaysInMonth,
"day_of_month": funcDayOfMonth, "day_of_month": funcDayOfMonth,
"day_of_week": funcDayOfWeek, "day_of_week": funcDayOfWeek,
"day_of_year": funcDayOfYear, "day_of_year": funcDayOfYear,
"deg": funcDeg, "deg": funcDeg,
"delta": funcDelta, "delta": funcDelta,
"deriv": funcDeriv, "deriv": funcDeriv,
"exp": funcExp, "exp": funcExp,
"floor": funcFloor, "floor": funcFloor,
"histogram_avg": funcHistogramAvg, "histogram_avg": funcHistogramAvg,
"histogram_count": funcHistogramCount, "histogram_count": funcHistogramCount,
"histogram_fraction": funcHistogramFraction, "histogram_fraction": funcHistogramFraction,
"histogram_quantile": funcHistogramQuantile, "histogram_quantile": funcHistogramQuantile,
"histogram_sum": funcHistogramSum, "histogram_sum": funcHistogramSum,
"histogram_stddev": funcHistogramStdDev, "histogram_stddev": funcHistogramStdDev,
"histogram_stdvar": funcHistogramStdVar, "histogram_stdvar": funcHistogramStdVar,
"holt_winters": funcHoltWinters, "double_exponential_smoothing": funcDoubleExponentialSmoothing,
"hour": funcHour, "hour": funcHour,
"idelta": funcIdelta, "idelta": funcIdelta,
"increase": funcIncrease, "increase": funcIncrease,
"irate": funcIrate, "irate": funcIrate,
"label_replace": funcLabelReplace, "label_replace": funcLabelReplace,
"label_join": funcLabelJoin, "label_join": funcLabelJoin,
"ln": funcLn, "ln": funcLn,
"log10": funcLog10, "log10": funcLog10,
"log2": funcLog2, "log2": funcLog2,
"last_over_time": funcLastOverTime, "last_over_time": funcLastOverTime,
"mad_over_time": funcMadOverTime, "mad_over_time": funcMadOverTime,
"max_over_time": funcMaxOverTime, "max_over_time": funcMaxOverTime,
"min_over_time": funcMinOverTime, "min_over_time": funcMinOverTime,
"minute": funcMinute, "minute": funcMinute,
"month": funcMonth, "month": funcMonth,
"pi": funcPi, "pi": funcPi,
"predict_linear": funcPredictLinear, "predict_linear": funcPredictLinear,
"present_over_time": funcPresentOverTime, "present_over_time": funcPresentOverTime,
"quantile_over_time": funcQuantileOverTime, "quantile_over_time": funcQuantileOverTime,
"rad": funcRad, "rad": funcRad,
"rate": funcRate, "rate": funcRate,
"resets": funcResets, "resets": funcResets,
"round": funcRound, "round": funcRound,
"scalar": funcScalar, "scalar": funcScalar,
"sgn": funcSgn, "sgn": funcSgn,
"sin": funcSin, "sin": funcSin,
"sinh": funcSinh, "sinh": funcSinh,
"sort": funcSort, "sort": funcSort,
"sort_desc": funcSortDesc, "sort_desc": funcSortDesc,
"sort_by_label": funcSortByLabel, "sort_by_label": funcSortByLabel,
"sort_by_label_desc": funcSortByLabelDesc, "sort_by_label_desc": funcSortByLabelDesc,
"sqrt": funcSqrt, "sqrt": funcSqrt,
"stddev_over_time": funcStddevOverTime, "stddev_over_time": funcStddevOverTime,
"stdvar_over_time": funcStdvarOverTime, "stdvar_over_time": funcStdvarOverTime,
"sum_over_time": funcSumOverTime, "sum_over_time": funcSumOverTime,
"tan": funcTan, "tan": funcTan,
"tanh": funcTanh, "tanh": funcTanh,
"time": funcTime, "time": funcTime,
"timestamp": funcTimestamp, "timestamp": funcTimestamp,
"vector": funcVector, "vector": funcVector,
"year": funcYear, "year": funcYear,
} }
// AtModifierUnsafeFunctions are the functions whose result // AtModifierUnsafeFunctions are the functions whose result

View file

@ -202,8 +202,8 @@ var Functions = map[string]*Function{
ArgTypes: []ValueType{ValueTypeScalar, ValueTypeVector}, ArgTypes: []ValueType{ValueTypeScalar, ValueTypeVector},
ReturnType: ValueTypeVector, ReturnType: ValueTypeVector,
}, },
"holt_winters": { "double_exponential_smoothing": {
Name: "holt_winters", Name: "double_exponential_smoothing",
ArgTypes: []ValueType{ValueTypeMatrix, ValueTypeScalar, ValueTypeScalar}, ArgTypes: []ValueType{ValueTypeMatrix, ValueTypeScalar, ValueTypeScalar},
ReturnType: ValueTypeVector, ReturnType: ValueTypeVector,
Experimental: true, Experimental: true,

View file

@ -651,7 +651,7 @@ eval_ordered instant at 50m sort_by_label(node_uname_info, "release")
node_uname_info{job="node_exporter", instance="4m5", release="1.11.3"} 100 node_uname_info{job="node_exporter", instance="4m5", release="1.11.3"} 100
node_uname_info{job="node_exporter", instance="4m1000", release="1.111.3"} 100 node_uname_info{job="node_exporter", instance="4m1000", release="1.111.3"} 100
# Tests for holt_winters # Tests for double_exponential_smoothing
clear clear
# positive trends # positive trends
@ -661,7 +661,7 @@ load 10s
http_requests{job="api-server", instance="0", group="canary"} 0+30x1000 300+80x1000 http_requests{job="api-server", instance="0", group="canary"} 0+30x1000 300+80x1000
http_requests{job="api-server", instance="1", group="canary"} 0+40x2000 http_requests{job="api-server", instance="1", group="canary"} 0+40x2000
eval instant at 8000s holt_winters(http_requests[1m], 0.01, 0.1) eval instant at 8000s double_exponential_smoothing(http_requests[1m], 0.01, 0.1)
{job="api-server", instance="0", group="production"} 8000 {job="api-server", instance="0", group="production"} 8000
{job="api-server", instance="1", group="production"} 16000 {job="api-server", instance="1", group="production"} 16000
{job="api-server", instance="0", group="canary"} 24000 {job="api-server", instance="0", group="canary"} 24000
@ -675,7 +675,7 @@ load 10s
http_requests{job="api-server", instance="0", group="canary"} 0+30x1000 300-80x1000 http_requests{job="api-server", instance="0", group="canary"} 0+30x1000 300-80x1000
http_requests{job="api-server", instance="1", group="canary"} 0-40x1000 0+40x1000 http_requests{job="api-server", instance="1", group="canary"} 0-40x1000 0+40x1000
eval instant at 8000s holt_winters(http_requests[1m], 0.01, 0.1) eval instant at 8000s double_exponential_smoothing(http_requests[1m], 0.01, 0.1)
{job="api-server", instance="0", group="production"} 0 {job="api-server", instance="0", group="production"} 0
{job="api-server", instance="1", group="production"} -16000 {job="api-server", instance="1", group="production"} -16000
{job="api-server", instance="0", group="canary"} 24000 {job="api-server", instance="0", group="canary"} 24000

12
ui-commits Normal file
View file

@ -0,0 +1,12 @@
dfec29d8e Fix border color for target pools with one target that is failing
65743bf9b ui: drop template readme
a7c1a951d Add general Mantine overrides CSS file
0757fbbec Make sure that alert element table headers are not wrapped
0180cf31a Factor out common icon and card styles
50af7d589 Fix tree line drawing by using a callback ref
ac01dc903 Explain, vector-to-vector: Do not compute results for set operators
9b0dc68d0 PromQL explain view: Support set operators
57898c792 Refactor and fix time formatting functions, add tests
091fc403c Fiddle with targets table styles to try and improve things a bit
a1908df92 Don't wrap action buttons below metric name in metrics explorer
ac5377873 mantine UI: Distinguish between Not Ready and Stopping

View file

@ -1277,17 +1277,17 @@ const funcDocs: Record<string, React.ReactNode> = {
</p> </p>
</> </>
), ),
holt_winters: ( double_exponential_smoothing: (
<> <>
<p> <p>
<code>holt_winters(v range-vector, sf scalar, tf scalar)</code> produces a smoothed value for time series based on <code>double_exponential_smoothing(v range-vector, sf scalar, tf scalar)</code> produces a smoothed value for time series based on
the range in <code>v</code>. The lower the smoothing factor <code>sf</code>, the more importance is given to old the range in <code>v</code>. The lower the smoothing factor <code>sf</code>, the more importance is given to old
data. The higher the trend factor <code>tf</code>, the more trends in the data is considered. Both <code>sf</code>{' '} data. The higher the trend factor <code>tf</code>, the more trends in the data is considered. Both <code>sf</code>{' '}
and <code>tf</code> must be between 0 and 1. and <code>tf</code> must be between 0 and 1.
</p> </p>
<p> <p>
<code>holt_winters</code> should only be used with gauges. <code>double_exponential_smoothing</code> should only be used with gauges.
</p> </p>
</> </>
), ),

View file

@ -17,7 +17,7 @@ export const functionArgNames: Record<string, string[]> = {
// exp: [], // exp: [],
// floor: [], // floor: [],
histogram_quantile: ['target quantile', 'histogram'], histogram_quantile: ['target quantile', 'histogram'],
holt_winters: ['input series', 'smoothing factor', 'trend factor'], double_exponential_smoothing: ['input series', 'smoothing factor', 'trend factor'],
hour: ['timestamp (default = vector(time()))'], hour: ['timestamp (default = vector(time()))'],
// idelta: [], // idelta: [],
// increase: [], // increase: [],
@ -68,7 +68,7 @@ export const functionDescriptions: Record<string, string> = {
exp: 'calculate exponential function for input vector values', exp: 'calculate exponential function for input vector values',
floor: 'round down values of input series to nearest integer', floor: 'round down values of input series to nearest integer',
histogram_quantile: 'calculate quantiles from histogram buckets', histogram_quantile: 'calculate quantiles from histogram buckets',
holt_winters: 'calculate smoothed value of input series', double_exponential_smoothing: 'calculate smoothed value of input series',
hour: 'return the hour of the day for provided timestamps', hour: 'return the hour of the day for provided timestamps',
idelta: 'calculate the difference between the last two samples of a range vector (for counters)', idelta: 'calculate the difference between the last two samples of a range vector (for counters)',
increase: 'calculate the increase in value over a range of time (for counters)', increase: 'calculate the increase in value over a range of time (for counters)',

View file

@ -60,8 +60,8 @@ export const functionSignatures: Record<string, Func> = {
histogram_stddev: { name: 'histogram_stddev', argTypes: [valueType.vector], variadic: 0, returnType: valueType.vector }, histogram_stddev: { name: 'histogram_stddev', argTypes: [valueType.vector], variadic: 0, returnType: valueType.vector },
histogram_stdvar: { name: 'histogram_stdvar', argTypes: [valueType.vector], variadic: 0, returnType: valueType.vector }, histogram_stdvar: { name: 'histogram_stdvar', argTypes: [valueType.vector], variadic: 0, returnType: valueType.vector },
histogram_sum: { name: 'histogram_sum', argTypes: [valueType.vector], variadic: 0, returnType: valueType.vector }, histogram_sum: { name: 'histogram_sum', argTypes: [valueType.vector], variadic: 0, returnType: valueType.vector },
holt_winters: { double_exponential_smoothing: {
name: 'holt_winters', name: 'double_exponential_smoothing',
argTypes: [valueType.matrix, valueType.scalar, valueType.scalar], argTypes: [valueType.matrix, valueType.scalar, valueType.scalar],
variadic: 0, variadic: 0,
returnType: valueType.vector, returnType: valueType.vector,

View file

@ -258,7 +258,7 @@ export const functionIdentifierTerms = [
type: 'function', type: 'function',
}, },
{ {
label: 'holt_winters', label: 'double_exponential_smoothing',
detail: 'function', detail: 'function',
info: 'Calculate smoothed value of input series', info: 'Calculate smoothed value of input series',
type: 'function', type: 'function',

View file

@ -46,7 +46,7 @@ import {
HistogramStdDev, HistogramStdDev,
HistogramStdVar, HistogramStdVar,
HistogramSum, HistogramSum,
HoltWinters, DoubleExponentialSmoothing,
Hour, Hour,
Idelta, Idelta,
Increase, Increase,
@ -312,8 +312,8 @@ const promqlFunctions: { [key: number]: PromQLFunction } = {
variadic: 0, variadic: 0,
returnType: ValueType.vector, returnType: ValueType.vector,
}, },
[HoltWinters]: { [DoubleExponentialSmoothing]: {
name: 'holt_winters', name: 'double_exponential_smoothing',
argTypes: [ValueType.matrix, ValueType.scalar, ValueType.scalar], argTypes: [ValueType.matrix, ValueType.scalar, ValueType.scalar],
variadic: 0, variadic: 0,
returnType: ValueType.vector, returnType: ValueType.vector,

View file

@ -20,7 +20,7 @@ export const promQLHighLight = styleTags({
NumberDurationLiteral: tags.number, NumberDurationLiteral: tags.number,
NumberDurationLiteralInDurationContext: tags.number, NumberDurationLiteralInDurationContext: tags.number,
Identifier: tags.variableName, Identifier: tags.variableName,
'Abs Absent AbsentOverTime Acos Acosh Asin Asinh Atan Atanh AvgOverTime Ceil Changes Clamp ClampMax ClampMin Cos Cosh CountOverTime DaysInMonth DayOfMonth DayOfWeek DayOfYear Deg Delta Deriv Exp Floor HistogramAvg HistogramCount HistogramFraction HistogramQuantile HistogramSum HoltWinters Hour Idelta Increase Irate LabelReplace LabelJoin LastOverTime Ln Log10 Log2 MaxOverTime MinOverTime Minute Month Pi PredictLinear PresentOverTime QuantileOverTime Rad Rate Resets Round Scalar Sgn Sin Sinh Sort SortDesc SortByLabel SortByLabelDesc Sqrt StddevOverTime StdvarOverTime SumOverTime Tan Tanh Time Timestamp Vector Year': 'Abs Absent AbsentOverTime Acos Acosh Asin Asinh Atan Atanh AvgOverTime Ceil Changes Clamp ClampMax ClampMin Cos Cosh CountOverTime DaysInMonth DayOfMonth DayOfWeek DayOfYear Deg Delta Deriv Exp Floor HistogramAvg HistogramCount HistogramFraction HistogramQuantile HistogramSum DoubleExponentialSmoothing Hour Idelta Increase Irate LabelReplace LabelJoin LastOverTime Ln Log10 Log2 MaxOverTime MinOverTime Minute Month Pi PredictLinear PresentOverTime QuantileOverTime Rad Rate Resets Round Scalar Sgn Sin Sinh Sort SortDesc SortByLabel SortByLabelDesc Sqrt StddevOverTime StdvarOverTime SumOverTime Tan Tanh Time Timestamp Vector Year':
tags.function(tags.variableName), tags.function(tags.variableName),
'Avg Bottomk Count Count_values Group LimitK LimitRatio Max Min Quantile Stddev Stdvar Sum Topk': tags.operatorKeyword, 'Avg Bottomk Count Count_values Group LimitK LimitRatio Max Min Quantile Stddev Stdvar Sum Topk': tags.operatorKeyword,
'By Without Bool On Ignoring GroupLeft GroupRight Offset Start End': tags.modifier, 'By Without Bool On Ignoring GroupLeft GroupRight Offset Start End': tags.modifier,

View file

@ -141,7 +141,7 @@ FunctionIdentifier {
HistogramStdVar | HistogramStdVar |
HistogramSum | HistogramSum |
HistogramAvg | HistogramAvg |
HoltWinters | DoubleExponentialSmoothing |
Hour | Hour |
Idelta | Idelta |
Increase | Increase |
@ -388,7 +388,7 @@ NumberDurationLiteralInDurationContext {
HistogramStdDev { condFn<"histogram_stddev"> } HistogramStdDev { condFn<"histogram_stddev"> }
HistogramStdVar { condFn<"histogram_stdvar"> } HistogramStdVar { condFn<"histogram_stdvar"> }
HistogramSum { condFn<"histogram_sum"> } HistogramSum { condFn<"histogram_sum"> }
HoltWinters { condFn<"holt_winters"> } DoubleExponentialSmoothing { condFn<"double_exponential_smoothing"> }
Hour { condFn<"hour"> } Hour { condFn<"hour"> }
Idelta { condFn<"idelta"> } Idelta { condFn<"idelta"> }
Increase { condFn<"increase"> } Increase { condFn<"increase"> }