Fix duplicate sample detection at chunks size limit

Before cutting a new XOR chunk in case the chunk goes over the size
limit, check that the timestamp is in order and not equal or older
than the latest sample in the old chunk.

Signed-off-by: György Krajcsovits <gyorgy.krajcsovits@grafana.com>
This commit is contained in:
György Krajcsovits 2023-09-20 14:49:56 +02:00
parent 56b3a015b6
commit 96d03b6f46

View file

@ -1283,6 +1283,9 @@ func (s *memSeries) appendPreprocessor(t int64, e chunkenc.Encoding, o chunkOpts
c = s.cutNewHeadChunk(t, e, o.chunkRange)
chunkCreated = true
} else if len(c.chunk.Bytes()) > maxBytesPerXORChunk {
if c.maxTime >= t {
return c, false, false
}
c = s.cutNewHeadChunk(t, e, o.chunkRange)
chunkCreated = true
}