So far, we are compiling C/C++ code without any optimization.
In non-representative, but practically relevant tests, the -O3
improved the total query time for a demanding graph by ~20%.
Change-Id: I5e8123650e53a4933ed4fbe63d0b1ca67217b865
If you want to use a different binary package than the officially
provided, you now can point the env var GOURL to the package you
want to get instead.
Change-Id: I1cefe2998bc86435cfbd058ba398a7b6c4e7d031
This reverts commit 0e9e3e068d.
Not only do we know this to produce problematic artifacts, it refuses
to build on Mac OS X.
TMPDIR=/tmp GOROOT=/Users/mattproud/Development/go/src/github.com/prometheus/prometheus/.build/root/go GOPATH=/Users/mattproud/Development/go/src/github.com/prometheus/prometheus/.build/root/gopath /Users/mattproud/Development/go/src/github.com/prometheus/prometheus/.build/root/go/bin/go build -o prometheus -ldflags " -X main.buildVersion c7052ed -X main.buildBranch refactor/storage/modify-append-signature -X main.buildUser mattproud@Berlin.local -X main.buildDate 20130815-11:15:49 -X main.goVersion 1.1 -X main.leveldbVersion 1.12.0 -X main.protobufVersion 2.5.0 -X main.snappyVersion 1.1.0 -linkmode external -extldflags '-lstdc++ -lpthread -static /Users/mattproud/Development/go/src/github.com/prometheus/prometheus/.build/root/lib/libleveldb.a /Users/mattproud/Development/go/src/github.com/prometheus/prometheus/.build/root/lib/libsnappy.a'" .
# _/Users/mattproud/Development/go/src/github.com/prometheus/prometheus
ld: library not found for -lcrt0.o
Change-Id: I4f42161aebfd35a6f09cd7f984b78cc4498774aa
The race condition binary target is special in that it needs to
explicitly link against the dependent libraries and recompile the CGO
bindings. Because of this, we have a new build target that produces
these binaries.
This commit updates the documentation, Makefiles, formatting, and
code semantics to support the 1.1. runtime, which includes ...
1. ``make advice``,
2. ``make format``, and
3. ``go fix`` on various targets.
This commit extracts the model.Values truncation behavior into the actual
tiered storage, which uses it and behaves in a peculiar way—notably the
retention of previous elements if the chunk were to ever go empty. This is
done to enable interpolation between sparse sample values in the evaluation
cycle. Nothing necessarily new here—just an extraction.
Now, the model.Values TruncateBefore functionality would do what a user
would expect without any surprises, which is required for the
DeletionProcessor, which may decide to split a large chunk in two if it
determines that the chunk contains the cut-off time.