Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[receiver/prometheus] update github.com/prometheus/prometheus to v0.49.1 breaks build #30883

Closed
songy23 opened this issue Jan 30, 2024 · 4 comments · Fixed by #30934
Closed
Assignees
Labels
bug Something isn't working receiver/prometheus Prometheus receiver

Comments

@songy23
Copy link
Member

songy23 commented Jan 30, 2024

Component(s)

receiver/promethues

What happened?

See https://github.com/open-telemetry/opentelemetry-collector-contrib/actions/runs/7715094549/job/21028827561?pr=30881

Collector version

mainline

Environment information

No response

OpenTelemetry Collector configuration

No response

Log output

No response

Additional context

No response

@songy23 songy23 added bug Something isn't working needs triage New item requiring triage receiver/prometheus Prometheus receiver labels Jan 30, 2024
Copy link
Contributor

Pinging code owners for receiver/prometheus: @Aneurysm9 @dashpole. See Adding Labels via Comments if you do not have permissions to add labels yourself.

Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@codeboten codeboten changed the title [receiver/prometheus] update github.com/prometheus/prometheus to v0.49.1 broke build [receiver/prometheus] update github.com/prometheus/prometheus to v0.49.1 breaks build Jan 30, 2024
@dashpole
Copy link
Contributor

PR: #30881

@crobert-1
Copy link
Member

The NewManager function signature was updated to require an extra registerer argument, and now also return an error in addition to the newly created manager. Reference

It's a trivial change for the receiver to handle a returned error, I'm not familiar enough to know how much effort is required to pass in a prometheus.Registerer variable.

Prometheus change that caused issue: prometheus/prometheus@5752050

@crobert-1 crobert-1 removed the needs triage New item requiring triage label Jan 31, 2024
@dashpole dashpole self-assigned this Jan 31, 2024
codeboten added a commit to open-telemetry/opentelemetry-collector that referenced this issue Mar 12, 2024
Long story, but i'm working on updating the prometheus dependency:
open-telemetry/opentelemetry-collector-contrib#30934

As part of that update, we need to adapt to a change that makes the
prometheus servers' self-observability metrics independent. See
prometheus/prometheus#13507 and
prometheus/prometheus#13610

One way to adapt to this change is by adding a label to each receivers'
metrics to differentiate one Prometheus receiver from another. I've
tried taking that approach in
open-telemetry/opentelemetry-collector-contrib#30934,
but the current issue is that the NoOp components all have the same
name, which causes the self-observability metrics to collide.

I can work around this in the prometheus receiver's own tests, but I
can't work around the issue in the `generated_component_test.go` tests,
since those are generated.

This PR makes the ID returned by `NewNopCreateSettings` unique by giving
it a unique name.

**Link to tracking Issue:**

Part of
open-telemetry/opentelemetry-collector-contrib#30883

cc @Aneurysm9

---------

Co-authored-by: Alex Boten <[email protected]>
tomasmota pushed a commit to tomasmota/opentelemetry-collector that referenced this issue Mar 14, 2024
Long story, but i'm working on updating the prometheus dependency:
open-telemetry/opentelemetry-collector-contrib#30934

As part of that update, we need to adapt to a change that makes the
prometheus servers' self-observability metrics independent. See
prometheus/prometheus#13507 and
prometheus/prometheus#13610

One way to adapt to this change is by adding a label to each receivers'
metrics to differentiate one Prometheus receiver from another. I've
tried taking that approach in
open-telemetry/opentelemetry-collector-contrib#30934,
but the current issue is that the NoOp components all have the same
name, which causes the self-observability metrics to collide.

I can work around this in the prometheus receiver's own tests, but I
can't work around the issue in the `generated_component_test.go` tests,
since those are generated.

This PR makes the ID returned by `NewNopCreateSettings` unique by giving
it a unique name.

**Link to tracking Issue:**

Part of
open-telemetry/opentelemetry-collector-contrib#30883

cc @Aneurysm9

---------

Co-authored-by: Alex Boten <[email protected]>
dmitryax pushed a commit that referenced this issue Mar 25, 2024
…30934)

Fixes
#30883

Prometheus PR we need to adapt to:
prometheus/prometheus#12738
The configuration option we were using to enable protobuf has been
removed, so this PR removes the `enable_protobuf_negotiation`
configuration on the prometheus receiver. In its place, the prometheus
server now has a config option, `scrape_protocols`, which specifies
which protocols it uses, including protobuf. Use
`config.global.scrape_protocols = [ PrometheusProto,
OpenMetricsText1.0.0, OpenMetricsText0.0.1, PrometheusText0.0.4 ]` to
enable protobuf scraping instead of using our option.

Prometheus PR we need to adapt to:
prometheus/prometheus#12958
We now need to pass a prometheus.Registerer to the scrape manager to
collect metrics about scrapes. This PR disables the collection of these
scrape metrics from the prometheus server. We did not use the default
registry, so we were previously collecting these metrics and never
exposing them. This PR passes a no-op registerer to prevent using memory
to store those metrics. In the future, we can consider using the
prometheus -> OTel bridge to export these with the self-observability
pipeline.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working receiver/prometheus Prometheus receiver
Projects
None yet
3 participants