-
Notifications
You must be signed in to change notification settings - Fork 2.9k
Closed as not planned
Closed as not planned
Copy link
Labels
StalebugSomething isn't workingSomething isn't workingclosed as inactivereceiver/prometheusPrometheus receiverPrometheus receiver
Description
Component(s)
receiver/prometheus
What happened?
Description
When prometheus scrape config is updated for metric relabel config with just regex change, the prometheus metrics receiver doesn't update the hash and hence doesnt pick up the metrics relabel config with the new regex.
Steps to Reproduce
With oteloperator's targetallcoator component enabled, update just the regex field in the scrape job to a different value. The prometheus receiver doesnt get the updated regex in the metric relabel config.
Expected Result
New config should have updated regex for metric relabeling.
Actual Result
The new regex is not picked up by the prometheus metrics receiver.
Collector version
v0.85.0
Environment information
Environment
OS: (e.g., "Ubuntu 20.04")
Compiler(if manually compiled): (e.g., "go 14.2")
OpenTelemetry Collector configuration - Configuration applied to the otel collector's operator config
scrape_configs:
- job_name: kube-proxy
scrape_interval: 30s
label_limit: 63
label_name_length_limit: 511
label_value_length_limit: 1023
kubernetes_sd_configs:
- role: pod
relabel_configs:
- action: keep
source_labels:
- __meta_kubernetes_namespace
- __meta_kubernetes_pod_name
separator: "/"
regex: kube-system/kube-proxy.+
- source_labels:
- __address__
action: replace
target_label: __address__
regex: "regex1"
replacement: "$$1:10249"
scrape_configs:
- job_name: kube-proxy
scrape_interval: 30s
label_limit: 63
label_name_length_limit: 511
label_value_length_limit: 1023
kubernetes_sd_configs:
- role: pod
relabel_configs:
- action: keep
source_labels:
- __meta_kubernetes_namespace
- __meta_kubernetes_pod_name
separator: "/"
regex: kube-system/kube-proxy.+
- source_labels:
- __address__
action: replace
target_label: __address__
regex: "regex2"
replacement: "$$1:10249"
No response
Log output
No response
Additional context
This is related to this issue. The same fix needs to be made here
Metadata
Metadata
Assignees
Labels
StalebugSomething isn't workingSomething isn't workingclosed as inactivereceiver/prometheusPrometheus receiverPrometheus receiver