-
Notifications
You must be signed in to change notification settings - Fork 2.9k
Description
Component(s)
exporter/awsxray
What happened?
Description
OTEL Collector when configured with AWS Xray is receving error message from AWS Xray as follows:
error internal/base_exporter.go:153 Exporting failed. Rejecting data. {"kind": "exporter", "data_type": "traces", "name": "awsxray", "error": "Permanent error: : \n\tstatus code: 408, request id: ", "rejected_items": 512}
Steps to Reproduce
Steps to reproduce:
OTEL configuration:
receivers:
otlp:
protocols:
grpc:
endpoint: 0.0.0.0:4317
http:
endpoint: 0.0.0.0:4318
processors:
memory_limiter:
check_interval: 1s
limit_percentage: 75
spike_limit_percentage: 15
batch:
send_batch_max_size: 50
send_batch_size: 50
exporters:
awsxray:
region: us-east-1
service:
pipelines:
traces:
receivers: [otlp]
processors: [memory_limiter, batch]
exporters: [awsxray]
OTEL collector Version: otelcol-contrib version 0.111.0
Otel collector is deployed in a pod with 1gb RAM and 500m cpu. There are 2 replicas.
Expected Result
No Error should come, all traces should be visible in AWS Xray
Actual Result
warn [email protected]/batch_processor.go:279 Sender failed {"kind": "processor", "name": "batch", "pipeline": "traces", "error": "Permanent error: : \n\tstatus code: 408, request id: "}
Collector version
otelcol-contrib version 0.111.0
Environment information
Environment
OTEL Contrib binary 0.111.0 downloaded from github
OS: Rocky Linux (uname -a: 6.1.79-99.167.amzn2023.x86_64)
OpenTelemetry Collector configuration
receivers:
otlp:
protocols:
grpc:
endpoint: 0.0.0.0:4317
http:
endpoint: 0.0.0.0:4318
processors:
memory_limiter:
check_interval: 1s
limit_percentage: 75
spike_limit_percentage: 15
batch:
send_batch_max_size: 50
send_batch_size: 50
exporters:
awsxray:
region: us-east-1
service:
pipelines:
traces:
receivers: [otlp]
processors: [memory_limiter, batch]
exporters: [awsxray]
Log output
2024-11-12T20:59:56.392Z error internal/base_exporter.go:153 Exporting failed. Rejecting data. {"kind": "exporter", "data_type": "traces", "name": "awsxray", "error": "Permanent error: : \n\tstatus code: 408, request id: ", "rejected_items": 50}
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*BaseExporter).Send
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/base_exporter.go:153
go.opentelemetry.io/collector/exporter/exporterhelper.NewTracesRequestExporter.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/traces.go:136
go.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces
go.opentelemetry.io/collector/[email protected]/traces.go:26
go.opentelemetry.io/collector/processor/batchprocessor.(*batchTraces).export
go.opentelemetry.io/collector/processor/[email protected]/batch_processor.go:434
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).sendItems
go.opentelemetry.io/collector/processor/[email protected]/batch_processor.go:277
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).processItem
go.opentelemetry.io/collector/processor/[email protected]/batch_processor.go:249
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).startLoop
go.opentelemetry.io/collector/processor/[email protected]/batch_processor.go:234
2024-11-12T20:59:56.392Z warn [email protected]/batch_processor.go:279 Sender failed {"kind": "processor", "name": "batch", "pipeline": "traces", "error": "Permanent error: : \n\tstatus code: 408, request id: "}
Additional context
No response