Skip to content

[exporterhelper] A exporter hungs up when logs are bigger than sending_queue.queue_size #12928

Closed
@at-ishikawa

Description

@at-ishikawa

Component(s)

exporter/exporterhelper

What happened?

Describe the bug

An exporter stops working if an exporter integrates with exporterhelper and it enables block_on_overflow and sets sizer to bytes.
The,n if a log is grouped into a chunk before an exporter, and the size of it is bigger than sending_queue.queue_size, then an exporter no longer handles any incoming telemetry.

I confirmed this happens on a log exporter, but it may happen for other telemetry types as well.

Steps to reproduce

Please see the configuration attached to OpenTelemetry Collector configuration.
Besides, I also set up another collector to receive a request from that collector (i.e. receiver).

receivers:
  otlp:
    protocols:
      grpc:
        endpoint: 0.0.0.0:24317

exporters:
  debug:
    verbosity: detailed

service:
  pipelines:
    logs:
      receivers:
        - otlp
      exporters:
        - debug
  telemetry:
    logs:
      level: debug
    metrics:
      readers:
        - pull:
            exporter:
              prometheus:
                host: '0.0.0.0'
                port: 18888

Then, after running both collectors, I write lines into files and see if the receiver outputs a log into a debug exporter.

First, confirm that the above setup works correctly.

for i in (seq 1); echo 'text' >> filelog/sample2.txt; end

Then I was able to see a debug exporter's output on a receiver..

2025-04-25T17:28:35.163-0700    info    Logs    {"resource logs": 1, "log records": 1}
2025-04-25T17:28:35.163-0700    info    ResourceLog #0
Resource SchemaURL:
ScopeLogs #0
ScopeLogs SchemaURL:
InstrumentationScope
LogRecord #0
ObservedTimestamp: 2025-04-26 00:28:35.060991139 +0000 UTC
Timestamp: 1970-01-01 00:00:00 +0000 UTC
SeverityText:
SeverityNumber: Unspecified(0)
Body: Str(text)
Attributes:
     -> log.file.name: Str(sample2.txt)
Trace ID:
Span ID:
Flags: 0

Then add many lines into the files that file log receiver reads.

for i in (seq 19); echo 'text' >> filelog/sample.txt; end

Then I couldn't see any output in a debug exporter.
And when I try to write a new line into a file, that line is not handled by an exporter.

for i in (seq 1); echo 'text' >> filelog/sample.txt; end

What did you expect to see?

An exporter divides the data into different batches and keeps sending telemetry.

What did you see instead?

An exporter stops working, and no new telemetry is handled.

Collector version

v0.124.0

Environment information

Environment

OS: Ubuntu 24.04
Compiler(if manually compiled): go version go1.24.1
Collector builder version: ocb version v0.124.0

OpenTelemetry Collector configuration

receivers:
  filelog:
    include: [ "filelog/*" ]

exporters:
  otlp:
    endpoint: localhost:24317
    tls:
      insecure: true
    sending_queue:
      enabled: true
      block_on_overflow: true
      sizer: bytes
      queue_size: 1000
      batch:
        flush_timeout: 1s
        max_size: 200
    timeout: 1s

service:
  pipelines:
    logs:
      receivers:
        - filelog
      exporters:
        - otlp
  telemetry:
    metrics:
      readers:
        - pull:
            exporter:
              prometheus:
                host: '0.0.0.0'
                port: 8888
    logs:
      level: debug

Log output

Additional context

In a reproducing step, I decided to write 19 lines.
When I checked the log byte size with sending_queue.max_size=200 on another exporter, the sizes are as follows:

# 2025-04-25T17:14:19.196-0700    debug   customexporter/logs.go:78 logExporter.pushLogs    {"count": 1, "bytes": 62}
# 2025-04-25T17:14:44.196-0700    debug   customexporter/logs.go:78 logExporter.pushLogs    {"count": 2, "bytes": 116}
# 2025-04-25T17:15:12.196-0700    debug   customexporter/logs.go:78 logExporter.pushLogs    {"count": 3, "bytes": 172}
# 4 items with max_size=200
# 2025-04-25T17:16:28.196-0700    debug   customsexporter/logs.go:78 logExporter.pushLogs    {"count": 3, "bytes": 172}
# 2025-04-25T17:16:28.197-0700    debug   customexporter/logs.go:78 logExporter.pushLogs    {"count": 1, "bytes": 62}

I thought 18 lines were sufficient to make it stop because it's bigger than 1000 bytes, but it didn't, and it stopped 19 lines for some reason.
Note that count is the value of plog.Logs.LogRecordCount() and bytes is the value of plog.LogsSize(logs).Sizer()' on an exporter..

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions