Skip to content

[PR] Support raw_json Encoding for Kafka Exporter (Azure Event Hub Compatibility) #38495

@capiisco

Description

@capiisco

Component(s)

exporter/kafka

Is your feature request related to a problem? Please describe.

Context:

Currently, there is no dedicated OpenTelemetry exporter for Azure Event Hub. To send logs to Azure Event Hub, we must use the Kafka exporter (kafkaexporter).

However, when using encoding: raw, the native log structure—including attribute transformations and resource attributes—is not preserved. This limitation prevents us from leveraging the transformed log format when sending logs to Event Hub.

Describe the solution you'd like

Proposed Solution :

Introduce a new encoding format: raw_json.
Ensure raw_json preserves the native log structure while incorporating attribute transformations.
Validate that logs sent in raw_json format are correctly structured for Event Hub ingestion.

Example Configuration
exporters:
kafka:
brokers: ["{{ az_namespace }}.servicebus.windows.net:9093"]
topic: "{{ az_eventhubs_name }}" # Event Hub name
protocol_version: "2.0.0"
encoding: raw_json
auth:
tls:
insecure: true
sasl:
mechanism: PLAIN
username: "$$ConnectionString" # Use this for Kafka with Event Hubs
password: "Endpoint=sb://{{ az_namespace }}.servicebus.windows.net/;
SharedAccessKeyName={{ az_authorization_rule }};
SharedAccessKey={{ az_sharedaccesskeyname }}"

Impact & Benefits

  • Ensures correct log formatting when using raw_json.
  • Preserves attribute transformations before sending logs.
  • Enhances compatibility with Azure Event Hub.

Describe alternatives you've considered

No response

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions