Caution
Grafana Agent has reached End-of-Life (EOL) on November 1, 2025. Agent is no longer receiving vendor support and will no longer receive security or bug fixes. Current users of Agent Static mode, Agent Flow mode, and Agent Operator should proceed with migrating to Grafana Alloy. If you have already migrated to Alloy, no further action is required. Read more about why we recommend migrating to Grafana Alloy.
loki.source.kafka
loki.source.kafka reads messages from Kafka using a consumer group
and forwards them to other loki.* components.
The component starts a new Kafka consumer group for the given arguments
and fans out incoming entries to the list of receivers in forward_to.
Before using loki.source.kafka, Kafka should have at least one producer
writing events to at least one topic. Follow the steps in the
Kafka Quick Start
to get started with Kafka.
Multiple loki.source.kafka components can be specified by giving them
different labels.
Usage
loki.source.kafka "LABEL" {
brokers = BROKER_LIST
topics = TOPIC_LIST
forward_to = RECEIVER_LIST
}Arguments
loki.source.kafka supports the following arguments:
assignor values can be either "range", "roundrobin", or "sticky".
Labels from the labels argument are applied to every message that the component reads.
The relabel_rules field can make use of the rules export value from a
loki.relabel component to apply one or more relabeling rules to log entries
before they’re forwarded to the list of receivers in forward_to.
In addition to custom labels, the following internal labels prefixed with __ are available:
__meta_kafka_message_key__meta_kafka_message_offset__meta_kafka_topic__meta_kafka_partition__meta_kafka_member_id__meta_kafka_group_id
All labels starting with __ are removed prior to forwarding log entries. To
keep these labels, relabel them using a loki.relabel component and pass its
rules export to the relabel_rules argument.
Blocks
The following blocks are supported inside the definition of loki.source.kafka:
authentication block
The authentication block defines the authentication method when communicating with the Kafka event brokers.
type supports the values "none", "ssl", and "sasl". If "ssl" is used,
you must set the tls_config block. If "sasl" is used, you must set the sasl_config block.
tls_config block
The following pairs of arguments are mutually exclusive and can’t both be set simultaneously:
ca_pemandca_filecert_pemandcert_filekey_pemandkey_file
When configuring client authentication, both the client certificate (using
cert_pem or cert_file) and the client key (using key_pem or key_file)
must be provided.
When min_version is not provided, the minimum acceptable TLS version is
inherited from Go’s default minimum version, TLS 1.2. If min_version is
provided, it must be set to one of the following strings:
"TLS10"(TLS 1.0)"TLS11"(TLS 1.1)"TLS12"(TLS 1.2)"TLS13"(TLS 1.3)
sasl_config block
The sasl_config block defines the listen address and port where the listener
expects Kafka messages to be sent to.
oauth_config block
The oauth_config is required when the SASL mechanism is set to OAUTHBEARER.
Exported fields
loki.source.kafka does not export any fields.
Component health
loki.source.kafka is only reported as unhealthy if given an invalid
configuration.
Debug information
loki.source.kafka does not expose additional debug info.
Example
This example consumes Kafka events from the specified brokers and topics
then forwards them to a loki.write component using the Kafka timestamp.
loki.source.kafka "local" {
brokers = ["localhost:9092"]
topics = ["quickstart-events"]
labels = {component = "loki.source.kafka"}
forward_to = [loki.relabel.kafka.receiver]
use_incoming_timestamp = true
relabel_rules = loki.relabel.kafka.rules
}
loki.relabel "kafka" {
forward_to = [loki.write.local.receiver]
rule {
source_labels = ["__meta_kafka_topic"]
target_label = "topic"
}
}
loki.write "local" {
endpoint {
url = "loki:3100/api/v1/push"
}
}Compatible components
loki.source.kafka can accept arguments from the following components:
- Components that export Loki
LogsReceiver
Note
Connecting some components may not be sensible or components may require further configuration to make the connection work correctly. Refer to the linked documentation for more details.



