Overview

Use Apache Kafka as a destination in Mage to stream records from your pipeline into Kafka topics in real time. Each record is serialized as a JSON object and sent to the configured bootstrap_server.

If a Key Property is defined in the Source Schema, Mage will use the corresponding field value as the Kafka message key—enabling partitioned message routing and supporting downstream systems that rely on message keys.

This is ideal for:

  • Real-time data streaming
  • Event-driven microservices
  • Kafka-based data pipelines and messaging queues

Configuration Parameters

KeyDescriptionDefaultRequired
bootstrap_serverComma-separated list of Kafka brokers with ports (e.g., kafka:9093, broker1:9092,broker2:9092).None
topicKafka topic to publish messages to.None
api_version(Optional) Kafka API version to use. If unspecified, Mage will auto-detect the server version.None

Notes

  • Records are serialized as JSON using the Kafka Python client.
  • To set Kafka message keys, define a Key Property in your source schema.
  • You can explicitly set the api_version to match your Kafka broker version if needed.
  • Compatible with self-hosted Kafka, Confluent Cloud, Amazon MSK, and other Kafka-compatible platforms.

Example Configuration

bootstrap_server: broker1.kafka.internal:9092,broker2.kafka.internal:9092
topic: mage_data_stream
api_version: (0, 10, 1)