Build pipelines that ingest data from event streaming sources like Kafka.
git clone https://github.com/wurstmeister/kafka-docker.git
.cd kafka-docker
.docker-compose.yml
file to match this:
command not found: docker-compose
,
try running the following command instead:
test
:
test
by typing the following JSON strings in the
terminal (Note that Kafka messages in Mage are assumed to be in JSON format):
test
:kafka-docker_default
doesn’t exist, create a new network:
git clone https://github.com/mage-ai/mage-ai.git
.cd mage-ai
.docker-compose.yml
file to match this:
./scripts/dev.sh
.docker compose
instead of docker run
.
+ New pipeline
, then select Streaming
.
Kafka
, and paste the following:
bootstrap_server
is set to localhost:9092
. If you’re
running Mage in a docker container, the bootstrap_server
should be
kafka:9093
.OpenSearch
and paste the following:
host
to match your OpenSearch domain’s endpoint.index_name
to match the index you want to export data into.Execute pipeline
to test the pipeline.
You should see an output like this:
kafka-python
:
Start trigger
button at the top
of the page to make the streaming pipeline active.