Skip to content

Latest commit

 

History

History

kafka

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Example: Bulk indexing from a Kafka topic

This example demonstrates using the BulkIndexer component to ingest data consumed from a Kafka topic.

The provided docker-compose.yml file launches a realistic environment with Zookeeper, Kafka, Confluent Control Center, Elasticsearch and Kibana, and allows to inspect data flows, indexer metrics, and see the ingested data in a dashboard.

Screenshot

First, launch the environment and wait until it's ready:

make setup

Then, launch the Kafka producers and consumers and the Elasticsearch indexer:

make run

Open the Kibana dashboard to see the results, the Kibana APM application to see the indexer metrics, and Confluent Control Center to inspect the Kafka cluster and see details about the topic and performance of consumers.

See the producer/producer.go file for the Kafka producer, consumer/consumer.go for the Kafka consumer, and the kafka.go file for the main workflow. The default configuration will launch one producer, four consumers, one indexer, and will send 1,000 messages per second; see go run kafka.go --help for changing the defaults.