This document explains how to run the Kafka integration tests that verify the actual end-to-end functionality of the Kafka exporter.
- Docker and Docker Compose installed
- Rust toolchain with cargo
- The integration tests feature enabled
-
Start the Kafka test environment:
./scripts/kafka-test-env.sh start
-
Run the integration tests:
export OPENSSL_DIR=$(brew --prefix openssl@3) export OPENSSL_ROOT_DIR=$(brew --prefix openssl@3) export PKG_CONFIG_PATH="$(brew --prefix openssl@3)/lib/pkgconfig:$PKG_CONFIG_PATH" cargo test --test kafka_integration_tests --features integration-tests
-
Stop the Kafka test environment:
./scripts/kafka-test-env.sh stop
The test environment includes:
- Kafka broker on
localhost:9092 - Zookeeper on
localhost:2181 - Pre-created topics:
otlp_traces,otlp_metrics,otlp_logs
The integration tests verify:
- Sends trace data to Kafka in JSON format
- Verifies message is received and contains valid JSON
- Validates trace structure (resource, scope_spans)
- Sends metrics data in Protobuf format
- Verifies binary message is received
- Confirms non-empty protobuf payload
- Tests log export with gzip compression
- Verifies compressed messages are properly handled
- Validates log structure after decompression
- Sends traces, metrics, and logs simultaneously
- Verifies all telemetry types reach their respective topics
- Tests concurrent operation
Use the management script for common operations:
# Start Kafka environment
./scripts/kafka-test-env.sh start
# Check status
./scripts/kafka-test-env.sh status
# Consume messages from a topic (for debugging)
./scripts/kafka-test-env.sh consume otlp_traces
# View Kafka logs
./scripts/kafka-test-env.sh logs
# Stop environment
./scripts/kafka-test-env.sh stopYou can also manually test the Kafka exporter:
-
Start Kafka:
./scripts/kafka-test-env.sh start
-
Run Rotel with Kafka exporter:
cargo run -- start --exporter kafka \ --kafka-exporter-brokers localhost:9092 \ --kafka-exporter-format json \ --debug-log traces
-
Send test data (in another terminal):
# Install telemetrygen if not already installed go install github.com/open-telemetry/opentelemetry-collector-contrib/cmd/telemetrygen@latest # Send traces telemetrygen traces --otlp-insecure --duration 5s
-
Consume messages:
./scripts/kafka-test-env.sh consume otlp_traces
- Ensure Kafka is running:
./scripts/kafka-test-env.sh status - Check Docker containers:
docker ps - Restart the environment:
./scripts/kafka-test-env.sh stop && ./scripts/kafka-test-env.sh start
- Set environment variables as shown in the Quick Start
- On macOS, ensure OpenSSL is installed:
brew install openssl@3
- Increase the test timeout in the test code if needed
- Check Kafka logs:
./scripts/kafka-test-env.sh logs - Verify topics exist:
docker exec rotel-test-kafka kafka-topics --bootstrap-server localhost:9092 --list
- Make sure the script is executable:
chmod +x scripts/kafka-test-env.sh - Check Docker permissions for your user
- ✅ JSON serialization
- ✅ Protobuf serialization
- ✅ Gzip compression
- ✅ Multiple telemetry types (traces, metrics, logs)
- ✅ Topic routing
- ✅ Producer error handling
- ✅ Concurrent message sending
These integration tests require a running Kafka instance and are not run by default with cargo test. They must be explicitly enabled with the integration-tests feature flag to ensure they don't interfere with regular unit testing workflows.