Kafka Consumer (Events)
You can also create events by routing them to a specific topic on Kafka. The Kafka Consumer feature allows events to be consumed from a specific topic on Kafka.
Create events in protobuf format: Google Protocol Buffers (GPB).
See kafka-consumer-events.proto
for the model definitions.
The Kafka Consumer by default consumes events from the topic configured and forwards them to eventd.
Enable Kafka Consumer
The Kafka Consumer is disabled by default and can be enabled as follows.
First, log in to the Karaf shell of your Meridian instance and configure the Kafka client settings to point to your Kafka broker. See Consumer Configs for a complete list of available options.
ssh -p 8101 admin@localhost
config:edit org.opennms.features.kafka.consumer.client
config:property-set bootstrap.servers my-kafka-ip-1:9092,my-kafka-ip-2:9092(1)
config:update
1 | Connect to the following Kafka nodes and adjust the IPs or FQDNs with the Kafka port (9092) accordingly. |
Next, install the opennms-kafka-consumer
feature from that same shell:
feature:install opennms-kafka-consumer
To ensure that the feature continues to be installed on subsequent restarts, add opennms-kafka-consumer
to a file in featuresBoot.d
:
echo "opennms-kafka-consumer" | sudo tee ${OPENNMS_HOME}/etc/featuresBoot.d/kafka-consumer.boot
Configure topic name
ssh -p 8101 admin@localhost
config:edit org.opennms.features.kafka.consumer
config:property-set eventsTopic opennms-kafka-events
config:update
While configuring the eventsTopic, make sure that it doesn’t conflict with other topics in the Meridian subsystem. If you are unsure, keep the default one. |