Kafka
Kafka Plugin – Publish
Summary (SEO Meta Description):
The Kafka Publish action in WeHub sends messages to an Apache Kafka topic.
Configure connection, topic, partition, headers, and offset for reliable event streaming.
Overview
The Kafka Publish Action allows workflows to send messages to Kafka topics.
It is commonly used in event-driven workflows, streaming pipelines, and integrations that require reliable message delivery.
⚙️ Action: Kafka Publish
- Description: Sends a message to a specified Kafka topic.
- Use Case: Stream workflow outputs into Kafka for downstream services or analytics.
Configuration Options:
| Field | Type | Description | 
|---|---|---|
| Label | String | Custom name for the node (e.g., "Publish Patient Update"). | 
| Select Connection | Connection | Choose or create a Kafka connection. | 
| Topic | String | The Kafka topic to publish to. | 
| Group ID | String | Consumer group identifier for offset tracking. | 
| Partition | Integer | Partition number to send the message to (default: 0). | 
| Start Offset | Integer | Offset position for the message (default: 0,-1for latest). | 
| Header(s) | Key/Value (String) | Optional metadata headers to attach to the Kafka message. | 
💡 Example Use Cases
- Healthcare → Publish HL7/FHIR updates to Kafka for interoperability.
- Finance → Stream payment events into Kafka for fraud detection systems.
- IoT → Send telemetry events from workflows into Kafka topics.
- Analytics → Forward processed workflow results into Kafka for real-time dashboards.
FAQ
Q: What happens if the Kafka topic does not exist?
A: The workflow will fail unless the Kafka broker is configured to auto-create topics.
Q: Can I publish messages with headers?
A: Yes, custom key/value headers can be added for metadata.
Q: Do I need to specify a partition?
A: If left empty, Kafka’s partitioner determines which partition to use.
- Keywords: WeHub Kafka Publish, Kafka Workflow Plugin, Kafka Integration, Publish to Kafka Topic, Event Streaming*