How to use IBM App Connect with Kafka
Kafka is a real-time event streaming platform that you can use to publish and subscribe, store, and process events as they happen. IBM® App Connect provides a Kafka connector that you can use to connect to a number of supported Kafka implementations.
You can use App Connect to connect to a Kafka broker and configure an integration flow that gets triggered whenever a message is received on the configured topic. You can also create an integration flow that puts messages to a configured topic.
- App Connect Enterprise as a Service connector
- Local connector in containers (Continuous Delivery release) 11.0.0.11-r1 or later
- Local connector in containers (Extended Update Support release)
- Local connector in containers (Long Term Support release)
The following information describes how to use App Connect to connect to Kafka.
- What to consider first
- Connecting to Kafka
- General considerations for using Kafka in App Connect
- Events and actions
What to consider first
Before you use App Connect Designer with Kafka, take note of the following considerations:
- Apache Kafka
- Apache Kafka on Confluent Platforms
- IBM Event Streams on premises
- IBM Event Streams for IBM Cloud
Connecting to Kafka
Kafka supports various security measures to connect to a cluster. The default setting is nonsecured, but you can have a mixture of unauthenticated, authenticated, encrypted, and nonencrypted channels. When you connect to your Kafka implementation in App Connect, you need to select an authorization method that reflects how your brokers are configured. You might need to speak to your Kafka administrator to get the values that are needed to connect in App Connect.
One of the following four authorization methods is needed to authenticate your connection. Each method has its own set of credentials for connecting to Kafka as displayed in Table 1. If you want to connect to a Kafka schema registry, you need to supply additional credentials as described in Connecting to a Kafka schema registry.
- PLAINTEXT
- The default setting for Kafka communication. Select this option for authentication by an unauthenticated and nonencrypted channel. If your brokers are configured to encrypt communication, one of the following options is needed.
- SASL_PLAINTEXT
- Select this option for authentication by Simple Authentication and Security Layer (SASL) (a username and password on a nonencrypted channel).
- SASL_SSL
- Select this option for authentication by Simple Authentication and Security Layer (SASL) (a username and password on a Secure Sockets Layer (SSL) encrypted channel).
- SSL
- Select this option for authentication by an SSL encryption channel.
PLAINTEXT | SASL_PLAINTEXT | SASL_SSL | SSL | |
---|---|---|---|---|
Kafka brokers list | Required | Required | Required | Required |
Client ID | Optional | Optional | Optional | Optional |
Username | N/A | Required | Required | N/A |
Password | N/A | Required | Required | N/A |
Security mechanism | N/A | Required | Required | N/A |
CA certificate | N/A | N/A | Optional | Required |
Private network connection | Optional | Optional | Optional | Optional |
- Kafka brokers list
- Specify the list of Kafka brokers in the format ["x.x.x.x:9092","y.y.y.y:9092"].
- Client ID
- Specify a default client ID for all the producer and consumer instances that are associated with this account.
- Username
- Specify the Kafka server username.
- Password
- Specify the Kafka server password.
- Security mechanism
- Specify how the broker is configured to accept secure connections. You need to specify the Salted Challenge Response Authentication Mechanism (SCRAM). The options include PLAIN, SCRAM-SHA-256, or SCRAM-SHA-512 security layers.
- CA certificate
- The certificate authority (CA) that is used to authenticate the broker, and which is needed for SSL connections only.
- Private network connection
-
Select the name of a private network connection that App Connect uses to connect to your private network. This list is populated with the names of private network connections that are created from the
Private network connections
page in the Designer instance. You see this field only if a switch server is configured for this Designer instance. For more information, see Connecting to a private network from App Connect Designer. (In App Connect Designer 12.0.10.0-r1 or earlier instances that include this field, the display name is shown as Agent name.)
To connect to a Kafka endpoint from the App Connect Designer Catalog page for the first time, expand Kafka, then click Connect.
Before you use the account that is created in App Connect in a flow, rename the account to something meaningful that helps you to identify it. To rename the account on the Catalog page, select the account, open its options menu (⋮), then click Rename Account.
General considerations for using Kafka in App Connect
- You can add a New message event node to an event-driven flow to trigger
the flow when a new message is received on a Kafka topic. To
configure App Connect to detect the event, select the topic, and
then optionally specify a message offset, client ID, group ID, and output schema.
- Message offset
-
Specify the message offset that the flow (consumer) uses when it starts for the first time (with a specific group ID).
- Select
Earliest
to start reading published messages (including historical messages) from the beginning of the topic. - Select
Latest
(the default) to read only messages that are published after the flow starts.
Note: If you stop the flow and restart it later (with the same group ID), the flow resumes reading messages from where it left off regardless of the Message offset setting. - Select
- Client ID
-
Specify a unique identifier that can be used to help trace activity in Kafka. This client ID overrides any client ID that is specified in the credentials for the selected Kafka account for this event node.
- Group ID
-
Specify a unique ID for a consumer group.
You might have more than one flow that listens for new messages on the same topic. When a new message is received on a topic, you can use the Group ID field to define how App Connect consumes that message.
- If you leave the Group ID field empty, all the flows are triggered when a
new message is received on the specified topic. (An ID value of
flowName_message
is automatically generated.) - If you assign the same group ID to all the flows, the throughput of messages from Kafka is shared such that only one flow is triggered when a new message is received on the specified topic. This behavior might be useful when you don’t want all the flows to run at the same time. For instance, for scaling purposes, if one flow goes down then the other flow is triggered.
- If you assign different group IDs in each flow, all the flows are triggered when a new message is received on the specified topic.
- If you leave the Group ID field empty, all the flows are triggered when a
new message is received on the specified topic. (An ID value of
- Select output schema
-
If you have configured a schema registry, click Select output schema to select a schema type (
AVRO
orJSON
), subject, and schema version. For more information, see Connecting to a Kafka schema registry.
- (General consideration) You can see lists of the trigger events and
actions that are available on the Catalog page of the App Connect Designer.
For some applications, the events and actions in the catalog depend on the environment and whether the connector supports configurable events and dynamic discovery of actions. If the application supports configurable events, you see a Show more configurable events link under the events list. If the application supports dynamic discovery of actions, you see a Show more link under the actions list.
- (General consideration) If you are using multiple accounts for an application, the set of fields that is displayed when you select an action for that application can vary for different accounts. In the flow editor, some applications always provide a curated set of static fields for an action. Other applications use dynamic discovery to retrieve the set of fields that are configured on the instance that you are connected to. For example, if you have two accounts for two instances of an application, the first account might use settings that are ready for immediate use. However, the second account might be configured with extra custom fields.
Events and actions
Kafka events
These events are for changes in this application that trigger a flow to start completing the actions in the flow.
- Messages
-
- New message
Kafka actions
Your flow completes these actions on this application.
- Messages
-
- Send message
- Topics
-
- Retrieve topics