Adapter commands for consumers and producers
Kafka adapter commands for consumers and producers are valid for input data sources and output data targets. See the related Kafka configurations for both consumers and producers in the Apache Kafka documentation for additional details.
- -SERVER hostname:port [,hostname:port[,hostname:port[, ...]]]
- -SRV hostname:port [,hostname:port[,hostname:port[, ...]]]
- Identifies one or more servers in a Kafka cluster that the adapter is to establish an initial
connection with. After the initial connection, the adapter discovers and uses the full set of
servers. If the connection to the first server fails, the adapter attempts to connect to each
subsequent server in the list until it succeeds.
Related Kafka configurations: bootstrap.servers
- -TOPIC topicname[topic_info] [topicname[topic_info]] [topicname[topic_info]] [...]
- -TP topicname[topic_info] [topicname[topic_info]] [topicname[topic_info]] [...]
- The target topic to publish to, or one or more source topics to consume from. Separate multiple
topic names with spaces. This command is required unless producers identify the topic for each
message separately using message header version 2 or later. See Headers for data payloads and the
-HDR command. To produce messages, specify a single target topic as either:
- topicname
- Publish the target message to a partition of the topic selected by the Kafka cluster.
- topicname:partition
- Publish the target message to the specified partition of the topic.
To consume messages, specify one or more source topics to consume from, using any combination of the following:- topicname
- Consume all source messages from the topic.
- topicname:partition
- Consume all source messages from the specified partition of the topic.
- topicname:*
- Consume source messages from all partitions of the topic.
- topicname:partition-offset
- Consume the source message at the specified offset of the specified partition of the topic.
- If the offset does not exist in the specified partition, the adapter consumes messages based on the policy specified by the -AOR command.
- If the -AOR command is not specified, the adapter consumes the message from the most recent offset position.
- -CLIENTID clientID
- -CID clientID
- A logical application name to identify the source of requests in Kafka server logs. The default
value is a null string.
Related Kafka configurations: client.id
- -HEADER {1 | 2 | 3}
- -HDR {1 | 2 | 3}
- The version of the header that precedes the message payload size and message payload data. See
Headers for data payloads for
details about the header versions.
This command is optional.
- -SECURITYPROTOCOL {PLAINTEXT | SSL | SASL_PLAINTEXT| SASL_SSL}
- -SP {PLAINTEXT | SSL | SASL_PLAINTEXT| SASL_SSL}
- The security protocol:
- PLAINTEXT
- Use a plain connection. This is the default.
- SSL
- Use an SSL connection for host authentication and data encryption. With SSL connections, you must also provide the path (-TSL command) and password (-TSP command) of the truststore.
- SASL_PLAINTEXT
- Use a Simple Authentication Security Layer (SASL) mechanism for authentication over a plain connection.
- SASL_SSL
- Use a SASL mechanism for authentication over an SSL connection.
See the security section in the Apache Kafka documentation for more details.
- -SASLMECHANISM {PLAIN | GSSAPI | SCRAM-SHA-256 | SCRAM-SHA-512}
- -SM {PLAIN | GSSAPI | SCRAM-SHA-256 | SCRAM-SHA-512}
- The Simple Authentication Security Layer (SASL) to use. This command is valid when the -SP command is set
to SASL_PLAINTEXT or SASL_SSL.
- PLAIN
- Use simple user name/password authentication.
- GSSAPI
- Use Kerberos authentication.
- SCRAM-SHA-256
- Use Salted Challenge Response Authentication Mechanism (SCRAM) with SHA-256 hash function.
- SCRAM-SHA-512
- Use SCRAM with SHA-512 hash function.
See the security section in the Apache Kafka documentation for more details.
- -LOGINCONFIGFILELOCATION file_path
- -LCFL file_path
- The location of the Java Authentication and Authorization Service (JAAS) login configuration
file, which contains information about the security model and parameters to use for authentication.
This command is valid when the -SP command is set to sasl_plaintext or
sasl_ssl.
Use the -LCFL command only for testing and debugging. This command sets the java.security.auth.login.config system property each time it connects to Kafka, and the property applies to the entire JVM process and all threads and adapters that run in it.
In a production environment, specify the location of the login configuration file with the Java Virtual Machine (JVM) Djava.security.auth.login.config=file_path parameter for the Java process in which the adapter runs.
- -KERBEROSCONFIGFILELOCATION file_path
- -KCFL file_path
- The location of the Kerberos configuration file, which contains information about the security
model and parameters to use for authentication. This command is valid when the -SP command is set to
sasl_plaintext or sasl_ssl and the -SM command is set to
gssapi. The file is typically named krb5.conf.
Use the -KCFL command only for testing and debugging. This command sets the java.security.krb5.conf system property each time it connects to Kafka, and the property applies to the entire JVM process and all threads and adapters that run in it.
In a production environment, specify the location of the Kerberos configuration file with the Java Virtual Machine (JVM) Djava.security.krb5.conf=file_path parameter for the Java process in which the adapter runs.
- -TRUSTSTORELOCATION file_path
- -TSL file_path
- The full path to the truststore. This command is valid only when the -SP command is set to ssl or sasl_ssl.
- -TRUSTSTOREPASSWORD password
- -TSP password
- The truststore password. This command is valid only when the -SP command is set to ssl or sasl_ssl.
- -KEYSTORELOCATION file_path
- -KSL file_path
- The full path to the keystore. This command is valid only when the -SP command is set to ssl or sasl_ssl and the Kafka cluster is configured for client-host authentication.
- -KEYSTOREPASSWORD password
- -KSP password
- The keystore password. This command is valid only when the -SP command is set to ssl or sasl_ssl and the Kafka cluster is configured for client-host authentication.
- -KEYPASSWORD password
- -KP password
- The key password. This command is valid only when the -SP command is set to ssl or sasl_ssl, the Kafka cluster is configured for client-host authentication, and the key password is different from the keystore password specified on the -KSP command.
- -LOGICALMESSAGEMODE
- -LMM
- Specifies logical message mode, in which the adapter processes multiple physical Kafka messages as a single record (a logical message). In this mode, the message payload of each physical message within the logical message is preceded with the 4-byte size of the payload, regardless of whether the -HDR command is specified.
- -ADDPROPFILE filename
- -APF filename
- The name of a file that contains additional Kafka producer or consumer configurations in Java™ properties format. The values specified in the file override those specified on the adapter command.
- -ADDPROP key=value [key=value [key=value]]
- -AP key=value [key=value [key=value]]
- Specifies one or more Kafka producer or consumer configurations. Each key=value pair is separated by spaces. The values specified on the -AP command override the values specified on the adapter command and the configurations in the file specified by the -APF command.
- -T [E | V] [+] [file_path]
- The adapter trace level and full path to the adapter trace log.
- -TRACE
- -T
- Log adapter informational messages.
- -TRACEERROR
- -TE
- Log only adapter errors during map execution.
- -TRACEVERBOSE
- -TV
- Use verbose (debug) logging. The log file records all activity that occurs while the adapter is producing or consuming messages.
- +
- Appends the trace information to the existing log file. Omit this keyword to create a new log file.
- file_path
- The full path to the adapter trace log. If you omit this keyword, the adapter creates the m4kafka.mtr log file in the map directory.