このコンテンツは選択した言語では利用できません。
SASL Inspection filter guide
Extract authenticated principals from SASL exchanges for use by downstream filters
Abstract
Providing feedback on Red Hat documentation リンクのコピーリンクがクリップボードにコピーされました!
We appreciate your feedback on our documentation.
To propose improvements, open a Jira issue and describe your suggested changes. Provide as much detail as possible to enable us to address your request quickly.
Prerequisite
- You have a Red Hat Customer Portal account. This account enables you to log in to the Red Hat Jira Software instance. If you do not have an account, you will be prompted to create one.
Procedure
- Click Create issue.
- In the Summary text box, enter a brief description of the issue.
In the Description text box, provide the following information:
- The URL of the page where you found the issue.
-
A detailed description of the issue.
You can leave the information in any other fields at their default values.
- Add a reporter name.
- Click Create to submit the Jira issue to the documentation team.
Thank you for taking the time to provide feedback.
About this guide
This guide covers using the Streams for Apache Kafka Proxy SASL Inspection Filter. This filter extracts the authenticated principal from a successful SASL exchange between Kafka Client and Kafka Broker and makes it available to the other filters in the chain.
Refer to other Streams for Apache Kafka Proxy guides for information on running the proxy or for advanced topics such as plugin development.
This filter inspects the SASL exchange between Kafka Client and Broker and extracts the authenticated principal. If the client’s authentication with the broker is successful, the filter makes the authenticated principal available to the other filters in the chain, so that they may know on whose behalf they are acting.
The SASL Inspection Filter plays no part in deciding if the authentication is successful or not. That role remains the exclusive responsibility of the broker.
To use this filter, the Kafka Cluster’s listener must be configured to authenticate using SASL, and it must use a SASL mechanism that is enabled by this filter. If the Kafka Client is configured to use a SASL mechanism that is not supported by the proxy, or the proxy and Kafka Cluster do not have the same mechanism available, the client will be disconnected with an unsupported SASL mechanism error.
This filter supports the following SASL mechanisms:
| SASL mechanism | Enabled by default |
|---|---|
| No | |
| Yes | |
| Yes | |
| Yes |
Mechanisms that transmit credentials in plain text are disabled by default. This is done to avoid the plain-text passwords existing in the proxy’s memory. To use such a mechanism, you must enable it in the filter’s configuration.
For the OAUTHBEARER inspection, only JWT tokens that use signatures (JWS) are supported. JWT tokens that use encryption (JWE) are not supported. Unsigned JWT tokens are supported but not recommended for production use.
If an attempt is made to use an unsupported token type, the authentication will fail with a SASL error.
Figure 1. Sequence diagram showing the SASL inspection filter extracting an authenticated principal from an SASL negotiation.
Chapter 1. Configuring the SASL inspection filter リンクのコピーリンクがクリップボードにコピーされました!
This procedure describes how to set up the SASL Inspection filter by configuring it in Streams for Apache Kafka Proxy.
Prerequisites
- An instance of Streams for Apache Kafka Proxy. For information on deploying Streams for Apache Kafka Proxy, see the Deploying and Managing Streams for Apache Kafka Proxy on OpenShift guide.
Procedure
Configure a
SaslInspectiontype filter.-
In an OpenShift deployment using a
KafkaProtocolFilterresource. See Section 1.1, “ExampleKafkaProtocolFilterresource”
-
In an OpenShift deployment using a
1.1. Example KafkaProtocolFilter resource リンクのコピーリンクがクリップボードにコピーされました!
If your instance of Streams for Apache Kafka Proxy runs on OpenShift, you must use a KafkaProtocolFilter resource to define the filter configuration.
The following example shows a complete filterDefinitions entry that is configured for OAUTHBEARER validation:
-
enabledMechanismsrestricts the filter to the given SASL mechanisms. Refer to SASL mechanism names listed in the supported mechanisms table. If this field is omitted, the filter enablesSCRAM-SHA-256,SCRAM-SHA-512, andOAUTHBEARERby default. -
If
requireAuthenticationistruethen successful authentication is required before the filter forwards any requests other than those strictly required to perform SASL authentication. Iffalse, the filter forwards all requests regardless of whether SASL authentication has been attempted or was successful. The default value isfalse.
Refer to the Deploying and Managing Streams for Apache Kafka Proxy on OpenShift guide for more information about configuration on OpenShift.
Chapter 2. Glossary リンクのコピーリンクがクリップボードにコピーされました!
Glossary of terms used in the SASL Inspection guide.
- JWE
- JSON Web Encryption is an IETF standard for exchanging encrypted data using JSON.
- JWT
- JSON Web Token is an IETF standard for securely transmitting information between parties as a JSON object.
- JWS
- JSON Web Signature is an IETF-proposed standard for signing arbitrary data.
- SASL
- Simple Authentication and Security Layer, a framework for handling authentication.
Appendix A. Using your subscription リンクのコピーリンクがクリップボードにコピーされました!
Streams for Apache Kafka is provided through a software subscription. To manage your subscriptions, access your account at the Red Hat Customer Portal.
A.1. Accessing Your Account リンクのコピーリンクがクリップボードにコピーされました!
- Go to access.redhat.com.
- If you do not already have an account, create one.
- Log in to your account.
A.2. Activating a Subscription リンクのコピーリンクがクリップボードにコピーされました!
- Go to access.redhat.com.
- Navigate to My Subscriptions.
- Navigate to Activate a subscription and enter your 16-digit activation number.
A.3. Downloading Zip and Tar Files リンクのコピーリンクがクリップボードにコピーされました!
To access zip or tar files, use the customer portal to find the relevant files for download. If you are using RPM packages, this step is not required.
- Open a browser and log in to the Red Hat Customer Portal Product Downloads page at access.redhat.com/downloads.
- Locate the Streams for Apache Kafka entries in the INTEGRATION AND AUTOMATION category.
- Select the desired Streams for Apache Kafka product. The Software Downloads page opens.
- Click the Download link for your component.
A.4. Installing packages with DNF リンクのコピーリンクがクリップボードにコピーされました!
To install a package and all the package dependencies, use:
dnf install <package_name>
dnf install <package_name>
To install a previously-downloaded package from a local directory, use:
dnf install <path_to_download_package>
dnf install <path_to_download_package>
Revised on 2025-12-16 10:58:10 UTC