このコンテンツは選択した言語では利用できません。

SASL Inspection filter guide


Red Hat Streams for Apache Kafka 3.1

Extract authenticated principals from SASL exchanges for use by downstream filters

Abstract

Streams for Apache Kafka Proxy is a protocol-aware proxy that extends and secures Kafka-based systems with a flexible filtering mechanism. This guide explains how to use the SASL Inspection filter.

Providing feedback on Red Hat documentation

We appreciate your feedback on our documentation.

To propose improvements, open a Jira issue and describe your suggested changes. Provide as much detail as possible to enable us to address your request quickly.

Prerequisite

  • You have a Red Hat Customer Portal account. This account enables you to log in to the Red Hat Jira Software instance. If you do not have an account, you will be prompted to create one.

Procedure

  1. Click Create issue.
  2. In the Summary text box, enter a brief description of the issue.
  3. In the Description text box, provide the following information:

    • The URL of the page where you found the issue.
    • A detailed description of the issue.
      You can leave the information in any other fields at their default values.
  4. Add a reporter name.
  5. Click Create to submit the Jira issue to the documentation team.

Thank you for taking the time to provide feedback.

About this guide

This guide covers using the Streams for Apache Kafka Proxy SASL Inspection Filter. This filter extracts the authenticated principal from a successful SASL exchange between Kafka Client and Kafka Broker and makes it available to the other filters in the chain.

Refer to other Streams for Apache Kafka Proxy guides for information on running the proxy or for advanced topics such as plugin development.

This filter inspects the SASL exchange between Kafka Client and Broker and extracts the authenticated principal. If the client’s authentication with the broker is successful, the filter makes the authenticated principal available to the other filters in the chain, so that they may know on whose behalf they are acting.

Note

The SASL Inspection Filter plays no part in deciding if the authentication is successful or not. That role remains the exclusive responsibility of the broker.

To use this filter, the Kafka Cluster’s listener must be configured to authenticate using SASL, and it must use a SASL mechanism that is enabled by this filter. If the Kafka Client is configured to use a SASL mechanism that is not supported by the proxy, or the proxy and Kafka Cluster do not have the same mechanism available, the client will be disconnected with an unsupported SASL mechanism error.

This filter supports the following SASL mechanisms:

Expand
Table 1. Table of supported SASL mechanisms
SASL mechanismEnabled by default

PLAIN

No

SCRAM-SHA-256

Yes

SCRAM-SHA-512

Yes

OAUTHBEARER

Yes

Mechanisms that transmit credentials in plain text are disabled by default. This is done to avoid the plain-text passwords existing in the proxy’s memory. To use such a mechanism, you must enable it in the filter’s configuration.

For the OAUTHBEARER inspection, only JWT tokens that use signatures (JWS) are supported. JWT tokens that use encryption (JWE) are not supported. Unsigned JWT tokens are supported but not recommended for production use.

If an attempt is made to use an unsupported token type, the authentication will fail with a SASL error.

Figure 1. Sequence diagram showing the SASL inspection filter extracting an authenticated principal from an SASL negotiation.

Chapter 1. Configuring the SASL inspection filter

This procedure describes how to set up the SASL Inspection filter by configuring it in Streams for Apache Kafka Proxy.

Prerequisites

Procedure

  1. Configure a SaslInspection type filter.

1.1. Example KafkaProtocolFilter resource

If your instance of Streams for Apache Kafka Proxy runs on OpenShift, you must use a KafkaProtocolFilter resource to define the filter configuration.

The following example shows a complete filterDefinitions entry that is configured for OAUTHBEARER validation:

kind: KafkaProtocolFilter
metadata:
  name: my-sasl-inspection-filter
spec:
  type: SaslInspection
  configTemplate:
    enabledMechanisms: [ "OAUTHBEARER" ]
    requireAuthentication: true
Copy to Clipboard Toggle word wrap
  • enabledMechanisms restricts the filter to the given SASL mechanisms. Refer to SASL mechanism names listed in the supported mechanisms table. If this field is omitted, the filter enables SCRAM-SHA-256, SCRAM-SHA-512, and OAUTHBEARER by default.
  • If requireAuthentication is true then successful authentication is required before the filter forwards any requests other than those strictly required to perform SASL authentication. If false, the filter forwards all requests regardless of whether SASL authentication has been attempted or was successful. The default value is false.

Refer to the Deploying and Managing Streams for Apache Kafka Proxy on OpenShift guide for more information about configuration on OpenShift.

Chapter 2. Glossary

Glossary of terms used in the SASL Inspection guide.

JWE
JSON Web Encryption is an IETF standard for exchanging encrypted data using JSON.
JWT
JSON Web Token is an IETF standard for securely transmitting information between parties as a JSON object.
JWS
JSON Web Signature is an IETF-proposed standard for signing arbitrary data.
SASL
Simple Authentication and Security Layer, a framework for handling authentication.

Appendix A. Using your subscription

Streams for Apache Kafka is provided through a software subscription. To manage your subscriptions, access your account at the Red Hat Customer Portal.

A.1. Accessing Your Account

  1. Go to access.redhat.com.
  2. If you do not already have an account, create one.
  3. Log in to your account.

A.2. Activating a Subscription

  1. Go to access.redhat.com.
  2. Navigate to My Subscriptions.
  3. Navigate to Activate a subscription and enter your 16-digit activation number.

A.3. Downloading Zip and Tar Files

To access zip or tar files, use the customer portal to find the relevant files for download. If you are using RPM packages, this step is not required.

  1. Open a browser and log in to the Red Hat Customer Portal Product Downloads page at access.redhat.com/downloads.
  2. Locate the Streams for Apache Kafka entries in the INTEGRATION AND AUTOMATION category.
  3. Select the desired Streams for Apache Kafka product. The Software Downloads page opens.
  4. Click the Download link for your component.

A.4. Installing packages with DNF

To install a package and all the package dependencies, use:

dnf install <package_name>
Copy to Clipboard Toggle word wrap

To install a previously-downloaded package from a local directory, use:

dnf install <path_to_download_package>
Copy to Clipboard Toggle word wrap

Revised on 2025-12-16 10:58:10 UTC

Legal Notice

Copyright © Red Hat.
The text of and illustrations in this document are licensed by Red Hat under a Creative Commons Attribution–Share Alike 3.0 Unported license ("CC-BY-SA"). An explanation of CC-BY-SA is available at http://creativecommons.org/licenses/by-sa/3.0/. In accordance with CC-BY-SA, if you distribute this document or an adaptation of it, you must provide the URL for the original version.
Red Hat, as the licensor of this document, waives the right to enforce, and agrees not to assert, Section 4d of CC-BY-SA to the fullest extent permitted by applicable law.
Red Hat, Red Hat Enterprise Linux, the Shadowman logo, JBoss, OpenShift, Fedora, the Infinity logo, and RHCE are trademarks of Red Hat, Inc., registered in the United States and other countries.
Linux® is the registered trademark of Linus Torvalds in the United States and other countries.
Java® is a registered trademark of Oracle and/or its affiliates.
XFS® is a trademark of Silicon Graphics International Corp. or its subsidiaries in the United States and/or other countries.
MySQL® is a registered trademark of MySQL AB in the United States, the European Union and other countries.
Node.js® is an official trademark of Joyent. Red Hat Software Collections is not formally related to or endorsed by the official Joyent Node.js open source or commercial project.
The OpenStack® Word Mark and OpenStack logo are either registered trademarks/service marks or trademarks/service marks of the OpenStack Foundation, in the United States and other countries and are used with the OpenStack Foundation's permission. We are not affiliated with, endorsed or sponsored by the OpenStack Foundation, or the OpenStack community.
All other trademarks are the property of their respective owners.
Red Hat logoGithubredditYoutubeTwitter

詳細情報

試用、購入および販売

コミュニティー

Red Hat ドキュメントについて

Red Hat をお使いのお客様が、信頼できるコンテンツが含まれている製品やサービスを活用することで、イノベーションを行い、目標を達成できるようにします。 最新の更新を見る.

多様性を受け入れるオープンソースの強化

Red Hat では、コード、ドキュメント、Web プロパティーにおける配慮に欠ける用語の置き換えに取り組んでいます。このような変更は、段階的に実施される予定です。詳細情報: Red Hat ブログ.

会社概要

Red Hat は、企業がコアとなるデータセンターからネットワークエッジに至るまで、各種プラットフォームや環境全体で作業を簡素化できるように、強化されたソリューションを提供しています。

Theme

© 2026 Red Hat
トップに戻る