Authorization filter guide
Enforce authorization policies on Kafka resources at the proxy
Abstract
Providing feedback on Red Hat documentation Copy linkLink copied to clipboard!
We appreciate your feedback on our documentation.
To propose improvements, open a Jira issue and describe your suggested changes. Provide as much detail as possible to enable us to address your request quickly.
Prerequisite
- You have a Red Hat Customer Portal account. This account enables you to log in to the Red Hat Jira Software instance. If you do not have an account, you will be prompted to create one.
Procedure
- Click Create issue.
- In the Summary text box, enter a brief description of the issue.
In the Description text box, provide the following information:
- The URL of the page where you found the issue.
-
A detailed description of the issue.
You can leave the information in any other fields at their default values.
- Add a reporter name.
- Click Create to submit the Jira issue to the documentation team.
Thank you for taking the time to provide feedback.
About this guide
This guide covers using the Streams for Apache Kafka Proxy Authorization filter to enforce authorization rules on client requests before they reach the Kafka brokers. Refer to other Streams for Apache Kafka Proxy guides for information about running the proxy or for advanced topics such as plugin development.
Chapter 1. Authorization overview Copy linkLink copied to clipboard!
The Authorization filter provides the ability for the proxy to enforce an authorization policy about Kafka resources. These authorization checks are performed in addition to any authorization checks made by the broker itself. For an action to be allowed, both the authorization filter and broker must decide that the action is allowed.
In general, the Authorization filter makes access decisions in the same manner as Kafka itself. A client cannot distinguish between authorization enforced on the proxy and authorization enforced on the kafka cluster itself.
In order to use the Authorization filter, the proxy must be able to determine the authenticated subject. The authenticated subject is the verified identity of the client, derived from its successful authentication.
- If your applications use SASL authentication, configure the SASL inspection filter to build the authenticated subject from the successful SASL exchange between the client and the broker.
Chapter 2. Authorization Model Copy linkLink copied to clipboard!
In Kafka, clients perform operations on resources in Kafka.
The following tables list the resource types and the operations that apply to them.
2.1. Resource types and operations Copy linkLink copied to clipboard!
This table lists the resource types and operations enforced by the authorization filter:
| Resource type | Operations | Typical use-case |
|---|---|---|
| Topic | READ | Required for a consuming client to fetch records. |
| WRITE | Required for a producing client to produce records. | |
| CREATE | Required for an admin client to create, delete or alter topics. | |
| DELETE | ||
| ALTER | ||
| DESCRIBE | Required for an admin client to perform the describe operations that refer to topic resources. | |
| DESCRIBE_CONFIGS | Required for an admin client to perform describe config operations that refer to topic configuration. | |
| ALTER_CONFIGS | Required for an admin client to perform alter config operations that relate to topic configuration. NOTE: Other Kafka resource types will be included in a future release. |
2.2. Implied operation permissions Copy linkLink copied to clipboard!
In the authorization model, some operations imply permission to perform other operations. This table lists the higher-level operations and the implied lower-level operations they include.
| Resource type | Operation | Implied Operation |
|---|---|---|
| Topic | READ | DESCRIBE |
| WRITE | ||
| DELETE | ||
| ALTER | ||
| ALTER_CONFIGS | DESCRIBE_CONFIG |
Chapter 3. Authorization rules Copy linkLink copied to clipboard!
The authorization rules define which principals can perform specific operations on specific Kafka resources.
3.1. Outline of a rule file Copy linkLink copied to clipboard!
The following example shows the overall outline of a rule file. The sections that follow give more details.
3.2. Comments Copy linkLink copied to clipboard!
Both line and block comments are supported. Line comments are preceded by //. Block comments are bracketed by /* … */ markers. Comments are ignored.
3.3. Imports Copy linkLink copied to clipboard!
Resource types must be imported before use. This is achieved using a from / import statement.
from <package> import <element> [as <alias>][,... , <elementn> [as <aliasn>]];
from <package> import <element> [as <alias>][,... , <elementn> [as <aliasn>]];
where:
-
<package> is
io.kroxylicious.filter.authorization -
<element> is a
ResourceTypeimplementation name. - <alias> is an optional alias for the resource type.
For example, TopicResource is the implementation that represents Kafka topics. To declare it use a import statement like this.
from io.kroxylicious.filter.authorization import TopicResource;
from io.kroxylicious.filter.authorization import TopicResource;
To declare it with an alias, use an import statement like this:
from io.kroxylicious.filter.authorization import TopicResource as Topic;
from io.kroxylicious.filter.authorization import TopicResource as Topic;
3.4. Rules Copy linkLink copied to clipboard!
The basic form of a rule is as follows:
<allow|deny> User with <user predicate> to <operation> <resource type> with <resource predicate>;
<allow|deny> User with <user predicate> to <operation> <resource type> with <resource predicate>;
where:
-
<allow|deny>indicates whether to allow or deny the action. -
<user predicate>matches the user principal performing the action. -
<resource type>identifies the resource type being acted upon. This can either be the name of the resource type name or an alias for it. -
<resource predicate>identifies the resource. -
<operation>identifies the operation(s) to be performed on the resource.
Rules must be ordered so that any deny rules precede the allow rules.
When rules are evaluated, they are considered from top to bottom, with the first matching rule taking precedence.
3.5. Otherwise deny Copy linkLink copied to clipboard!
Rules files must end with the statement otherwise deny. This stipulation means that all rules files have deny-by-default semantics.
... otherwise deny;
...
otherwise deny;
3.6. User predicates Copy linkLink copied to clipboard!
The following User predicates are supported:
| Predicate | Description | Example |
|---|---|---|
|
| Equals |
|
|
| Set inclusion |
|
|
| Prefix (note that the wildcard * is only permitted at the end of the prefix.) |
|
3.7. Resource Predicates Copy linkLink copied to clipboard!
The following resource predicates are supported:
| Predicate | Description | Example |
|---|---|---|
|
| Equality |
|
|
| Set inclusion |
|
|
| Prefix (wildcard * is permitted only at the end) |
|
|
| Regular expression match |
|
3.8. Operations Copy linkLink copied to clipboard!
Operations in rules can be specified in the following ways:
-
As a single operation, for example
READ -
As a set of operations, for example
{READ, WRITE} -
As a wildcard that matches any operation, for example
*
The Authorization filter does not support the keyword ALL.
Chapter 4. Configuring the Authorization filter Copy linkLink copied to clipboard!
This procedure describes how to set up the Authorization filter by configuring it in Streams for Apache Kafka Proxy.
Prerequisites
- An instance of Streams for Apache Kafka Proxy. For information on deploying Streams for Apache Kafka Proxy, see the Deploying and Managing Streams for Apache Kafka Proxy on OpenShift guide.
Procedure
Configure an
Authorizationtype filter.-
In an OpenShift deployment using a
KafkaProtocolFilterresource. See Section 4.1, “ExampleKafkaProtocolFilterresource”
-
In an OpenShift deployment using a
Configure the ACL rules.
-
In an OpenShift deployment use a
ConfigMapresource. See Section 4.2, “Example ACL Rules”
-
In an OpenShift deployment use a
4.1. Example KafkaProtocolFilter resource Copy linkLink copied to clipboard!
If your instance of Streams for Apache Kafka Proxy runs on OpenShift, you must use a KafkaProtocolFilter resource to contain the filter configuration.
Here’s a complete example of a KafkaProtocolFilter resource configured for authorization:
-
authorizeris the name of the authorizer service implementation. Currently, this must beAclAuthorizerService. -
aclFileis the reference file containing the ACL rules. You can use an interpolation reference to reference rules stored within a KubernetesConfigMaporSecretresource.
4.2. Example ACL Rules Copy linkLink copied to clipboard!
If your instance of Streams for Apache Kafka Proxy runs on OpenShift, you must use a ConfigMap resource to contain the ACL rules.
Here’s a complete example of a ConfigMap resource configured for authorization:
Example ConfigMap resource containing the ACL rules
Chapter 5. Glossary Copy linkLink copied to clipboard!
Glossary of terms used in the Authorization guide.
- Subject
- The identity of the client for the purposes of applying policies within the proxy. Whether this is the same as the broker’s notion of subject depends on how authentication is configured in the proxy.
- Principal
- A component of a subject.
- Resource
- An entity which an authorizer can control access to. Resources are identified by a type and a name. Examples include Kafka topics and consumer groups (where the group ID is treated as the resource name).
- Operations
- The things that can be done to resources of a particular type. For example, a Kafka topic resource has operations which include describe, read and write.
- Action
-
A resource and an operation such as
read the topic called invoices. - Authorizer
- A component that makes an authorization decision, usually based on some kind of access policy.
- Decision
-
The outcome of the authorization of a particular action. This is either
allowordeny.
Appendix A. Using your subscription Copy linkLink copied to clipboard!
Streams for Apache Kafka is provided through a software subscription. To manage your subscriptions, access your account at the Red Hat Customer Portal.
A.1. Accessing Your Account Copy linkLink copied to clipboard!
- Go to access.redhat.com.
- If you do not already have an account, create one.
- Log in to your account.
A.2. Activating a Subscription Copy linkLink copied to clipboard!
- Go to access.redhat.com.
- Navigate to My Subscriptions.
- Navigate to Activate a subscription and enter your 16-digit activation number.
A.3. Downloading Zip and Tar Files Copy linkLink copied to clipboard!
To access zip or tar files, use the customer portal to find the relevant files for download. If you are using RPM packages, this step is not required.
- Open a browser and log in to the Red Hat Customer Portal Product Downloads page at access.redhat.com/downloads.
- Locate the Streams for Apache Kafka entries in the INTEGRATION AND AUTOMATION category.
- Select the desired Streams for Apache Kafka product. The Software Downloads page opens.
- Click the Download link for your component.
A.4. Installing packages with DNF Copy linkLink copied to clipboard!
To install a package and all the package dependencies, use:
dnf install <package_name>
dnf install <package_name>
To install a previously-downloaded package from a local directory, use:
dnf install <path_to_download_package>
dnf install <path_to_download_package>
Revised on 2025-12-16 10:57:38 UTC