Ce contenu n'est pas disponible dans la langue sélectionnée.
Chapter 6. Securing Kafka
A secure deployment of AMQ Streams can encompass:
- Encryption for data exchange
- Authentication to prove identity
- Authorization to allow or decline actions executed by users
6.1. Encryption
AMQ Streams supports Transport Layer Security (TLS), a protocol for encrypted communication.
Communication is always encrypted for communication between:
- Kafka brokers
- ZooKeeper nodes
- Operators and Kafka brokers
- Operators and ZooKeeper nodes
- Kafka Exporter
You can also configure TLS between Kafka brokers and clients by applying TLS encryption to the listeners of the Kafka broker. TLS is specified for external clients when configuring an external listener.
AMQ Streams components and Kafka clients use digital certificates for encryption. The Cluster Operator sets up certificates to enable encryption within the Kafka cluster. You can provide your own server certificates, referred to as Kafka listener certificates, for communication between Kafka clients and Kafka brokers, and inter-cluster communication.
AMQ Streams uses Secrets to store the certificates and private keys required for TLS in PEM and PKCS #12 format.
A TLS Certificate Authority (CA) issues certificates to authenticate the identity of a component. AMQ Streams verifies the certificates for the components against the CA certificate.
- AMQ Streams components are verified against the cluster CA Certificate Authority (CA)
- Kafka clients are verified against the clients CA Certificate Authority (CA)
6.2. Authentication
Kafka listeners use authentication to ensure a secure client connection to the Kafka cluster.
Supported authentication mechanisms:
- Mutual TLS client authentication (on listeners with TLS enabled encryption)
- SASL SCRAM-SHA-512
- OAuth 2.0 token based authentication
The User Operator manages user credentials for TLS and SCRAM authentication, but not OAuth 2.0. For example, through the User Operator you can create a user representing a client that requires access to the Kafka cluster, and specify TLS as the authentication type.
Using OAuth 2.0 token-based authentication, application clients can access Kafka brokers without exposing account credentials. An authorization server handles the granting of access and inquiries about access.
6.3. Authorization
Kafka clusters use authorization to control the operations that are permitted on Kafka brokers by specific clients or users. If applied to a Kafka cluster, authorization is enabled for all listeners used for client connection.
If a user is added to a list of super users in a Kafka broker configuration, the user is allowed unlimited access to the cluster regardless of any authorization constraints implemented through authorization mechanisms.
Supported authorization mechanisms:
- Simple authorization
- OAuth 2.0 authorization (if you are using OAuth 2.0 token-based authentication)
- Open Policy Agent (OPA) authorization
- Custom authorization
Simple authorization uses AclAuthorizer
, the default Kafka authorization plugin. AclAuthorizer
uses Access Control Lists (ACLs) to define which users have access to which resources. For custom authorization, you configure your own Authorizer
plugin to enforce ACL rules.
OAuth 2.0 and OPA provide policy-based control from an authorization server. Security policies and permissions used to grant access to resources on Kafka brokers are defined in the authorization server.
URLs are used to connect to the authorization server and verify that an operation requested by a client or user is allowed or denied. Users and clients are matched against the policies created in the authorization server that permit access to perform specific actions on Kafka brokers.