Respond to events from external systems
Simplified event routing provides Event-Driven Ansible controller the capability to capture and analyze data from various remote systems (like GitHub or GitLab) using event streams. You can attach one or more event streams to an activation by swapping out sources in a rulebook.
Event streams simplify connecting sources to rulebooks. This capability enables the creation of a single endpoint to receive alerts from an event source for utilization in multiple rulebooks.
Event streams Copy linkLink copied!
Event streams provide the secure, authenticated entry point for external systems to send events over the internet directly to Event-Driven Ansible controller,simplifying remote data ingestion.
Event-Driven Ansible controller supports six different event stream types.
| Type | Description | Vendor examples |
|---|---|---|
| Hashed Message Authentication Code (HMAC) |
Uses a shared secret between Event-Driven Ansible controller and the vendors webhook server. This guarantees message integrity. |
Github |
| Basic Authentication |
Uses HTTP basic authentication. |
Datadog, Dynatrace |
| Token Authentication |
Validates incoming event data using a security token passed in the request header. While the standard HTTP header used is Authorization, it can be customized for specific platforms, such as using X-Gitlab-Token for GitLab integrations. |
Gitlab, ServiceNow |
| OAuth2 |
Uses Machine-to-Machine (M2M) mode with a grant type called client credentials. The token is opaque. |
Dynatrace |
| OAuth2 with JWT |
Uses M2M mode with a grant type called client credentials. The token is JSON Web Token (JWT). |
Datadog |
| Elliptic Curve Digital Signature Algorithm (ECDSA) |
Verifies message authenticity using a public/private key pair. The sender signs the message with a private key, and the receiver (Event-Driven Ansible controller) validates it with a public key. |
SendGrid, Twilio |
| Mutual Transport Layer Security (mTLS) |
Ensures two-way authentication between Event-Driven Ansible controller and the client sending events through an event stream. It has two sub-types:
Needs the vendor’s CA certificate to be present in our servers at startup. This supports non-repudiation. |
PagerDuty |
If you are using an mTLS event stream with a load balancer, you must enable SSL pass-through (or L4 routing) in your load balancer configuration.
This is required because the SSL termination and client certificate validation for mTLS must occur at the platform gateway proxy server. Consult your load balancer documentation for details on enabling SSL pass-through.
Event-Driven Ansible controller also supports four other specialized event streams that are based on the six basic event stream types:
- GitLab event stream
- GitHub event stream
- ServiceNow event stream
- Dynatrace event stream
These specialized types limit the parameters you use by adding default values. For example, the GitHub event stream is a specialization of the HMAC event stream with many of the fields already populated. After the GitHub event stream credential has been saved, the recommended defaults for this event stream are displayed.
Create an event stream credential Copy linkLink copied!
Create a credential to establish the authentication mechanism (like basic auth or HMAC) required for external systems to securely send events to an event stream.
Before you begin Copy linkLink copied!
- Each event stream must have exactly one credential.
Procedure Copy linkLink copied!
Results Copy linkLink copied!
The Details page is displayed. From there or the Credentials list view, you can edit or delete it.
Create an event stream Copy linkLink copied!
Create a dedicated stream endpoint to simplify how external systems send events, making it easier to route data to multiple rulebook activations.
Before you begin Copy linkLink copied!
- If you will be attaching your event stream to a rulebook activation, ensure that your activation has a decision environment and project already set up.
- If you plan to connect to automation controller to run your rulebook activation, ensure that you have created a Red Hat Ansible Automation Platform credential type in addition to the decision environment and project. For more information, see Setting up a Red Hat Ansible Automation Platform credential.
Procedure Copy linkLink copied!
Results Copy linkLink copied!
After creating your event stream, the following outputs occur:
- The Details page is displayed. From there or the Event Streams list view, you can edit or delete it. Also, the Event Streams page shows all of the event streams you have created and the following columns for each event: Events received, Last event received, and Event stream type. As the first two columns receive external data through the event stream, they are continuously updated to let you know they are receiving events from remote systems.
- If you disabled the event stream, the Details page is displayed with a warning message, This event stream is disabled.
Note After an event stream is created, the associated credential cannot be deleted until the event stream it is attached to is deleted.
- Your new event stream generates a URL that is necessary when you configure the webhook on the remote system that sends events.
HTTP headers Copy linkLink copied!
In the context of Event-Driven Ansible and event streams, HTTP headers play a significant role because they carry the necessary context and security information about the incoming event from a third-party source (for example, GitHub, a monitoring tool, or a proprietary webhook).
They include the following capabilities:
- Authentication and non-repudiation
-
This is the most critical use. Headers often contain tokens, API keys, or security signatures (like an HMAC in an
X-Hub-Signatureheader) that Event-Driven Ansible uses to verify the sender’s identity and ensure the event payload has not been tampered with. This supports non-repudiation—proof that the event came from a legitimate source. - Debugging and Logging
-
Headers provide crucial data points (
Date,User-Agent,X-Request-ID) for tracing the event’s path, helping system administrators and SREs debug issues related to delayed or failed event processing.
Headers are essential for all HTTP communication, serving several distinct purposes:
- Context and metadata: Describe the data being sent (for example,
Content-Type: application/json, Content-Length: 1024). - Client/Server Capabilities: Inform the receiving party of the sender’s capabilities or preferences (for example,
Accept-Language: en-US). - Authentication/Authorization: Carry security credentials (for example,
Authorization: Bearer <token>). - Caching: Controls how content should be cached by clients and proxies (for example,
Cache-Control: max-age=3600). - Routing and Tracking: They facilitate network routing and transaction tracking, often via custom headers (for example,
X-Request-ID).
Configuring HTTP headers securely for event streams Copy linkLink copied!
To enhance event stream security, you must explicitly define which HTTP headers are passed. These headers carry the critical context and authentication data required for processing.
Procedure Copy linkLink copied!
Static Unique Universal Identifiers (UUIDs) for event streams Copy linkLink copied!
You can configure an event stream with a static Unique Universal Identifier (UUID) to ensure its webhook URL remains consistent, even if the event stream service is recreated.
This feature is relevant for disaster recovery scenarios where external systems, like firewalls or third-party webhooks, cannot be easily reconfigured to use a new URL. Here are key concepts when considering using static UUIDs:
- Disaster recovery support
-
A static UUID ensures that the external webhook URL, which follows the format,
https://your-eda-server.com/api/eda/v1/external_event_stream/{uuid}/, does not change upon service recreation. - Uniqueness
- The UUID you provide must be unique across all existing event streams in the system.
- Security warning
- Static UUIDs make your webhook URLs predictable and, therefore, could reduce security. Only use this feature when URL consistency is critical and you have implemented strong additional security measures (like strong authentication and network restrictions). For normal operations, always use autogenerated (dynamic) UUIDs.
You must ensure that additional security measures are in place, such as robust credential types (HMAC, mTLS) and network restrictions.
Update an event stream with a static UUID (API Method) Copy linkLink copied!
Access the API to set a static UUID, a feature critical for maintaining webhook URL consistency across service recreations, such as in disaster recovery scenarios.
Before you begin Copy linkLink copied!
- Ansible Automation Platform 2.6-next
Procedure Copy linkLink copied!
Results Copy linkLink copied!
- Confirm that the UUID of your event stream has been updated to the new static string.
Configure your remote system to send events Copy linkLink copied!
After you have created your event stream, you must configure your remote system to send events to Event-Driven Ansible controller. The method used for this configuration varies, depending on the vendor for the event stream credential type you select.
Before you begin Copy linkLink copied!
- The URL that was generated when you created your event stream
- Secrets or passwords that you set up in your event stream credential
About this task Copy linkLink copied!
The following example demonstrates how to configure webhooks in a remote system like GitHub to send events to Event-Driven Ansible controller. Each vendor will have unique methods for configuring your remote system to send events to Event-Driven Ansible controller.
Procedure Copy linkLink copied!
Results Copy linkLink copied!
After the webhook has been added, it attempts to send a test payload to ensure there is connectivity between the two systems (GitHub and Event-Driven Ansible controller). If it can successfully send the data, you will see a green check mark next to the Webhook URL with the message, Last delivery was successful.
Verify your event streams work Copy linkLink copied!
Confirm end-to-end event flow by verifying the event stream receives data from the remote system, validating the webhook URL and authentication setup.
Procedure Copy linkLink copied!
Results Copy linkLink copied!
This moves the event stream to production mode and makes it easy to attach to rulebook activations. When this option is toggled off, your ability to forward events to a rulebook activation is disabled and the This event stream is disabled message is displayed.
Replace sources and attach event streams to activations Copy linkLink copied!
Replace complex source mappings with pre-configured event streams to simplify rulebook activation design and centralize incoming event routing.
About this task Copy linkLink copied!
There are several key points to keep in mind regarding source mapping:
- An event stream can only be used once in a rulebook source swap. If you have multiple sources in the rulebook, you can only replace each source once.
- The source mapping happens only in the current rulebook activation. You must repeat this process for any other activations using the same rulebook.
- The source mapping is valid only if the rulebook doesn’t get modified. If the rulebook gets modified during the source mapping process, the source mapping would fail and it would have to be repeated.
- If the rulebook is modified after the source mapping has been created and a Restart happens, the rulebook activation fails.
Procedure Copy linkLink copied!
Results Copy linkLink copied!
After you create your rulebook activation, the Details page is displayed. You can navigate to the Event streams page to confirm your events have been received.