Configure logging components
Automation controller provides several loggers that can be configured to deliver structured log data for analysis and monitoring.
The following are special loggers (except for awx, which constitutes generic server logs) that provide large amounts of information in a predictable structured or semi-structured format, using the same structure as if obtaining the data from the API:
job_events: Provides data returned from the Ansible callback module.job_lifecycle: Provides data about the lifecycle of jobs, including when they are created, started, and finished.broadcast_websocket: Provides data about broadcast websocket messages sent to the clients.activity_stream: Displays the record of changes to the objects within the application.system_tracking: Provides fact data gathered by Ansiblesetupmodule, that is,gather_facts: truewhen job templates are run with Enable Fact Cache selected.awx: Provides generic server logs, which include logs that would normally be written to a file. It contains the standard metadata that all logs have, except it only has the message from the log statement.
These loggers only use the log-level of INFO, except for the awx logger, which can be any given level.
Additionally, the standard automation controller logs are deliverable through this same mechanism. It should be apparent how to enable or disable each of these five sources of data without manipulating a complex dictionary in your local settings file, and how to adjust the log-level consumed from the standard automation controller logs.
From the navigation panel, select to configure the logging components in automation controller.
Log message schema Copy linkLink copied!
This section describes the common schema for log messages generated by various automation controller components. Understanding the log message schema can help in effectively monitoring and troubleshooting the system.
Common schema for all loggers:
cluster_host_id: Unique identifier of the host within the automation controller cluster.level: Standard python log level, roughly reflecting the significance of the event. All of the data loggers as a part of 'level' useINFOlevel, but the other automation controller logs use different levels as appropriate.logger_name: Name of the logger we use in the settings, for example, "activity_stream".@timestamp: Time of log.path: File path in code where the log was generated.
Activity stream schema Copy linkLink copied!
Automation controller includes an activity_stream logger that records changes to objects in the system, such as job templates, inventories, and credentials.
This uses the fields common to all loggers listed in Log message schema.
It has the following additional fields:
actor: Username of the user who took the action documented in the log.changes: JSON summary of what fields changed, and their old or new values.operation: The category of the changes logged in the activity stream, for example, "associate".object1: Information about the object being operated on, consistent with what is shown in the activity stream.object2: If applicable, the second object involved in the action.
This logger reflects the data being saved into job events, except when they would otherwise conflict with expected standard fields from the logger, in which case the fields are nested. Note that the field host on the job_event model is given as event_host. There is also a sub-dictionary field, event_data within the payload, which has different fields depending on the specifics of the Ansible event.
This logger also includes the common fields in Log message schema.
Scan / fact / system tracking data schema Copy linkLink copied!
This section describes the schema for log messages produced by the scan/fact/system tracking logger in automation controller.
These contain detailed dictionary-type fields that are either services, packages, or files.
services: For services scans, this field is included and has keys based on the name of the service.Note Periods are not allowed by elastic search in names, and are replaced with "_" by the log formatter.
package: Included for log messages from package scans.files: Included for log messages from file scans.host: Name of the host the scan applies to.inventory_id: The inventory id the host is inside of.
This logger also includes the common fields in Log message schema.
Job status changes Copy linkLink copied!
The job status changes logger captures changes in the status of jobs as they occur.
This is a lower-volume source of information about changes in job states compared to job events, and captures changes to types of unified jobs other than job template based jobs.
This logger also includes the common fields in Log message schema and fields present on the job model.
Automation controller logs Copy linkLink copied!
Automation controller uses the standard Python logging module to log messages from various parts of the system.
This logger also includes the common fields in Log message schema.
In addition, this contains a msg field with the log message. Errors contain a separate traceback field. From the navigation panel, select . On the Logging Settings page click and use the ENABLE EXTERNAL LOGGING option to enable or disable the logging components.