Chapter 2. Metrics file locations
Reporting metrics to Red Hat is a requirement. Logging metrics for your automation jobs is automatically enabled when you install Ansible SDK. You cannot disable it.
Every time an automation job runs, a new tarball is created. You are responsible for scraping the data from the storage location and for monitoring the size of the directory.
You can customize the metrics storage location for each Python file that runs a playbook, or you can use the default location.
2.1. Default location for metrics files Copy linkLink copied to clipboard!
When you install Ansible SDK, the default metrics storage location is set to the ~/.ansible/metrics directory.
After an automation job is complete, the metrics are written to a tarball in the directory. Ansible SDK creates the directory if it does not already exist.
2.2. Customizing the metrics storage location Copy linkLink copied to clipboard!
You can specify the path to the directory to store your metrics files in the Python file that runs your playbook.
You can set a different directory path for every Python automation job file, or you can store the tarballs for multiple jobs in one directory. If you do not set the path in a Python file, the tarballs for the jobs that it runs will be saved in the default directory (~/.ansible/metrics).
Procedure
- Decide on a location on your file system to store the metrics data. Ensure that the location is readable and writable. Ansible SDK creates the directory if it does not already exist.
In the
job_optionsin themain()function of your Python file, set themetrics_output_pathparameter to the directory where the tarballs are to be stored.In the following example, the metrics files are stored in the
/tmp/metricsdirectory after thepb.ymlplaybook has been executed:Copy to Clipboard Copied! Toggle word wrap Toggle overflow
2.3. Viewing metrics files Copy linkLink copied to clipboard!
After an automation job has completed, navigate to the directory that you specified for storing the data and list the files.
The data for the newly-completed job is contained in a tarball file whose name begins with the date and time that the automation job was run. For example, the following file records data for an automation job executed on 8 March 2023 at 2.30AM.
ls 2023_03_08_02_30_24__aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa_job_data.tar.gz
$ ls
2023_03_08_02_30_24__aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa_job_data.tar.gz
To view the files in the tarball, run tar xvf.
The following example shows the jobs.csv file.
cat jobs.csv job_id,job_type,started,finished,job_state,hosts_ok,hosts_changed,hosts_skipped,hosts_failed,hosts_unreachable,task_count,task_duration 84896567-a586-4215-a914-7503010ef281,local,2023-03-08 02:30:22.440045,2023-03-08 02:30:24.316458,,5,0,0,0,0,2,0:00:01.876413
$ cat jobs.csv
job_id,job_type,started,finished,job_state,hosts_ok,hosts_changed,hosts_skipped,hosts_failed,hosts_unreachable,task_count,task_duration
84896567-a586-4215-a914-7503010ef281,local,2023-03-08 02:30:22.440045,2023-03-08 02:30:24.316458,,5,0,0,0,0,2,0:00:01.876413
When a parameter value is not available, the corresponding entry in the CSV file is empty. In the jobs.csv file above, the job_state value is not available.