Chapter 2. Initializing InstructLab
You must initialize the InstructLab environments to begin working with the Red Hat Enterprise Linux AI models.
2.1. Creating your RHEL AI environment Copy linkLink copied to clipboard!
You can start interacting with LLM models and the RHEL AI tooling by initializing the InstructLab environment.
Prerequisites
- You installed RHEL AI with the bootable container image.
- You have root user access on your machine.
Procedure
Optional: To set up training profiles, you need to know the GPU accelerators in your machine. You can view your system information by running the following command:
$ ilab system infoInitialize InstructLab by running the following command:
$ ilab config initThe CLI prompts you to setup your
config.yaml.Example output
Welcome to InstructLab CLI. This guide will help you to setup your environment. Please provide the following values to initiate the environment [press Enter for defaults]: Generating `/home/<example-user>/.config/instructlab/config.yaml` and `/home/<example-user>/.local/share/instructlab/internal/train_configuration/profiles`...Follow the CLI prompts to set up your training hardware configurations. This updates your
config.yamlfile and adds the propertrainconfigurations for training an LLM model. Type the number of the YAML file that matches your hardware specifications.ImportantThese profiles only add the necessary configurations to the
trainsection of yourconfig.yamlfile, therefore any profile can be selected for inference serving a model.Example output of selecting training profiles
Please choose a train profile to use: [0] No profile (CPU-only) [1] A100_H100_x2.yaml [2] A100_H100_x4.yaml [3] A100_H100_x8.yaml [4] L40_x4.yaml [5] L40_x8.yaml [6] L4_x8.yaml Enter the number of your choice [hit enter for the default CPU-only profile] [0]:Example output of a completed
ilab config initrun.You selected: A100_H100_x8.yaml Initialization completed successfully, you're ready to start using `ilab`. Enjoy!Configuring your system’s GPU for inference serving: This step is only required if you are using Red Hat Enterprise Linux AI exclusively for inference serving.
Edit your
config.yamlfile by running the following command:$ ilab config editIn the
evaluatesection of the configurations file, edit thegpus:parameter and add the number of accelerators on your machine.evaluate: base_branch: null base_model: ~/.cache/instructlab/models/granite-7b-starter branch: null gpus: <num-gpus>In the
vllmsection of theservefield in the configuration file, edit thegpus:andvllm_args: ["--tensor-parallel-size"]parameters and add the number of accelerators on your machine.serve: backend: vllm chat_template: auto host_port: 127.0.0.1:8000 llama_cpp: gpu_layers: -1 llm_family: '' max_ctx_size: 4096 model_path: ~/.cache/instructlab/models/granite-7b-redhat-lab vllm: llm_family: '' vllm_args: ["--tensor-parallel-size", "<num-gpus>"] gpus: <num-gpus>
If you want to use the skeleton taxonomy tree, which includes two skills and one knowledge
qna.yamlfile, you can clone the skeleton repository and place it in thetaxonomydirectory by running the following command:rm -rf ~/.local/share/instructlab/taxonomy/ ; git clone https://github.com/RedHatOfficial/rhelai-sample-taxonomy.git ~/.local/share/instructlab/taxonomy/Directory structure of the InstructLab environment
├─ ~/.cache/instructlab/models/1 ├─ ~/.local/share/instructlab/datasets2 ├─ ~/.local/share/instructlab/taxonomy3 ├─ ~/.local/share/instructlab/phased/<phase1-or-phase2>/checkpoints/4 - 1
~/.cache/instructlab/models/: Contains all downloaded large language models, including the saved output of ones you generate with RHEL AI.- 2
~/.local/share/instructlab/datasets/: Contains data output from the SDG phase, built on modifications to the taxonomy repository.- 3
~/.local/share/instructlab/taxonomy/: Contains the skill and knowledge data.- 4
~/.local/share/instructlab/phased/<phase1-or-phase2>/checkpoints/: Contains the output of the multi-phase training process
Verification
You can view the full
config.yamlfile by running the following command$ ilab config showYou can also manually edit the
config.yamlfile by running the following command:$ ilab config edit