Chapter 4. Tool choice options reference
You can configure how the language model decides when to call a tool by using the tool_choice parameter.
Red Hat AI Inference Server supports multiple tool choice modes that enable different levels of control over tool calling behavior. You set the tool choice value when you create a request that triggers the model to use the available tools, for example:
Select the appropriate tool choice mode based on your application requirements:
| Mode | Behavior | Use case |
|---|---|---|
|
| The model decides whether to call a tool based on the conversation context and available tools. | General purpose tool calling where the model determines when tools are needed. |
|
| The model must call at least one of the available tools and cannot respond without making a tool call. | Scenarios where a tool call is mandatory for the task, such as data retrieval or structured output generation, for example, database queries or API interactions. |
|
| The model does not call any tools and provides a text-based response only. | Disabling tool calling for specific requests while keeping tool definitions available for context. |
|
| The model must call the specific named tool. | Forcing the model to use a particular tool, useful for structured data extraction or specific workflows. |
When using tool_choice="required", ensure that at least one of the available tools is appropriate for the request. If no suitable tool exists, the model might call an inappropriate tool or generate errors.