From e3b6bfc79d7a95b39f27ff23d97b7036c7e97bd0 Mon Sep 17 00:00:00 2001 From: Sangeetha Hariharan Date: Tue, 14 May 2024 10:53:33 -0700 Subject: [PATCH] Chore: Add documentation for new tool parameters --- docs/docs/07-gpt-file-reference.md | 27 +++++++++++++++------------ 1 file changed, 15 insertions(+), 12 deletions(-) diff --git a/docs/docs/07-gpt-file-reference.md b/docs/docs/07-gpt-file-reference.md index 2f8fcb8b..c6207ad2 100644 --- a/docs/docs/07-gpt-file-reference.md +++ b/docs/docs/07-gpt-file-reference.md @@ -43,18 +43,21 @@ Tool instructions go here. Tool parameters are key-value pairs defined at the beginning of a tool block, before any instructional text. They are specified in the format `key: value`. The parser recognizes the following keys (case-insensitive and spaces are ignored): -| Key | Description | -|-------------------|-----------------------------------------------------------------------------------------------------------------------------------------------| -| `Name` | The name of the tool. | -| `Model Name` | The OpenAI model to use, by default it uses "gpt-4-turbo" | -| `Description` | The description of the tool. It is important that this properly describes the tool's purpose as the description is used by the LLM. | -| `Internal Prompt` | Setting this to `false` will disable the built-in system prompt for this tool. | -| `Tools` | A comma-separated list of tools that are available to be called by this tool. | -| `Credentials` | A comma-separated list of credential tools to run before the main tool. | -| `Args` | Arguments for the tool. Each argument is defined in the format `arg-name: description`. | -| `Max Tokens` | Set to a number if you wish to limit the maximum number of tokens that can be generated by the LLM. | -| `JSON Response` | Setting to `true` will cause the LLM to respond in a JSON format. If you set true you must also include instructions in the tool. | -| `Temperature` | A floating-point number representing the temperature parameter. By default, the temperature is 0. Set to a higher number for more creativity. | +| Key | Description | +|--------------------|-----------------------------------------------------------------------------------------------------------------------------------------------| +| `Name` | The name of the tool. | +| `Model Name` | The LLM model to use, by default it uses "gpt-4-turbo". | +| `Global Model Name`| The LLM model to use for all the tools. | +| `Description` | The description of the tool. It is important that this properly describes the tool's purpose as the description is used by the LLM. | +| `Internal Prompt` | Setting this to `false` will disable the built-in system prompt for this tool. | +| `Tools` | A comma-separated list of tools that are available to be called by this tool. | +| `Global Tools` | A comma-separated list of tools that are available to be called by all tools. | +| `Credentials` | A comma-separated list of credential tools to run before the main tool. | +| `Args` | Arguments for the tool. Each argument is defined in the format `arg-name: description`. | +| `Max Tokens` | Set to a number if you wish to limit the maximum number of tokens that can be generated by the LLM. | +| `JSON Response` | Setting to `true` will cause the LLM to respond in a JSON format. If you set true you must also include instructions in the tool. | +| `Temperature` | A floating-point number representing the temperature parameter. By default, the temperature is 0. Set to a higher number for more creativity. | +| `Chat` | Setting it to `true` will enable an interactive chat session for the tool. |