Configuration of a Generative AI chat control
Use Generative AI chat control to send requests to the AI provider to get information. You can submit the prompt using a Generative AI provider or perform a search using an AI knowledge base provider. You can configure these AI providers within the Integration menu.
You can include case data for analysis when using Generative AI or an AI knowledge base. This enables you to utilize the case data to gather the information you need through natural language queries, while ensuring that the actual case data remains hidden from view.
Configuration to search using Generative AI
Configure the following settings to perform search using Generative AI.
| Header label | Enter the header label to be displayed at runtime. (Default: Generative AI Chat) | ||||||||
| Search type | To execute the search based on Generative AI, select Generative AI. | ||||||||
| Provider | Select the Generative AI provider to use, such as ChatGPT, OpenAI, or AI Agent. (Default:
Tungsten)
The Provider list includes the AI providers configured in . See Integrate Generative AI with TotalAgility. If a Generative AI provider is not already configured, a warning message appears when you add the Generative AI chat control to a form. When you select an AI Agent provider, lists of additionally added input and output variables appear. You can set a static value or select a dynamic variable (form control, form variable, global variable, and copilot insights) for the input and output variables. |
||||||||
| Source | Select one of the following options.
|
||||||||
| Persist conversation | If selected, allows the context to persist for the duration of the conversation. (Default:
Clear)
Persisting the conversation increases the token usage that may result in exceeding the token limit. When
the prompt exceeds the token limit, the earlier conversation history is removed to manage the token limit automatically.
|
||||||||
| Use response profile | If selected, lets you add a response profile that can be used at runtime because you may want your responses to take a specific word count or style. (Default: Clear) | ||||||||
| Response profile | Provide an inline value, such as "Summarize the response in 20 words or less." or select a global variable, form variable, or form control. | ||||||||
| Use seed | If selected, lets you specify the seed value. (Default: Clear)
The seed value increases the likelihood of obtaining more consistent results from the AI provider across repeated requests, though not guaranteed. |
||||||||
| Seed | Provide an inline value or select a numeric variable. (Default: 0 and Maximum: 32,767) | ||||||||
| Include case data | If selected, lets you include case data for analysis when using Generative AI. (Default:
Clear)
When you include case data, you are prompted to enhance your experience by providing descriptions for both the process and activities to achieve optimal results. The "Include case data" option is not available when using a Custom LLM provider.
At runtime, the case details are included in the information available to search and in the response. |
||||||||
| Override default message |
If selected, enables the Welcome message and Input help text boxes where you can enter a customized welcome message and help text to be displayed in the Chat control at runtime. (Default: Clear)
|
||||||||
| Response format |
Select either option for response format.
|
When you submit a prompt in a Generative AI Chat control on a form using Generative AI as the search type, an entry is recorded in the audit log.
When you view the Generative AI chat control at runtime:
-
The Copilot chat window appears with a welcome message, "Hello <username>, how can I help you today?". You can modify the welcome message and input help text, customize the appearance of the chat window, such as modify the header color of the Copilot chat window, set an icon for Copilot, and more. See Generative AI chat control style.
-
When you enter any input, the response is streamed before it is displayed. You can click the Cancel button to cancel the request. If you add an image or text as part of the input, such as upload an image of car registration, and request the AI provider to get the registration number, the response is returned accordingly.
-
To clear the current conversation (including the context if Persist conversation is turned on in Search using Generative AI) and start a new conversation, click the New chat icon in the top right corner of the Chat box.
-
The prompt within the control is executed based on the search type (Generative AI or AI knowledge base) configured.
Configuration to search using AI knowledge base
Configure the following settings to search using both TotalAgility document and non-TotalAgility documents with the AI knowledge base in a Generative AI chat control.
Access to the knowledge base is available only to Enterprise tier customers. For users on the Standard and Advanced tiers, this feature is restricted and indicated by a lock icon.
| Header label | Enter the header label to be displayed at runtime. (Default: Generative AI Chat) |
| Search type | To execute the search based on the AI knowledge base, select AI Knowledge Base. |
| AI Knowledge Base | Select the configured AI knowledge base provider. (Default: Tungsten) See
Integrate with AI
knowledge base.
The list of indexes from the selected AI knowledge base integration appears. |
| Index | Select an index. |
| Override max number of matches |
If selected, allows you to override the maximum number of documents configured when configuring the AI knowledge base integration. (Default: Clear) |
| Max number of matches |
Specify the maximum number of documents to override. (Default: 5) This property is available only if "Override max number of matches" is enabled. |
| Index Fields |
The source identifier in the AI knowledge base determines relevant sources, formulates queries to retrieve information, and ranks the retrieved information based on relevance and reliability to ensure a well-informed response. You can add one or more source identifiers.
|
| Persist conversation | If selected, allows the context to persist for the duration of the conversation. (Default:
Clear)
Persisting the conversation increases the token usage that may result in exceeding the token limit. When
the prompt exceeds the token limit, the earlier conversation history is removed to manage the token limit automatically.
|
| Use response profile | If selected, lets you add a response profile that can be used at runtime because you may want your responses to take a specific word count or style. (Default: Clear) |
| Response profile | Provide an inline value, such as "Summarize the response in 20 words or less." or select a global variable, form variable, or form control. |
| Use seed | If selected, lets you specify the seed value. (Default: Clear)
The seed value increases the likelihood of obtaining more consistent results from the AI provider across repeated requests, though not guaranteed. |
| Seed | Provide an inline value or select a numeric variable. (Default: 0 and Maximum: 32,767) |
| Include case data |
If selected, lets you include case data for analysis when using the AI knowledge base. (Default: Clear) When you include case data, you are prompted to enhance your experience by providing descriptions for both the process and activities to achieve optimal results. The "Include case data" option is not available when using a Custom LLM provider.
At runtime, the case details are included in the information available to search and also included in the response. |
| Override default message | If selected, overrides the default welcome message and input help text with custom text at
runtime. (Default: Clear)
Enter the text for the Welcome message and Input help text to be displayed in the Chat window at runtime. For example, if you enter "Good day!" for the Welcome message, the default "Hello <username>, how can I help you today?" text in the chat window is replaced with "Good day!" at runtime. |
| Response format |
Select either option for response format.
|
| Events | Add one or more actions for the
Document clicked event. Within the action, you must get access to the
document ID of the document clicked. For example, add a Same page event to display the document in a Web capture control.
Save the event. |
At runtime, when you search the knowledge base for results based on TotalAgility and non-TotalAgility document data:
-
The response is streamed before it is displayed. You can click the Cancel button to cancel the search.
-
The number of documents returned in a search depends on the integration setting or the AI knowledge base configuration. The number specified in the "Override max number of matches" setting takes precedence over the integration setting.
-
The TotalAgility documents added to the knowledge base appear as references with hyperlinks. Clicking these hyperlinks triggers the associated actions.
-
The documents used for the search are from all added sources; if no sources are added, all documents in the knowledge base are used. You can perform the search within the control only if you have search access (configured in the AI knowledge base integration).
-
The search results display both TotalAgility and non-TotalAgility document data along with relevant references. In the text response, references are indicated using numbers, such as [1]. At runtime, for TotalAgility document references, the names of documents are displayed as hyperlinks that direct you to the respective documents; for non-TotalAgility document references, the text specified for the reference property is displayed without hyperlinks. Additionally, non-TotalAgility document references do not trigger any document click events.
-
When you select a reference in the search results, the corresponding section of the document is highlighted. If the reference is configured to display the document in a Web capture control, that specific section is highlighted and focused within the control. The same document may appear multiple times in the search results, and clicking the document can highlight different sections each time.