Configuration of a Generative AI chat control
Use Generative AI chat control to send requests to the AI provider to get information. You can submit the prompt using a Generative AI provider or perform a search using an AI knowledge base provider. You can configure these AI providers within the Integration menu.
You can include case data for analysis when using Generative AI or an AI knowledge base. This enables you to utilize the case data to gather the information you need through natural language queries, while ensuring the actual case data remains hidden from view.
Search using Generative AI
Configure the following settings to perform search using Generative AI.
| Setting | Description | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|
| Header label | Enter the header label to be displayed at runtime. (Default: Generative AI Chat) | ||||||||||
| Search type | To execute the search based on Generative AI, select Generative AI. | ||||||||||
| Provider | Select the Generative AI provider to use, such as OpenAI, or AI Agent. (Default: Tungsten)
The Provider list includes the AI providers configured in . See Integrate Generative AI with TotalAgility. If a Generative AI provider is not already configured, a warning message appears when you add the Generative AI chat control to a form. When you select an AI Agent provider, lists of additionally added input and output variables appear. You can set a static value or select a dynamic variable (form control, form variable, global variable, and copilot insights) for the input and output variables. |
||||||||||
| Source | Select one of the following options.
|
||||||||||
| Persist conversation | If selected, allows the context to persist for the duration of the conversation. (Default:
Clear)
Persisting the conversation increases the token usage that may result in exceeding the token limit. When
the prompt exceeds the token limit, the earlier conversation history is removed to manage the token limit automatically.
|
||||||||||
| Use response profile | If selected, lets you add a response profile that can be used at runtime because you may want your responses to take a specific word count or style. (Default: Clear) | ||||||||||
| Response profile | Provide an inline value, such as "Summarize the response in 20 words or less." or select a global variable, form variable, or form control. | ||||||||||
| Use seed | If selected, lets you dynamically set "Use seed" using a form variable (Boolean) or form
control. (Default: Clear)
The seed value increases the likelihood of obtaining more consistent results from the AI provider across repeated requests, though not guaranteed. |
||||||||||
| Seed | Select a form variable or form control, or select a numeric value. (Default: 0 and Maximum: 32,767) | ||||||||||
| Include case data | If selected, lets you include case data for analysis when using Generative AI. (Default:
Clear)
When you include case data, you are prompted to enhance your experience by providing descriptions for both the process and activities to achieve optimal results. The "Include case data" option is not available when using a Custom LLM provider.
At runtime, the case details are included in the information available to search and in the response. |
||||||||||
| Override default message |
If selected, enables the Welcome message and Input help text boxes where you can enter a customized welcome message and help text to be displayed in the Chat control at runtime. (Default: Clear)
|
||||||||||
| Response format |
Select either option for response format.
|
||||||||||
| Use MCP servers |
If selected, lets you add MCP servers. (Default: Clear)
|
||||||||||
| Audit conversation |
If selected, enables auditing of AI interactions. When auditing is enabled, prompt-response pairs are captured in the audit log while maintaining user anonymity to comply with data protection standards. (Default: Clear) |
When you submit a prompt in a Generative AI Chat control on a form using Generative AI as the search type, an entry is recorded in the audit log.
When you view the Generative AI chat control at runtime:
-
The Copilot chat window appears with the following:
-
"This conversation is being audited" message appears if the "Audit conversation" setting is enabled.
Learn more about use of AI link. Click the link to know more about Tungsten Automation's commitment to responsible AI and compliance.
-
A welcome message, "Hello <username>, how can I help you today?". You can modify the welcome message and input help text, customize the appearance of the chat window, such as modify the header color of the Copilot chat window, set an icon for Copilot, and more. See Generative AI chat control style.
-
-
When you enter any input, the response is streamed before it is displayed. You can click the Cancel button to cancel the request. If you add an image or text as part of the input, such as upload an image of car registration, and request the AI provider to get the registration number, the response is returned accordingly.
-
If MCP servers are used, you can perform MCP-related functions. For example, you can enter a prompt to upload the AI generated content or a document to SharePoint or Google Drive.
-
To clear the current conversation (including the context if Persist conversation is turned on in Search using Generative AI) and start a new conversation, click the New chat icon in the top right corner of the Chat box.
-
The prompt within the control is executed based on the search type (Generative AI or AI knowledge base) configured.
Search using AI knowledge base
Configure the following settings to search the AI knowledge base using both TotalAgility document and non-TotalAgility documents.
Access to the knowledge base is available only to Enterprise tier customers. For users on the Standard and Advanced tiers, this feature is restricted and indicated by a lock icon.
| Setting | Description | ||||||
|---|---|---|---|---|---|---|---|
| Header label |
Enter the header label to be displayed at runtime. (Default: Generative AI Chat) |
||||||
| Search type |
To execute the search based on the AI knowledge base, select AI Knowledge Base. |
||||||
| AI Knowledge Base |
Select the configured AI knowledge base provider. (Default: Tungsten) See Integrate with AI knowledge base. The list of indexes from the selected AI knowledge base integration appears. |
||||||
| Index |
Select an index from the list indexes that are associated with the selected AI knowledge base provider. |
||||||
| Override max number of matches |
If selected, allows you to override the maximum number of documents configured when configuring the AI knowledge base integration. (Default: Clear) |
||||||
| Max number of matches |
Specify the maximum number of documents to override. (Default: 5) This property is available only if "Override max number of matches" is enabled. |
||||||
| Filter Fields |
The filter fields in the AI knowledge base determines relevant sources, formulates queries to retrieve information, and ranks the retrieved information based on relevance and reliability to ensure a well-informed response. You can add one or more filter fields to create advanced search filters for more flexible and precise results. The index fields that are configured as filterable in the AI knowledge base integration are available for use in the Chat control.
|
||||||
|
Filter operator |
Select the Filter operator to use. Available options are: And or Or. |
||||||
|
Content fields |
You can define the fields that are used during search and passed to the LLM. This gives you greater control over the data included in search and retrieval operations. The chunk text field is always included automatically as search input.
|
||||||
|
Override query type |
If selected, allows you to select a different query type and overrides the query type set at the integration-level. The "Override query type" setting on the control becomes available only if "Enable query type override" is selected in the AI knowledge base integration. |
||||||
| Query type |
By default, the Query type list populates the query type set in the selected AI knowledge base integration. Select a different query type to override. The following query types are available: Simple, Semantic, Vector, Vector simple hybrid, and Vector semantic hybrid. At runtime, the query type defined at the control level takes precedence over the query type set at the integration level. |
||||||
| Vectorized fields |
Define vectorized fields that are used during the knowledge base search. The "Vectorized fields" section becomes available in the Chat control only when you select "Enable query type override" and set the query type as Vector, Vector simple hybrid, and Vector semantic hybrid in AI knowledge base integration. To add vectorized fields:
|
||||||
| Persist conversation | If selected, allows the context to persist for the duration of the conversation. (Default:
Clear)
Persisting the conversation increases the token usage that may result in exceeding the token limit. When
the prompt exceeds the token limit, the earlier conversation history is removed to manage the token limit automatically.
|
||||||
| Use response profile | If selected, lets you add a response profile that can be used at runtime because you may want your responses to take a specific word count or style. (Default: Clear) | ||||||
| Response profile | Provide an inline value, such as "Summarize the response in 20 words or less." or select a global variable, form variable, or form control. | ||||||
| Use seed | If selected, lets you specify the seed value dynamically using a form variable or form
control when searching the knowledge base. (Default: Clear)
The seed value increases the likelihood of obtaining more consistent results from the AI provider across repeated requests, though not guaranteed. |
||||||
| Seed | Select a form variable or form control, or select a numeric value. (Default: 0 and Maximum: 32,767) | ||||||
| Include case data |
If selected, lets you include case data for analysis when using the AI knowledge base. (Default: Clear) When you include case data, you are prompted to enhance your experience by providing descriptions for both the process and activities to achieve optimal results. The "Include case data" option is not available when using a Custom LLM provider.
At runtime, the case details are included in the information available to search and also included in the response. |
||||||
| Override default message |
If selected, enables the Welcome message and Input help text boxes where you can enter a customized welcome message and help text to be displayed in the Chat window at runtime. (Default: Clear)
|
||||||
| Response format |
Select either option for response format.
|
||||||
| Use MCP servers |
If selected, lets you add MCP servers. (Default: Clear)
|
||||||
| Audit conversation |
If selected, enables auditing of AI interactions. When auditing is enabled, prompt-response pairs are captured in the audit log while maintaining user anonymity to comply with data protection standards. (Default: Clear) |
||||||
| Events | Add one or more actions for the
Document clicked event. Within the action, you must get access to the
document ID of the document clicked. For example, add a Same page event to display the document in a Web capture control.
Save the event. |
At runtime, you can open the Chat window by clicking the Chat with Copilot icon. The Chat window displays the following:
-
"Interaction with AI is being audited" message appears if the "Audit conversation" setting is enabled.
-
The response is streamed before it is displayed. You can click the Cancel button to cancel the search.
-
The number of documents returned in a search depends on the integration setting or the AI knowledge base configuration. The number specified in the "Override max number of matches" setting takes precedence over the integration setting.
-
The TotalAgility documents added to the knowledge base appear as references with hyperlinks. Clicking these hyperlinks triggers the associated actions.
-
The documents used for the search are from all added sources; if no sources are added, all documents in the knowledge base are used. You can perform the search within the control only if you have search access (configured in the AI knowledge base integration).
-
The search results display both TotalAgility and non-TotalAgility document data along with relevant references. In the text response, references are indicated using numbers, such as [1]. At runtime, for TotalAgility document references, the names of documents are displayed as hyperlinks that direct you to the respective documents; for non-TotalAgility document references, the text specified for the reference property is displayed without hyperlinks. Additionally, non-TotalAgility document references do not trigger any document click events.
-
When you select a reference in the search results, the corresponding section of the document is highlighted. If the reference is configured to display the document in a Web capture control, that specific section is highlighted and focused within the control. The same document may appear multiple times in the search results, and clicking the document can highlight different sections each time.
-
When you execute the search, all content (retrievable) fields are passed to the LLM and vector fields are used in the knowledge base search.
-
If MCP servers are used, you can perform MCP-related functions. For example, you can enter a prompt to retrieve content from knowledge base and mail the same to a user.