Configure Generative AI chat control to search using an AI knowledge base
Configure the following settings to search the AI knowledge base using both TotalAgility documents and non-TotalAgility documents.
Access to the knowledge base is available only to Enterprise-tier customers. For users on the Standard and Advanced tiers, this feature is restricted and indicated by a lock icon .
| Setting | Description | ||||||
|---|---|---|---|---|---|---|---|
| Header label |
Enter the header label to be displayed at runtime. (Default: Generative AI Chat) |
||||||
| Search type |
To execute the search based on an AI knowledge base, select AI Knowledge Base. |
||||||
| AI Knowledge Base |
Select the configured AI knowledge base provider. (Default: Tungsten) See Integrate with AI knowledge base. The list of indexes from the selected AI knowledge base integration appears. |
||||||
| Index |
Select an index from the index list. This list includes indexes that are associated with the selected AI knowledge base provider. |
||||||
| Override max number of matches |
If selected, it enables overriding the maximum number of documents configured when configuring the AI knowledge base integration. (Default: Clear) |
||||||
| Max number of matches |
Specify to override the maximum number of documents (default: 5) that were configured in the AI knowledge base integration. (Default: 5) This property is available only if "Override max number of matches" is enabled. |
||||||
| Filter Fields |
The filter fields in the AI knowledge base determine relevant sources, formulate queries to retrieve information, and rank the retrieved information based on relevance and reliability to ensure a well-informed response. You can add one or more filter fields to create advanced search filters for more flexible and precise results. The index fields that are configured as filterable in the AI knowledge base integration are available for use in the Chat control.
|
||||||
|
Filter operator |
Select the Filter operator to use. Available options are And or Or. |
||||||
|
Content fields |
You can define the fields that are used during search and passed to the LLM. This gives you greater control over the data included in search and retrieval operations. The chunk text field is always included automatically as search input.
|
||||||
|
Override query type |
If selected, it enables selecting a different query type and overrides the query type set at the integration level. The "Override query type" setting on the control becomes available only if "Enable query type override" is selected in the AI knowledge base integration. |
||||||
| Query type |
By default, the Query type list populates the query type set in the selected AI knowledge base integration. Select a different query type to override. The following query types are available: Simple, Semantic, Vector, Vector simple hybrid, and Vector semantic hybrid. At runtime, the query type defined at the control level takes precedence over the query type set at the integration level. |
||||||
| Semantic configuration |
Use semantic configuration to improve the relevance of semantic search results. By default, the Default semantic configuration is applied. To use a different semantic configuration defined for the index, on the Semantic configuration list, select a different configuration. See Semantic configuration for search indexes. The "Semantic configuration" property is only available if the query type is set to Semantic or Vector
semantic hybrid.
|
||||||
| Vectorized fields |
Define vectorized fields to be used in knowledge base search. The "Vectorized fields" section is only available in the Chat control if the "Enable query type override"
is selected and the query type is set to Vector, Vector simple hybrid, or Vector semantic hybrid in the AI knowledge base
integration.
To add vectorized fields:
|
||||||
| Persist conversation |
If selected, the context persists during the conversation. (Default: Clear)
Persisting the conversation increases the token usage. If the prompt exceeds the token limit, the earlier
conversation history is automatically removed to manage the token limit.
|
||||||
| Use response profile |
If selected, it enables you to add a response profile so that at runtime, your responses display in a specific word count or style. (Default: Clear) |
||||||
| Response profile | Provide an inline value, such as "Summarize the response in 20 words or less." or select a global variable, form variable, or form control. | ||||||
| Use seed | If selected, enables setting the seed value dynamically using a form variable or form control
when searching the knowledge base. (Default: Clear)
The seed value increases the likelihood of obtaining more consistent results from the AI provider across repeated requests, though not guaranteed. |
||||||
| Seed | Select a form variable or form control, or select a numeric value (default: 0 and maximum: 32,767). | ||||||
| Include case data |
If selected, it enables you to include case data for analysis when using the AI knowledge base. (Default: Clear) When you include case data, you are prompted to enhance your experience by providing descriptions for both the process and activities to achieve optimal results. The "Include case data" option is not available when using a Custom LLM provider. To include case data:
At runtime, case details are added to the searchable context and are also included in the response output. |
||||||
| Override default message |
If selected, overrides the default welcome message and input help text with custom text at runtime. (Default: Clear) Enter the details to be displayed at runtime.
|
||||||
| Response format |
Select either option for the response format:
|
||||||
| Use MCP servers |
If selected, it enables you to add MCP servers. (Default: Clear) To add MCP servers:
|
||||||
| Audit conversation |
If selected, it enables auditing of AI interactions. When auditing is enabled, prompt-response pairs are captured in the audit log while maintaining user anonymity to comply with data protection standards. (Default: Clear) |
||||||
| Events |
Add one or more actions for the Document clicked event. Within the action, you must access the document ID of the clicked document. For example, add a Same page action to display the document in a Web capture control. Save the event. |
At runtime, you can open the Chat window by selecting the Chat with Copilot icon. The Chat window displays the following:
-
"Interaction with AI is being audited" message appears if the "Audit conversation" setting is enabled.
-
The response is streamed before it is displayed. You can select the Cancel button to cancel the search.
-
The number of documents returned in a search depends on the integration setting or the AI knowledge base configuration. The number specified in the "Override max number of matches" setting takes precedence over the integration setting.
-
The TotalAgility documents added to the knowledge base appear as references with hyperlinks. Selecting these hyperlinks triggers the associated actions.
-
The documents used for the search are from all added sources; if no sources are added, all documents in the knowledge base are used. You can perform the search within the control only if you have search access (configured in the AI knowledge base integration).
-
The search results display both TotalAgility and non-TotalAgility document data along with relevant references. In a text response, references appear as numbers, such as [1]. At runtime, for TotalAgility document references, the names of documents are displayed as hyperlinks that direct you to the respective documents; for non-TotalAgility document references, the text specified for the reference property is displayed without hyperlinks. Additionally, non-TotalAgility document references do not trigger any "Document clicked" events.
-
On selecting a reference in search results, the corresponding section of the document is highlighted. If the reference is configured to display the document in a Web capture control, that specific section is highlighted and focused within the control. The same document may appear multiple times in the search results, and selecting the document can highlight different sections each time.
-
When you execute the search, all content (retrievable) fields are passed to the LLM, and vector fields are used in the knowledge base search.
-
If MCP servers are used, you can perform MCP-related functions, such as entering a prompt to retrieve content from the knowledge base and mailing the same to a user.