Copilot
You can use the Dashboard, Generative AI, or AI Knowledge Base to get insights into the data on a form at runtime.
On the Insights type list, select an insights type and configure it.
- None (default)
- Dashboard
- Generative AI
- AI Knowledge Base
When you enable Dashboard insights, Generative AI, or AI knowledge base for Copilot in the main form, the chat window remains open as you navigate through the child forms in a multiview form. The context remains intact, and queries continue to work.
-
In a multiview form using Generative AI insights, all child forms inherit this functionality and each child form displays the Copilot configuration regardless of the individual form settings.
-
In a multiview form that is not using Copilot, each child form displays the Copilot configuration specific to that form, but the context is not maintained between the child forms.
-
Multiple instances of a Web capture control are not supported in a multiview form.
Dashboard
Use a dashboard to get insight into form data at runtime. Ensure that you include at least one Chart, Tile, Table, Work queue, Job list, or Workload control on the form. However, do not include the Generative AI chat control.
Configure the dashboard as follows.
| Name | Description | ||||||
|---|---|---|---|---|---|---|---|
| Header label | Enter a name for the dashboard to be displayed at runtime. | ||||||
| Provider | Select a Generative AI provider. (Default:
Tungsten)
The Provider list includes the AI providers configured in . See Integrate Generative AI with TotalAgility. Custom LLM and AI Agent providers are not supported. If a Generative AI provider is not already configured, a warning message appears. |
||||||
| Persist conversation | If selected, allows the context to persist for the duration of the conversation. (Default:
Clear)
Persisting the conversation increases the token usage that may result in exceeding the token limit. When
the prompt exceeds the token limit, the earlier conversation history is removed to manage the token limit automatically. The
existing message that exceeded the token limit and the context is also cleared.
|
||||||
| Use response profile | If selected, adds a response profile that can be used at runtime because you may want your responses to take a specific word count or style. (Default: Clear) | ||||||
| Response profile |
Click and provide an inline value, or select a String variable (form or global), or a form control. |
||||||
| Use seed | If selected lets you specify the seed. (Default: Clear)
Provide an inline value or select a numeric variable. (Default: 0 and Maximum: 32,767) Seed increases the likelihood of obtaining more consistent results from the AI provider across repeated requests, though not guaranteed. |
||||||
| Override default message |
If selected, enables the Welcome message and Input help text boxes where you can enter a customized welcome message and help text to be displayed in the chat window at runtime. (Default: Clear)
|
||||||
| Response format |
Select either option for response format.
|
||||||
| Audit conversation |
If selected, enables auditing of AI interactions. When auditing is enabled, prompt-response pairs are captured in the audit log while maintaining user anonymity to comply with data protection standards. (Default: Clear) |
||||||
| Display |
Select either option for Width mode to set the width for the Copilot chat window displayed when you open the form at runtime.
The default value of Height is 450 pixels. You can change this value and accordingly the height of the Copilot chat window will be displayed at runtime. |
||||||
| Events |
Configure the following events to add actions as needed. See Configure actions for a form control event.
|
At runtime, the Chat with Copilot icon appears at the bottom-right corner of the form. When you click this icon, the Chat window opens and displays the following details:
-
"Interaction with AI is being audited" message appears if the "Audit conversation" setting is enabled.
Learn more about use of AI link. Click the link to know more about Tungsten Automation's commitment to responsible AI and compliance.
-
A welcome message, "Hello <username>, how can I help you today?."
You can increase or decrease the size of the Chat window, and position it anywhere on the screen.
On the Chat window, you can do the following:
-
Enter a request, such as "How many activities are listed in Process_1?" in the Copilot chat box, and then click the Return icon or press Enter to get a response from Copilot.
-
Request about the details of the Charts, Tiles, Tables, Work queue, Job list, and Workload controls.
-
Large Language Models (LLM) have limitations in performing mathematical calculations. Therefore, when using dashboard insights, you may get inaccurate results when requesting counts, totals, and similar metrics.
-
When you request the AI provider to create a chart, you should explicitly mention an HTML chart in the prompt. For example, "Create an HTML chart for the active and finished jobs created this week."
-
-
Clear the current conversation and start a new conversation by clicking the Start over icon in the top-right corner of the Chat box.
The context of the conversation is maintained if the "Persist conversation" is selected while designing the form. However, the context of conversation is not maintained between two different forms.
-
Close the Chat window by clicking the Collapse icon .
Generative AI
Use Generative AI to get insights from the attached image file, text file, and plain text, as well as to handle general queries at runtime.
Configure Generative AI insight type as follows.
| Name | Description | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|
| Header label | Enter a name for the header to be displayed at runtime. | ||||||||||
| Provider | Select the Generative AI provider to use, such as OpenAI or AI Agent. (Default: Tungsten)
The Provider list includes the AI providers configured in . See Integrate Generative AI with TotalAgility. If a Generative AI provider is not already configured, a warning message appears. When you select an AI Agent provider, lists of additionally added input and output variables appear. You can set a static value or select dynamic variable (form control, form variable, global variable, and copilot insights) for the input and output variables. |
||||||||||
| Source | Select the source based on which the AI provider provides the response.
|
||||||||||
| Persist conversation | If selected, allows the context to persist for the duration of the conversation. (Default:
Clear)
Persisting the conversation increases the token usage that may result in exceeding the token limit. When
the prompt exceeds the token limit, the earlier conversation history is removed to manage the token limit automatically. The
existing message that exceeded the token limit and the context is also cleared.
|
||||||||||
| Use response profile | If selected, adds a response profile that can be used at runtime because you may want your responses to take a specific word count or style. (Default: Clear) | ||||||||||
| Response profile |
On the Response profile list, click and provide an inline value, or select a String variable (form or global), or a form control. |
||||||||||
| Use seed | If selected lets you specify the seed. (Default: Clear)
Provide an inline value or select a numeric variable. (Default: 0 and Maximum: 32, 767) Seed increases the likelihood of obtaining more consistent results from the AI provider across repeated requests, though not guaranteed. |
||||||||||
| Include case data |
If selected, includes case data for analysis when using Generative AI. This allows you to leverage the case data to obtain the information you need through natural language queries, while ensuring that the case data itself is not displayed on the screen. For Case identifier, select a variable (global variable, form variable, or form control) or provide an inline value for Case ref or Case ID. A list of case data is displayed. By default, Key Case Details, History (full), Variables, and Notes options are selected. You can select the items from the list as needed. Include case data is not available when using a Custom LLM provider. |
||||||||||
| Override default message |
If selected, enables the Welcome message and Input help text boxes where you can enter a customized welcome message and help text to be displayed in the Chat window at runtime. (Default: Clear)
|
||||||||||
| Response format |
Select either option for response format.
|
||||||||||
| Use MCP servers |
If selected, lets you add MCP servers. (Default: Clear)
|
||||||||||
| Audit conversation |
If selected, enables auditing of AI interactions. When auditing is enabled, prompt-response pairs are captured in the audit log while maintaining user anonymity to comply with data protection standards. (Default: Clear) |
||||||||||
| Display |
Select either option for Width mode to set the width for the Copilot chat window displayed when you open the form at runtime.
The default value of Height is 450 pixels. You can change this value and accordingly the height of the Copilot chat window will be displayed at runtime. |
||||||||||
| Events |
Configure the following events to add actions as needed. See Configure actions for a form control event.
|
At runtime, you can open the chat window by clicking the Chat with Copilot icon. The Chat window displays the following:
-
"This conversation is being audited" message appears if the "Audit conversation" setting is enabled.
Learn more about use of AI link. Click the link to know more about Tungsten Automation's commitment to responsible AI and compliance.
-
A welcome message, "Hello <username>, how can I help you today?"
You can increase or decrease the size of the Chat window, and position it anywhere on the screen.
In the Chat window, you can do the following:
-
Enter image or text as input and request response from the AI provider. For example, you can type "List the countries generate electricity from nuclear power plants". When you enter a request, the response is streamed before it is displayed. You can click the Cancel button to cancel the request.
-
If MCP servers are used, you can perform MCP-related functions. For example, you can enter a prompt to upload the AI generated content or a document to SharePoint or Google Drive.
-
Clear the current conversation and start a new conversation by clicking the New chat icon in the top-right corner of the Chat box.
The context of the conversation is maintained if the "Persist conversation" is selected while designing the form. However, the context of conversation is not maintained between two different forms.
-
When you include case data and interact with Copilot at runtime, the case details are included in the information available for search and in the response.
-
When you submit a prompt in the Copilot chat window while using a form configured with Copilot, an entry is recorded in the audit log.
AI Knowledge Base
Use the AI Knowledge Base to gain insights by enabling users to search across multiple documents for answers to natural language queries within their solutions. You can create a knowledge base by uploading various types of documents, including both captured documents and non-document data, which will be stored in Azure Search. This knowledge base can then be queried using natural language in conjunction with a large language model (LLM). When you perform a search, the response is streamed before it is displayed. You can click the Cancel button to cancel the search.
The access to knowledge base is available only to Enterprise tier customers. For users on the Standard and Advanced tiers, this feature is restricted and indicated by a lock icon.
| Name | Description | ||||||
|---|---|---|---|---|---|---|---|
| Header label |
Enter a name for the header to be displayed at runtime. |
||||||
| AI Knowledge Base |
Select the configured AI knowledge base provider. (Default: Tungsten) |
||||||
| Index |
The list of index names to which you can add, edit and delete index fields. You must specify at least one index. |
||||||
| Override max number of matches |
If selected, allows you to override the maximum number of documents configured when configuring the AI knowledge base integration. (Default: Clear) |
||||||
| Max number of matches |
Specify the maximum number of documents to override. (Default: 5) This property is available only if "Override max number of matches" is enabled. |
||||||
| Filter Fields |
The filter fields in the AI knowledge base determines relevant sources, formulates queries to retrieve information, and ranks the retrieved information based on relevance and reliability to ensure a well-informed response. You can add one or more filter fields to create advanced search filters for more flexible and precise results. The index fields that are configured as filterable in the AI knowledge base integration are available for use in the Chat control.
|
||||||
|
Filter operator |
Select the Filter operator to use. Available options are: And or Or. |
||||||
|
Content fields |
You can define the fields that are used during search and passed to the LLM. This gives you greater control over the data included in search and retrieval operations. The chunk text field is always included automatically as search input.
|
||||||
|
Override query type |
If selected, allows you to select a different query type and overrides the query type set at the integration-level. The "Override query type" setting on the control becomes available only if "Enable query type override" is selected in the AI knowledge base integration. |
||||||
| Query type |
By default, the Query type list populates the query type set in the selected AI knowledge base integration. Select a different query type to override. The following query types are available: Simple, Semantic, Vector, Vector simple hybrid, and Vector semantic hybrid. At runtime, the query type defined at the control level takes precedence over the query type set at the integration level. |
||||||
| Vectorized fields |
Define vectorized fields that are used during the knowledge base search. The "Vectorized fields" section becomes available in the Chat control only when you select "Enable query type override" and set the query type as Vector, Vector simple hybrid, and Vector semantic hybrid in AI knowledge base integration. To add vectorized fields:
|
||||||
| Persist conversation | If selected, allows the context to persist for the duration of the conversation. (Default:
Clear)
Persisting the conversation increases the token usage that may result in exceeding the token limit. When
the prompt exceeds the token limit, the earlier conversation history is removed to manage the token limit automatically. The
existing message that exceeded the token limit and the context is also cleared.
|
||||||
| Use response profile | If selected, adds a response profile that can be used at runtime because you may want your responses to take a specific word count or style. (Default: Clear) | ||||||
| Response profile |
Click and provide an inline value, or select a String variable (form or global), or a form control. |
||||||
| Use seed | If selected lets you specify the seed. (Default: Clear)
Provide an inline value or select a numeric variable. (Default: 0 and Maximum: 32,767) Seed increases the likelihood of obtaining more consistent results from the AI provider across repeated requests, though not guaranteed. |
||||||
| Seed | Provide an inline value or select a numeric variable. (Default: 0 and Maximum: 32,767) | ||||||
| Include case data |
If selected, includes case data for analysis when using Generative AI. This allows you to leverage the case data to obtain the information you need through natural language queries, while ensuring that the case data itself is not displayed on the screen. For Case identifier, select a variable (global variable, form variable, or form control) or provide an inline value for Case ref or Case ID. A list of case data is displayed. By default, Key Case Details, History (full), Variables, and Notes options are selected. You can select the items from the list as needed.
Include case data is not available when using a Custom LLM provider. |
||||||
| Override default message |
If selected, enables the Welcome message and Input help text boxes where you can enter a customized welcome message and help text to be displayed in the Chat window at runtime. (Default: Clear)
|
||||||
| Response format |
Select either option for response format.
|
||||||
| Use MCP servers |
If selected, lets you add MCP servers. (Default: Clear)
|
||||||
| Audit conversation |
If selected, enables auditing of AI interactions. When auditing is enabled, prompt-response pairs are captured in the audit log while maintaining user anonymity to comply with data protection standards. (Default: Clear) |
||||||
| Display |
Select either option for Width mode to set the width for the Copilot chat window displayed when you open the form at runtime.
The default value of Height is 450 pixels. You can change this value and accordingly the height of the Copilot chat window will be displayed at runtime. |
||||||
| Events |
Configure the following events to add actions as needed. See Configure actions for a form control event.
|
At runtime, you can open the Chat window by clicking the Chat with Copilot icon. The Chat window displays the following:
-
"Interaction with AI is being audited" message if the "Audit conversation" setting is enabled.
-
Learn more about use of AI link. Click the link to know more about Tungsten Automation's commitment to responsible AI and compliance.
-
A welcome message, "Hello <username>, how can I help you today?."
In the Chat window, you can do the following:
-
Enter image or text as input and request response from the AI provider. For example, upload an image of car registration and request the registration number. When you enter a request, the response is streamed before it is displayed. You can click the Cancel button to cancel the request.
-
If MCP servers are used, you can perform MCP-related functions. For example, you can enter a prompt to retrieve content from knowledge base and mail the same to a user.
-
Clear the current conversation and start a new conversation by clicking the New chat icon in the top-right corner of the Chat box.
The context of the conversation is maintained if the "Persist conversation" is selected while designing the form. However, the context of conversation is not maintained between two different forms.
-
When you include case data and interact with Copilot at runtime, the case details are included in the information available for search and in the response.
-
When you execute the search, all content (retrievable) fields are passed to the LLM and vector fields are used in the knowledge base search.