Copilot

You can use the Dashboard, Generative AI, or AI Knowledge Base to get insights into the data on a form at runtime.

On the Insights type list, select an insights type and configure it.

When you enable Dashboard insights, Generative AI, or AI knowledge base for Copilot in the main form, the chat window remains open as you navigate through the child forms in a multiview form. The context remains intact, and queries continue to work.

  • In a multiview form using Generative AI insights, all child forms inherit this functionality and each child form displays the Copilot configuration regardless of the individual form settings.

  • In a multiview form that is not using Copilot, each child form displays the Copilot configuration specific to that form, but the context is not maintained between the child forms.

  • Multiple instances of a Web capture control are not supported in a multiview form.

Dashboard

Use a dashboard to get insight into form data at runtime. Ensure that you include at least one Chart, Tile, Table, Work queue, Job list, or Workload control on the form. However, do not include the Generative AI chat control.

Configure the dashboard as follows.

Name Description
Header label Enter a name for the dashboard to be displayed at runtime.
Provider Select a Generative AI provider. (Default: Tungsten)

The Provider list includes the AI providers configured in Integration > Generative AI. See Integrate Generative AI with TotalAgility.

Custom LLM and AI Agent providers are not supported.

If a Generative AI provider is not already configured, a warning message appears.

Persist conversation If selected, allows the context to persist for the duration of the conversation. (Default: Clear)
Persisting the conversation increases the token usage that may result in exceeding the token limit. When the prompt exceeds the token limit, the earlier conversation history is removed to manage the token limit automatically. The existing message that exceeded the token limit and the context is also cleared.
Use response profile If selected, adds a response profile that can be used at runtime because you may want your responses to take a specific word count or style. (Default: Clear)
Response profile

Click and provide an inline value, or select a String variable (form or global), or a form control.

Use seed If selected lets you specify the seed. (Default: Clear)

Provide an inline value or select a numeric variable. (Default: 0 and Maximum: 32,767)

Seed increases the likelihood of obtaining more consistent results from the AI provider across repeated requests, though not guaranteed.

Override default message

If selected, enables the Welcome message and Input help text boxes where you can enter a customized welcome message and help text to be displayed in the chat window at runtime. (Default: Clear)

Option Description
Welcome message

You can provide static or dynamic value (form control, form variable, or global variable) or a combination of both for a welcome message.

For example, if you enter "Good day!" as the message, and a form variable called Name that contains the value "John Doe", the default "Hello <username>, how can I help you today?" is replaced with "Good day! John Doe" at runtime.

Input help text

Enter the help text to be displayed in the Chat window at runtime.

Response format

Select either option for response format.

  • HTML (default): Returns the response with HTML formatting, such as bullet points, headings, and bold text, to present structured content clearly and enhance readability.

  • None: Returns the response as plain text with no formatting.

  • On upgrading TotalAgility, the Response format is set to None.

  • This setting is not available when using Custom LLM or AI Agent AI providers.

Audit conversation

If selected, enables auditing of AI interactions. When auditing is enabled, prompt-response pairs are captured in the audit log while maintaining user anonymity to comply with data protection standards. (Default: Clear)

Display

Select either option for Width mode to set the width for the Copilot chat window displayed when you open the form at runtime.

Option Description
Fixed

When set to fixed, the width of the Copilot chat window does not change according to the form width.

Enter Width in pixel. (Default: 450 px)

By default, this setting is applied when a form is upgraded or imported from a previous version.

Percentage (default)

When set to percentage, the width of the Copilot chat window is adjusted according to the form width.

Enter Width in percentage and Min width in pixel. (Default: 450 px)

If width (in percentage) is less than the minimum width, the width of the Copilot chat widow is set to the minimum width.

The default value of Height is 450 pixels. You can change this value and accordingly the height of the Copilot chat window will be displayed at runtime.

Events

Configure the following events to add actions as needed. See Configure actions for a form control event.

  • Response returned: A response is returned when the action is executed.

  • Reset chat: For example, add the Delete row action to delete a row from a table. The action is executed when you click the Reset chat button.

  • Prompt entered: The action is executed when you enter a prompt in the Copilot chat window.

At runtime, the Chat with Copilot icon appears at the bottom-right corner of the form. When you click this icon, the Chat window opens and displays the following details:

  • "Interaction with AI is being audited" message appears if the "Audit conversation" setting is enabled.

    Learn more about use of AI link. Click the link to know more about Tungsten Automation's commitment to responsible AI and compliance.

  • A welcome message, "Hello <username>, how can I help you today?."

You can increase or decrease the size of the Chat window, and position it anywhere on the screen.

On the Chat window, you can do the following:

  • Enter a request, such as "How many activities are listed in Process_1?" in the Copilot chat box, and then click the Return icon or press Enter to get a response from Copilot.

  • Request about the details of the Charts, Tiles, Tables, Work queue, Job list, and Workload controls.

    • Large Language Models (LLM) have limitations in performing mathematical calculations. Therefore, when using dashboard insights, you may get inaccurate results when requesting counts, totals, and similar metrics.

    • When you request the AI provider to create a chart, you should explicitly mention an HTML chart in the prompt. For example, "Create an HTML chart for the active and finished jobs created this week."

  • Clear the current conversation and start a new conversation by clicking the Start over icon in the top-right corner of the Chat box.

    The context of the conversation is maintained if the "Persist conversation" is selected while designing the form. However, the context of conversation is not maintained between two different forms.

  • Close the Chat window by clicking the Collapse icon .

Generative AI

Use Generative AI to get insights from the attached image file, text file, and plain text, as well as to handle general queries at runtime.

Configure Generative AI insight type as follows.

Name Description
Header label Enter a name for the header to be displayed at runtime.
Provider Select the Generative AI provider to use, such as OpenAI or AI Agent. (Default: Tungsten)

The Provider list includes the AI providers configured in Integration > Generative AI. See Integrate Generative AI with TotalAgility.

If a Generative AI provider is not already configured, a warning message appears.

When you select an AI Agent provider, lists of additionally added input and output variables appear. You can set a static value or select dynamic variable (form control, form variable, global variable, and copilot insights) for the input and output variables.

Source Select the source based on which the AI provider provides the response.

Option Description

None (default)

The response is generated based on the input provided (plain text or variables) at runtime.

Image

On the Document list, select a form control (Web capture control), form variable, or global variable of type String, and on the Mime type list, select a file type.

The response is generated based on the image provided.

  • Mime type is not supported for form variables.

  • All Generative AI models do not support images.

Text

If the document has layout text, such as a semi-structured document (invoice, purchase order, and more) or a fixed form, the document text can provide better results when compared to the results returned using a Custom LLM provider.

On the Document list, select a form control (Web capture control), form variable, or global variable of type String.

The response is generated based on the document passed as the input.

  • The file types (TIFF, JPEG, PNG, and PDF) that are supported in the Transformation Designer are supported for text source format in TotalAgility at runtime.

  • When you set the source format as Text, the standard model configuration is used.

Inline image

Click the attachment icon available at runtime to upload an image and ask a question about it. The thumbnail of the uploaded image is displayed.

The response is generated based on the image provided at runtime.

  • You need not save the document in the Web capture control to have the Document ID populated.

  • Make sure the EDoc rendering option is selected in a Web capture control to be able to upload different document types, such as Microsoft Word document.

Persist conversation If selected, allows the context to persist for the duration of the conversation. (Default: Clear)
Persisting the conversation increases the token usage that may result in exceeding the token limit. When the prompt exceeds the token limit, the earlier conversation history is removed to manage the token limit automatically. The existing message that exceeded the token limit and the context is also cleared.
Use response profile If selected, adds a response profile that can be used at runtime because you may want your responses to take a specific word count or style. (Default: Clear)
Response profile

On the Response profile list, click and provide an inline value, or select a String variable (form or global), or a form control.

Use seed If selected lets you specify the seed. (Default: Clear)

Provide an inline value or select a numeric variable. (Default: 0 and Maximum: 32, 767)

Seed increases the likelihood of obtaining more consistent results from the AI provider across repeated requests, though not guaranteed.

Include case data

If selected, includes case data for analysis when using Generative AI. This allows you to leverage the case data to obtain the information you need through natural language queries, while ensuring that the case data itself is not displayed on the screen.

For Case identifier, select a variable (global variable, form variable, or form control) or provide an inline value for Case ref or Case ID. A list of case data is displayed. By default, Key Case Details, History (full), Variables, and Notes options are selected. You can select the items from the list as needed.

Include case data is not available when using a Custom LLM provider.

Override default message

If selected, enables the Welcome message and Input help text boxes where you can enter a customized welcome message and help text to be displayed in the Chat window at runtime. (Default: Clear)

Option Description
Welcome message

You can provide static or dynamic value (form control, form variable, or global variable) or a combination of both for a welcome message.

For example, if you enter "Good day!" as the message, and a form variable called Name that contains the value "John Doe", the default "Hello <username>, how can I help you today?" is replaced with "Good day! John Doe" at runtime.

Input help text

Enter the help text to be displayed in the Chat window at runtime.

Response format

Select either option for response format.

  • HTML (default): Returns the response with HTML formatting, such as bullet points, headings, and bold text, to present structured content clearly and enhance readability.

  • None: Returns the response as plain text with no formatting.

  • On upgrading TotalAgility, the Response format is set to None.

  • This setting is not available when using Custom LLM or AI Agent AI providers.

Use MCP servers

If selected, lets you add MCP servers. (Default: Clear)

  1. To add MCP servers, click Add. The Add MCP servers dialog box appears.
  2. From the list of configured MCP servers, select the required servers. See Configure MCP server connection.

  3. Click Done.

Audit conversation

If selected, enables auditing of AI interactions. When auditing is enabled, prompt-response pairs are captured in the audit log while maintaining user anonymity to comply with data protection standards. (Default: Clear)

Display

Select either option for Width mode to set the width for the Copilot chat window displayed when you open the form at runtime.

Option Description
Fixed

When set to fixed, the width of the Copilot chat window does not change according to the form width.

Enter Width in pixel. (Default: 450 px)

By default, this setting is applied when a form is upgraded or imported from a previous version.

Percentage (default)

When set to percentage, the width of the Copilot chat window is adjusted according to the form width.

Enter Width in percentage and Min width in pixel. (Default: 450 px)

If width (in percentage) is less than the minimum width, the width of the Copilot chat widow is set to the minimum width.

The default value of Height is 450 pixels. You can change this value and accordingly the height of the Copilot chat window will be displayed at runtime.

Events

Configure the following events to add actions as needed. See Configure actions for a form control event.

  • Response returned: A response is returned when the action is executed.

  • Reset chat: For example, add the Delete row action to delete a row from a table. The action is executed when you click the Reset chat button.

  • Prompt entered: The action is executed when you enter a prompt in the Copilot chat window.

At runtime, you can open the chat window by clicking the Chat with Copilot icon. The Chat window displays the following:

  • "This conversation is being audited" message appears if the "Audit conversation" setting is enabled.

    Learn more about use of AI link. Click the link to know more about Tungsten Automation's commitment to responsible AI and compliance.

  • A welcome message, "Hello <username>, how can I help you today?"

You can increase or decrease the size of the Chat window, and position it anywhere on the screen.

In the Chat window, you can do the following:

  • Enter image or text as input and request response from the AI provider. For example, you can type "List the countries generate electricity from nuclear power plants". When you enter a request, the response is streamed before it is displayed. You can click the Cancel button to cancel the request.

  • If MCP servers are used, you can perform MCP-related functions. For example, you can enter a prompt to upload the AI generated content or a document to SharePoint or Google Drive.

  • Clear the current conversation and start a new conversation by clicking the New chat icon in the top-right corner of the Chat box.

    The context of the conversation is maintained if the "Persist conversation" is selected while designing the form. However, the context of conversation is not maintained between two different forms.

  • When you include case data and interact with Copilot at runtime, the case details are included in the information available for search and in the response.

  • When you submit a prompt in the Copilot chat window while using a form configured with Copilot, an entry is recorded in the audit log.

AI Knowledge Base

Use the AI Knowledge Base to gain insights by enabling users to search across multiple documents for answers to natural language queries within their solutions. You can create a knowledge base by uploading various types of documents, including both captured documents and non-document data, which will be stored in Azure Search. This knowledge base can then be queried using natural language in conjunction with a large language model (LLM). When you perform a search, the response is streamed before it is displayed. You can click the Cancel button to cancel the search.

The access to knowledge base is available only to Enterprise tier customers. For users on the Standard and Advanced tiers, this feature is restricted and indicated by a lock icon.

Name Description
Header label

Enter a name for the header to be displayed at runtime.

AI Knowledge Base

Select the configured AI knowledge base provider. (Default: Tungsten)

See Integrate with AI knowledge base.

Index

The list of index names to which you can add, edit and delete index fields. You must specify at least one index.

Override max number of matches

If selected, allows you to override the maximum number of documents configured when configuring the AI knowledge base integration. (Default: Clear)

See Integrate TotalAgility with AI Knowledge Base.

Max number of matches

Specify the maximum number of documents to override. (Default: 5)

This property is available only if "Override max number of matches" is enabled.

Filter Fields

The filter fields in the AI knowledge base determines relevant sources, formulates queries to retrieve information, and ranks the retrieved information based on relevance and reliability to ensure a well-informed response.

You can add one or more filter fields to create advanced search filters for more flexible and precise results.

The index fields that are configured as filterable in the AI knowledge base integration are available for use in the Chat control.

  1. Click , expand the Name box and select a filterable field.

    In addition to the following system index fields: Job ID,Case ID, and Case Ref, the index fields that are configured as filterable in the AI knowledge base integration appear in the Name box. For example, if you select a Job ID, the knowledge base search is performed based on documents associated with that job ID.

  2. On the Operator list, select the field operator to use. Available operators are: Equals, Not equals, Greater than, Greater than or equal, Less than, Less than or equal, All excluding, and Any.

    "All excluding" and "Any" field operators are only available if the index field type is set to "String collection" while adding indexes in the AI knowledge base integration.

  3. On the Value list, select a global variable, form variable or a form control, or click and provide an inline value,

  4. Click Add.

    The filter field is added. Add more fields an needed.

    To edit a filter field, select the field and click and modify the required fields. To delete a field, select the field and click .

    The data object fields are not supported in an AI knowledge base search filter.

Filter operator

Select the Filter operator to use. Available options are: And or Or.

Content fields

You can define the fields that are used during search and passed to the LLM. This gives you greater control over the data included in search and retrieval operations. The chunk text field is always included automatically as search input.

  • Click , expand the Name box and select a filterable field.

    In addition to the following system index fields: Job ID,Case ID, and Case Ref, the index fields that are configured as retrievable in the AI knowledge base integration appear in the Name box.

    In the AI knowledge base integration, you must configure the retrievable field as searchable for it to appear in the Name box.

  • Click Add.

    The retrievable filter field is added. Add more fields an needed. You can edit or delete the fields.

Override query type

If selected, allows you to select a different query type and overrides the query type set at the integration-level.

The "Override query type" setting on the control becomes available only if "Enable query type override" is selected in the AI knowledge base integration.

Query type

By default, the Query type list populates the query type set in the selected AI knowledge base integration.

Select a different query type to override. The following query types are available: Simple, Semantic, Vector, Vector simple hybrid, and Vector semantic hybrid.

At runtime, the query type defined at the control level takes precedence over the query type set at the integration level.

Vectorized fields

Define vectorized fields that are used during the knowledge base search.

The "Vectorized fields" section becomes available in the Chat control only when you select "Enable query type override" and set the query type as Vector, Vector simple hybrid, and Vector semantic hybrid in AI knowledge base integration.

To add vectorized fields:

  • In the Name box, click the down arrow to expand the list, and select an index field. The fields that are configured as vectorized, searchable, and retrievable only appear in the Name box.

  • Click Add.

    The field appears under Vectorized fields. You can edit or delete the fields.

    If you import a form from an earlier version of TotalAgility or upgrade TotalAgility , the retrievable fields and vector fields lists are empty by default. You can configure the fields as needed.

Persist conversation If selected, allows the context to persist for the duration of the conversation. (Default: Clear)
Persisting the conversation increases the token usage that may result in exceeding the token limit. When the prompt exceeds the token limit, the earlier conversation history is removed to manage the token limit automatically. The existing message that exceeded the token limit and the context is also cleared.
Use response profile If selected, adds a response profile that can be used at runtime because you may want your responses to take a specific word count or style. (Default: Clear)
Response profile

Click and provide an inline value, or select a String variable (form or global), or a form control.

Use seed If selected lets you specify the seed. (Default: Clear)

Provide an inline value or select a numeric variable. (Default: 0 and Maximum: 32,767)

Seed increases the likelihood of obtaining more consistent results from the AI provider across repeated requests, though not guaranteed.

Seed Provide an inline value or select a numeric variable. (Default: 0 and Maximum: 32,767)
Include case data

If selected, includes case data for analysis when using Generative AI. This allows you to leverage the case data to obtain the information you need through natural language queries, while ensuring that the case data itself is not displayed on the screen.

For Case identifier, select a variable (global variable, form variable, or form control) or provide an inline value for Case ref or Case ID. A list of case data is displayed. By default, Key Case Details, History (full), Variables, and Notes options are selected. You can select the items from the list as needed.

Include case data is not available when using a Custom LLM provider.

Override default message

If selected, enables the Welcome message and Input help text boxes where you can enter a customized welcome message and help text to be displayed in the Chat window at runtime. (Default: Clear)

Option Description
Welcome message

You can provide static or dynamic value (form control, form variable, or global variable) or a combination of both for a welcome message.

For example, if you enter "Good day!" as the message, and a form variable called Name that contains the value "John Doe", the default "Hello <username>, how can I help you today?" is replaced with "Good day! John Doe" at runtime.

Input help text

Enter the help text to be displayed in the Chat window at runtime.

Response format

Select either option for response format.

  • HTML (default): Returns the response with HTML formatting, such as bullet points, headings, and bold text, to present structured content clearly and enhance readability.

  • None: Returns the response as plain text with no formatting.

  • On upgrading TotalAgility, the Response format is set to None.

  • This setting is not available when using Custom LLM or AI Agent AI providers.

Use MCP servers

If selected, lets you add MCP servers. (Default: Clear)

  1. To add MCP servers, click Add. The Add MCP servers dialog box appears.
  2. From the list of configured MCP servers, select the required servers. See Configure MCP server connection.

  3. Click Done.

Audit conversation

If selected, enables auditing of AI interactions. When auditing is enabled, prompt-response pairs are captured in the audit log while maintaining user anonymity to comply with data protection standards. (Default: Clear)

Display

Select either option for Width mode to set the width for the Copilot chat window displayed when you open the form at runtime.

Option Description
Fixed

When set to fixed, the width of the Copilot chat window does not change according to the form width.

Enter Width in pixel. (Default: 450 px)

By default, this setting is applied when a form is upgraded or imported from a previous version.

Percentage (default)

When set to percentage, the width of the Copilot chat window is adjusted according to the form width.

Enter Width in percentage and Min width in pixel. (Default: 450 px)

If width (in percentage) is less than the minimum width, the width of the Copilot chat widow is set to the minimum width.

The default value of Height is 450 pixels. You can change this value and accordingly the height of the Copilot chat window will be displayed at runtime.

Events

Configure the following events to add actions as needed. See Configure actions for a form control event.

  • Response returned: A response is returned when the action is executed.

  • Document clicked: For example, add a Same page action to display the document in a Web capture control. The action is executed when you click the document.

    You must have access to the document ID of the clicked document.

  • Reset chat: For example, add the Delete row action to delete a row from a table. The action is executed when you click the Reset chat button.

  • Prompt entered: The action is executed when you enter a prompt in the Copilot chat window.

At runtime, you can open the Chat window by clicking the Chat with Copilot icon. The Chat window displays the following:

  • "Interaction with AI is being audited" message if the "Audit conversation" setting is enabled.

  • Learn more about use of AI link. Click the link to know more about Tungsten Automation's commitment to responsible AI and compliance.

  • A welcome message, "Hello <username>, how can I help you today?."

In the Chat window, you can do the following:

  • Enter image or text as input and request response from the AI provider. For example, upload an image of car registration and request the registration number. When you enter a request, the response is streamed before it is displayed. You can click the Cancel button to cancel the request.

  • If MCP servers are used, you can perform MCP-related functions. For example, you can enter a prompt to retrieve content from knowledge base and mail the same to a user.

  • Clear the current conversation and start a new conversation by clicking the New chat icon in the top-right corner of the Chat box.

    The context of the conversation is maintained if the "Persist conversation" is selected while designing the form. However, the context of conversation is not maintained between two different forms.

  • When you include case data and interact with Copilot at runtime, the case details are included in the information available for search and in the response.

  • When you execute the search, all content (retrievable) fields are passed to the LLM and vector fields are used in the knowledge base search.