AI Agent: Difference between revisions

From QPR ProcessAnalyzer Wiki
Jump to navigation Jump to search
(TK-56477)
 
(3 intermediate revisions by the same user not shown)
Line 10: Line 10:
In QPR Cloud, the AI Agent is disabled by default. To take the AI Agent into use, please send a request to customercare@qpr.com and the AI Agent will be enabled for your environment.
In QPR Cloud, the AI Agent is disabled by default. To take the AI Agent into use, please send a request to customercare@qpr.com and the AI Agent will be enabled for your environment.


To enable the AI Agent for customers using an on-premise system, there's the option to use either Snowflake Cortex or OpenAI. See the [[Snowflake Connection Configuration]] or [[AI_Assistant_for_QPR_ProcessAnalyzer#Take_AI_Assistant_into_Use|OpenAI related settings]] respectively.
To enable the AI Agent for customers using an on-premise system, there's the option to use either Snowflake Cortex or OpenAI. See [[Snowflake_Connection_Configuration#Snowflake_Cortex|Snowflake Cortex]] or [[AI_Assistant_for_QPR_ProcessAnalyzer#Take_AI_Assistant_into_Use|OpenAI related settings]] respectively.


You can also fine-tune the large language model, so that it better suits to your use. For example, the large language model can be fine-tuned with background information regarding the analyzed process and the broader business environment. For more information about fine-tuning large language models, please contact QPR or check out the [https://docs.snowflake.com/en/user-guide/snowflake-cortex/cortex-finetuning Snowflake Cortex Fine-tuning] or [https://platform.openai.com/docs/guides/fine-tuning OpenAI Fine-tuning] technical instructions.
You can also fine-tune the large language model, so that it better suits to your use. For example, the large language model can be fine-tuned with background information regarding the analyzed process and the broader business environment. For more information about fine-tuning large language models, please contact QPR or check out the [https://docs.snowflake.com/en/user-guide/snowflake-cortex/cortex-finetuning Snowflake Cortex Fine-tuning] or [https://platform.openai.com/docs/guides/fine-tuning OpenAI Fine-tuning] technical instructions.
Line 17: Line 17:
* '''System Prompt''': Optional. Enter a system prompt for the language model if desired. This can help guide the model's responses but is not required.
* '''System Prompt''': Optional. Enter a system prompt for the language model if desired. This can help guide the model's responses but is not required.
* '''User Prompt''': Mandatory. Provide a user prompt for the language model. This field must contain valid input.
* '''User Prompt''': Mandatory. Provide a user prompt for the language model. This field must contain valid input.
* '''Input Charts''': Optional. Configure an array of zero-to-many chart configurations to be used as source data for the language model. The AI Agent can function without input charts. You can copy chart settings from another chart and and use them here as follows: Open the Chart Settings (editable) of the other chart, select all JSON and copy it to a clipboard. Come back to the AI Agent Input Charts, paste the settings to the textbox, and click Apply.
* '''Input Charts''': Optional. Configure an array of zero-to-many chart configurations to be used as source data for the language model. The AI Agent can function without input charts. You can copy chart settings from another chart and and use them here as follows: Open the Chart Settings (editable) of the other chart, select all JSON and copy it to a clipboard. Come back to the AI Agent Input Charts, paste the settings to the textbox, and click Apply. Note that if the input chart's data exceeds 20000 characters, the data is cut at the end of the line where the limit is reached.
* '''LLM Name''': Select a language model from the predefined list of Snowflake Cortex models or enter a custom model name. If the OpenAI API key is configured and no LLM name is specified, the OpenAI language model will be used by default. Default language model is llama3.1-70b.
* '''LLM Name''': Select a language model from the predefined list of Snowflake Cortex models or enter a custom model name. If the OpenAI API key is configured and no LLM name is specified, the OpenAI language model will be used by default. Default language model is llama3.1-70b.
* '''LLM Parameters''': Optional. Define optional parameters for the language model using a JSON object. Refer to the [https://docs.snowflake.com/en/sql-reference/functions/complete-snowflake-cortex Snowflake] and [https://platform.openai.com/docs/api-reference/chat/create OpenAI] documentation for supported parameters. Example: { "temperature": 0 }
* '''LLM Parameters''': Optional. Define optional parameters for the language model using a JSON object. Refer to the [https://docs.snowflake.com/en/sql-reference/functions/complete-snowflake-cortex Snowflake] and [https://platform.openai.com/docs/api-reference/chat/create OpenAI] documentation for supported parameters. Example: { "temperature": 0 }
Line 45: Line 45:
* '''Event Type Mapping''': Select the event attribute used as the event type for the input charts. The event type selection affects the process flow and for example variations and flows. When the event type mapping is defined, the input charts work similarly as if the same event type mapping had been selected in the model level. When <model default> is selected, the event type mapping defined for the model is used. Note that filters don't contain the event type mapping information, so they use the mapping that has been defined for the input charts, and thus charts with different event type mapping don't work together with filtering (usually all cases appear to be filtered out due to the mismatch).
* '''Event Type Mapping''': Select the event attribute used as the event type for the input charts. The event type selection affects the process flow and for example variations and flows. When the event type mapping is defined, the input charts work similarly as if the same event type mapping had been selected in the model level. When <model default> is selected, the event type mapping defined for the model is used. Note that filters don't contain the event type mapping information, so they use the mapping that has been defined for the input charts, and thus charts with different event type mapping don't work together with filtering (usually all cases appear to be filtered out due to the mismatch).
* '''Chart Settings (Editable)''': All settings in the AI Agent are stored in a single entity that are shown here. The settings can also be edited directly. There are validations in place, so invalid settings are not accepted.
* '''Chart Settings (Editable)''': All settings in the AI Agent are stored in a single entity that are shown here. The settings can also be edited directly. There are validations in place, so invalid settings are not accepted.
== Data Sent to LLM Providers ==
AI Agent is based on the large language models (LLM) offered by Snowflake and OpenAI through their APIs. When using Snowflake LLMs, the data is already in Snowflake. See [[AI_Assistant_for_QPR_ProcessAnalyzer#Data_Sent_to_OpenAI|Data Sent to OpenAI]] for information what is sent to OpenAI when its LLMs are used.
[[Category: QPR ProcessAnalyzer]]

Latest revision as of 14:26, 4 February 2025

The AI Agent is an innovative dashboard component designed to display textual responses generated by language models based on user-configured prompts. This component seamlessly integrates with both case-centric and object-centric Snowflake models, to provide comprehensive insights.
Key features of the AI Agent include:

  • Versatile Model Support: The AI Agent is compatible with language models from both OpenAI and Snowflake Cortex, offering flexibility and a wide range of capabilities to meet diverse analytical needs.
  • Dynamic Response Generation: By configuring specific prompts, users can tailor the AI Agent to generate relevant and context-specific textual responses, enhancing decision-making processes.
  • Integration with Snowflake: The AI Agent supports Snowflake's case-centric and object-centric models, allowing users to harness the full potential of their data within the Snowflake ecosystem.

This component is ideal for users seeking to leverage advanced language models to extract meaningful insights and drive data-driven strategies directly from their dashboards.

Take AI Agent into Use

In QPR Cloud, the AI Agent is disabled by default. To take the AI Agent into use, please send a request to customercare@qpr.com and the AI Agent will be enabled for your environment.

To enable the AI Agent for customers using an on-premise system, there's the option to use either Snowflake Cortex or OpenAI. See Snowflake Cortex or OpenAI related settings respectively.

You can also fine-tune the large language model, so that it better suits to your use. For example, the large language model can be fine-tuned with background information regarding the analyzed process and the broader business environment. For more information about fine-tuning large language models, please contact QPR or check out the Snowflake Cortex Fine-tuning or OpenAI Fine-tuning technical instructions.

General Settings

  • System Prompt: Optional. Enter a system prompt for the language model if desired. This can help guide the model's responses but is not required.
  • User Prompt: Mandatory. Provide a user prompt for the language model. This field must contain valid input.
  • Input Charts: Optional. Configure an array of zero-to-many chart configurations to be used as source data for the language model. The AI Agent can function without input charts. You can copy chart settings from another chart and and use them here as follows: Open the Chart Settings (editable) of the other chart, select all JSON and copy it to a clipboard. Come back to the AI Agent Input Charts, paste the settings to the textbox, and click Apply. Note that if the input chart's data exceeds 20000 characters, the data is cut at the end of the line where the limit is reached.
  • LLM Name: Select a language model from the predefined list of Snowflake Cortex models or enter a custom model name. If the OpenAI API key is configured and no LLM name is specified, the OpenAI language model will be used by default. Default language model is llama3.1-70b.
  • LLM Parameters: Optional. Define optional parameters for the language model using a JSON object. Refer to the Snowflake and OpenAI documentation for supported parameters. Example: { "temperature": 0 }
  • Title: Custom title for the AI Agent can be defined. Note that the AI Agent does not have an automatically created title.
  • Description: Provide a description for the AI Agent. Note that this description is not used as a prompt for the language model.

Layout Settings

Following layout settings are available:

  • Background color: Background color of the AI Agent area. The color can also contain partial transparency (alpha value) or even be fully transparent. When the A Agent background has transparency, the color of the dashboard background is visible behind the AI Agent.
  • Border color: Border color of the AI Agent. Like the background, border color can also contain transparency.
  • Border width: Border width of the AI Agent in pixels. When the width is zero, border is not visible.
  • Border corner roundness: Border corner roundness in pixels. Zero means sharp corners.

Filtering Settings

  • Follow Dashboard Filters: When checked, the AI Agent is filtered by the filters in the dashboard. When unchecked, the AI Agent is not affected by filters in the dashboard, and thus the AI Agent feedback is based on data in the entire model.
  • Chart Filter: Apply filters to each of the input charts if they are set to Follow Dashboard Filters.
  • Find Root Causes: Rules to select cases used by the root causes analyses to find possible root causes for the phenomena pointed by those rules.
  • Require Root Causes Criteria: When enabled, the chart will show a user-friendly message if no criteria is selected for the Find Root Causes selection. This setting can be enabled for the AI Agent performing the root causes analysis, i.e., the root causes selection is mandatory for the AI Agent to show relevant results.

Advanced Settings

  • Settings Available in Preview Mode: If there is no need for a user to change the AI Agent settings when viewing the dashboard, this option can hide the settings in the preview mode. This helps to make the dashboard cleaner looking as there is no settings button in the AI Agent. Then changing the AI Agent settings can only be done in the edit mode.
  • On-Screen Settings: See On-screen settings.
  • Linked Settings: See Linked settings.
  • Linked Settings Disabled: Can be used to temporarily disable Linked Settings.
  • Model: Select a model for the input charts. This can be overridden in each chart's configuration.
  • Object-Centric Perspective Settings: Configure these settings if an object-centric model is selected. These settings can also be overridden in each chart's configuration.
  • Event Type Mapping: Select the event attribute used as the event type for the input charts. The event type selection affects the process flow and for example variations and flows. When the event type mapping is defined, the input charts work similarly as if the same event type mapping had been selected in the model level. When <model default> is selected, the event type mapping defined for the model is used. Note that filters don't contain the event type mapping information, so they use the mapping that has been defined for the input charts, and thus charts with different event type mapping don't work together with filtering (usually all cases appear to be filtered out due to the mismatch).
  • Chart Settings (Editable): All settings in the AI Agent are stored in a single entity that are shown here. The settings can also be edited directly. There are validations in place, so invalid settings are not accepted.

Data Sent to LLM Providers

AI Agent is based on the large language models (LLM) offered by Snowflake and OpenAI through their APIs. When using Snowflake LLMs, the data is already in Snowflake. See Data Sent to OpenAI for information what is sent to OpenAI when its LLMs are used.