Solution Development
Breadcrumbs

Settings - Text2SQL

Intended audience: analysts developers administrators

AO Platform: 4.4


Text2SQL

This section includes the configuration settings for the Text2SQL transformation functionality. Text2SQL transformation happens when a user enters a Natural Language Question (NLQ) in the Easy Answers solution and the Easy Answers Chatbot. This includes the Easy Answers native interpretation of Natural Language Questions and the enhanced, Large Language Model-based integration for interpreting questions.

See Use of Generative AI for details.

If a commercial LLM is enabled (compared to an Open Source LLM), each transaction (API request) will incur a transaction fee.

image-20250625-113105.png
  • Show Next Question Suggestions in Chatbot

ON/OFF Toggle

OFF

If enabled, the Next Question Suggestions will be generated in the Chatbot based on the History list of past questions.

COMMON SETTINGS

LLM SETTINGS

TOKEN MATCHING SETTINGS

VECTOR STORE SETTINGS

image-20250625-111621.png
image-20251104-101225.png
image-20251016-125229.png
image-20250625-112451.png

Properties

Label

UI Widget

Defaults

Description

COMMON SETTINGS




Always Invoke Text2SQL

ON/OFF Toggle

OFF

If enabled, the “text2SQL” translation will be performed for every question posed in Easy Answers. If disabled, the system will reuse previous natural language questions and their translations into SQL (indexed by Elasticsearch) from the cache. In a production environment, this toggle should be disabled to gain performance. Always Invoke Text2SQL is disabled by default.

LLM SETTINGS




Use LLM

ON/OFF Toggle

OFF

If enabled, the toggle turns on the use of a Large Language Model (LLM) for generating the internally used SQL code from the Easy Answers Question that the user has asked. Note: All remaining properties in the LLM SETTINGS section and in the CHATBOT SETTINGS section will not be used/available if the Use LLM toggle is turned OFF.

  • Question Rephrasing Model

Dropdown

Use Default

Select the LLM Model to use for Question Rephrasing, or leave it on Use Default to use what has been selected on the Settings - Generative AI Configurations page.

  • MSO Identification Model

Dropdown

Use Default

Select the LLM Model to use for MSO Identification, or leave it on Use Default to use what has been selected on the Settings - Generative AI Configurations page.

  • MSO Property Identification Model

Dropdown

Use Default

Select the LLM Model to use for MSO Property Identification, or leave it on Use Default to use what has been selected on the Settings - Generative AI Configurations page.

  • Semantic SQL Reasoning Generator Model

Dropdown

Use Default

Select the LLM Model to use for Semantic SQL Reasoning Generator, or leave it on Use Default to use what has been selected on the Settings - Generative AI Configurations page.

  • Semantic SQL Generator Model

Dropdown

Use Default

Select the LLM Model to use for Semantic SQL Generator, or leave it on Use Default to use what has been selected on the Settings - Generative AI Configurations page.

  • Semantic SQL Functions Model

Dropdown

Use Default

Select the LLM Model to use for Semantic SQL Functions, or leave it on Use Default to use what has been selected on the Settings - Generative AI Configurations page.

  • Semantic SQL Post Processing Model

Dropdown

Use Default

Select the LLM Model to use for Semantic SQL Post Processing, or leave it on Use Default to use what has been selected on the Settings - Generative AI Configurations page.

  • Semantic SQL Validator Model

Dropdown

Use Default

Select the LLM Model to use for Semantic SQL Validator, or leave it on Use Default to use what has been selected on the Settings - Generative AI Configurations page.

  • Semantic SQL Corrector Model

Dropdown

Use Default

Select the LLM Model to use for Semantic SQL Corrector, or leave it on Use Default to use what has been selected on the Settings - Generative AI Configurations page.

  • Question Tagging Model

Dropdown

Use Default

Select the LLM Model to use for Question Tagging Model, or leave it on Use Default to use what has been selected on the Settings - Generative AI Configurations page.

  • Semantic SQL Sourcing Extractor Model

Dropdown

Use Default

Select the LLM Model to use for Semantic SQL Sourcing Extractor, or leave it on Use Default to use what has been selected on the Settings - Generative AI Configurations page.

  • Response Summarization Model

Dropdown

Use Default

Select the LLM Model to use for Response Summarization, or leave it on Use Default to use what has been selected on the Settings - Generative AI Configurations page.

  • Use MSO Description

ON/OFF Toggle

ON

If enabled, a question using the LLM approach will look to MSO LLM Descriptions to support LLM prompt creation.

The MSO LLM Descriptions can be auto-populated using the Generate LLM Content entry in the Ontology options menu > Advanced. Viewing or manually creating MSO LLM Descriptions can be done on the Generative AI > Ontology MSO Descriptions page in the Admin solution.

Important: Using populated MSO LLM Descriptions will increase the cost of calling the LLM API.

  • Use MSO Property Description

ON/OFF Toggle

ON

If enabled, a question using the LLM approach will look to MSO Property LLM Descriptions to support LLM prompt creation.

The MSO Property LLM Descriptions can be auto-populated using the Generate LLM Content entry in the Ontology options menu > Advanced. Viewing or manually creating MSO Property LLM Descriptions can be done on the Generative AI > Ontology MSO Property Descriptions page in the Admin solution.

Important: Using populated MSO Descriptions will increase the cost of calling the LLM API.

  • LLM Model

Dropdown

Use Default

Use the dropdown to select from available LLM Model Configurations.

  • Number of LLM Execution Threads

Number Field

1

With the LLM Validation Strategy set to Default, only 1 thread can be configured. If the LLM Validation Strategy is Strict, then 2 or more LLM Execution Threads can be configured.

  • LLM Validation Strategy

Dropdown

Default

Dropdown includes:

  • Default - only validates that SQL syntax is executable.

  • Strict - will include a specific LLM-based Query Validator (validating syntax correctness, areas are proper, conditions are met, performance efficiency) for n-number of different variations of a question, - seeking to optimize for the best possible answer.

  • Include MSO Schema in Prompt

ON/OFF Toggle

OFF

likely to be removed - no longer used

  • Cache LLM Raw Prompt Responses

ON/OFF Toggle

ON

If enabled, will seek to reuse a previously Cached LLM RAW Prompt Response in case an identical question has already been asked.

  • Show Semantic SQL Reasoning and Sourcing

ON/OFF Toggle

ON

If enabled, it will make the Semantic SQL Reasoning and Sourcing functionality available in the Easy Answers Options menu on the Results page. See https://docs-easyanswers.apporchid.com/easy-answers-results-page/?contextKey=viewing-sourcing-information&version=latest.

  • Detect Ambiguity

ON/OFF Toggle

OFF

If enabled, will prompt the user to clarify uncertainties in the user’s question, eg. if a user says “filter by materials”, but doesn’t say which materials, then Easy Answers will ask a clarifying question and/or suggest possible values, such as plastic, steel, bronze, etc… User will also be able to type in some free-form text to provide clarification which will be added to the question prompt in a further attempt to help better interpreting the user’s question.

  • Ambiguity Model

Dropdown

Use Default

Use the dropdown to select from available LLM Model Configurations. This property only shows when the Detect Ambiguity property is enabled.

  • Use Statistics for Semantic SQL Optimization

ON/OFF Toggle

OFF

If enabled, the Statistics gathered during MSO Discovery will be used when performing semantic queries, ensuring they scale efficiently with the data.

  • Use Statistics for LLM Content Generation

ON/OFF Toggle

OFF

If enabled, the Statistics gathered during MSO Discovery will be used to generate LLM Content.

  • Use LLM Content from Parent MSO

Dropdown

Dropdown

Select from the following inheritance options:

  • Use Parent Content - will copy LLM Content from the referenced (parent) ontologies. The user can edit the content, but if LLM Content is changed in the referenced ontology, it will override the current ontology.

  • Ignore Parent Content - no LLM Content will be referenced or copied from the referenced (parent) ontologies. The user shall generate LLM Content for the child ontology.

  • Copy Parent Content - will copy LLM Content from the referenced (parent) ontologies. The user can edit the content independently of the LLM Content in the referenced ontologies.

Show Filter Criteria for Easy Answers

ON/OFF Toggle

OFF


Apply Default Criteria

Dropdown

As part of Post Processing as an LLM call

Default criteria to be used for LLM Strategy.

  • Test Query Through Execution

ON/OFF Toggle

OFF


  • Show Apps for Generative AI Query

ON/OFF Toggle

ON

Show additional Apps, if any, along with the AI Response table.

  • Show Data Summary

ON/OFF Toggle

OFF

If enabled, a Summary of the question will be shown in Easy Answers.

  • Cache Data Summaries

ON/OFF Toggle

ON

If enabled, the Data Summary for the Semantic SQL will be cached.

  • Use Templates

ON/OFF Toggle

ON

If enabled, Natural Language Question (NLQ) Templates will be used to answer a new NLQ whenever it finds a match in a template.

TOKEN MATCHING SETTINGS




Use Search Store for Token Matching

ON/OFF Toggle

OFF

If enabled, the Vector Store will be used for Token Matching. If disabled, the internal AO platform logic will be utilized.

  • Valid POS Tags for Search Store for

Dropdowns


Repeater section for POS Tags. Dropdown includes NOUN, ADJECTIVE, VERB, and ADVERB. Only shown if Use Search Store for Token Matching is enabled.

MSO Prompt Priority

Dropdown

Low

Dropdown includes: None, Low, Medium, High.

The MSO Prompt Priority can be auto-populated using the AO Platform REST API. Viewing or manually creating MSO Prompt Priorities can be done on the Generative AI > Ontology MSO Descriptions page in the Admin solution.

MSO Property Prompt Priority

Dropdown

Low

Dropdown includes: None, Low, Medium, High.

The MSO Property Prompt Priority can be auto-populated using the AO Platform REST API. Viewing or manually creating MSO Property Prompt Priorities can be done on the Generative AI > Ontology MSO Descriptions page in the Admin solution.

Named Entities to be Excluded

Dropdowns


Repeater section - The Named Entities to be Excluded refers to configuring how named entities are recognized. Default values: ORGANIZATION, ADDRESS, PERSON.

Custom Entities to be Included

Dropdowns


Repeater section - The Custom Entities to be Ubckyded refers to configuring how custom entities are recognized.

VECTOR STORE SETTINGS




  • Use Vector Store

ON/OFF Toggle

OFF

If enabled, will enable the selected Vector Store to identify “Did you mean” alternative question suggestions in Easy Answers.

  • Vector Store

Dropdown

Weaviate DB

Use the dropdown to select from the available Vector Store:

  • Use Default

  • Pinecone DB

  • Weaviate DB

  • Embedding Model

Dropdown

Use Default

Embedding involves mapping high-dimensional vectors into a lower-dimensional space, facilitating machine learning tasks with large inputs, such as sparse vectors that represent words. Use the dropdown to select from available Embedding Model configurations.

  • Vector Matching Algorithm

Dropdown

Dot Product

Dropdown includes: Dot Product, L1 Distance, L2 Distance, Euclidean, Cosine Similarity.

  • Enable Easy Answers Suggestions

ON/OFF Toggle

ON

If enabled, the “Did you mean…?” functionality will be enabled in Easy Answers. If disabled, then the Easy Answers alternative question suggestions will not appear for user questions that are not fully understood.

  • Single Token Similarity Threshold

Number Field

0.82

The Single Token Similarity Threshold value is used to suggest individual Words for the “Fix Unknown Words” functionality in Easy Answers. In that workflow, individual words will be suggested for words not understood by our Natural Language interpretation. The higher the threshold, the more accurate (but likely fewer) word suggestions will be shown.

  • Similarity Threshold

Number Field

0.92

The Similarity Threshold value determines the results returned by Vector Store in response to the “Did you mean…?” functionality in Easy Answers. “Did you mean…?” is a way of resolving partially understood questions based on previously asked questions. The higher the threshold, the more accurate (but likely fewer) alternative question suggestions will be shown.

  • Suggestion Count

Number Field

3

The Suggestion Count determines the maximum number of suggestions shown in Easy Answers.

  • Use Vector Store for Token Matching

ON/OFF Toggle

OFF


  • Similarity Threshold for Token Matching

Number Field

0.85

The Similarity Threshold for Token Matching value determines the results returned by Vector Store in response to the “Did you mean…?” functionality in Easy Answers. “Did you mean…?” is a way of resolving partially understood questions based on previously asked questions. The higher the threshold, the more accurate (but likely fewer) alternative question suggestions will be shown.

  • Similarity Threshold for Nouns in Token Matching

Number Field

0.7

The Similarity Threshold for Nouns value determines the results returned by Vector Store in response to the “Did you mean…?” functionality in Easy Answers. “Did you mean…?” is a way of resolving partially understood questions based on previously asked questions. The higher the threshold, the more accurate (but likely fewer) alternative question suggestions will be shown.

  • Match for Token Types

Text Field

mso, mso_property

  • mso

  • mso_property

  • mso_property_value






Contact App Orchid | Disclaimer