Training
Module
Language understanding in Azure Health Bot - Training
Language Understanding plays a fundamental role in the working of Azure Health Bot.
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
APPLIES TO: Composer v2.x
This glossary defines words, phrases, and acronyms that have a specific meaning in Composer.
Actions are the main component of a trigger. They're what enable your bot to take action, whether in response to user input or any other event that may occur. Actions are powerful, and with them you can formulate and send a response, create and assign values to properties, manipulate the conversational flow, manage dialogs, and many other activities.
Additional information:
An action expressed by a bot, a channel, or a client that conforms to the Bot Framework Activity Schema.
Adaptive dialogs are a new way to model conversations that take the best of waterfall dialogs and prompts in the dialogs library. Adaptive dialogs are event-based. Using adaptive dialogs simplifies sophisticated conversation modeling primitives like building a dialog dispatcher and ability to handle interruptions elegantly. Adaptive dialogs derive from dialogs and interact with the rest of the Bot Framework SDK dialog system.
Additional information:
Adaptive expressions are a new expressions language used with the Bot Framework SDK and other conversational AI components, like Bot Framework Composer, Language Generation,adaptive dialogs, and Adaptive Cards.
Additional information:
Contains settings you can customize for your Composer application. Use Composer settings on the navigation pane to open this page.
Additional information:
A section of the Create page where users design and author their bot.
An option in Composer's navigation pane. The Bot responses page is a central way to manage bot response .lg resources. From there, users can view all the .lg templates and edit them.
A bot project is a top-level container of multiple bots. A bot project must contain a root bot, and it may contain one or more local and/or remote skills.
Every dialog that you create in Composer will be a child dialog. Dialogs can be nested multiple levels deep with the main dialog being the root of all dialogs in Composer. Each child dialog must have a parent dialog and parent dialogs can have zero or more child dialogs, but a child dialog can and must have only one parent dialog.
Additional information:
An option in Composer's navigation pane. It navigates users to the Application settings page where users manage settings for their bot and Composer.
An option in Composer's navigation pane. It navigates users to the Create page where users design and develop their bots.
The default recognizer applies Language Understanding (LUIS) and QnA Maker to interpret the user's input.
Note
Language Understanding (LUIS) will be retired on 1 October 2025. Beginning 1 April 2023, you won't be able to create new LUIS resources. A newer version of language understanding is now available as part of Azure AI Language.
Conversational language understanding (CLU), a feature of Azure AI Language, is the updated version of LUIS. For more information about question-and-answer support in Composer, see Natural language processing.
Additional information:
Dialogs are the basic building blocks in Composer. Each dialog represents a portion of the bot's functionality that contains instructions for what the bot will do and how it will react to user input. Dialogs are composed of Recognizers that help understand and extract meaningful pieces of information from user's input, a language generator that helps generate responses to the user, triggers that enable your bot to catch and respond to events and actions that help you put together the flow of conversation that will occur when a specific event is captured via a Trigger. There are two types of dialogs in Composer: main dialog and child dialog. Dialogs are represented in declarative assets as .dialog
files.
Additional information:
The Bot Framework Emulator is a desktop application that allows bot developers to test and debug their bots, either locally or remotely. Using the Emulator, you can chat with your bot and inspect the messages it sends and receives. The Emulator displays messages as they would appear in a web chat UI and logs JSON requests and responses as you exchange messages with your bot. Before deploying your bot, you can run it locally and test it using the Emulator.
Additional information:
In the Bot Connector Service, an endpoint is a programmatically addressable location where a bot or channel can receive activities.
An entity contains the important details of the user's intent. It can be anything, a location, date, time, cuisine type, etc. An intent may have no entities, or it may have multiple entities, each providing additional details to help understand the needs of the user.
Additional information:
An option in Composer's navigation pane and the start page of Composer.
An intent is the task that the user wants to accomplish or the problem they want to solve. Intent recognition in Composer is its ability to determine what the user is requesting. This is accomplished by the recognizer using either Regular Expressions or LUIS. When an intent is detected from the user's input, an event is emitted which can be handled using the Intent recognized trigger. If the intent isn't recognized by any recognizers, another event is emitted which can be handled using the Unknown intent trigger.
Note
Language Understanding (LUIS) will be retired on 1 October 2025. Beginning 1 April 2023, you won't be able to create new LUIS resources. A newer version of language understanding is now available as part of Azure AI Language.
Conversational language understanding (CLU), a feature of Azure AI Language, is the updated version of LUIS. For more information about question-and-answer support in Composer, see Natural language processing.
Additional information:
Language generation is the process to produce meaningful phrases and sentences in the form of natural language. Language generation enables your bot to response to a user with human readable language.
Additional information:
Known as the response editor, this is a section of the Bot responses page. It's the editor where users can view and edit all the Language generation templates.
There's also an inline response editor in the authoring canvas where your Bot Responses for the selected trigger or action can be added or updated.
Language understanding (LU) deals with how the bot handles users input and converts them into something that it can understand and respond to intelligently. It involves the use of a recognizer, such as the default recognizer or Orchestrator, along with utterances, intents, and entities.
Additional information:
Note
Language Understanding (LUIS) will be retired on 1 October 2025. Beginning 1 April 2023, you won't be able to create new LUIS resources. A newer version of language understanding is now available as part of Azure AI Language.
Conversational language understanding (CLU), a feature of Azure AI Language, is the updated version of LUIS. For more information about question-and-answer support in Composer, see Natural language processing.
A recognizer type in Composer that enables you to extract intent and entities based on LUIS service.
Additional information:
A local skill is a skill that doesn't need to contain a skill manifest. It can be called and shared within the Composer environment without Azure App registration resources.
A section of the User input page. It's the language understanding editor where users can view and edit all the Language understanding templates.
The main dialog is the foundation of every bot created in Composer. There's only one main dialog and all other dialogs are children of it. It gets initialized every time your bot runs and is the entry point into the bot.
A bot uses memory to store property values, in the same way that programming and scripting languages such as C# and JavaScript do. A bots memory management is contained within the following scopes: user, conversation, dialog and turn.
Additional information:
A section of the Composer window. It enables users to navigate to different parts of Composer.
Orchestrator is an intent-only language recognition engine and recognizer. It provides a skill dispatching solution for the Bot Connector Service. Bots built using Composer or the Bot Framework SDK can use it.
Additional information:
A parent dialog is any dialog that has one or more child dialogs, and any dialog can have zero or more child dialogs associated with it. A parent dialog can also be a child of another dialog.
Prompts refer to bots asking questions to users to collect information of various data types (e.g. text, numbers).
Additional information:
A property is a distinct value identified by a specific address. An address has two parts, the scope and name: scope.name. Some examples of typical properties in Composer could include: user.name, turn.activity, dialog.index, user.profile.age.
Additional information:
A section of the Create page where users can edit properties.
A cloud-based natural language processing service that easily creates a natural conversational layer over your data. Bot Framework Composer integrates QnA Maker knowledge base creation and management in addition to the existing LUIS integration for language understanding.
Note
Azure QnA Maker will be retired on 31 March 2025. Beginning 1 October 2022, you won't be able to create new QnA Maker resources or knowledge bases. A newer version of the question and answering capability is now available as part of Azure AI Language.
Custom question answering, a feature of Azure AI Language, is the updated version of the QnA Maker service. For more information about question-and-answer support in Composer, see Natural language processing.
Additional information:
QnA Maker imports your content into a knowledge base of question and answer pairs. After you publish your knowledge base, a client application such as a chat bot can send a user's question to your endpoint. Your QnA Maker service processes the question and responds with the best answer. Users can create and manage their QnA Maker knowledge bases within the context of the Composer environment.
Additional information:
A recognizer enables your bot to understand and extract meaningful pieces of information from user's input. There are currently two types of recognizers in Composer: LUIS and Regular Expression, both emit events which are handled by [triggers](#trigger].
Note
Language Understanding (LUIS) will be retired on 1 October 2025. Beginning 1 April 2023, you won't be able to create new LUIS resources. A newer version of language understanding is now available as part of Azure AI Language.
Conversational language understanding (CLU), a feature of Azure AI Language, is the updated version of LUIS. For more information about question-and-answer support in Composer, see Natural language processing.
A regular expression is a sequence of characters that define a search pattern. Regex provides a powerful, flexible, and efficient method for processing text. The extensive pattern-matching notation of regex enables your bot to quickly parse large amounts of text to find specific character patterns that can be used to determine user intents, validate text to ensure that it matches a predefined pattern (such as an email address or ZIP Codes), or extract entities from utterances.
A remote skill is a skill that contains a skill manifest and is published to a remote host, such as Azure.
This is where you can add or make updates to your .lg templates.
See also:
A root bot is the first and the main bot that is created in your bot's project.
See main dialog.
When a property is in scope, it's visible to your bot. See memory concept article to know more about the different scopes of memory.
A skill is a bot that can perform a set of tasks for another bot.
A skill consumer is a bot that can invoke one or more skills.
Additional information:
A skills manifest is a JSON file that describes the actions the skill can perform, its input and output parameters, and the skill's endpoints.
A horizontal bar at the top of the Composer window, bearing the name of the product and the name of current bot project.
A horizontal bar under Title bar in the Composer screen. It's a strip of icons used to perform certain actions to manipulate dialogs, triggers, and actions.
Triggers are the main component of a dialog and let you catch and respond to events. Each trigger has a condition and a collection of actions to execute when the condition is met.
Additional information:
In Composer, trigger phrases are example utterances that users define with a LUIS recognizer or a regular expression recognizer. Composer's language processing examines a user's utterance to determine the intent and extract any entities it may contain. A trigger phrase in LUIS is generally referred to as an utterance. A trigger phrase in a regular expression is generally referred to as a pattern.
Note
Language Understanding (LUIS) will be retired on 1 October 2025. Beginning 1 April 2023, you won't be able to create new LUIS resources. A newer version of language understanding is now available as part of Azure AI Language.
Conversational language understanding (CLU), a feature of Azure AI Language, is the updated version of LUIS. For more information about question-and-answer support in Composer, see Natural language processing.
An option in Composer's navigation pane. The User input page is a central way to manage language understanding .lu resources. From there, users can view all the .lu templates and edit them.
An utterance can be thought of as a continuous fragment of speech that begins and ends with a clear pause. Composer's language processing examines a user's utterance to determine the intent and extract any entities it may contain.
Additional information:
Training
Module
Language understanding in Azure Health Bot - Training
Language Understanding plays a fundamental role in the working of Azure Health Bot.