November 2016

Volume 31 Number 11

[Bot Framework]

Solving Business Problems with the Microsoft Bot Framework

By Srikantan Sankaran

People today do everything online or on their phones—buying, selling, banking, researching—and to stay competitive, enterprises constantly need to evolve their applications to deliver the best possible experience to customers who use their services. This involves providing various self-service capabilities, with the convenience of anytime and anywhere access to their data, often from social channels, using voice and messaging. That’s a challenge, due to the variety of applications that need to be factored in, and because most of these applications were never designed to handle the scenarios that are faced today. Efforts to address these needs likely require multiple parallel development projects involving considerable resources. The Microsoft Bot Framework, however, can ease the pain.

The Microsoft Bot Framework provides a platform for organizations to build applications—bots—that consumers can interact with, easily, conversationally, over voice or text, whenever convenient. Without any additional development effort, these bots can be seamlessly accessed from multiple social channels, such as Skype, Slack, Facebook Messenger and so forth. They can give users access to all of their data consolidated from disparate line-of-business (LOB) applications, using technologies like Azure Logic Apps or Microsoft Flow. These technologies ship with connectors for all key business applications in the market today. Azure Search Service provides the powerful Lucene engine that can be used to search the data, both structured and unstructured, in a flexible way. Customers can interact with the bots through natural language conversations, and the Azure Language Understanding Intelligent Service (LUIS) can interpret these conversations for downstream applications to respond to.

In this article and the next, I’ll discuss a scenario that illustrates the challenges organizations face today, and how they can create a solution by taking advantage of the Microsoft Bot Framework. You’ll find the following references useful as you implement the solution covered in this scenario:

  • Indexing Documents in Azure Blob Storage with Azure Search (bit.ly/2d4yr8s)
  • Enable and Disable Change Tracking (SQL Server) (bit.ly/2d226wt)

Creating a Solution

The business scenario that forms the background to this article involves an insurance agency that offers different types of insurance policies covering vehicles, home, travel and health to consumers. Consumers can register themselves with a Web application using their Microsoft credentials, and submit their request for an insurance policy. A business process workflow takes care of registering the user in Dynamics CRM Online, where profile information is captured, and then stores the insurance policy application in Office 365 SharePoint. An internal workflow process within the organization changes the policy request status over time, finally resulting in its approval and generating a policy document that’s stored in Office 365. These steps are outside the scope of this article, which instead focuses on the downstream integration scenarios that kick in when the information is captured in the system, and on sharing the subsequent updates and status with the consumer. Figure 1 depicts the architecture of the solution for this scenario.

Architecture of the Solution
Figure 1 Architecture of the Solution

Creating a contact profile for the consumer in Dynamics CRM Online, creating or updating a policy request in Office 365, or uploading policy documents in Office 365 triggers events in business process flows implemented using Microsoft Flow. Through these processes, structured data is synchronized to Azure SQL Database tables, and unstructured data, such as policy documents, is replicated to Azure Blob Storage. Azure Search crawls the data at regular intervals and keeps it current for querying. A bot application is deployed as an Azure Web app, published through the Microsoft Bot Framework Connector service. Through the Connector service, the bot application connects to a Skype channel, which consumers access to retrieve the status of their insurance policy requests, download the issued policy documents, schedule site inspection visits and so on. Consumers log in with the Microsoft account they registered with when applying for an insurance policy. They can use either messaging on the Skype client or voice commands to interact with the bot. The bot application integrates with the LUIS service, one of the several services available in Microsoft Cognitive Services, to interpret the conversations from the consumer and execute queries on Azure Search.

With this context, here’s what has to be done to create the solution:

  1. Author business process flows using Microsoft Flow to synchronize and consolidate data from different LOB applications.
  2. Configure Azure Search to index the consolidated data so it can be queried from the bot.
  3. Configure the LUIS service, and create and train the model for interpreting user conversations from the bot.
  4. Build a bot application using the Microsoft Bot Framework and use it from Skype.

I’ll describe the first two steps in this article and the remaining two next time.

Insurance Policy Request Synchronization Flow

In this example, policy requests are stored in a custom list in Office 365. A Microsoft Flow process bound to this list is triggered when a policy request is inserted or updated during the approval process, invoking a stored procedure in Azure SQL Database. Separate Microsoft Flow processes are authored for insert-and-update scenarios.

Figure 2 shows how Microsoft Flow Designer lets you select a trigger event on the SharePoint list in Office 365 when creating or updating an item or file.

Microsoft Flow Trigger Events
Figure 2 Microsoft Flow Trigger Events

Figure 3 shows the flow that’s triggered when a policy request is inserted into this list. The Microsoft Flow Designer reads the metadata of the stored procedure and automatically displays a form based on its input parameters. Placing the cursor on the form input fields launches the card shown on the right in Figure 3, displaying attributes from the preceding activities you can choose from to map to the stored procedure input parameters.

Insurance Policy Request Sync Flow
Figure 3 Insurance Policy Request Sync Flow

I’ll create an additional flow for syncing updates to ensure the status of the policy approval request is regularly updated from Office 365 to the target Azure SQL database.

Note that each of the connectors used in this scenario requires a connection that must be configured first, and a user account with sufficient rights to read data from a custom list in the Office 365 SharePoint Site. Likewise, the Azure SQL Database connector requires user credentials for authentication to the database.

I can test the flow I authored by inserting a record into the custom list. The data should be replicated into the Azure SQL Database.

Policy Documents Synchronization Flow

Once the insurance policy request is approved (on completion of a back-office process that’s out of scope of this article), a policy document is generated and uploaded to a document library in Office 365 SharePoint. A Microsoft Flow then replicates the policy document to Azure Blob Storage. Separate Microsoft Flow processes are authored to insert a new document in Azure Storage and to replace the original document if it’s updated in Office 365. The document contains certain key words, like the registration number of the vehicle being insured, for example. Later in the article, I’ll use Azure Search to execute a full text search within the documents based on specific key words. Figure 4 shows the Microsoft Flow implemented to synchronize documents that are uploaded for the first time in Office 365.

Policy Documents Synchronization Flow
Figure 4 Policy Documents Synchronization Flow

I can test the flow by uploading any document to the SharePoint Document Library. The documents should get replicated to the Azure storage container that was configured in the flow.

Customer Profile Data Synchronization

When a customer first registers with the insurance provider, a contact is created in Dynamics CRM Online. A Microsoft Flow is then triggered that uses the Dynamics CRM Online Connector to pick up the contact and insert it into an Azure SQL Database. Figure 5 shows how this process can be implemented.

Contact Data Flow
Figure 5 Contact Data Flow

Note that you need to configure a connection to Dynamics CRM Online first, with a user credential that has permission to connect to the organization and a security role that allows access to the data. Further, you should enable change tracking on the table in Dynamics CRM Online where the “insert events” are to be captured, which, in this case, is the Contacts table.

 Microsoft Flow lets you view the results of a process execution, with information about the request and response pertaining to each activity in it.

Figure 6 shows the outcome of a contact data synchronization process. Selecting an activity on the left opens a card with the details of the input and output payload, as shown on the right.

Flow Execution
Figure 6 Flow Execution

To test this flow, I can create a contact from the CRM Online portal. The contact data should get replicated to the Azure SQL Database.

Later, I’ll use the contact data to identify a user logged in to a Skype bot and retrieve the approval status of the user’s insurance policy application request. I’ll also use the contact information to schedule a site inspection visit from the bot.

I’ve implemented the process to consolidate the data from all the different LOB applications, but I need to complete certain steps before I consume the data in Azure Search.

Enabling Change Tracking in Azure SQL Database: Azure Search uses the inbuilt indexer for Azure SQL Database to crawl the data and build the index. Change tracking needs to be enabled on the database and on all the tables, so that Azure Search doesn’t carry out a regenerative index build on them every time:

ALTER DATABASE PolicyInfoDB SET CHANGE_TRACKING = ON
(CHANGE_RETENTION = 2 DAYS, AUTO_CLEANUP = ON);
ALTER TABLE CrmCustomerData ENABLE CHANGE_TRACKING
WITH (TRACK_COLUMNS_UPDATED = OFF);
ALTER TABLE PolicyRequests ENABLE CHANGE_TRACKING
WITH (TRACK_COLUMNS_UPDATED = OFF);

You can also track deletion of records in database tables and of documents in Azure Storage during the indexing process, through configuration.

Configuring Azure Search to Index the Policy­Requests and CrmCustomerData Tables: Use the Azure Portal to create an entity for your search, then select the option to Import Data from an Azure SQL Database. The wizard takes you through the process of connecting to the database, creating a data source for the search, sampling the data to read the metadata and selecting the necessary attributes to use, and ending with the index refresh schedule. You configure separate indexes for each of the tables. Figure 7 shows the search index configuration page for the database.

Configure Azure Search to Index Database Table
Figure 7 Configure Azure Search to Index Database Table

The Azure Search configuration page detects that change tracking is enabled on the database table being indexed. I’ve set the indexing frequency to Hourly; other options include Once, Daily and Custom.

Once the indexes are configured, Azure Search immediately triggers an indexing operation on the data source. You can check the status of each index to view the number of records that have been populated into the index.

The Azure Search settings blade provides a search explorer that gives you an easy way to set search parameters, view the results and verify the configuration.

Configuring the Azure Search Index for Policy Documents: Configuring Azure Search to search and index Azure Storage blobs is much like the database configuration just described, though I should note that this feature is in still in Preview at this time.

Later in this article I’m going to use Azure Search to query for documents based on the policy document generation date and on key words like the vehicle registration number, which is stored within the documents. Azure Search supports the Apache Lucene Query Language, which provides advanced querying capabilities.

Note that blob storage has a property, metadata_storage_last_modified, which is a date-time field. The data type of its counterpart in Azure Search, which the indexer matches it to, is Edm.DateTime­Offset, which isn’t searchable. Because I want to be able to retrieve policy documents based on the year they’re issued, this needs to be a queryable field.

To address this, I could change the data type in the index to Edm.String, which is searchable. However, because the indexer uses this field as a flag to determine whether a file has changed since the last indexing operation and whether it should be indexed again, I’d risk breaking that functionality by changing the data type.

Instead, I can add a supplemental field to the index to store the same value using an Edm.String data type. However, this means I can’t use the Azure Portal UX to configure the index, because it doesn’t allow customizing the logic for mapping fields from Azure Storage to the index.

To get around this, I’ll use the Postman client (getpostman.com/apps), a free downloadable tool, to invoke the Azure Search Service REST APIs, and to complete the following tasks:

  • Creating the data source: This can be done either from the Azure Portal or using the REST API from the Postman client.
  • Configuring the index: Because I’m defining the fields in the index manually, I can provide names that are more user-friendly than the default Search configuration ones. The “filepath” field will be set as the key field in the index. Notice that there are two placeholder fields for the last modified date, one based on Edm.DateTimeOffset and the other on Edm.String:
{
  "name" : "policydocuments-index",
    "fields": [
      { "name": "filepath", "type": "Edm.String", "key": true,
        "searchable": true },
      { "name": "content", "type": "Edm.String", "searchable": true },
      { "name": "filesize", "type": "Edm.String", "searchable": true },
      { "name": "author", "type": "Edm.String", "searchable": true },
      { "name": "filename", "type": "Edm.String", "searchable": true },
      { "name": "lastmoddate", "type": "Edm.String", "searchable": true },
      { "name": "contenttype", "type": "Edm.String", "searchable": true },
      { "name": "modifieddate", "type": "Edm.DateTimeOffset",
        "searchable": false }
      ]
}
  • Creating the indexer and mapping fields:  This step creates the indexer, defines the index schedule and maps the fields in Azure Storage to those in the index defined in the previous step:
{
  "name" : "policydocuments-indexer",
  "dataSourceName" : "policiesrepodatasource",
  "targetIndexName" : "policydocuments-index",
  "schedule" : { "interval" : "PT2H" },
    "fieldMappings" : [
           { "sourceFieldName" : "metadata_storage_name",
             "targetFieldName" : "filename" },
     { "sourceFieldName" : "metadata_storage_size",
       "targetFieldName" : "filesize" },
     { "sourceFieldName" : "metadata_author", "targetFieldName" : "author" },
     { "sourceFieldNambe" : "metadata_storage_last_modified",
       "targetFieldName" :
       "modifieddate" },
     { "sourceFieldName" : "metadata_storage_last_modified",
       "targetFieldName" :
       "lastmoddate" },
     { "sourceFieldName" : "metadata_content_type",
       "targetFieldName" : "contenttype" },
     { "sourceFieldName" : "metadata_storage_path",
       "targetFieldName" : "filepath" }
  ],
  "parameters" : { "base64EncodeKeys": true }
}

Now I can use the Azure Search explorer in the portal to verify that the documents have indeed been indexed, and the attribute values are mapped as expected.

Software Prerequisites for Implementing This Scenario

Apart from an Azure subscription that’s required to implement the scenario described here, you’ll also need the following development tools and additional software:

  • SQL Management Studio for SQL Server 2014 or later to create and connect to the Azure SQL Database.
  • A Dynamics CRM Online subscription, with a user account that can connect to the organization where the contacts will be created and has the necessary security roles to read the contacts table. Change tracking should be enabled on this table for its “on Create” trigger to function.
  • Access to an Office 365 SharePoint site and a user account with contribute rights to that site.
  • A work account to sign in and author the business processes for data synchronization using Microsoft Flow. Get started with it at flow.microsoft.com.
  • To use the LUIS service, sign up with your work account (or Microsoft account) at luis.ai. In Part 2 of this article, I’ll use this service to model the conversations mentioned in this article.

Note: Microsoft Flow is in preview as of this writing, and can’t at this point handle exceptions, implement retry logic or manipulate the code behind the process flow. All the processes presented in this scenario can be authored using Azure Logic Apps, as well, which, aside from providing all the features of Microsoft Flow, supports additional features such as exception handling, using the code editor to manipulate the flow and more.

Executing the Scenario

Running this scenario end to end requires a Web application that lets consumers register using their Microsoft credentials and submit an insurance policy request. This application would then create a contact in Dynamics CRM Online, generate a contact ID and use this value in the insurance policy request created in the list in Office 365. However, because such an application is out of scope of this article, you’ll have to perform these steps manually and separately.

Log into the Dynamics CRM Online Portal and create a contact. Ensure that you capture details like the mobile number and the Microsoft Account for the customer, as these will be required later when working with the bot. The Microsoft Flow bound to the Contacts table would get triggered, invoke a stored procedure in Azure SQL Database and then insert a customer record in the CrmCustomerdata table. Note down the value of the Customer ID stored in the table for the next step.

Manually create an insurance policy request in the SharePoint custom list in Office 365. Ensure the customer ID generated for this customer in Dynamics CRM Online is captured in the request. Figure 8 shows a list in SharePoint where the request is captured. A Microsoft Flow process bound to this list will be triggered when the record is saved, or when it’s updated later during the approval process. Ensure that the policy request is replicated to the Azure SQL database PolicyRequests table.

Insurance Policy Request in Office 365 List
Figure 8 Insurance Policy Request in Office 365 List

Next, create PDF document samples and ensure they contain vehicle registration or identification numbers to be used during document search. Upload these documents to the SharePoint library that’s bound to the Microsoft Flow process for document synchronization. Verify that these documents are replicated to the Azure storage account configured in the flow.

Manually run the indexer to ensure that the data synchronized in the previous steps is indexed and available to be searched. Launch the Azure Search Explorer and query for records in each of the tables and the policy documents you’ve uploaded and ensure they’re returned in the search results.

Natural Language Utterances in the Bot App

Users of the bot on Skype or other channels can type in their requests in natural-language style, which the bot should then interpret, identify the user’s intent and entities, and query Azure Search and fetch the results. For the current scenario, the bot ought to cover all possible ways its users could interact with it and pose questions on the status of their insurance policy requests, or download their approved policy documents, or query policy documents from the previous years. Here are some examples of utterances the bot should be able to interpret:

  • What is the status of my insurance policy request?
  • Is my policy request # VehPol0101 approved?
  • Is there an update to my insurance policy request # VehPol0101?
  • Show me my policy document.
  • Get me the policy document.
  • Get me last year’s policy document for Vehicle # KA 02A 8534.
  • Show me the policy document for vehicle # KA 02A 8534 issued in 2014.
  • Can you schedule a site inspection visit on Tuesday next week for vehicle # KA 02A 8534?

In my next article, I’ll build a bot application with natural-­language support. To do so, I’ll create a LUIS model that will register these natural-language utterances, train it to identify user intent in them, and determine the supporting parameters or entities with which to query the Azure Search Service.

Artifacts Available for Download

The following artifacts used to implement this scenario are available for download from the GitHub repository at bit.ly/2cOfANh:

  • Azure SQL Database table creation scripts for the CrmCustomer and PolicyRequests tables.
  • Stored procedures used in the preceding database tables.
  • Screenshots of the schema of the Office 365 list that stores the policy requests, and that of the document library where the policy documents are uploaded.

I will be uploading other artifacts pertaining to the next article to the same location.

Wrapping Up

Organizations today deal with a variety of applications that need to work together to solve their business problems. This is often a challenge in that information is distributed across the enterprise, making it difficult to consolidate and bring within reach of consumers. Microsoft Flow helps to consolidate information from across the enterprise and Azure Search provides a flexible and rich querying engine for accessing the data. There’s just one service interface to integrate with to surface all of the information, reducing the time to build and deploy your client-side applications. 

In Part 2 of this article series, I’ll consume the information exposed by Azure Search from a consumer-facing application, such as a bot for Skype. To do so, I’ll create a LUIS model to interpret user messages from the bot and translate them into structured requests for Azure Search.


Srikantan Sankaran is a technical evangelist from the DX team in India, based in Bangaluru. He works with numerous ISVs in India and helps them architect and deploy their solutions on Microsoft Azure. Reach him at sansri@microsoft.com.

Thanks to the following Microsoft technical experts for reviewing this article: Sandeep Alur and Paul Bouwer
Sandeep Alur is the Lead Evangelist for DX India and is based in Bangalore.

Paul Bouwer is a Senior SDE from the Technical Evangelism and Development Team, and is based in Brisbane.


Discuss this article in the MSDN Magazine forum