Extract and analyze call center data

Azure Blob Storage
Azure AI Speech
Azure AI services
Power BI

This article describes how to extract insights from customer conversations at a call center by using Azure AI services and Azure OpenAI Service. Use these real-time and post-call analytics to improve call center efficiency and customer satisfaction.


Diagram that shows the call center AI architecture. Download a PowerPoint file of this architecture.


  1. A phone call between an agent and a customer is recorded and stored in Azure Blob Storage. Audio files are uploaded to an Azure Storage account via a supported method, such as the UI-based tool, Azure Storage Explorer, or a Storage SDK or API.

  2. Azure AI Speech is used to transcribe audio files in batch mode asynchronously with speaker diarization enabled. The transcription results are persisted in Blob Storage.

  3. Azure AI Language is used to detect and redact personal data in the transcript.

    For batch mode transcription and personal data detection and redaction, use the AI services Ingestion Client tool. The Ingestion Client tool uses a no-code approach for call center transcription.

  4. Azure OpenAI is used to process the transcript and extract entities, summarize the conversation, and analyze sentiments. The processed output is stored in Blob Storage and then analyzed and visualized by using other services. You can also store the output in a datastore for keeping track of metadata and for reporting. Use Azure OpenAI to process the stored transcription information.

  5. Power BI or a custom web application that's hosted by App Service is used to visualize the output. Both options provide near real-time insights. You can store this output in a customer relationship management (CRM), so agents have contextual information about why the customer called and can quickly solve potential problems. This process is fully automated, which saves the agents time and effort.


  • Blob Storage is the object storage solution for raw files in this scenario. Blob Storage supports libraries for languages like .NET, Node.js, and Python. Applications can access files on Blob Storage via HTTP or HTTPS. Blob Storage has hot, cool, and archive access tiers for storing large amounts of data, which optimizes cost.

  • Azure OpenAI provides access to the Azure OpenAI language models, including GPT-3, Codex, and the embeddings model series, for content generation, summarization, semantic search, and natural language-to-code translation. You can access the service through REST APIs, Python SDK, or the web-based interface in the Azure OpenAI Studio.

  • Azure AI Speech is an AI-based API that provides speech capabilities like speech-to-text, text-to-speech, speech translation, and speaker recognition. This architecture uses the Azure AI Speech batch transcription functionality.

  • Azure AI Language consolidates the Azure natural-language processing services. For information about prebuilt and customizable options, see Azure AI Language available features.

  • Language Studio provides a UI for exploring and analyzing AI services for language features. Language Studio provides options for building, tagging, training, and deploying custom models.

  • Power BI is a software-as-a-service (SaaS) that provides visual and interactive insights for business analytics. It provides transformation capabilities and connects to other data sources.


Depending on your scenario, you can add the following workflows.

Scenario details

This solution uses Azure AI Speech to convert audio into written text. Azure AI Language redacts sensitive information in the conversation transcription. Azure OpenAI extracts insights from customer conversation to improve call center efficiency and customer satisfaction. Use this solution to process transcribed text, recognize and remove sensitive information, and perform sentiment analysis. Scale the services and the pipeline to accommodate any volume of recorded data.

Potential use cases

This solution provides value to organizations in industries like telecommunications and financial services. It applies to any organization that records conversations. Customer-facing or internal call centers or support desks benefit from using this solution.


These considerations implement the pillars of the Azure Well-Architected Framework, which is a set of guiding tenets that can be used to improve the quality of a workload. For more information, see Microsoft Azure Well-Architected Framework.


Reliability ensures your application can meet the commitments you make to your customers. For more information, see Overview of the reliability pillar.


Security provides assurances against deliberate attacks and the abuse of your valuable data and systems. For more information, see Overview of the security pillar.

Cost optimization

Cost optimization is about looking at ways to reduce unnecessary expenses and improve operational efficiencies. For more information, see Overview of the cost optimization pillar.

The total cost of this solution depends on the pricing tier of your services. Factors that can affect the price of each component are:

  • The number of documents that you process.
  • The number of concurrent requests that your application receives.
  • The size of the data that you store after processing.
  • Your deployment region.

For more information, see the following resources:

Use the Azure pricing calculator to estimate your solution cost.

Performance efficiency

Performance efficiency is the ability of your workload to meet the demands placed on it by users in an efficient manner. For more information, see Overview of the performance efficiency pillar.

When high volumes of data are processed, it can expose performance bottlenecks. To ensure proper performance efficiency, understand and plan for the scaling options to use with the AI services autoscale feature.

The batch speech API is designed for high volumes, but other AI services APIs might have request limits, depending on the subscription tier. Consider containerizing AI services APIs to avoid slowing down large-volume processing. Containers provide deployment flexibility in the cloud and on-premises. Mitigate side effects of new version rollouts by using containers. For more information, see Container support in AI services.


This article is maintained by Microsoft. It was originally written by the following contributors.

Principal authors:

  • Dixit Arora | Senior Customer Engineer, ISV DN CoE
  • Jyotsna Ravi | Principal Customer Engineer, ISV DN CoE

To see non-public LinkedIn profiles, sign in to LinkedIn.

Next steps