Share via

June 2016

Volume 31 Number 6

[Power BI]

Microsoft Azure Media Services and Power BI

By Sagar Bhanudas | June 2016

Microsoft Azure Media Services offers a rich platform for developers and independent software vendors to deliver video-on-demand and live-streaming experiences over the Web and native apps. To enrich consumer experiences and gain insights into context/application usage, it’s important to weave a robust cross-platform solution in the back end for analytics and data visualization. As of this writing, the Azure Media Services platform doesn’t offer analytics out of the box; hence, developers are always challenged by business demands usage data from an analytics standpoint.

This article focuses on helping developers build an analytics platform on top of Azure Media Services (and Player) to surface out usage trends. The solution space includes usage of an intermediate (Web API) service and database, with the visualization culminating with Power BI.

The Scenario

Most of the organizations creating/embedding media content need to delve on usage/analytics data in an effort to improvise end-user experience. To achieve this, developers need to record some of the key performance indicators (KPIs) related to their video/media consumption. Here are some of the most common and desirable KPIs:

  • Which is the most watched video?
  • How many people watched the video until completion?
  • How many times was the video paused and at which position?
  • Which is the most streamed format/bit rate?
  • Platform information and other demographic details.

In order for you to decide the right mix of services and tools that will help deliver these KPIs, a significant amount of time and energy is spent to either create each component from the ground up or build on top of proven platform services.

Thankfully, the Azure Cloud Platform has a rich set of services that can be leveraged in the overall solution design to capture and address analytics requirements. The following components will be used for this scenario:

  • Azure Media Services
  • Azure Media Player
  • Azure Web Apps
  • Azure SQL Database
  • Power BI

The idea here is to capture raw data from Azure Media Player and feed it back to a middle-tier (Web API), which brokers the connection with the managed Azure SQL (reporting) database. You then connect Power BI to the data sources within the reporting database for surfacing out the trends of media usage/consumption through visualizations.

Further sections in this article detail the actual implementation of enabling this scenario.

Dynamic Packaging and Preparation for Video Consumption

The best part of using Azure Media Services is that you can take your video/audio content and prepare it for consumption on various platforms. You can achieve this either by using the Azure Portal or by performing the same steps from code if you need to automate the steps/solution. This section of the scenario discusses the following tasks:

  • Identify the content (in this case, I’ll choose a demo video content) to be consumed on demand
  • Upload the video to Azure Media Services (backed by an Azure Storage account)
  • Monitor the upload progress and submit the job for dynamic packaging
  • Get the relevant URLs for consumption on various platforms

I’ll use the Azure sample located at to get started with the C# console app for achieving these tasks. Though this sample leverages NuGet packages and the C# program, you can use SDKs available in other languages, as well. The most important code here is to submit the video for encoding and get the URLs, as shown in Figure 1.

Figure 1 Code for Submitting an Encoding Job

static public IAsset EncodeToAdaptiveBitrateMP4s(IAsset asset,
    AssetCreationOptions options)
  IJob job = _context.Jobs.CreateWithSingleTask(
    "Media Encoder Standard",
    "H264 Multiple Bitrate 720p",
    "Adaptive Bitrate MP4",
  Console.WriteLine("Submitting transcoding job...");
  job = job.StartExecutionProgressTask(
  j =>
      Console.WriteLine("Job state: {0}", j.State);
      Console.WriteLine("Job progress: {0:0.##}%",
  Console.WriteLine("Transcoding job finished.");
  IAsset outputAsset = job.OutputMediaAssets[0];
  return outputAsset;

If you’re new to Azure Media Services, here’s a quick list of terminologies:

  • Asset (or IAsset)—an entity representing a Media package with Azure Media Services. It may contain one or more content files.
  • Job (or IJob)—an entity representing a unit of “encoding” work to be performed by the Azure Media Service. Think of it as converting a file from one format to another.
  • Adaptive Bitrate—an encoding format that adapts to CPU/network capability of the target system and delivers content matching the criteria that best suits the device. You just have to create the Adaptive Bitrate files and Azure Media Services will identify the right bitrate to be streamed to the client device.

Now, putting the pieces together from the code in Figure 1, the function submits an “asset” to the Media Services Standard Encoder for conversion to the source format Adaptive Bitrate MP4 asset.

The sample also shows job progress, or you can track it through the Azure Portal as shown in Figure 2 in the “jobs” section. (I have blanked out the ID field values.)

Azure Media Services Sample Showing Job Progress
Figure 2 Azure Media Services Sample Showing Job Progress

Now that you’ve successfully created a media asset ready to be consumed across devices and platforms, you must be sure to let Azure Media Services know the platform of the target device so that it can stream out the right encoding and content format. For example, Windows devices generally support playing Silverlight Smooth Streaming format:­assetvideo.ism/Manifest

where iOS devices support HLS video format and others:


Notice the trailing few letters of the URL, which are appended after “Manifest.” These format notations help the Azure Media Services endpoint identify the content format to be streamed to the device. Now that you have the media ready to be streamed to the device, let’s create an HTML page for consuming the video through Azure Media Player (

Consuming Media and Sending Analytics Data to the Back End

Azure Media Player is a Web video player that complements the playback of Azure Media Services content on the client side. It works with an underlying browser platform to render the video content (primarily from Azure Media Services) with minimal configuration in JavaScript code. However, the goal is not only to play the media but also to derive analytical data of the usage and log it to the back end. Because the objective is to view the aggregate data toward the end of the solution for analysis, I’ll focus only on a select number of parameters to be recorded; for example, an identifier of the video content, title of the video, pause time, timestamp and extra remarks/data.

Hence, the HTML code for player looks like that in Figure 3.

Figure 3 The HTML Front-End Code

<!DOCTYPE html>
  <title>Welcome to the awesome world of Azure Media Services</title>
  <link href="
  <script src="
  <script src="
    <meta charset="utf-8" />
  <video id="azuremediaplayer" class="azuremediaplayer
    amp-default-skin amp-big-play-centered"
  <script src="scripts/Player/Player.js"></script>

The HTML (design) code in Figure 3 simply declares the video element along with the Azure Media Player-specific properties to denote the UI of the player. Notice the Player.js file, which will perform the task of detecting the platform specifying the video URL, create an object analytics data and send it asynchronously to the custom back end.

The simplest code in Player.js can look something like that in Figure 4.

Figure 4 Setting Video Source and Capturing Azure Media Player Attributes

var myOptions = {
  "nativeControlsForTouch": true,
  controls: true,
  autoplay: true,
  width: "640",
  height: "400",
myPlayer = amp("azuremediaplayer", myOptions);
// For feature detection, you can use libraries like Modernizr and then
// construct the URL
  "src": "
  "type": "application/",
  // Events
  myPlayer.addEventListener(amp.eventName.pause, _ampEventHandler);
  // More events like
  // Content load complete
  // Media completed
  // Video seek
  // Page unload
  function _ampEventHandler(eventDetails)
    var eventName = eventDetails.type;
    var pauseTime = eventDetails.presentationTimeInSec;
    var title = "Hello Azure Media Service ! ";
    var ExtraData = "None";
    var dateTime = new Date().toUTCString();
    var data = {
      'MediaRef': 9999,
      'EventTime': pauseTime,
      'MediaTitle': title,
      'TimeStamp': dateTime,
      'ExtraData' : ExtraData
  $(function () {
    type: "POST",
    data: JSON.stringify(data),
    url: "",
    contentType: "application/json"

Figure 4 shows the simplest form of attributes that can be collected from the Azure Media Player for reference. Your solution media player can attach many more event handlers and capture richer metrics (see Although the code references a smooth streaming format, you can detect the platform features and quickly prepare the media URL for the relevant format of the content.

In this example, a handler is attached to the “paused” event of the Azure Media Player, which calls a back-end Web API through an AJAX call (JQuery) to avoid any blocking interference with the video playback. Hence, whenever the user pauses the video, the back-end call is made to push the data asynchronously to the analytics database, without any response being passed back to the HTML page.

Because JavaScript isn’t a strongly typed language, you can create and append dynamic properties to the object and set its values. The “data” object in this example represents the instance of the attributes captured from Azure Media Player and sent as a JSON string to the back end of one of the widely used formats for Web API communications.

The JSON string looks like this:

MediaTitle":"Video - 21",
"TimeStamp":"Mon, 04 Apr 2016 16:03:36 GMT",

The Web API and the Database

The choice of Web API and database can be across the spectrum because the reporting tool for this scenario is Power BI ( However, just to keep this simple, an ASP.NET Web API project has been created to quickly set up the back-end service. This helps you easily set up communication from the front end without having to worry about writing a lot of code for connecting to database systems. Unlike the front-end AJAX call, it’s important that you ensure connectivity and feedback from the database layer. Any issue with CRUD operations with DB might result in failure to capture client-side metrics. This approach also helps in testing the Web API-to-­database connections without having to write complex test cases.

The code in Figure 5 shows how the back-end Web API received the “data” JSON object from the HTML front end and pushes it to the SQL Azure Database.

Figure 5 The Web API and the Database

public class MediaAnalyticsController : ApiController
    async public void Post([FromBody]MediaWebAPI.Models.MediaData value)
  using (SqlConnection connection =
    await connection.OpenAsync();
    int EntryID  =
       SQLConnHelper.SQLWriteAnalyticsDataAsync(value, connection);

I’m abstracting much of the data access layer class code and the actual hit to the database, as the options may vary here and this can be general code that can symbolize the use of multiple types of middleware plus database combinations. Upon successful entry to the database, you receive the unique ID for the row, EntryId. There are definitely other methods of receiving feedback from the database for successful operation and any of them will work here, as well.

To represent the analytics data in a structured format within the database, I’ve created a table closely matching the Model (in MVC) or the JSON object I received from the HTML front end, as shown in Figure 6.

A Simplistic Table Design
Figure 6 A Simplistic Table Design

Tip: Another quick way to create Backend as a Service is to use Azure Mobile Apps, which lets you quickly provision Web API, as well as Azure SQL database, during service creation.

Stitching the Workflow Together: Gaining Insights Through Power BI

All the phases previously were building blocks for setting the stage for demonstrating the end goal of the very topic discussed here: Analytics/BI using the Azure Media Services component.

With the combination of Azure Media Services and Azure Media Player, you were able to quickly deliver the video experience without having to worry much about the encoding and streaming capabilities. However, most organizations are also interested in gaining insights into the usage pattern of the media to surface out consumption trends. Azure provides a great visualization platform to innovate and enable solution developers to create engaging experiences around its cloud-powered services. Power BI is a data visualization and analytics tool, which supports creation of interactive dashboards to monitor metrics/KPIs through easy-to-­configure development environments.

To create the visualizations and surface out the trends in the media content usage, you’ll use the Power BI desktop tool ( to configure and create the reports, which will then be published as dashboards for monitoring the KPIs. In the Power BI desktop, click the Get Data button and select More from the dropdown. You should see the list of various supported database sources and, for this instance, you’d select Microsoft Azure SQL Database, as shown in Figure 7.

Power BI Getting Started
Figure 7 Power BI Getting Started

The Get Data wizard then connects to the database and you can specify additional options to import data into the designer. Power BI uses datasets to load data from data sources.

After the data is imported, you’ll create a report using visualization charts/artifacts to reflect the KPIs. In a multi-table scenario, there can be relationships between tables for complex user objects that can be depicted in the model, as well. The datasets and visualizations can be paired through the top-right pane of the Power BI desktop tool, as shown in Figure 8.

Creating Visualization Charts
Figure 8 Creating Visualization Charts

Each visualization forms a tile interface in the Power BI report to reflect a particular KPI. In this scenario, video completion rates, pause times and most consumed media content should give the overall trend of the usage, which then can be used for marketing campaigns, user-engagement analysis and so on.

Once the KPIs are finalized, they can be pinned to the dashboard–which can form a single view of the product for CXOs, developers, IT, support functions and other interested parties. The tiles can be spread across different dashboards to form the views or KPIs that are of interest to sets of teams. For example, CXOs might be interested in viewing media completion rates, demographic details and most consumed categories, whereas IT and engineering teams can choose to monitor other KPIs such as failure rates, most streamed Bitrate, network analytics and so on.

Finally, the reports can be published over the Web at for broad consumption and usage, as shown in Figure 9.

Power BI Dashboard Displaying Reports
Figure 9 Power BI Dashboard Displaying Reports

Wrapping Up

This is just the tip of the iceberg regarding solutions and services Microsoft Azure can provide developers and solution providers. Along with video on demand, Azure Media Services can also deliver live streaming services for richer media experiences. There’s ample documentation around use cases and scenarios for Azure Media Services, Web apps and databases that can help developers provision and start running applications in a matter of minutes. Continuous improvements are incorporated to Azure and the Power BI platform almost every month to enable relevant scenarios for the services.

Little customization can help you address specific needs and this article is a starting point to demonstrate how usage statistics can be captured and extended to provide media-consumption analytics.

Sagar Bhanudas Joshi has worked with developers and ISVs on Universal Windows Platform and the Microsoft Azure platform for more than six years. His role includes working with ISVs and startups to help them architect, design and on-board solutions and applications to Microsoft Azure, Windows and the Office 365 platform. Joshi lives and works in Mumbai, India. Contact him on Twitter: @sagarjms handle.

Thanks to the following Microsoft technical expert for reviewing this article: Sandeep J. Alur, Lead Evangelist

Sandeep Alur has over 16 years of industry experience providing technology and architectural guidance to Enterprise customers. His experience ranges from dotcom days to the next generation distributed and cloud computing technology era. At Microsoft, he leads the Technical Evangelism Charter for India, and works with the Start-up & ISV ecosystem to provide them a platform to build solutions on the Cloud Platform.

Discuss this article in the MSDN Magazine forum