Share via

January 2018

Volume 33 Number 1

[Data Points]

Creating Azure Functions to Interact with Cosmos DB

By Julie Lerman

Read the entire EF Core 2 and UWP App Data series:

Using EF Core 2 and Azure Functions to Store UWP App Data Locally and Globally, Part 1
Using EF Core 2 and Azure Functions to Store UWP App Data Locally and Globally, Part 2
Using EF Core 2 and Azure Functions to Store UWP App Data Locally and Globally, Part 3
Using EF Core 2 and Azure Functions to Store UWP App Data Locally and Globally, Part 4

Julie LermanIn my last column, I walked you through building a simple Universal Windows Platform (UWP) app—a game called CookieBinge—that was basically an excuse for me to explore using Entity Framework Core 2.0 (EF Core) in a device-bound Windows 10 app. The addition of providers for portable databases such as SQLite complete the picture to make this a possibility.

The app, in its current state, stores game scores locally on the device where the game is being played by using EF Core 2.0 to persist the data to a SQLite database. In that last article, I promised the next iteration of this game would allow users to share that data with other game players on the Internet. To achieve that goal, I’ll be using two cool features of Azure: Azure Cosmos DB and Azure Functions.

A Bit About Azure Cosmos DB

Azure Cosmos DB is the next generation of what began life as Azure Document DB—a technology I’ve written about numerous times in this column. “An Overview of Microsoft Azure DocumentDB” ( in June 2015 was followed by two more articles where I used it as the back end of an Aurelia Web app by way of a Node.js server API.

DocumentDB evolved to become a globally distributed database with some extraordinary features. Besides the ease with which it can be distributed globally, its consistency models have been realigned so that there’s more to choose from than just strong and eventual. Between those two extremes, users can now also choose bounded-staleness, session or consistent prefix. There are many more important features to support the data store, and the document database is now joined by a number of other data models—documents accessible via MongoDB APIs, graph databases, table (key/value pair) and column data storage. All of these models are under the Cosmos DB umbrella. In fact, if you had an Azure DocumentDB, it was automatically switched to an Azure Cosmos DB, allowing your existing document database to benefit from all of the new features Cosmos DB brings. You can read much more about Cosmos DB by starting at

And a Bit About Azure Functions

I’m going to use Cosmos DB to store the CookieBinge scores. However, rather than write all of the code myself using the Document DB APIs, I’ll take advantage of another relatively new feature of Azure: Azure Functions. Azure Functions is the Microsoft “serverless computing” offering. I’ve always been a skeptic about that phrase because the computing is still on a server ... just not my server. But having finally had a chance to work with Azure Functions, I now have a great respect for the concept. Azure Functions lets you focus on the actual logic you want to perform in the app, while it takes care of the cross-cutting concerns such as deployment, supporting APIs, and connections to other functionality such as data storage or sending e-mails. I didn’t totally understand this until I did it myself, so my hope is that by following my path as I prepare this feature for the CookieBinge app, you’ll also have your aha moment.

Preparation for Building the First Azure Function

There are three ways to build Azure Functions. One is with tooling in Visual Studio 2017. Another is directly in the Azure portal. You can also use Visual Studio Code in combination with the Azure command-line interface (CLI). I decided to start my learning by way of the portal because it walked me through all of the steps I needed. Another benefit is that without the help of the Visual Studio 2017 tooling, I was forced to think a little harder about all of the moving parts. I feel I got a much better understanding that way. Of course, there are lots of fabulous resources for doing this in Visual Studio 2017, as well, but the other thing I like about using the portal is that it’s Web-based and, therefore, a cross-platform option. Keep in mind, while you can deploy your function code and assets from source control into Azure, anything you build directly in the portal will have to be downloaded to your machine (a simple task) and from there, pushed to your repository.

If you want to follow along and don’t already have an Azure subscription, I’m happy to inform you that you can get a free subscription, and it’s not just for a short trial. Some Azure products will be free for a year and there are a few dozen that will always be free. Go to to get set up. I’m using the account I get as part of my Visual Studio subscription, which has a monthly credit allowance for experimenting with Azure.

Before creating the functions, I need to define my goals. I want my app to be able to:

  • Store user scores on the Web, persisting not only some user information and the date along with the score, but also the type of device the game was played on.
  • Allow users to retrieve their highest scores across all of the devices on which they’re playing.
  • Allow a user to retrieve the highest scores across all users.

I’m not going to bog this lesson down with matters like creating and authenticating accounts, though of course you’d need that for the real world. My aim is to show you Azure Functions and, eventually, the interaction from the UWP app.

Creating the Azure Function in the Azure Portal

Azure Functions is a service that’s grouped in a function app that allows you to define and share settings across its set of functions. So, I’ll start by creating a new function app. In the Azure portal, click New and filter on “function app” to easily find that option. Click Function App in the results list and then Create, which prompts you to fill out some metadata, such as a name for your app. I named mine As this is just a simple demo, I’ll accept the rest of the defaults on the Create page. For easy future access to your new function app, check the Pin to Dashboard option and then the Create button. It took only about 30 seconds for my new function app’s deployment to be completed.

Now you can add some functions into the function app. My functions will be built to support the list of goals mentioned earlier. The Azure Functions service has a pre-defined (and quite rich) set of events it can respond to, including an HTTP request, a change in a Cosmos DB database, or an event in a blob or a queue. Because I want to call into these functions from the UWP app, I want functions that respond to requests coming over HTTP. The portal provides a slew of templates in a variety of languages: Bash, Batch, C#, F#, JavaScript, PHP, PowerShell, Python and TypeScript. I’ll use C#.

To create the first function inside the function app, click on the plus sign next to the Functions header. You’ll see buttons to create pre-defined functions, but if you scroll down below those buttons, you’ll find a link to create a custom function. Choose that option and you’ll see a scrollable grid filled with template options, as shown in Figure 1. HTTP Trigger – C# should be at the top of that list and that’s what you should select.

A Bit of the Template List for Creating New, Custom Azure Functions
Figure 1 A Bit of the Template List for Creating New, Custom Azure Functions

Name the function and then click the Create button. I named mine StoreScores.

The portal will create a function with some default code so you can see how it’s structured. The function is built into a file called run.csx (see Figure 2). You can have additional logic in supporting files, as well, but that’s more advanced than needed for this first look.

Default Function Logic for a New HTTPTrigger
Figure 2 Default Function Logic for a New HTTPTrigger

The only method in the example is called Run, which is what Azure will call in response to an HTTP request to this function. It has one parameter to capture the request and another for relaying information to a log.

In the sample, you can see that the function is looking for incoming data that represents a name, and the function is flexible enough to search for it in the query parameters and in the request body. If the name isn’t found, the function will return an Http­ResponseMessage with a friendly error message, otherwise it returns “Hello [name]” in the response.

Customizing the Function to Interact with Cosmos DB

The goal of the function is to store the incoming data into a Cosmos DB database. Here’s where the magic begins. There’s no need to create connections and commands and other code to do this task. Azure Functions has the ability to integrate easily with a number of other Azure products—and Cosmos DB is one of them.

In the Functions list, you should see your new function and three items below it. One of those items is Integrate. Select that and you’ll see the form partially shown in Figure 3. Notice that it says the trigger is an HTTP request and that the output returns something through HTTP. Because I want to return a success or failure message, I do want to keep that HTTP output. But I also want to add an output that has a Cosmos DB collection as its destination.

Defining the Functions Integration Points
Figure 3 Defining the Functions Integration Points

To do this, click on New Output, which will display a list of icons. Scroll down to the one called Azure Cosmos DB, select it and then further down on the page you’ll see a SELECT button. You know what to do. (Click that button!)

The screen to set up this integration gets pre-populated with defaults. The Document Parameter Name represents the parameter you’ll use in run.csx. I’ll leave that as the default name, outputDocument. Next are the names of the Cosmos DB database and the collection within that database, as well as the connection to the Cosmos DB account where the database lives. You’ll also see a checkbox to automatically create the database for you. I already have a few Cosmos DB accounts created, so I’m going to use one of those, but I let my function create a new database named CookieBinge with a collection called Binges in that account. Figure 4 shows how I’ve filled out this form before saving the output definition. Because I marked the checkbox to create the database and collection, those will be created for me, but not when I save this output. When the function first attempts to store data into the database and sees that it doesn’t exist, the function will create the database on the fly.

Defining a Cosmos DB as Output for the Function
Figure 4 Defining a Cosmos DB as Output for the Function

Customizing Run.csx

Now it’s time to redefine the function code. The new version of the function expects a JSON object passed in that aligns with this BingeRequest class, which I added into the run.csx file below the Run method:

public class BingeRequest{
  public string userId {get;set;}
  public string userName {get;set;}
  public string deviceName {get;set;}
  public DateTime dateTime {get;set;}
  public int score{get;set;}
  public bool worthit {get;set;}

But this is not the same structure as the data I want to store because I want to capture one more property—the date and time the data is logged into the database. I’ll do that using a second class, BingeDocument, which inherits from BingeRequest, thereby inheriting all of its properties, as well as adding one more property named logged. The constructor takes a populated BingeRequest and after setting the value of logged, it transfers the BingeRequest values to its own properties:

public class BingeDocument:BingeRequest
    public BingeDocument(BingeRequest binge){
    public DateTime logged{get;set;}

With these types in place, the Run method can take advantage of them. Figure 5 shows the modified listing for run.csx, including placeholders for the BingeRequest and BingeDocument classes described earlier.

Let’s parse the new Run method. Its signature takes a request and a TraceWriter just as the original signature did, but now it also has an asynchronous output parameter named outputDocument. The result of the output parameter is what will get pushed to the Cosmos DB output I defined. Notice that its name aligns with the output parameter name in the output configuration in Figure 4. The TraceWriter lets me output messages to the log window that’s below the code window. I’ll take these out eventually, but it’s like the old days, without the IDEs that let you debug. Don’t get me wrong, though. The code window is amazing at parsing the language you’re working in, and when you save, any compiler errors, which are very detailed, also get output to the debug window. It also does things like inserting a closing brace when you type in an opening brace. There are an impressive number of editor features, in fact. For example, right-click on the editor window to see the long list of editor features of which you can take advantage.

Figure 5 The New run.csx File Capturing the Binge and Storing It into the Output, Cosmos DB

using System.Net;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req,
  TraceWriter log,    IAsyncCollector<object> outputDocument)
  BingeRequest bingeData =  await req.Content.ReadAsAsync<BingeRequest>();
  log.Verbose("Incoming userId:" + bingeData.userId);
  var doc=new BingeDocument(bingeData,log);
  log.Verbose("Outgoing userId:" + doc.userId);
  await outputDocument.AddAsync(doc);
  if (doc.userId !=" " ){
    return req.CreateResponse(HttpStatusCode.OK,$"{doc.userId} was created" );
  else {
    return req.CreateResponse(HttpStatusCode.BadRequest,
      $"The request was incorrectly formatted." );
public class BingeRequest{ . . . }
public class BingeDocument { . . . }

The first line of code in Run asynchronously reads the request and converts its result into a BingeRequest object.

Next, I instantiate a new BingeDocument, passing in the object I just created from the request, which results in a fully populated BingeDocument object, along with the logged property populated.

I then use the TraceWriter to show some data from the request in the logs so, when debugging, I can tell if BingeDocument did indeed get the data from the request object.

Finally, I asynchronously add the BingeDocument I just created into the asynchronous outputDocument.

The result of that outputDocument object is what gets sent to Cosmos DB by the function. It gets stored into the DocumentDB as JSON—the function converts it again in the background for you. Because I wired everything up with the integration settings, I don’t have to write any code to make that happen.

When all is said and done, I return a message via HttpResponse, relaying the function’s success or failure.

Compiling and Testing the Function

Functions get compiled when you save them. I’m going to randomly delete something important in the code so you can see the compiler in action, with results being displayed in the log window below the code window. Figure 6 shows the compiler output, highlighting the chaos I created by deleting an open brace from line 13. Even without a debugger, I’ve found that seeing these errors helps me work through code I’ve had to write without the aid of IntelliSense or other coding tools in my IDEs. At the same time, I learned how much I depend on those tools!

The Log Window Displaying Compiler Information Including Errors
Figure 6 The Log Window Displaying Compiler Information Including Errors

When the code is fixed up and the compiler is happy, the log will display the “Reloading” message followed by “Compilation succeeded.”

Now it’s time to test the function and you can do that right in the same window where you’re coding the function. To the right of the code editor are two tabbed panes. One displays the list of files related to the function. By default, there are only two, the run.csx file I’m currently looking at and a function.json file that contains all of the settings defined in the UI.  The other tab is for running tests. This built-in test UI is like a mini-Postman or Fiddler application for creating HTTP requests with a lot less effort because it’s already aware of the function to be tested. All you need to do is insert a JSON object to represent the incoming request. The test UI defaults to sending an HTTP Post, so you don’t even need to change that for this test. Enter the following JSON into the Request Body textbox. The schema is important, but you can use whatever values you want:

  "userId": "54321",
  "userName": "Julie",
  "deviceName" : "XBox",
  "dateTime": "2017-10-25 15:26:00",
  "score" : "5",
  "worthit" : "true",
  "logged": ""

Next, click the Run button on the Test pane. The test will call the function, passing in the request body, and then display any HTTP results in the Output window. In Figure 7, you can see the output “54321 was created,” as well as the log output from the function in the Logs window.

The Test Pane After Running a Test on the Function
Figure 7 The Test Pane After Running a Test on the Function

Viewing the New Data in the Cosmos DB Database

What you can’t see here is that as a result of this first successful test, the CookieBinge Cosmos DB database was created, and in it, the Binge collection where this document was stored. Let’s take a look at that before wrapping up this installment of my multi-part column.

You can do this by first opening the Cosmos DB account in the portal where you created this database. Mine is called datapointscosmosdb, so I’ll go to All Resources and type datapoints in the filter to find it. Once I open the account, I can see all of the collections and databases there, although the only one I have is my Binges collection in the CookieBinge database, as shown in Figure 8. That’s what just got created by the function.

The Binges Collection Listed in the datapointscosmosdb Account
Figure 8 The Binges Collection Listed in the datapointscosmosdb Account

Click on Binges to open up the Data Explorer for that collection. I’ve run the test twice, so you can see in Figure 9 that two documents were stored in the collection. The first seven properties in the document are the properties I defined. The rest are metadata that Cosmos DB and the relevant APIs use for tasks like indexing, searching, partitioning and more.

Looking at the Stored Documents in Cosmos DB in the Portal
Figure 9 Looking at the Stored Documents in Cosmos DB in the Portal

Looking Ahead

If you look back at Figure 7, you’ll see there’s a Get Function URL link above the code window. That URL is what I’ll use in the CookieBinge app to send data to the cloud.

Now that you’ve seen how to create the function and hook it up with Cosmos DB, my next column will show you how to build two more functions to retrieve different views of the data. The final installment will show how to call the functions from the CookieBinge app and display their results.

Julie Lerman is a Microsoft Regional Director, Microsoft MVP, software team coach and consultant who lives in the hills of Vermont. You can find her presenting on data access and other topics at user groups and conferences around the world. She blogs at the and is the author of “Programming Entity Framework,” as well as a Code First and a DbContext edition, all from O’Reilly Media. Follow her on Twitter: @julielerman and see her Pluralsight courses at

Thanks to the following Microsoft tech­nical expert for reviewing this article: Jeff Hollan

Discuss this article in the MSDN Magazine forum