March 2015

Volume 30 Number 3

.NET Micro Framework - Create an IoT Device Using the Gadgeteer with Azure Blob Storage

By Benjamin Perkins

In the next five years, the number of Internet-connected devices will be more than three times the total human population. These devices won’t be restricted to the smartphones and tablets we use now, but will include additional devices embedded into home appliances, elevators, automobiles, business environments, arm bands, clothing and much more. These devices will capture information about energy consumption, speed, temperatures, the presence of various gases, blood pressure, heart rate, program or hardware exceptions, and just about anything you can imagine.

The data captured by these devices needs to be stored in a highly scalable, highly reliable and highly usable environment. By usable I mean once the data is stored, the platform on which it exists should provide services, features and processing power to analyze, learn from and act on the gathered data. In this article, I’m going to show you how to use the Gadgeteer—a rapid hardware development platform based on the Microsoft .NET Micro Framework—to build a device for capturing data; Microsoft Azure Storage to store the data; and the Azure platform to analyze and consume the data. By following along, you’ll begin your journey into the Internet of Things (IoT) generation.

I’ll discuss those three components, the Gadgeteer, Azure Blob Storage, and some data capturing and analysis components of the Azure platform in detail. In addition, I’ll provide thorough code-level instructions on how you can use the Gadgeteer to insert an image into an Azure Blob Storage container.


The Gadgeteer, provided by GHI Electronics LLC (, includes kits, modules and mainboards (such as the FEZ Raptor, FEZ Hydra and the FEZ Spider used for this project) for creating many different types of devices. These devices can be used to capture numerous varieties of data. For example, they might use a gas-sensing module for detecting gases in the air, a motion detector or a module for measuring relative temperature and humidity. The FEZ Spider-compatible modules necessary to complete the project discussed in this article are: a camera, an Ethernet jack and a power module. In the first section, I’ll show you how to configure the modules and develop the code required to prepare the captured image for insertion into an Azure Blob Storage container.

Figure 1 shows the configuration of the project. In addition to the camera, Ethernet jack, and power supply, there’s also a character display for displaying the IP address and date and time on the device; an LED light for visual notifications; and a button for triggering the capture of the picture and the process to upload it.

FEZ Spider Configuration for Taking a Picture and Storing in an Azure Blob Container
Figure 1 FEZ Spider Configuration for Taking a Picture and Storing in an Azure Blob Container

After installing the .NET Micro Framework, GHI binaries (Package 2014 R5) and the .NET Gadgeteer SDK onto your development machine, begin by creating a new project in Visual Studio 2012 (Visual Studio 2013 is also supported, although not extensively tested) and selecting the Gadgeteer template. Once you’ve given the project a name, the wizard will walk you through the selection of a mainboard and Micro Framework version. Then, using the Toolbox, add the modules to build your device, similar to what’s shown in Figure 1.

When the modules are all physically connected to the sockets on the mainboard, begin to make the code changes required to achieve the project goal—connecting to the network and taking a picture. The module used to make the network connection is the ethernetJ11D, which supports the common RJ-45 network adapter. Here’s some code to make the ethernetJ11D module perform the connection and get an allocated IP address:


Of course, this isn’t the only way to achieve the connection; however, it’s what worked best and was simplest in this context.

Once connected, you may want to execute a simple System.Net.Http­WebRequest request, similar to what’s shown in Figure 2, to make sure the connection to the Internet is working as expected.

Figure 2 Testing Your Internet Connection

void makeGenericHTTPRequest()
    string url = "";
    using (var req = System.Net.HttpWebRequest.Create(url))
      using (var res = req.GetResponse())
        Debug.Print("HTTP Response length: " 
          + res.ContentLength.ToString());  }
  catch (Exception ex)

Note that the HTTPS protocol won’t work here without some configuration. If you receive an exception such as “A first chance exception of type ‘System.NotSupportedException’ occurred in Microsoft.SPOT.Net.Security.dll,” the SSL Seed on the device must be updated. You can do this using the .NET Micro Framework Deployment Tool (MFDeploy.exe), located in the C:\Program Files (x86)\Microsoft .NET Micro Framework\v4.3\Tools directory. With the device connected to the development machine, navigate to Target | Manage Device Keys and click the Update SSL Seed, as shown in Figure 3. When that’s done, HTTPS should work as expected.

The .NET Micro Framework Deployment Tool MFDeploy
Figure 3 The .NET Micro Framework Deployment Tool MFDeploy

Once you’ve gotten the connection to the network and the Internet configured and working, it’s important to set the date and time on the device. As discussed later, the date and time are required parts of the PUT Blob REST API (, which you’ll use to insert the image into the Azure Blob container. To set the date and time, use the TimeServiceSettings class, which is part of the Microsoft.SPOT.Time namespace, as shown here: 

TimeServiceSettings time = new TimeServiceSettings()
  ForceSyncAtWakeUp = true
IPAddress[] address = Dns.GetHostEntry("").AddressList;
time.PrimaryServer = address[0].GetAddressBytes();
TimeService.Settings = time;

Once the TimeService is running, you can retrieve a current timestamp using the standard DateTime.UtcNow or DateTime.Now properties.

Capturing an image using a GHI FEZ Spider-compatible camera requires just a single line of code:


Recall that I added the Button module to the device. It’s from the Button’s pressed event that the TakePicture method is called. When the picture is taken, the camera.PictureCaptured event is triggered and in turn calls the method camera_PictureCaptured(Camera sender, GT.Picture e). I use standard C# code to wire up the event handler and the method for linking the Pressed and Captured events to their corresponding methods:

button.ButtonPressed += button_ButtonPressed;
camera.PictureCaptured += camera_PictureCaptured;

The camera_PictureCaptured method receives GT.Picture as a parameter. After being converted to a byte[] array using the e.PictureData property, the image data is passed to the custom method for inserting into the Azure Blob container. Reading this, the process may seem complicated, but it’s really quite simple, as shown in Figure 4. Also, you can download the code sample that accompanies this article to see the whole thing.

Gadgeteer Picture-Capturing Process Flow
Figure 4 Gadgeteer Picture-Capturing Process Flow

Figure 5 shows the code used to consume the AzureBlob class, discussed in the next section where I create the Authorization header ( and call the REST API that inserts the image into the Azure Blob container.

Figure 5 Inserting the Image into Azure Blob Storage

void insertImageintoAzureBlob(GT.Picture picture)
  AzureBlob storage = new AzureBlob()
    Account = "ACCOUNT-NAME",
    BlobEndPoint = "",
  if (ethernetJ11D.IsNetworkUp)
    storage.PutBlob("CONTAINER-NAME", picture.PictureData);
    characterDisplay.Print("NO NETWORK CONNECTION");

From the device perspective, that’s it. The device is now assembled and connected to the Internet, and the logic to capture the image and send it in the correct format to the PUT Blob REST API is all complete. Next, I create the Azure Blob Storage account and container and configure the code required to insert the image.

Azure Blob Storage

Azure Blob Storage is a useful service that provides access to images, documents, videos, and so forth from anywhere using HTTP or HTTPS. For example, if you have an image named home.bmp, a public Azure Storage account named contosox and a container named blob, accessing the .bmp file from a browser is as simple as entering into a browser or referencing the URL from HTML or source code.

Inserting, listing, downloading and deleting blobs from a standard .NET application on an Azure Blob container is accomplished using the Azure .NET Storage Client library via the Microsoft.Windows­Azure.Storage assembly. This assembly, unfortunately, isn’t available or compatible with the .NET Micro Framework or on the Gadgeteer device. This isn’t a problem, though, because the Azure Blob Storage service provides publicly accessible REST APIs  that support the same insert, list, download and delete capabilities (see for more information). Therefore, consuming these features from an IoT device is as simple as calling a standard REST API.

I’ll show you how to create an Azure Blob Storage account and container; how to create the REST API Authorization header using the .NET Micro Framework 4.3; and how to get the image captured in the previous section uploaded into the Azure Blob container.

Create the Azure Storage account from within the Azure Management Console and then click the + Add button to add the container. In this example, the storage account name is contosox and the container is blob, as illustrated in Figure 6.

Creating the Azure Storage Container for Storing the Image Taken from the IoT Device
Figure 6 Creating the Azure Storage Container for Storing the Image Taken from the IoT Device

The example in Figure 7 is a breakdown of what’s required to consume the PUT Blob REST API. Appending the name of the image to the URL shown in Figure 6 and calling the GetRequestStream method of the System.Net.HttpWebRequest class returns a System.IO.Stream object. And using the Write method of the System.IO.Stream class successfully saves the image to the Azure Blob container.  Note that the blobContent parameter of the Write method is the byte[] content contained in the GT.Picture.PictureData property.

Figure 7 Consuming the PUT Blob REST API

HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
  using (Stream requestStream = request.GetRequestStream())
    requestStream.Write(blobContent, 0, blobLength);
catch (WebException ex)

That’s not too complicated; it’s very much like calling a REST API from a standard ASP.NET or Microsoft .NET Framework application. The difference—and the complexity—come from the creation of the Authorization header, defined here: 

Authorization="[SharedKey|SharedKeyLite] <AccountName>:<Signature>"

When the Azure Blob container is created, it’s made public, meaning that anyone can read from it via the previously mentioned URL. However, adding or deleting a Blob requires an Authorization header to be present in the request. The SharedKey value in the header notifies the server that a shared access key exists and should be used for request authentication. The SharedKey is associated to the Azure Storage account, in this example contosox, and is acquired by clicking on Manage Access Keys for the given storage account. The value contained in the Primary Access Key textbox, illustrated in Figure 8, is the SharedKey used for accessing the container.

Acquiring the SharedKey Necessary for Azure Storage Authentication
Figure 8 Acquiring the SharedKey Necessary for Azure Storage Authentication

Additionally, the Signature portion of the Authorization header must be a Hash-based Message Authentication Code (HMAC) constructed using a number of request attributes, computed using the System.Se­curity.Cryptography.HashAlgorithm.ComputeHash method and encoded by the System.Convert.ToBase64String method.

I’ll break that down a little, starting with the components necessary for constructing the Authorization header. These include attributes such as x-ms-version, content-length, content-type, shared access key and many other header values described in detail at One of the more important attributes is the x-ms-date, which is the reason you had to set the date and time during the initialization of the device. The x-ms-date must be a UTC timestamp. The timestamp on the device must have a differential of less than 15 minutes when compared to the timestamp on the server where the storage service is hosted. The storage service ensures the request isn’t older than 15 minutes; if the time frame has a greater differential, the service returns a 403 (Forbidden). The other attributes don’t change and can be hardcoded or retrieved from some other source—a configuration file, for example.

Once the Authorization header is formatted correctly, which takes about 35 lines of code, it needs to be hashed and encoded per the requirements. Using the full version of the .NET Framework, the code required resembles this:

using (HashAlgorithm hashSHA256 = 
  new HashAlgorithm(HashAlgorithmType.SHA256))
  Byte[] dataToHmac = 
  signature = 

The System.Security.Cryptography.HashAlgorithm.ComputeHash method required to hash the Authorization header does exist in the .NET Micro Framework version 4.3 ( and is a valid option for hashing the Signature. I chose, however, to develop a WebAPI that accepts the constructed header as a parameter, encodes it, hashes it and returns it to the Gadgeteer for use with the PUT Blob REST API call. I chose this approach primarily because I wanted to test calling a WebAPI from the Gadgeteer and found this to be a logical place to do it. The following code shows how the WebAPI was called from the Gadgeteer device:

string queryString = "constructedHeader=" + constructedHeader;
Uri uri = new Uri("https://*??" + queryString);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream dataStream = response.GetResponseStream();
StreamReader reader = new StreamReader(dataStream);
string responseFromServer = reader.ReadToEnd();

The response from a WebAPI, as you likely already know, is a JSON object and is the contents of the responseFromServer string.  There is an open source MicroJSON library at, which is useful for parsing the results of the WebAPI and getting the hashed and encoded Authorization header value:

var jdom = (JObject)JsonHelpers.Parse(responseFromServer);
var hashedvalue = jdom["HashedValue"];
StringBuilder authorizationHeader = new StringBuilder("{0} {1}:{2}")
  .Replace("{0}", SharedKeyAuthorizationScheme)
  .Replace("{1}", "contosox")
  .Replace("{2}", hashedValue);

Alternatively, using the string.IndexOf method in combination with the string.Substring method can achieve the same result.

The string.Format method isn’t currently supported in the .NET Micro Framework; therefore, I used StringBuilder.Replace to construct the authorizationHeader. As noted earlier, the Authorization header consists of the SharedKeyAuthorizationScheme, which is a combination of either SharedKey or SharedKeyLite; the Azure Blog Storage name, contosox; and the encoded and hashed Signature. When the Authorization header is correctly formed, add it to the PUT Blob REST API request object and execute it. Once the image is authenticated, it’s added to the Blob container:

request.Headers.Add("Authorization", authorizationHeader);

In this example, the event that triggers the insertion of the image into the Azure Blob container is the press of a button. I chose this approach simply to create this proof-of-concept project. A real-world implementation might instead use a motion, gas, barometric pressure, temperature or moisture sensor. Really, anything that a module exists for that triggers an event when a given threshold is breached can be used to take a picture and upload it to Azure. In many cases, the requirements for the IoT device will not include an image, only the storage of captured data such as temperature, speed, gas level, time and so on. Such data can be stored in a database via a simple WebAPI call, similar to the one I discussed previously, but without the need for the Authorization header because there’s no Blob or image.

Now that the data is captured and stored on the Azure platform, the question is what to do with it and how to analyze and learn from it. In the next section I’ll provide some ideas and concluding thoughts for continued review, development and future discussions. My intent here hasn’t necessarily been to show all the details of an end-to-end IoT solution, but rather to provide food for thought and to drive these concepts forward.

Microsoft Azure

Most IT professionals have a good idea of what the term “Big Data” means. However, how to use and harvest it are likely not as clear. When I first started learning about Big Data, the initial hurdle I had was finding a data source I could use for working with tools like HDInsight, Power BI, Event Hubs, Machine Learning or Stream Analytics. Nonetheless, I started using these services, learning the features and capabilities, but without a large source of data to run my algorithms on, I quickly lost interest. What finally dawned on me was that the IoT—those Internet-connected devices—could be used for collecting data and building large sources of information for use with these Azure services.

It’s clear that each of these services is designed with the IoT in mind, considering Event Hubs and Stream Analytics for real-time decision making and HDInsight and Machine Learning for analyzing massive amounts of data, in search of longer-term trends. For example, Event Hubs, which is specifically designed to ingest millions of device-triggered events per second, provides the necessary scale for large-sized IoT solutions. For companies and enterprises that have already implemented an IoT solution or are beginning to create their IoT strategy, Event Hubs would be a great start to manage long-term growth and scale.  Additionally, Stream Analytics integrates with Event Hubs and can provide real-time analysis of the data submitted to the Event Hubs by the devices. Stream Analytics can compare the data being sent to Event Hubs with historical data and proactively send an alert if the current patterns don’t match the historical ones. Imagine the possibilities with this.

Organizations and even individuals who have large data pools—not necessarily collected by IoT devices—can use the HDInsight or Machine Learning services for the analysis of that data. HDInsight is a Hadoop solution that can scale to terabytes or petabytes of data on demand, and the Azure platform provides an almost infinite amount of storage and compute resources. Use HDInsight to find hidden business opportunities together with or independently from Machine Learning to mine data to help predict future trends and behaviors. These services introduce you to the new IoT era by exposing previously unseen information in innovative ways, opening up great possibilities, all of which can be presented in a user-friendly manner via Power BI.

Wrapping Up

The goal of this article was to explain how to configure an IoT device using the Gadgeteer, connect it to the Internet and upload an image to an Azure Blob container in the cloud. Inserting data from a device into any Internet-accessible data source requires only a simple Web­API call, and the configuration of the actual device hardware is as simple as dragging and dropping modules from a Toolbox menu to a design template. The only real complexity in this example was the creation of the PUT Blob REST API Authorization header, as it must be in a specific format and be encoded and hashed using the System.Security.Cryptography.HashAlgorithm.ComputeHash method via a WebAPI or using the .NET Micro Framework class. 

I also summarized some Azure platform services and established their scaling capabilities for the storage and analysis of data captured from IoT devices. Once your devices begin to capture sufficient amounts of data, you can use Azure platform services like Event Hubs and Stream Analytics for real-time analysis, and HDInsight and Machine Learning for longer-term investigation. The discovery of the secrets hidden within the massive amount of generated, stored, processed and presented information can then be used for making business decisions, forecasting and helping your corporate strategy succeed. Let’s do this!

Benjamin Perkins is a senior support escalation engineer at Microsoft and author of three books on IIS, NHibernate and Microsoft Azure. He is currently writing a C# book to be published in parallel with the release of C# 6.0. Reach him at

Thanks to the following Microsoft technical experts for reviewing this article: Martin Grasruck and Colin Miller