Accessing Azure Tables via REST

Most of the Azure table samples that I can find use the Storage Client to access the tables, which in turn uses the ADO.NET Data Services. While this provides a rich set of functionality, I wanted to explore the simplicity and flexibility of calling the REST API directly. The Azure table storage API is documented here. Using that, and the REST helpers in the Storage Client, I was able to create a simple console app that calls the Azure table API directly. Instructions and a link to the code are at the bottom of this post.


Development Storage

Note that this code has not been tested on development storage, only on the Azure services. I decided to quit using the development storage a couple of months ago and now only use Azure storage. My reasoning was simple: Azure storage is very similar to development storage, but it is not exact. I didn’t want to have to understand or debug the nuances. I also didn’t want to have to migrate the data every time I did a build. My experience also told me that it was very important to understand the performance of the data subsystem. Identifying data performance issues early is a lot better than staying up all night putting hot fixes into your latest release, and running the storage services remotely with the web server local is just about a worst case scenario. As a remote member of the team, it is also much easier to share code that talks to the same storage. You don’t have to worry about synching your tables every time you want to jointly debug an app. Your mileage may vary, but I am very happy with the decision.


The Code

My goal for this exercise is pretty simple. I want to list all of the tables in the storage account and I want to display all of the data within any one of those tables. While writing the code, I decided that I want to be able to limit the queries with a “where” clause (mainly because it was really easy to do). With the goal of keeping it simple, I’m delaying table create and delete and entity insert, update, and delete until later.


According to the documents, in order to get a list of tables, all one has to do is use HTTP GET to call making sure to replace myaccount with the correct storage account. After some more digging, it turns out that you have to add some headers to the request, including a signature header. Figuring out exactly how to construct those headers was the only difficult part of the exercise.


Here is a walkthrough of the code.


Step 1 - Create the web request

// Resource = “Tables”

// _Account = your Azure Storage Account

string uri = @"https://" + _Account + "" + Resource;

HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(uri);

request.Method = "GET";

request.ContentLength = 0;

Step 2 – Add a date header to the request

request.Headers.Add("x-ms-date", DateTime.UtcNow.ToString("R", System.Globalization.CultureInfo.InvariantCulture));


Step 3 - Sign the request

For a table, you need to use this Canonical form:

VERB + "\n" +

Content - MD5 + "\n" +

Content - Type + "\n" +

Date + "\n" +



My first reaction to this was – huh? I pieced together the following code that implements the above.

// Verb

string signature = "GET\n";

// Content-MD5

signature += "\n";

// Content-Type

signature += "\n";

// Date

signature += request.Headers["x-ms-date"] + "\n";

// remove the query string

int q = Resource.IndexOf("?");

if (q > 0) Resource = Resource.Substring(0, q);

// Canonicalized Resource

// Format is /{0}/{1} where 0 is name of the account and 1 is resources URI path

signature += "/" + _Account + "/" + Resource;


Note that for the table storage signature, you remove the query string from the resource but for blob storage you do not. This is documented in the API, but I missed it and wasted a lot of debug cycles.


Step 4 - Hash the signature with the secret and add the result to the SharedKey header

Replace _Secret with your Azure account secret. Table Storage also supports SharedKeyLite, but Blobs don’t, so I decided to stick with SharedKey.

// Hash-based Message Authentication Code (HMAC) using SHA256 hash

System.Security.Cryptography.HMACSHA256 hasher = new System.Security.Cryptography.HMACSHA256(Convert.FromBase64String(_Secret));

// Authorization header

string authH = "SharedKey " + _Account + ":" + System.Convert.ToBase64String(hasher.ComputeHash(System.Text.Encoding.UTF8.GetBytes(signature)));


Step 5 - Getting the results

using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())


using (System.IO.StreamReader r = new System.IO.StreamReader(response.GetResponseStream()))


string xml = r.ReadToEnd();

int ret = response.StatusCode;




Step 6 – Processing the results

According to the Azure Table API document, “The Query Tables operation returns the list of tables in the account as an ADO.NET entity set, which is an Atom feed.” The documentation shows the XML results returned. In this case, all we care about is the “d:TableName” repeating element.


if (ret == 200)


XmlDocument doc = new XmlDocument();


XmlNodeList nodes = doc.GetElementsByTagName("d:TableName");

Console.WriteLine("Tables in Account: " + _Account);

foreach (XmlNode n in nodes)











That’s all there is to it. Once you get this code working, it’s pretty easy to modify the URIs for other calls. Simply change the Resource. Here are some examples queries:

Customers – returns all rows from the Customers table (assuming that table actually exists)

Customers?$filter=state eq TX

Songs?$filter=rating ge 3


The general format to retrieve data from a table is “mytable$filter=<query-expression>”. More information is available here.


The following code will process the results from the table, displaying each field of each row.

if (status == 200)


XmlDocument doc = new XmlDocument();


XmlNodeList nodes = doc.GetElementsByTagName("m:properties");

// process each row

foreach (XmlNode n in nodes)


// process each column

foreach (XmlNode n2 in n.ChildNodes)


Console.Write(n2.Name + ": " + n2.InnerText);







Sample Console App Code

If you create a new console application named AzureStorageConsole, you can replace the code in Program.cs with the code attached below. You will also need to change the “accountN” and “secretN” references to your Azure credentials. I decided to embed the storage account information in the code vs. reading it from a config file. This implementation uses a helper class to allow you to specify the account in the constructor, which makes it easy to use different accounts. Lastly, you will need to replace the Customer table queries with existing tables in your storage account.


The next steps are to add support for creating and deleting tables as well as insert, edit, and delete capabilities to individual tables. I am also going to play around with the REST APIs for Blobs and Queues. They appear to be quite a bit simpler than the Table API.