December 2010

Volume 25 Number 12

Forecast: Cloudy - Pushing Content from SharePoint to Microsoft Azure Storage

By Joseph Fultz | December 2010

I have a coauthor this month, as colleague Shad Phillips helped me out with a recent project where I was working with a customer on a proof of concept that used SharePoint 2010 as an application platform. On the side, one of the customer’s staffers asked me if I could think of a reasonable way to take approved content from SharePoint and publish it, making it available to people outside of the corporate network.

The customer’s current infrastructure didn’t support external content (downloadable documents and videos). Having done a lot of work with Azure, I immediately thought that it would be pretty simple to incorporate pushing the content to Azure Storage as part of the workflow and then, depending on need, making it publicly available or providing lease-based access for restricted content.

With that in mind, I talked with my colleague, Shad, who had previously solved a similar problem where he had implemented a sample method to archive SharePoint documents from the library to Azure Storage. Although the intent of that solution is different from my goal, the mechanics are the same. This month, Shad and I will walk through a sample implementation that pushes content from SharePoint to Azure Storage, and cover a little bit about lease-access control to the files.

Scenario and Setup

Specifically, we developed a custom feature that enables a user to selectively push a document from SharePoint to Azure Storage. For some inexplicable reason, users don’t typically like it when their documents are moved and links aren’t provided to find them, so we left a link in the document library to the cloud location that behaves the same as it would if the document were at a non-cloud location.

Needed software:

  • Visual Studio 2010
  • Microsoft SharePoint 2010
  • Microsoft Azure SDK
  • Azure Development Storage Service

SharePoint 2010 Site and Document Library Configuration

For this scenario, we created a SharePoint site using the Team Site template. In the Shared Documents library, we created a column that allows us to flag an item as archived to Azure. This is done through the Library Settings accessible via the Ribbon. Once in Library Settings, we created a column with the properties illustrated in Figure 1.

image: Column Settings on the Team Site Template

Figure 1 Column Settings on the Team Site Template

In Advanced Settings, we also selected “Yes” for the “Allow the management of content types” setting.

We used this as part of the Content Type that we named Link to a Document. Next, we created instances of this Content Type as a means to link to the archived document, as shown in Figure 2.

image: Our New “Link to a Document” Content Type

Figure 2 Our New “Link to a Document” Content Type

Once the column and the content type were added to the document library, we uploaded a sample Word document named Services SOW.docx.

SharePoint 2010 Web.config

In order to connect to the cloud, we needed to get the settings that are required to connect with Azure. In this case, we used development storage and added the keys to the <appSettings> element in the web.config, as shown in Figure 3.

image: Adding Keys in Web.config

Figure 3 Adding Keys in Web.config

SharePoint Project

Fortunately, for SharePoint 2010 using Visual Studio 2010, it’s a nice developer experience to create, debug and publish new features. We created a SharePoint Feature (see msdn.microsoft.com/library/bb861828(office.12) for more information on this), which adds a custom action to the items’ action drop-down menus in the document library. The user will click it to use the archive feature.

We started by creating a solution named MSSAzureArchive using the Empty SharePoint Project template (see Figure 4).

image: Project Selection in Visual Studio 2010

Figure 4 Project Selection in Visual Studio 2010

Next, we specified the site and security level for debugging. We decided to choose “Deploy as a farm solution” because the code will need to make external calls, which the sandboxed solution won’t allow. References needed to be added to the project for Microsoft.Windows.Azure.StorageClient and System.Web. Next, we added an Empty Element item to the project using the EmptyElement template and named it AzureStorageElement. We added a <CustomAction/> element in order to add a new action item to the context menu for the document library items (see Figure 5).

image: Adding an AzureStorageElement via Add New Item

Figure 5 Adding an AzureStorageElement via Add New Item

A new Feature called Feature1 was automatically added to the project, which we renamed to MSSAzureArchive. We replaced the contents of the Elements.xml file for the AzureStorageElement that was added with the following:

<?xml version="1.0" encoding="utf-8"?>
 <Elements xmlns="https://schemas.microsoft.com/sharepoint/">
   <CustomAction
     Id="UserInterfaceCustomActions.ECBItemToolbar"
     RegistrationType="List"
     RegistrationId="101"
     Location="EditControlBlock"
     Sequence="106"
     Title="Azure Storage">
     <UrlAction Url="~sitecollection/
       _layouts/MSSAzureArchive/
       AzureStorage.aspx?ItemUrl={ItemUrl}" />
   </CustomAction>
 </Elements>

For the uninitiated SharePoint developer, Figure 6 shows a quick description of some of the properties (more information about the element and its properties can be found at msdn.microsoft.com/library/ms460194).

Figure 6 Properties of the Element

Property Function
Id Unique identifier.
Location Specifies where in the SharePoint UI the element should appear. In this case, the item menu (EditControlBlock) is the desired location versus, for example, the ribbon.
Sequence Specifies the ordering priority for the actions.

Note the Url property of the UrlAction element; this is the navigation action that happens in order to handle the command to archive the document. Based on this configuration, SharePoint knows where to put the feature in the UI and also what to do when someone clicks on it. SharePoint will navigate to a page we create that will handle the archiving of the selected document. This page will allow the user to select a target storage container or create a new storage container for the item, so we needed to add an Application Page item to the project. Once again using the SharePoint 2010 templates, we chose the Application Page template and named it AzureStorage.aspx (see Figure 7).

image: Adding a New Page in SharePoint 2010

Figure 7 Adding a New Page in SharePoint 2010

Because this sample wasn’t meant to impress anyone with a genius UI design, we added only minimal controls needed to get the job done. In the element of the page markup, we added the code shown in Figure 8.

Figure 8 Adding the Minimum-Needed Controls

Document to Archive:
<asp:Label ID="fileName" runat="server" ></asp:Label>   <br/>   
Choose Azure Container:
<asp:DropDownList ID="azureContainers" runat="server"  
  Visible="true"></asp:DropDownList>   
<asp:TextBox id="newContainerName" runat="server" Visible="false"></asp:TextBox>
<asp:Button ID="saveContainer" runat="server" Text="Save Container" 
  OnClick="SaveContainer_Click" Visible="false"></asp:Button>
<br />
<asp:Button ID="createContainer" runat="server" Text="Create New Container" 
  OnClick="CreateContainer_Click" />
<br/>
<asp:Button ID="archiveFile" runat="server" Text="Archive File" 
  OnClick="Archive_Click" />       
<br/>
<asp:Label ID="errMessage" runat="server" Text=""></asp:Label>

Next, we edited the code-behind and wired the UI elements up to some code to talk to Azure Storage, and rendered the information. We initialized the cloud storage client within the load event for the page and grabbed the available containers using the previous web.config settings (see Figure 9).

Figure 9 Initializing the Cloud Storage Client

protected void Page_Load(object sender, EventArgs e)
{
  this.InitializeCloudStorage();
  if (!IsPostBack)
  {
    this.GetContainers();
  }
}
private void GetContainers()
{
  IEnumerable<CloudBlobContainer> blobContainers =  
    cloudBlobClient.ListContainers();
  this.azureContainers.DataSource = blobContainers;
  azureContainers.DataTextField = "Name";
  this.azureContainers.DataBind();
  if (azureContainers.Items.Count < 1)
  {
    ListItem defaultContainer = new ListItem(defaultContainerName);
    defaultContainer.Selected = true;
    azureContainers.Items.Add(defaultContainer);
  }
}

Because the focus here is on the archive functionality, we concentrated on that. The other code is available via download (code.msdn.microsoft.com/mag201012Cloudy). We added a click handler for the archiveFile button and wired the Archive_Click function to it. Based on the UrlAction element, we can retrieve the path to the item. In the click function, the item is fetched from SharePoint using the object model, checked to see if it has already been archived and—if not—uploaded to the selected container (see Figure 10).

Figure 10 Code for the Click Function to Fetch the Item from SharePoint

protected void Archive_Click(object o, EventArgs e)
{
  try
  {
    webSite = SPContext.Current.Web;
    filePath = webSite.Url.ToString() + 
    Request.QueryString["ItemUrl"].ToString();
    fileToArchive = webSite.GetFile(filePath);
    string sArchived = fileToArchive.Item["IsArchived"].ToString(); 
    bool isArchived = Convert.ToBoolean(sArchived);
    if (isArchived)
    {
      errMessage.Text = "This document has already been archived!";
    }
    else
    {
      string newGuid = Guid.NewGuid().ToString();
      string uniqueBlobName = string.Format(newGuid + "_" + 
        fileToArchive.Name.ToString());
      blobContainer = cloudBlobClient.GetContainerReference(
        this.azureContainers.SelectedValue);
      blobContainer.CreateIfNotExist();
      cloudBlob = blobContainer.GetBlockBlobReference(uniqueBlobName);
      cloudBlob.UploadByteArray(fileToArchive.OpenBinary());

After the item is uploaded to storage, a new archive item of the type “Link to a Document” is created in place of the original document and the original document is deleted. If this were a publishing case instead of archival, the original item likely wouldn’t be deleted, but rather just marked as published with a link to the published version. The original item is used to get the target document library and the path of the original document. The new item is marked as archived by adding the IsArchived property and assigning the value “true.” First we did some work to get some of the values we needed, and then we created the new item and assigned the values to it, as shown here:

SPDocumentLibrary docLib = 
  fileToArchive.DocumentLibrary;
Hashtable docProperties = new Hashtable();
docProperties["IsArchived"] = true;
string docLibRelPath = 
  docLib.RootFolder.ServerRelativeUrl;
string docLibPath = string.Empty;
webSiteCollection = SPContext.Current.Site;
docLibPath = 
  webSiteCollection.MakeFullUrl(docLibRelPath);
string azureURL = cloudBlob.Uri.ToString();

The function BuildLinkToItem creates an instance of the content type “Link to a Document” using the path to the item in Azure Storage. This instance of the content type will be added to the library as the link to retrieve the item from Azure Storage via the SharePoint UI, as shown here:

string azureStub = this.BuildLinkToItem(azureURL).ToString();
  SPFile newFile = webSite.Files.Add(documentPath,     
    UTF8Encoding.UTF8.GetBytes(azureStub), docProperties, true);
  SPListItem item = newFile.Item;
  item["Content Type"] = "Link to a Document";
  SPFieldUrlValue itemUrl = new SPFieldUrlValue();
  itemUrl.Description = fileToArchive.Name;
  itemUrl.Url = azureURL;
  item["URL"] = itemUrl;
  item["IsArchived"] = true;
  item.Update();
  fileToArchive.Delete();

With the code completed to save the document, move it and replace it with a link to Azure Storage, it was time to focus on the build and deployment of the solution. We double-clicked on the Package.package file to bring up the package designer and subsequently selected the Advanced tab at the bottom of the screen. This is where we added the package assemblies that we needed in order to include the Microsoft.WindowsAzure.StorageClient.dll. Keeping it simple for this sample, we set the Deployment Target to the GlobalAssemblyCache. We ensured that Development Storage was running by navigating to the Server Explorer, clicking on the Azure Storage node, then clicking on the “(Development)” node.

Throwing caution to the wind, we pressed F5 to build, deploy, attach to a process and initiate a browser session to start debugging our feature. We navigated back to the Shared Documents library mentioned earlier and opened the drop-down menu attached to the document we previously loaded. In the drop-down, we selected our new element, Azure Storage, which took us to the custom application page to select the destination container (see Figure 11).

image: Selecting the Azure Storage Element

Figure 11 Selecting the Azure Storage Element

Once on the page, we could have created a new container, but instead we used the documents container that we created and clicked the Archive File button to execute the code from earlier (see Figure 12).

image: Selecting the Container

Figure 12 Selecting the Container

With the file archived to Azure Storage, we navigated back to the Shared Documents library. Instead of seeing the document, we saw a Link to a Document item that replaced the Services SOW.docx Word document (see Figure 13).

image: Link to Document in the SharePoint Documents Library

Figure 13 Link to Document in the SharePoint Documents Library

When we looked at the properties of the item, we saw the fields related to the content type, and in particular the URL to where the document now resides in Azure Storage (see Figure 14).

image: Link Properties

Figure 14 Link Properties

We could open the document directly from Azure Storage by clicking on Link to a Document. We could use the URL property to access it directly or via some other code or UI. For example, if we still wanted to index these items via the SharePoint Index Service, we could create a custom IFilter that would know how to process my Link to a Document Content Type to ensure the content would get indexed properly.

With the implementation of archiving content from a SharePoint Document Library to a Azure Storage Container out of the way, it left us with only public or no access to the archived documents for unauthenticated requests.

Access Control When Publishing

As mentioned earlier, what lead me to talk with Shad about his archival piece was the use of Azure Storage as a means to provide a public landing spot for content that had gone through review and approval. In the case that I was considering, I didn’t have to include any access control because the documents were to be shared with everyone. However, it only took a few minutes for someone to ask the question: “What if we want to publish something and make it available only to certain people, for example, vendors, customers or employees?” Often a task like this is accomplished within companies by including those people as part of a corporate domain or having them somehow federated so they’re identified via username and password challenge. This wasn’t the case here, and the customer didn’t really want to set up some application layer or front end to control access; developing the front end would reduce the value by increasing implementation costs.

One solution is to use a SharedAccessPolicy on the blobs. The container and blobs in the container would have their PublicAccess set to Off with a little bit of code you would likely write doing Azure Storage development anyway. The following code sample shows how I can set the PublicAccess to Off, but allow for SharedAccess on the container should I generate a signature and hand it out:

BlobContainerPermissions permissions = new BlobContainerPermissions();
  permissions.PublicAccess = BlobContainerPublicAccessType.Off;
  SharedAccessPolicy accesspolicy = new SharedAccessPolicy();
  accesspolicy.Permissions = SharedAccessPermissions.Read;
  permissions.SharedAccessPolicies.Add("Read", accesspolicy);
  BlobContainer.SetPermissions(permissions);

If we ask for a resource directly in the storage container, we’ll get a 404 Page Not Found. As we upload the blobs, we do a similar bit of work for the blob itself, but we create a SharedAccessPolicy that allows read, set an expiry time for it and ask for a Shared Access Signature back, like this:

SharedAccessPolicy policy = new SharedAccessPolicy();
policy.Permissions = SharedAccessPermissions.Read;
policy.SharedAccessExpiryTime = DateTime.Now.AddDays(5);
string SharedAccessSignature = destBlob.GetSharedAccessSignature(policy));

The call to GetSharedAccessSignature returns a string like this:

?se=2010-08-26T18%3A22%3A07Z&sr=b&sp=r&sig=WdUHKvQYnbOcMwUdFavn4QS0lvhAnqBAnVnC6x0zPj8%3D

If I concatenate that query string onto the end of the URI for the blob, I should receive it back, providing the expiry hasn’t passed. More information about the signature and Shared Access Policies can be found at msdn.microsoft.com/library/ee395415.

To solve the problem, I’d generate signatures and provide signed URIs that had a long expiry, which makes it easy to create them at the time of upload and then store a list of links to the published documents. For something a little more secure and for which I want to provide access to a single user for a short period of time, I would need a piece of UI. That UI would allow a user to request access to one or more resources and get back signed URIs that would provide access for a short duration of time.

Mixing with the Cloud

Here Shad and I used one general implementation to address two different scenarios. This was particularly useful for both scenarios, as we needed a particular piece of functionality (scalable, reliable and expandable storage) that the cloud provided without us having to do much in the way of setup and costing only what we used. The main idea that we hope to convey is that, as professionals looking to create solutions for our customers (internal or external), our solution concepts don’t have to be in the cloud or on-premises exclusively. The two can be easily mixed. As the Service Updates get applied over time, it will become easier and easier to blend the corporate network to the cloud network. I expect in the future that it will blend to the point of there not being much of a difference. So, as you’re looking at solutions for your software or business systems, you might take a moment to pause and think, “Is there something in the cloud that will help me?”


Joseph Fultz* is an architect at the Microsoft Technology Center in Dallas where he works with enterprise customers and ISVs designing and prototyping software solutions to meet business and market demands. He’s spoken at events such as Tech·Ed and similar internal training events.*

Shad Phillips* is an architect at the Microsoft Technology Center in Dallas where he works with enterprise customers and partners designing and deploying enterprise content management solutions built on Microsoft SharePoint 2010. *

Thanks to the following technical expert for reviewing this article: Jasen Tenney