Loading static resources from Azure blob storage in cloud services roles
When you upload and deploy a cloud services package to Azure the actual .cspkg file gets copied around several times. It is first uploaded to the portal, then to the fabric, and lastly to the cloud service instance. All this copying around can take time if your cspkg file is large which will slow down your deployment speed. I've found cases where customers are bundling large static files inside the project that their code uses which unnecessarily increases the size of the package. These could be things like lookup databases, images, etc. Ideally these files could be loaded into your role directly from Azure blob storage so that you don't need to bundle them at all.
There are four separate things you need to do in order to make this work.
- You need a local resource which you can store your files in. This is a namespace defined in the service definition (csdef) file of our cloud service. You need to allocate a certain amount of local disk space to store these files as well as give it a name you can reference. You can optionally have Azure clear out the data on role recycle (so that you pull fresh data from blob storage). Inside the WorkerRole XML element of your csdef add the following lines:
<LocalResources>
<LocalStorage name="staticResources" sizeInMB="1023" cleanOnRoleRecycle="true" />
</LocalResources>
- You need to add the code required to pull the data out of blob storage and put it into our local resource when our role starts. The easiest way to do this is by putting it in the OnStart event which we're overriding from the RoleEntryPoint class.
This example downloads all of the files stored in a Azure storage container called "staticResources" and places them in the local "staticResources" folder.
// Retrieve an object that points to the local storage resource.
LocalResource localResource = RoleEnvironment.GetLocalResource("staticResources");
//Retrieve an object that points to the cloud storage account
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient client = storageAccount.CreateCloudBlobClient();
//Iterate through all ofthe blobs in the "staticResources" container and place them in our local resource folder
CloudBlobContainer container = client.GetContainerReference("staticResources");
foreach (IListBlobItem item in container.ListBlobs(null, false))
{
CloudBlockBlob blob = (CloudBlockBlob)item;
string[] paths = { localResource.RootPath, blob.Name };
String filePath = Path.Combine(paths);
blob.DownloadToFile(filePath, FileMode.Create);
}
- You need to use a tool like Azure Storage Explorer (https://azurestorageexplorer.codeplex.com/ ) AzCopy (https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/ ) to create the container and upload your static blobs into the storage account.
- Now whenever you need to reference the static file you can do so like this:
LocalResource localResource = RoleEnvironment.GetLocalResource("staticResources");
string[] paths = { localResource.RootPath, "myfile.dat" };
string filePath = Path.Combine(paths);
This will give you the full file system path to your file so you can load it up via whatever mechanism you need.