Backing up a file system to Azure Storage with PowerShell - 2
Backing up a file system to Azure Storage with PowerShell
Overview: As an IT Administrator, I value backups both in the datacenter and on personal devices. While SkyDrive (or whatever it will be renamed to) are the answer for user data, some of us still maintain a lot of binary data and full installs that would simply fill up SkyDrive. I have historically been a big fan of solutions like iDrive, but have never been a big fan of the agents provided. This blog shows how to use your Azure benefits to safely store any data at an affordable price point.
Pricing: Azure storage is pretty affordable. At today’s prices, I can store my 100 GB or data (binaries, documents and etc.) for $9.50 per month and pay nothing for transferring data in (Azure does not charge for ingress). If I did ever need to restore everything, the bandwidth charge would just be $11.40 for a complete restore. This cost of $9.50 can be part of my $140 per month MSDN benefits I already get, so there is no additional charge.
How it works: Using the PowerShell script below, I simply run a monthly backup (scheduled task) of my main workstation and it compares the MD5 hash (see second script) of the local files to what is held in the cloud. If the hash of the local file is different, it uploads the file and deletes any files from my cloud backup that are no longer on my local machine. It is just that simple!
Setting up Azure: In Azure, simply create a Storage account (either local or Geo, your choice) and click the “Manage Access Keys” to get the $storageAccount and the $storageKey. Next create a container and be sure to mark it “Private”, else you will be sharing with the world!
Script (version 1.0):
This version of the script works off of the Archive bit set on the local file system. This is adequate for most usages, but can be impacted if you have existing backup software. For this to work you will need to install the PowerShell commandlets for Azure (https://www.windowsazure.com/en-us/manage/install-and-configure-windows-powershell/) and will need to run the “set-AzureSubscription” command to connect your Azure account to the local store on the machine.
# Import the modules
import-module "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1"
#### Set these parameters as you see fit
$logfile = "C:\Temp\backup.log"
$storageAccount = "your storage account here"
$storageKey = "your storage key here"
$storageContainer = "your storage container here"
$backupDirectories = "c:\data", "c:\users"
####
# Log into Azure. You must first have run 'set-AzureSubscription' for this to work
$context = New-AzureStorageContext -StorageAccountName $storageAccount -StorageAccountKey $storageKey
$startTime = get-date
"$startTime => Starting Backup" | Out-File $logfile -Append
# get each item in the directories mentioned above that have the Archive bit set, a standard for backup technologies
#for each directory in $backupDirectories
foreach ($backupDirectory in $backupDirectories) {
$files=get-childitem $backupDirectory *.* -rec | where-object {!($_.psiscontainer) -and ($_.attributes -match "Archive")}
#for each item in the directory
foreach ($file in $files) {
Set-AzureStorageBlobContent -Blob $file.fullname -Container $storageContainer -File $file.fullname -Context $context -Force
$endTime = get-date
$fullname = $file.fullname
"$endTime => Uploaded $fullname" | Out-File $logfile -Append
attrib -a $file.fullname
}
}
$startTime = get-date
"$startTime => Starting Removal of old files" | Out-File $logfile -Append
# Now we need to delete objects from Azure that no longer exist on the local computer
$existingFiles = Get-AzureStorageBlob -context $context -Container $storageContainer
foreach ($existingFile in $existingFiles) {
if ((Test-Path $existingFile.name) -ne $True) {
Remove-AzureStorageBlob -context $context -Container $storageContainer -Blob $existingFile.name
$deleteTime = get-date
"deleteTime => Deleted $existingFile" | Out-File $logfile -Append
}
}
Script (version 2.0):
This version of the script:
performs an MD5 hash check of the files using a file stream
includes directory path skipping (to not uploade .ost files and the like)
maintains multiple log files
has a debug mode
sets the power plan
URL encodes the file names, as certain NTFS file names do not work in URI formats, specifically the ]
Includes summary information in the log files
It can incur more bandwidth retrieving the MD5 hash from Azure, but is a more secure solution as the integrity of the file is compared and guarantees the files are the same.
# This script was written by Kevin Saye (ksaye@saye.org)
#
# This is a Powershell backup script that backs up entire directories to Azure Storage
# to get the Azure Powershell modules, you must install the following commandlets:
# https://www.windowsazure.com/en-us/manage/install-and-configure-windows-powershell/
# A special thanks to https://www.nikgupta.net/2013/09/azure-blob-storage-powershell/ for the tips!
#
# I am assuming you have one container for each machine you are backing up.
#### Set these parameters as you see fit
$logfile = "C:\Temp\backup.log"
$storageAccount = "your storage account here"
$storageKey = "your storage key here"
$storageContainer = "your storage container here"
$backupDirectories = "c:\users", "c:\data"
$skipDirectoryMatch = "\Temp", "~", ".tmp", ".ost"
$debugLevel = $False
$logfilelimit = 5
####
# Import the Azure modules
import-module "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1"
[Reflection.Assembly]::LoadWithPartialName("System.Web")
function SetPowerPlan([string]$PreferredPlan) {
$guid = (Get-WmiObject -Class win32_powerplan -Namespace root\cimv2\power -Filter "ElementName='$PreferredPlan'").InstanceID.tostring()
$regex = [regex]"{(.*?)}$"
$newpowerVal = $regex.Match($guid).groups[1].value
powercfg -S $newpowerVal
}
function escapeURL($escapeString) {
$escapedString = [web.httputility]::urlencode($escapeString)
$escapedString
}
function unEscape($escapeString) {
$escapedString = [web.httputility]::UrlDecode($escapeString)
$escapedString
}
$totalLocalFiles=0
$totalLocalFileSize=0
$totalCloudFileSize=0
$totalCloudFileSizeOld=0
$totalCloudFileCountOLd=0
$context = New-AzureStorageContext -StorageAccountName $storageAccount -StorageAccountKey $storageKey
$md5 = [System.Security.Cryptography.MD5]::Create()
# rename log files
do {
if ($logfilelimit -eq 1) {
$logfilebackupsource = "$logfile"
} else {
$logfilelimitbefore = $logfilelimit -1
$logfilebackupsource = "$logfile.$logfilelimitbefore"
}
$logfilebackuptarget = "$logfile.$logfilelimit"
Copy-Item $logfilebackupsource $logfilebackuptarget -Force
$logfilelimit--
}
while ($logfilelimit -gt 0)
#set Preferred powerplan
SetPowerPlan "High Performance"
$startTime = get-date
"$startTime => Starting Backup" | Out-File $logfile
"$startTime => Backing up directories: $backupDirectories, skipping any path that matches: $skipDirectoryMatch" | Out-File $logfile -Append
#get each item in the directories mentioned above and compare the MD5 hash values with what is in the cloud
foreach ($backupDirectory in $backupDirectories) {
$files=get-childitem $backupDirectory *.* -rec | where-object {!($_.psiscontainer)}
#for each item in the directory
foreach ($file in $files) {
$totalLocalFiles++
$fullnameEsc = escapeURL($file.fullname)
$fullname = $file.fullname
$startTime = get-date
$totalLocalFileSize = $totalLocalFileSize + $file.Length
# Get local and cloud MD5 hash
$localMD5 = $null
$cloudMD5 = $null
$fileReader = new-object System.IO.FileStream $fullname, "Open"
$localMD5 = [System.Convert]::ToBase64String($md5.ComputeHash($fileReader))
$cloudMD5 = (Get-AzureStorageBlob -Blob $fullnameEsc -Container $storageContainer -Context $context).ICloudBlob.Properties.ContentMD5
$cloudfile = (Get-AzureStorageBlob -Blob $fullnameEsc -Container $storageContainer -Context $context).Name
$skipfile = $false
$localFileSize = $file.Length / 1024
$fileReader.Close()
if ($cloudMD5 -ne $localMD5) {
foreach ($skipDirectory in $skipDirectoryMatch) {
if ($fullname.ToLower().Contains($skipDirectory.ToLower())) {
$skipfile = $True
}
}
if (!$skipfile) {
if ($debugLevel) {"$startTime => Debug: $fullname ($localFileSize KB) local hash is '$localMD5'. Cloud file $cloudfile cloud hash is '$cloudMD5', uploading file now." | Out-File $logfile -Append}
Set-AzureStorageBlobContent -Blob $fullnameEsc -Container $storageContainer -File $fullname -Context $context -Force -ConcurrentTaskCount 1
if ($debugLevel) {
$cloudfile = (Get-AzureStorageBlob -Blob $fullnameEsc -Container $storageContainer -Context $context).Name
$cloudMD5 = (Get-AzureStorageBlob -Blob $fullnameEsc -Container $storageContainer -Context $context).ICloudBlob.Properties.ContentMD5
"$startTime => Debug: After upload cloud file $cloudfile, the cloud hash is '$cloudMD5'." | Out-File $logfile -Append
}
$startTime = get-date
"$startTime => Uploaded $fullname as $fullnameEsc" | Out-File $logfile -Append
$totalCloudFileCountOLd++
} else {
if ($debugLevel) {"$startTime => Debug: we are skipping: $fullname, because the path matches: $skipDirectoryMatch." | Out-File $logfile -Append}
}
} else {
$endTime = get-date
if ($debugLevel) {"$startTime => Debug: Cloud and Local MD5 hash match for: $fullname, not uploading." | Out-File $logfile -Append}
}
}
}
$startTime = get-date
[int]$totalLocalFileSize = $totalLocalFileSize/1024/1024/1024
"$startTime => Completed Backup of files to the cloud. Processed $totalLocalFiles local files for a total size of $totalLocalFileSize GB." | Out-File $logfile -Append
# Now we need to delete objects from Azure that no longer exist on the local computer
$existingFiles = Get-AzureStorageBlob -context $context -Container $storageContainer
$startTime = get-date
$fileCount = $existingFiles.Count
"$startTime => Starting Removal of old files, if any. Found $fileCount item(s) in the cloud. Processing now." | Out-File $logfile -Append
foreach ($existingFile in $existingFiles) {
$fullname= $existingFile.Name
$fullnameEsc = escapeURL($existingFile.Name)
$deleteTime = get-date
$skipfile = $false
$filenameUnEsc = unEscape($fullname)
if ((Test-Path -LiteralPath $filenameUnEsc) -ne $True) {
foreach ($skipDirectory in $skipDirectoryMatch) {
if ($filenameUnEsc.ToLower().Contains($skipDirectory.ToLower())) {
$skipfile = $True
}
}
if (!$skipfile) {
Remove-AzureStorageBlob -context $context -Container $storageContainer -Blob $fullname
"$deleteTime => Deleted file $fullname from the cloud backup." | Out-File $logfile -Append
$totalCloudFileSizeOld = $totalCloudFileSizeOld + $existingFile.Length
$totalCloudFileCountOLd++
}
} else {
if ($debugLevel) {"$deleteTime => Debug: File $filenameUnEsc exist in cloud and on local machine, skipping." | Out-File $logfile -Append}
$totalCloudFileSize = $totalCloudFileSize + $existingFile.Length
}
}
[int]$totalCloudFileSize = $totalCloudFileSize/1024/1024/1024
[int]$totalCloudFileSizeOld = $totalCloudFileSizeOld/1024/1024/1024
$startTime = get-date
"$startTime => Completed Removal of unmatched cloud files. Processed $totalLocalFiles cloud files for a total size of $totalLocalFileSize GB." | Out-File $logfile -Append
"$startTime => Completed Removal of unmatched cloud files. Deleted $totalCloudFileCountOLd cloud files for a total size of $totalCloudFileSizeOld GB." | Out-File $logfile -Append
#set Preferred powerplan
SetPowerPlan "Balanced"