I ended up writing a powershell script to export a bacpac of each database to my local pc using sqlpackage then upload to Azure blob storage which is run on a schedule. It's not a perfect solution but I need to have access to the data for development/support. I really like the managed instance so far with the exception of this. There should be a more integrated way to backup and restore MI databases to local or at least to export a bacpac directly to Azure blob.
Here's the script. I'm not very experienced with powershell but it works...
#SQLPackage - your location may be different - you can find it using this in cmd: where /R c:\ SqlPackage.exe
$sqlPackageFileName = "c:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\150\sqlpackage.exe"
#Database connection
$targetServerName = "TargetServerSQLManagedInstance"
$username = "username"
$password = "password"
#Storage Connection
$subscriptionId = "SubscriptionId"
$storageAccountRG = "StorageAccountRG"
$storageAccountName = "StorageAccountName"
$storageContainerName = "StorageContainerName"
$storageAccountKey = "StorageAccountKey"
# Add all databases by name to the array
$databases = @("Database1", "Database2")
foreach ($database in $databases) {
$Now = Get-Date -Format "MMddyyyy_hhmmss"
$blob = $database + "_" + $Now + ".bacpac"
$filename = "C:\Temp\" + $blob
& $sqlPackageFileName /Action:Export /ssn:$targetServerName /sdn:$database /su:$username /sp:$password /tf:$filename /p:Storage=File
# Set AzStorageContext
$destinationContext = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
# Upload File
Set-AzStorageBlobContent -File $filename -Container $storageContainerName -Blob $blob -Context $destinationContext -StandardBlobTier Hot
#Remove file from local
Remove-Item $filename
}