Azure Functions
An Azure service that provides an event-driven serverless compute platform.
4,982 questions
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
I have this Time Trigger Function that runs PowerShell code to fetch Azure storage accounts used capacity metrics based on the tags. The code works fine when I trigger the job manually and gives me the accurate details but fetches 0 value from all the storage accounts in its schedule run.
The PowerShell uses SPN authentication for all the storage accounts.
param($Timer)
# Get environment variables
$tenantId = [Environment]::GetEnvironmentVariable('TENANTID')
$thumb = [Environment]::GetEnvironmentVariable('THUMBID')
$appId = [Environment]::GetEnvironmentVariable('APPID')
$appName = [Environment]::GetEnvironmentVariable('APPNAME')
Write-Host "Connecting Azure through $appName SPN login to call UC API"
# Connect to Azure with SPN
try {
Connect-AzAccount -ServicePrincipal -Tenant $tenantId -CertificateThumbprint $thumb -ApplicationId $appId
$subscriptions = Get-AzSubscription
Write-Host "Subscriptions fetched: $($subscriptions.Count)"
} catch {
Write-Warning "Failed to connect to Azure with SPN login. Skipping."
return
}
# Define the tag key
$tagKey = "Business Data Domain"
# Initialize an array to store the results
$strg = @()
$current_date = Get-Date -Format "yyyyMMdd"
foreach ($subscription in $subscriptions) {
try {
Set-AzContext -SubscriptionId $subscription.Id
Write-Host "Set context to subscription: $($subscription.Name)"
$storageAccounts = Get-AzStorageAccount | Where-Object { $_.Tags.ContainsKey($tagKey) }
Write-Host "Storage accounts found in subscription $($subscription.Name): $($storageAccounts.Count)"
if ($storageAccounts.Count -eq 0) {
Write-Host "No storage accounts with tag '$tagKey' found in subscription $($subscription.Name)."
}
} catch {
Write-Warning "Failed to set context for subscription: $($subscription.Name). Skipping."
continue
}
foreach ($storageAccount in $storageAccounts) {
Write-Host "Processing storage account: $($storageAccount.StorageAccountName)"
$resourceId = "/subscriptions/$($subscription.Id)/resourceGroups/$($storageAccount.ResourceGroupName)/providers/Microsoft.Storage/storageAccounts/$($storageAccount.StorageAccountName)"
$uri = "https://management.azure.com$resourceId/providers/Microsoft.Insights/metrics?api-version=2023-10-01&metricnames=UsedCapacity&aggregation=Average"
try {
$response = Invoke-AzRestMethod -Method Get -Uri $uri
$metrics = $response.Content | ConvertFrom-Json
Write-Host "Metrics response for $($storageAccount.StorageAccountName): $($response.Content)"
$usedCapacityMetric = $metrics.value | Where-Object { $_.name.value -eq "UsedCapacity" }
if ($usedCapacityMetric) {
$averageCapacity = $usedCapacityMetric.timeseries.data.average | Measure-Object -Sum | Select-Object -ExpandProperty Sum
Write-Host "Average Capacity for $($storageAccount.StorageAccountName): $averageCapacity"
} else {
Write-Host "Metric 'UsedCapacity' not found for $($storageAccount.StorageAccountName)."
$averageCapacity = 0
}
} catch {
Write-Warning "Error retrieving metrics for $($storageAccount.StorageAccountName): $($_.Exception.Message)"
$averageCapacity = 0
}
# Create a custom object for each storage account
$objresults = [PSCustomObject]@{
SubscriptionName = $subscription.Name
StorageAccount = $storageAccount.StorageAccountName
UsedCapacityInBytes = $averageCapacity
TagName = $tagKey
TagValue = $storageAccount.Tags[$tagKey]
}
# Add the object to the results array
$strg += $objresults
}
}
<#Write-Host "Completed fetching storage account details for $($storageAccount.StorageAccountName)"
catch {
Write-Warning "Failed to process storage account: $($storageAccount.StorageAccountName). Skipping."
}#>
Write-Host "Checking if strg has data before exporting to CSV..."
if ($strg.Count -eq 0) {
Write-Host "No data found in strg array!"
} else {
Write-Host "strg array contains data. Proceeding to export..."
}
# Create folder to store source path
$localPath = "$(Get-Location)\strgacctmp"
If (-not (Test-Path -Path $localPath)) {
New-Item -Path $localPath -ItemType 'Directory' -Force
Write-Host "New Folder strgacctmp Created!"
} else {
Write-Host "Folder already Exists!"
}
<# Create folder to store source path
$localPath = "$(Get-Location)\unitycatalogtmp"
If (Get-ChildItem -Path $localPath -Force -ErrorAction SilentlyContinue) {
Write-Host "Folder already Exists!"
}
else {
New-Item -Path "$localPath" -ItemType 'Directory' -Name unitycatalogtmp -Force -ErrorAction Stop
Write-Host "New Folder unitycatalogtemp Created!"
}#>
function uploadFileToBlob {
param(
[String] $LocalFilePath,
[String] $TargetFileName
)
$tenantId = [Environment]::GetEnvironmentVariable('TENANTID')
$storage_account = [Environment]::GetEnvironmentVariable('STORAGE_ACCOUNT')
$subscription = [Environment]::GetEnvironmentVariable('SUBSCRIPTION_NAME')
$filesystemName = [Environment]::GetEnvironmentVariable('CONTAINER_NAME')
$destPath = [Environment]::GetEnvironmentVariable('ADLS_FOLDER_PATH')
Write-Host " ----Retrieved App Settings from Function App--- "
Write-Host "ADLS_STORAGE_ACCOUNT - $storage_account"
Write-Host "ADLS_SUBSCRIPTION_NAME - $subscription"
Write-Host "ADLS_CONTAINER_NAME - $filesystemName"
Write-Host "Source - $LocalFilePath"
Write-Host "Destination Path - $destPath"
Write-Host "Path from where files will be uploaded: $LocalFilePath"
Write-Host "-------------------------"
$tenantId = [Environment]::GetEnvironmentVariable('TENANTID')
$thumb = [Environment]::GetEnvironmentVariable('THUMBID')
$appId = [Environment]::GetEnvironmentVariable('APPID')
$appName = [Environment]::GetEnvironmentVariable('APPNAME')
Write-Host "Connecting Azure through $appName SPN login to upload to ADLS"
$ctx = New-AzStorageContext -StorageAccountName $storage_account -UseConnectedAccount -ErrorAction Stop
#$ctx
try {
$destPath += "/" + $TargetFileName
New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $destPath -Source $LocalFilePath -ErrorAction Stop -Force
Start-Sleep -seconds 2
# Will execute on successful completion of folder upload
Write-Host "File uploaded successfully!"
}
catch {
Write-Output " Error occured as below"
$_
exit
}
}
# Export to CSV
$CsvPath = $localPath + "\strgaccsize$current_date.csv"
$strg | Export-Csv -Path $CsvPath -NoTypeInformation
#Upload to blob
$targetFileName = "edpstrgaccsize$($current_date).csv"
uploadFileToBlob -LocalFilePath $CsvPath -TargetFileName $targetFileName
Write-Host "Uploaded Storage Details-"
#Need to delete locally stored files in folder unitycatalogtmp
Get-ChildItem -Path $localPath -Filter *.csv | Remove-Item -Recurse
Write-Host "Script Completed"
In the schedule run log report I noticed "average" not being fetched but collects the same metrics in the manual run (below)
9/19/2024, 7:28:32.800 AM INFORMATION: Processing storage account azstrg12345
9/19/2024, 7:28:33.597 AM OUTPUT: Response Content:
9/19/2024, 7:28:33.597 AM "OUTPUT: {
""cost"": 59,
""timespan"": ""2024-09-19T06:28:33Z/2024-09-19T07:28:33Z"",
""interval"": ""PT1H"",
""value"": [
{
""id"": ""/subscriptions/xxxx/resourceGroups/xxxx/providers/Microsoft.Storage/storageAccounts/xxxx/providers/Microsoft.Insights/metrics/UsedCapacity"",
""type"": ""Microsoft.Insights/metrics"",
""name"": {
""value"": ""UsedCapacity"",
""localizedValue"": ""Used capacity""
},
""displayDescription"": ""The amount of storage used by the storage account. For standard storage accounts, it's the sum of capacity used by blob, table, file, and queue. For premium storage accounts and Blob storage accounts, it is the same as BlobCapacity or FileCapacity."",
""unit"": ""Bytes"",
""timeseries"": [
{
""metadatavalues"": [],
""data"": [
{
""timeStamp"": ""2024-09-19T06:28:00Z"",
""average"": 55255854855
}
]
}
],
""errorCode"": ""Success""
}
],
""namespace"": ""Microsoft.Storage/storageAccounts"",
""resourceregion"": ""northeurope""
}"