Azure time trigger PowerShell function manual vs schedule

Sudhirkumar Karamchand 0 Reputation points
2024-09-19T09:40:48.64+00:00

I have this Time Trigger Function that runs PowerShell code to fetch Azure storage accounts used capacity metrics based on the tags. The code works fine when I trigger the job manually and gives me the accurate details but fetches 0 value from all the storage accounts in its schedule run.

The PowerShell uses SPN authentication for all the storage accounts.

param($Timer)

# Get environment variables
$tenantId = [Environment]::GetEnvironmentVariable('TENANTID')
$thumb = [Environment]::GetEnvironmentVariable('THUMBID')
$appId = [Environment]::GetEnvironmentVariable('APPID')
$appName = [Environment]::GetEnvironmentVariable('APPNAME')

Write-Host "Connecting Azure through $appName SPN login to call UC API"

# Connect to Azure with SPN
try {
    Connect-AzAccount -ServicePrincipal -Tenant $tenantId -CertificateThumbprint $thumb -ApplicationId $appId
    $subscriptions = Get-AzSubscription
    Write-Host "Subscriptions fetched: $($subscriptions.Count)"
} catch {
    Write-Warning "Failed to connect to Azure with SPN login. Skipping."
    return
}

# Define the tag key
$tagKey = "Business Data Domain"

# Initialize an array to store the results
$strg = @()
$current_date = Get-Date -Format "yyyyMMdd"

foreach ($subscription in $subscriptions) {
    try {
        Set-AzContext -SubscriptionId $subscription.Id
        Write-Host "Set context to subscription: $($subscription.Name)"
        
        $storageAccounts = Get-AzStorageAccount | Where-Object { $_.Tags.ContainsKey($tagKey) }
        Write-Host "Storage accounts found in subscription $($subscription.Name): $($storageAccounts.Count)"
        
        if ($storageAccounts.Count -eq 0) {
            Write-Host "No storage accounts with tag '$tagKey' found in subscription $($subscription.Name)."
        }
    } catch {
        Write-Warning "Failed to set context for subscription: $($subscription.Name). Skipping."
        continue
    }
    
    foreach ($storageAccount in $storageAccounts) {
        Write-Host "Processing storage account: $($storageAccount.StorageAccountName)"

        $resourceId = "/subscriptions/$($subscription.Id)/resourceGroups/$($storageAccount.ResourceGroupName)/providers/Microsoft.Storage/storageAccounts/$($storageAccount.StorageAccountName)"
        $uri = "https://management.azure.com$resourceId/providers/Microsoft.Insights/metrics?api-version=2023-10-01&metricnames=UsedCapacity&aggregation=Average"
        
        try {
            $response = Invoke-AzRestMethod -Method Get -Uri $uri
            $metrics = $response.Content | ConvertFrom-Json
            Write-Host "Metrics response for $($storageAccount.StorageAccountName): $($response.Content)"
            $usedCapacityMetric = $metrics.value | Where-Object { $_.name.value -eq "UsedCapacity" }
            
            if ($usedCapacityMetric) {
                $averageCapacity = $usedCapacityMetric.timeseries.data.average | Measure-Object -Sum | Select-Object -ExpandProperty Sum
                Write-Host "Average Capacity for $($storageAccount.StorageAccountName): $averageCapacity"
            } else {
                Write-Host "Metric 'UsedCapacity' not found for $($storageAccount.StorageAccountName)."
                $averageCapacity = 0
            }
        } catch {
            Write-Warning "Error retrieving metrics for $($storageAccount.StorageAccountName): $($_.Exception.Message)"
            $averageCapacity = 0
        }

                
        # Create a custom object for each storage account
        $objresults = [PSCustomObject]@{
            SubscriptionName    = $subscription.Name
            StorageAccount      = $storageAccount.StorageAccountName
            UsedCapacityInBytes = $averageCapacity
            TagName             = $tagKey
            TagValue            = $storageAccount.Tags[$tagKey]
        }
        # Add the object to the results array
        $strg += $objresults
    }

}

<#Write-Host "Completed fetching storage account details for $($storageAccount.StorageAccountName)"
        catch {
            Write-Warning "Failed to process storage account: $($storageAccount.StorageAccountName). Skipping."
        }#>
    


Write-Host "Checking if strg has data before exporting to CSV..."
if ($strg.Count -eq 0) {
    Write-Host "No data found in strg array!"
} else {
    Write-Host "strg array contains data. Proceeding to export..."
}

# Create folder to store source path
$localPath = "$(Get-Location)\strgacctmp"
If (-not (Test-Path -Path $localPath)) {
    New-Item -Path $localPath -ItemType 'Directory' -Force
    Write-Host "New Folder strgacctmp Created!"
} else {
    Write-Host "Folder already Exists!"
}

<# Create folder to store source path 
$localPath = "$(Get-Location)\unitycatalogtmp"
If (Get-ChildItem -Path $localPath -Force -ErrorAction SilentlyContinue) {
    Write-Host "Folder already Exists!"
}
else {
    New-Item -Path "$localPath" -ItemType 'Directory' -Name unitycatalogtmp -Force -ErrorAction Stop
    Write-Host "New Folder unitycatalogtemp Created!"
}#>


function uploadFileToBlob {
    param(
        [String] $LocalFilePath,
        [String] $TargetFileName
    )
    $tenantId = [Environment]::GetEnvironmentVariable('TENANTID')
    $storage_account = [Environment]::GetEnvironmentVariable('STORAGE_ACCOUNT')
    $subscription = [Environment]::GetEnvironmentVariable('SUBSCRIPTION_NAME')
    $filesystemName = [Environment]::GetEnvironmentVariable('CONTAINER_NAME')
    $destPath = [Environment]::GetEnvironmentVariable('ADLS_FOLDER_PATH')

    Write-Host " ----Retrieved App Settings from Function App--- "
    Write-Host "ADLS_STORAGE_ACCOUNT - $storage_account"
    Write-Host "ADLS_SUBSCRIPTION_NAME - $subscription"
    Write-Host "ADLS_CONTAINER_NAME - $filesystemName"
    Write-Host "Source - $LocalFilePath"
    Write-Host "Destination Path - $destPath"
    Write-Host "Path from where files will be uploaded: $LocalFilePath"
    Write-Host "-------------------------"

    $tenantId = [Environment]::GetEnvironmentVariable('TENANTID')
    $thumb = [Environment]::GetEnvironmentVariable('THUMBID')
    $appId = [Environment]::GetEnvironmentVariable('APPID')
    $appName = [Environment]::GetEnvironmentVariable('APPNAME')

    Write-Host "Connecting Azure through $appName SPN login to upload to ADLS"

    
    $ctx = New-AzStorageContext -StorageAccountName $storage_account -UseConnectedAccount -ErrorAction Stop
    #$ctx

    try {
        $destPath += "/" + $TargetFileName
        New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $destPath -Source $LocalFilePath -ErrorAction Stop -Force
        Start-Sleep -seconds 2

        # Will execute on successful completion of folder upload
        Write-Host "File uploaded successfully!"
    }
    catch {
        Write-Output " Error occured as below"
        $_
        exit
    }
}

# Export to CSV        
$CsvPath = $localPath + "\strgaccsize$current_date.csv"
$strg | Export-Csv -Path $CsvPath -NoTypeInformation

#Upload to blob
$targetFileName = "edpstrgaccsize$($current_date).csv"
uploadFileToBlob -LocalFilePath $CsvPath -TargetFileName $targetFileName

Write-Host "Uploaded Storage Details-"

#Need to delete locally stored files in folder unitycatalogtmp
Get-ChildItem -Path $localPath -Filter *.csv | Remove-Item -Recurse

Write-Host "Script Completed"


In the schedule run log report I noticed "average" not being fetched but collects the same metrics in the manual run (below)

9/19/2024, 7:28:32.800 AM	INFORMATION: Processing storage account azstrg12345
9/19/2024, 7:28:33.597 AM	OUTPUT: Response Content:
9/19/2024, 7:28:33.597 AM	"OUTPUT: {
  ""cost"": 59,
  ""timespan"": ""2024-09-19T06:28:33Z/2024-09-19T07:28:33Z"",
  ""interval"": ""PT1H"",
  ""value"": [
    {
      ""id"": ""/subscriptions/xxxx/resourceGroups/xxxx/providers/Microsoft.Storage/storageAccounts/xxxx/providers/Microsoft.Insights/metrics/UsedCapacity"",
      ""type"": ""Microsoft.Insights/metrics"",
      ""name"": {
        ""value"": ""UsedCapacity"",
        ""localizedValue"": ""Used capacity""
      },
      ""displayDescription"": ""The amount of storage used by the storage account. For standard storage accounts, it's the sum of capacity used by blob, table, file, and queue. For premium storage accounts and Blob storage accounts, it is the same as BlobCapacity or FileCapacity."",
      ""unit"": ""Bytes"",
      ""timeseries"": [
        {
          ""metadatavalues"": [],
          ""data"": [
            {
              ""timeStamp"": ""2024-09-19T06:28:00Z"",
              ""average"": 55255854855
            }
          ]
        }
      ],
      ""errorCode"": ""Success""
    }
  ],
  ""namespace"": ""Microsoft.Storage/storageAccounts"",
  ""resourceregion"": ""northeurope""
}"

Azure Functions
Azure Functions
An Azure service that provides an event-driven serverless compute platform.
Windows for business | Windows Server | User experience | PowerShell
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Amira Bedhiafi 41,111 Reputation points Volunteer Moderator
    2025-06-24T09:56:55.14+00:00

    Hello @Sudhirkumar Karamchand !

    Thank you for posting on Microsoft Learn.

    I think you have an issue in the context or environment difference happening between manual and scheduled runs of your trigger function.

    When you run it manually, your PowerShell function has access to your interactive identity/session (especially in development or testing via VS Code / Azure Portal). But during scheduled execution, it's fully reliant on non-interactive SPN auth (Service Principal with certificate thumbprint) which is where the problem typically lies.

    You mentioned:

    Connect-AzAccount -ServicePrincipal -Tenant $tenantId -CertificateThumbprint $thumb -ApplicationId $appId
    

    However, later, when using:

    Invoke-AzRestMethod -Method Get -Uri $uri
    

    This may fail silently or return partial data when the token is missing scopes/permissions in the scheduled context.

    When you use Connect-AzAccount with a certificate, the session token might not have access to Microsoft.Insights metrics APIs unless the SPN has monitoring reader or metrics reader roles.

    If the schedule runs right on the hour, there may be a delay in the metrics being available (the API may return the metric, but the timeseries.data.average is missing because ingestion hasn’t completed).

    The default Invoke-AzRestMethod request doesn't specify an explicit timespan, so it uses the default 1-hour window from current UTC time. During scheduled runs, that may yield no data, especially if metrics aren’t yet available for that window.

    You may need to hardcode or dynamically generate a more stable window :

    $endTime = (Get-Date).ToUniversalTime().AddMinutes(-5)
    $startTime = $endTime.AddHours(-1)
    $timespan = "$($startTime.ToString("s"))Z/$($endTime.ToString("s"))Z"
    $uri = "https://management.azure.com$resourceId/providers/Microsoft.Insights/metrics?api-version=2023-10-01&metricnames=UsedCapacity&aggregation=Average&timespan=$timespan"
    
    

    The SPN (used in the certificate login) should have reader or monitoring reader role on the storage accounts or you can simply assign at subscription level if you're looping through many.

    The REST call might succeed with a 200 but still contain no metrics, so try to add this inside the try block:

    if (-not $metrics.value) {
        Write-Warning "No metrics returned in scheduled run for $($storageAccount.StorageAccountName)"
    }
    

    Anothing thing that caught me, scheduled runs don't support interactive login so change this:

    $ctx = New-AzStorageContext -StorageAccountName $storage_account -UseConnectedAccount
    

    to:

    $ctx = New-AzStorageContext -StorageAccountName $storage_account -UseManagedIdentity
    

    Or reuse the existing SPN login (Connect-AzAccount), or use shared key if secure.

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.