i have added append -Blobtype append also but it's not saving other jobs status that was written previously before the current run
Azure automation runbooks(multiple) are not appending to same blob storage
Azure automation runbooks are not appending to same blob storage.
I have 5 runbooks from where I am sending log data to a common storage account blob file. as below
Set-AzStorageBlobContent -Container "data" -File "file" -Blob "$Year/$Month/$Day/status.csv" -Context $context -Force
but it is removing the content of other runbooks , it's not appending when running the powershell scripts using automation runbooks. it's just deleting previously written content from other
if I am trying the same powershell scripts from local (vscode), it's working fine . all the scripts are appending to same storage account. but when i am running using runbooks it's not appending.
Sign in to comment
Sort by: Most helpful
Thanks for your post! I have been doing some further investigation with our Automation Team and confirmed that when you have different process running, then you should output them to separate files and then append later. From what we understand for this scenario, it appears that this is a basic OS synchronization issue that occurs when different processes all try to write to the same file and thus causing a conflict that you are seeing.
Please let us know if you are able to perform these changes and we are happy to help you further should you have any questions or concerns.
below is the code I am added, It is appending now for all the processes, thank you for help.
just one concern , it is adding the column name every time with each write. I want to exclude the duplicate colummn name with each write operation. Can you please suggest
$container = (Get-AzStorageContainer -Name "centralizedlogdata" -Context $context).CloudBlobContainer
$encoding = [System.Text.Encoding]::UTF8
$blob = Get-AzStorageBlob -Container "centralizedlogdata" -Blob "$Year/$Month/$Day/centralized_status.csv" -Context $context
$contentToAdd = [System.IO.File]::OpenRead("$CentralizedStatus")
Thank you for the update @NGaur-3476, you can use the PS tools/cmdlets to more easily achieve this by using the following script as a template for testing to see if this achieves your end goal of excluding duplicate column names per the following:
$CentralizedStatus = "C:\path\to\append\content" # Get content to append to .\tempblob $contentToAdd = Get-Content -Path $CentralizedStatus # Get current content; store in .\tempblob Get-AzStorageBlobContent -Blob "$Year/$Month/$Day/centralized_status.csv" -Container "centralizedlogdata" -Context $context -Destination .\tempblob -Force # Append desired content to .\tempblob Add-Content -Value $contentToAdd -Path .\tempblob # Overwrite existing blob w/ new data Set-AzStorageBlobContent -File .\tempblob -Container "centralizedlogdata" -Blob "$Year/$Month/$Day/centralized_status.csv" -Context $context -Force # Remove .\tempblob Remove-Item .\tempblob
Please let us know if this helps while testing and let us know should you require any further assistance, and this will delete the temporary blob once its finished running and we have also verified that its running on our end.
Sign in to comment