no active sponsorship
I have a Azure student account and I get "No active sponsorship" when I check my balance
How to connect Databricks to Storage account in Azure when I have no access to app registriation
How to connect Databricks to Storage account in Azure when I have no access to app registriation. Does this mean I can only go to my administrator to fix this?
Unable to Access Microsoft Work Account due to Lost Authentication Code
How can login to a Microsoft work account be resolved if it requires an authentication code from the Microsoft Authenticator app? Authentication access was lost after a phone restart that impacted the app.
Geo-redundancy cannot be enabled on Storage Account
I have a storage account in West US that was set up with RA-GRS redundancy. Last week, I decided to perform a manual failover to the secondary region (East US). The failover succeeded, but now I am unable to set redundancy back to RA-GRS. The option…
Storage account stuck in data sync to secondary region
I have a storage account in West US that we have enabled RA-GRS resiliency on. I wanted to perform a failover to the secondary region as part of a drill, but the storage account is stuck doing initial data sync from the primary to the secondary region…
When does lifecycle policy changes run in the background?
If I set lifecycle policy to change tier from Hot to Cold after 0, 5, or 10 days. What time after hitting that criteria does the tier get updated? I understand that it can take 24 hours for the policy to be effective, but once it does how often does it…
Azure Data Factory: Copy from Blob to Blob Container with Hierarchical Namespace Fails with 409 Conflict
I am trying to copy data from one storage account to another. I have a pipeline with an activity that copies data from source storage account to sink. The pipeline runs successfully and I can see data has been copied. But, as soon as I try to copy the…
Setting defender settings for storage account via bicep does not work
I have included the following in my bicep in order to use Microsoft defender for cloud for my storage account (see code below). The pipeline that deploys the resources in azure goes through without issues and Microsoft defender for cloud gets enabled.…
failed to initialize new pipeline [failed to authenticate credentials for azstorage]
I have created a storage account (StorageV2 (general purpose v2)) and a container inside it. This storage account have both .blob.core.windows.net/ and .dfs.core.windows.net/ private end points. Public Network access is disabled. Access Key…
I need to move storge data from one region to another , but with azcopy metadata is changing how can i copy data fast and with same metadata.
I want to move the storage from one region to another and metadata should not change. By default its changing to lower case . Iwant it in same way as it is in Primary region.
AWS DataSync S3 Bucket Copy Files To Azure Storage Account Container
Hi Expert, I had an issue AWS DataSync S3 Bucket Copy Files To Azure Storage Account Container. I setup AWS DataSync for S3 Bucket (Location / Source) Copy file to Azure Storage Account Container (destination)—it already have the permission to access…
While Creating Logic Apps Workflow getting error "Failed to save workflow."
While attempting to create a workflow for a logic app (standard) I keep getting an error message "Failed to create a workflow". The storage account associated with the Logic App has policy related restrictions requiring to be part of a…
Issue with Synapse Linked Service and Storage Blob Data Contributor Role
Hi, I’m encountering an issue when trying to open the Synapse Linked Service that was automatically created by Azure Data Lake Storage. The error message I’m receiving is: “Please check permission on ... to make sure you have at least 'Storage Blob Data…
Not authorized to view storage and logs for trusted signing account I created
I'm trying to get usage metrics from trusted monitoring so I can monitor usage before we hit the monthly cap and start being billed extra. I am following the instructions to store logs from trusted signing within the Azure storage account as defined here…
How to insert huge records like 1000 and 10000 records into sql db using logic app without having timeout or server error using batching
I have a csv file where I am able to read the data andnwhile passing rhe data into my sql getting timeout or server errors of it is a huge data how to do batching effectivily
azcopy copy failure between storage accounts with >10MB blob size
Hi Team, Any ideas how to resolve this?... I am experiencing a problem when trying to copy a blob from one storage account to another when the blob is ~10MB or more. A ~1MB blob works fine. I consistently have success with ~1MB blob size. I…
Azure Storage Account not accessible via portal but we can access through explorer
We are unable to access the Blob containers via the Azure portal when looking at an Azure Storage account. This is not an access or networking issue as we were able to achieve this last week. I can access the equivalent storage account in a different…
Data factory storage event trigger support rename event
Is it possible to add storage event trigger to support blob rename? Currently it only support create and delete activity. But we have another platform called SAP DataSphere, it's using BlobRenamed event type and RenameFile api to push data into storage…
How to identify Azure resources that need to update to TLS 1.2 or later version?
I received a notification from Microsoft that interactions with Azure services must be secured using Transport Layer Security (TLS) 1.2 or later by October 31, 2024. I want to know which resources or services are affected and need to be updated to TLS…
I have placed a .json file in the Data storage container which I have created, While I am trying the fetch the data thorough inputs from Stream analytics job, the data is not loading
I have placed the .json file in the Storage account container(Date folder-->Time folder) as per the below screenshot. And here I am trying to fetch the data which is present in the .json file in one of the input of the Stream analytics job as per the…