Azure storage blob limit

Akash Bhot 26 Reputation points
2021-08-30T19:18:09.613+00:00

Use Case - Back up database (.BAK) of around 300 GB to Azure storage blob

Issue - Currently looks like there is constraint of 200GB file size to be uploaded on Azure storage blob

When tried to backup database around 300GB into a single backup file ran into below issue.

Msg 3202, Level 16, State 1, Line 1
Write on "https://XXXXXXstorage.blob.core.windows.net/container/XXX_Backup_1.bak" failed: 1117(The request could not be performed because of an I/O device error.)
Msg 3013, Level 16, State 1, Line 1
BACKUP DATABASE is terminating abnormally.

I tried to split the backup into 2 files but still ran into same issue. I continues splitting the backup file until 9 where I was able to backup the database.

Question -
1)How would dynamically determine # of split required for database backup file more than 200GB .
a. Is there any supporting article available online ?

Azure Storage Explorer
Azure Storage Explorer
An Azure tool that is used to manage cloud storage resources on Windows, macOS, and Linux.
237 questions
0 comments No comments
{count} votes

3 answers

Sort by: Most helpful
  1. Sumarigo-MSFT 44,591 Reputation points Microsoft Employee
    2021-08-31T05:53:51.987+00:00

    @Akash Bhot Welcome to Microsoft Q&A Forum, Thank you for posting your query here!

    For better understanding the issue are you using SQL Server database backup to Azure azure blob storage? Can you also check port 80 and 443 is enabled?
    Are you using any proxy or firewall? A proxy may prevent connection or limit the number of connections.
    Which version of Sql Server are you using?

    SQL Server Managed Backup to Microsoft Azure uses the Backup to Block Blob feature. The maximum size of a block blob is 200 GB. But by utilizing striping, the maximum size of an individual backup can be up to 12 TB. If your backup requirements exceed this, consider using compression, and test the backup file size prior to setting up SQL Server Managed Backup to Microsoft Azure. You can either test by backing up to a local disk or manually backing up to Microsoft Azure storage using BACKUP TO URL Transact-SQL statement. For more information, see SQL Server Backup to URL.

    Large SQL Server database backup on an Azure VM and archiving

    Based on the error message issue may have occurred due to the credentials(, SQL backup and restore to Azure Blob storage service
    Install Azure Storage Explorer and add Read/Write permissions in Shared Access Key

    Error message: Msg 3013, Level 16, State 1, Line 1

    SQL SERVER – Backup to URL – Script to Generate Credential and Backup using Shared Access Signature (SAS)

    Troubleshooting method 2: - Create a new SAS on the Storage account level

    • Backup with compression and maxtransfersize

    Additionally, I would like to share with you the checksum, Stats and Stripe options

    BACKUP (Transact-SQL) - SQL Server | Microsoft Learn
    RESTORE VERIFYONLY (Transact-SQL) - SQL Server | Microsoft Learn

    Example

    127775-capture.png

    Checksum:CHECKSUM Specifies that the backup operation verifies each page for checksum and torn page, if enabled and available, and generate a checksum for the entire backup.

    Using backup checksums may affect workload and backup throughput.

    Perform checksum before writing to the media We can ensure a good SQL backup. SQL Server internally calculates the checksum value for each page. This checksum value remains precisely the same for the same data in the page during the recalculation as well. SQL Server writes the CHECKSUM value for each page during the backup once we select this option.

    During the database restore process, SQL Server recalculates the CHECKSUM value for each page and matches with the CHECKSUM value written during the backup. If both values are similar, it shows that the database backup is valid.

    Stats: With stats, you may see the percentage of the backup/restore

    Stripe: A striped backup is just a backup of your database that is spread across multiple files. It's useful when your backup file is too big, striping the file you are going to have the same final size of your backup, but the size will be spread in more files.

    Limitations for SQL Managed Instance: Max backup stripe size is 195 GB (maximum blob size). Increase the number of stripes in the backup command to reduce individual stripe size and stay within this limit.

    BACKUP (Transact-SQL) - SQL Server | Microsoft Learn

    Additional information: Azcopy does a good job at copying data in Azure. Azcopy command will copy the backup files from the VM to a storage of your choice.
    127813-image.png

    Hope this helps!
    Kindly let us know if the above helps or you need further assistance on this issue.

    ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments

  2. Akash Bhot 26 Reputation points
    2021-08-31T13:58:01.613+00:00

    Hello , Thanks for your prompt response and information but my question is -

    how would you find out total number of strip file required for my database?

    I manually kept increasing the files till 9 and then it didn't failed with I/O error and I was able to successfully take the backup on Azure blob. How to programatically find --> total # of files required until the backup won't failed.

    Issue was not due to credentials.


  3. NIJSTEN Stephen 120 Reputation points
    2024-05-28T07:52:44.8066667+00:00

    Hi all,

    I had the same problem with the back-up of my DB. I had the same error and fixed it by creating multiple storage containers to do a stripped back-up of my DB. This resulted in 10 storage containers instead of 2 => DB was 200-300GB & I've read the max capacity of a storage container is 200GB.

    The error described by @Akash Bhot is not a cause of wrong credentials/access key/SAS token because he succeeded after creating multipe storage containers.

    Our question is how to actually back-up the DB data to one storage container into one bak file instead of multiple bak files (stripped back-up)?

    Kind regards,

    Stephen.

    0 comments No comments