Azure SQL DB reached Data Max Size

Vijay Kumar 2,036 Reputation points
2022-04-27T23:15:46.187+00:00

Hi Team,

One of my Azure DB (Size) is almost reached 3.5 TB. We are anticipating data growth will be another 3-4 TB in the next couple of weeks.

Configuration Details:
Compute tier: Provisioned
HardWare configuration: Gen5 up to 80 vCores, up to 408 GB memory.

Under this config the Data Max Size is 4TB.
In this case, what are the options?

Azure SQL Database
0 comments No comments
{count} votes

Accepted answer
  1. Alberto Morillo 34,676 Reputation points MVP Volunteer Moderator
    2022-04-28T01:38:59.813+00:00

    One option is to move to Azure Hyperscale were 100 TB is the size limit.

    On Azure SQL Managed Instance you can store up to 16 TB of data as stated here.

    You can also decide to leave Azure SQL (PaaS) and go back to Azure SQL Server VMs (IaaS).

    1 person found this answer helpful.
    0 comments No comments

3 additional answers

Sort by: Most helpful
  1. Oury Ba-MSFT 20,931 Reputation points Microsoft Employee Moderator
    2022-04-28T20:21:10.62+00:00

    @Vijay Kumar
    Agreed with @Alberto Morillo 's answer.
    You can use Hyperscale database tier where you can use the database beyond 4TB and supports up to 100TB of database size.
    Hyperscale databases aren't created with a defined max size. A Hyperscale database grows as needed - and you're billed only for the capacity you use. For read-intensive workloads, the Hyperscale service tier provides rapid scale-out by provisioning additional replicas as needed for offloading read workloads.

    Reference link: What is the Hyperscale service tier? - Azure SQL Database | Microsoft Learn

    Regards,
    Oury


  2. G, Divyashree 6 Reputation points
    2023-02-20T06:17:15.78+00:00

    Is there any other option?

    Like moving the old data to cold storage and retrieving back as and when required?

    Also, to compress the data before storing in DB?


  3. WisonHii 81 Reputation points
    2023-02-23T01:59:48.36+00:00

    I had the same problem, one database which is BC_Gen5 V20 Core reaches to 3.5 TB.
    Because my database is used for BI reporting/Analysis purpose, so I used page compression on the large tables.

    After all large tables compressed, the size was 1.5 TB.

    Of course, it will cause CPU consumption higher than before.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.