Shrink and reclaim space for very huge database

Malkesh Sheth 46 Reputation points
2022-12-22T04:54:07.207+00:00

Hello,

We have a very huge OLTP database with 30 TB out of which only 8 TB is used. We want to reclaim the free space as it is cloud infra and want to save on cost. Since this is AG setup such way we could reclaim around 60TB. This database has also very big replication setup and work as a publisher. There are few tables with 1-4 TB sizes. The plan is to get new LUN with less size and move all objects with data over there and then remove the old lun. Can someone guide me on what is the best way to do so? Thanks in advance.

SQL Server | Other
0 comments No comments
{count} votes

Accepted answer
  1. Bjoern Peters 8,921 Reputation points
    2022-12-22T09:26:32.417+00:00

    Hi @Malkesh Sheth

    That are heavy operations that cause a long-running (additional) workload for your SQL Server.

    I would recommend an additional step in the restructuring of the database... create additional data files for those big tables that can separate those tables from the others and make maintenance on these files much more effortless.

    Recommended steps to be performed:

    • create FULL backup
    • pause data movement in AG
    • pause replication jobs
    • shrink datafiles with EMPTYFILE
    • reorg/rebuild indexes
    • update stats

    Regarding "blocking" you can find some advice in the documentation.
    https://learn.microsoft.com/en-us/sql/t-sql/database-console-commands/dbcc-shrinkfile-transact-sql?view=sql-server-ver16#examples

    I hope my answer is helpful to you,

    Your
    Bjoern Peters

    If the reply was helpful, please upvote and/or accept it as an answer, as this helps others in the community with similar questions. Thanks!

    1 person found this answer helpful.
    0 comments No comments

1 additional answer

Sort by: Most helpful
  1. NikoXu-msft 1,916 Reputation points
    2022-12-22T08:47:40.133+00:00

    Hi @Malkesh Sheth ,

    It's going to take some time for the shrink operations on such large database files to run. Honestly, you should consider running the DBCC command and just letting it run as long as you can, on days and at times you can afford. Just let it do it's thing.

    Best regards,
    Niko

    ----------

    If the answer is the right solution, please click "Accept Answer" and kindly upvote it. If you have extra questions about this answer, please click "Comment".
    Note: Please follow the steps in our Documentation to enable e-mail notifications if you want to receive the related email notification for this thread.

    1 person found this answer helpful.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.