Server 2025 ReFS Dedup job consumes all memory and hangs server

micce 26 Reputation points
2025-02-20T12:56:33.94+00:00

Hi!

Server 2025 with Hyper-V role and a couple of guests.

Got a 12TB volume with ReFS. 10TB allocated.

ReFS debup was enabled with this:

Enable-ReFSDedup -Volume L: -Type Dedup

But starting a job that do a dedup consumes all memory and ends up hanging the server.

This is after starting the job. And before it hangs the server I stopped it (a little before it exhaust the memory). But if I let it run everything comes to a halt and process start failing because of lack of memory.

Start-ReFSDedupJob -Volume L: -CpuPercentage 30
Stop-ReFSDedupJob -Volume L:

How can I limit the dedup process to work with "what it have" in memory and not taking down the server?

Memory running out with refs dedup job

Get-Volume -DriveLetter L | fl
ObjectId             : {1}\\HV\root/Microsoft/Windows/Storage/Providers_v2\WSP_Volume.ObjectId="{baef148e-8ffc-11eb-a21e-806e6f6e6963}:VO:\\?\Volume{4385d
                       0fb-a232-444c-a2e0-8a7f06507baa}\"
PassThroughClass     :
PassThroughIds       :
PassThroughNamespace :
PassThroughServer    :
UniqueId             : \\?\Volume{4385d0fb-a232-444c-a2e0-8a7f06507baa}\
AllocationUnitSize   : 4096
DedupMode            : Disabled
DriveLetter          : L
DriveType            : Fixed
FileSystem           : ReFS
FileSystemLabel      : p02v03sim-ded
FileSystemType       : ReFS
HealthStatus         : Healthy
OperationalStatus    : OK
Path                 : \\?\Volume{4385d0fb-a232-444c-a2e0-8a7f06507baa}\
ReFSDedupMode        : Dedup
Size                 : 13194072424448
SizeRemaining        : 1460263964672
PSComputerName       :


Get-ReFSDedupStatus L: | fl
Volume           : L:
Enabled          : True
Type             : Dedup
Status           : Dedup Cancelled
Used             : 10,67 TiB
Deduped          : 45,7 GiB
ScannedOnLastRun : 11,07 GiB
DedupedOnLastRun : 953,93 MiB
FullRun          : False
LastRunTime      : 2025-02-20 13:32:22
LastRunDuration  : 01m:09.681s
NextRunTime      : N/A


Windows Server
{count} votes

1 answer

Sort by: Most helpful
  1. Ian Xue-MSFT 40,901 Reputation points Microsoft External Staff
    2025-02-24T07:15:41.94+00:00

    Hi micce,

    Thanks for your post. Generally speaking, if you are using deduplication, there is a formula for how much RAM you need to handle it. If this server has been in place for a while, I’m wondering if you hit a limit of data to RAM ratio that it can no longer handle. You need 1GB of RAM for each 1TB of data for deduplication to run smoothly.

    Reference: Optimize storage with ReFS deduplication and compression in Azure Local - Azure Local | Microsoft Learn

    Best Regards,

    Ian Xue


    If the Answer is helpful, please click "Accept Answer" and upvote it.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.