data deduplication optimization job is being cencelled in few secinds

kashif Nazeer 21 Reputation points
2020-11-02T07:18:31.797+00:00

Hi Everyone,

On a Windows 2012 server, the Data dedup optimization job a particular drive is getting cancelled with in few seconds. So no more dedup is happening.
Below are some event logs and actions performed.

Event ID 10242
Optimization reconciliation has completed.

Volume: D: (\?\Volume{0af60788-0b20-46b6-bb36-0b48bcbde9ac})
Reconciled containers: 176
Unreconciled containers: 61
Merged containers: 0
Total reconciled references: 0
Error code: 0x0
Error message: NULL

> even though new data is being copied the “Reconciled” and “Unreconciled” count stays same.


Event ID 6153

Optimization job has completed.

Volume: D: (\?\Volume{0af60788-0b20-46b6-bb36-0b48bcbde9ac})
Error code: 0x8056533D
Error message: The operation was cancelled.

Savings rate: 49
Saved space: 10464533387556
Volume used space: 10765877268480
Volume free space: 1949236043776
Optimized file count: 10731610
In-policy file count: 10955938
Job processed space (bytes): 0
Job elapsed time (seconds): 536
Job throughput (MB/second): 0


Event ID 4143

Data Deduplication cancelled job type "Optimization" on volume "\?\Volume{0af60788-0b20-46b6-bb36-0b48bcbde9ac}\". Memory resource is running low on the machine or in the job.

Event ID 10243
Optimization job on volume D: (\?\Volume{0af60788-0b20-46b6-bb36-0b48bcbde9ac}) was configured with insufficient memory.

System memory percentage: 90
Available memory: 7371 MB
Minimum required memory: 8340 MB

> Earlier it was configured with 25% and i increased to 90% but still it complain of insufficient RAM.


Event ID 8244
Error terminating job host process for job type "Optimization" on volume "D:" (process id: 4400).
0x80070005, Access is denied.

> Still trying to figure out.

any hint to solve the issue

Windows Server 2012
Windows Server 2012
A Microsoft server operating system that supports enterprise-level management, data storage, applications, and communications.
1,579 questions
0 comments No comments
{count} votes

3 answers

Sort by: Most helpful
  1. Xiaowei He 9,891 Reputation points
    2020-11-03T08:39:48.687+00:00

    Hi,

    1. Please check if the volume we want to optimize has free space.
    2. Please check if the RAM is enough for the job. By default a server will limit the RAM used by the optimization job to 50% of total RAM in the server. So if the above server had just 4 GB RAM, then only 2 GB would be available for the optimization job. You can manually override this:

    Start-Dedupjob <volume> -Type Optmization -Memory <50 to 80>

    Usually, 1-2 GB RAM is used per 1 TB of data per volume. For example:

    Volume Volume size Memory used
    Volume 1 1 TB 1-2 GB
    Volume 2 1 TB 1-2 GB
    Volume 3 2 TB 2-4 GB
    Total for all volumes 1+1+2 * 1GB up to 2GB 4 – 8 GB RAM

    https://aidanfinn.com/?p=15866

    (Please note: Information posted in the given link is hosted by a third party. Microsoft does not guarantee the accuracy and effectiveness of information.)

    Thanks for your time!
    Best Regards,
    Anne

    -----------------------------

    If the Answer is helpful, please click "Accept Answer" and upvote it.

    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.

    0 comments No comments

  2. kashif Nazeer 21 Reputation points
    2020-11-04T05:51:56.303+00:00

    Hi Anne,

    the recommended memory allocation is 25 to 50% of the total RAM for Dedup operation, i tried to increase it to 90% but result was same.

    before the Dedup job start, the system is using approximate 30% and after the job run it reaches 95% in few minutes and job fails.

    i can understand the RAM is not enough as MS documentation says that system need 1GB of RAM for each 1TB of data to dedup. in this calculation i need 12 GB of free RAM..

    why i started the thread ! the Server is working with same amount of RAM (8GB) from several years and from last 6 months data size is almost same. the Dedup was working perfectly fine but from last 30-40 days it is failing. so i am trying to figure out if it is a Memory leak issue.

    0 comments No comments

  3. Xiaowei He 9,891 Reputation points
    2020-11-05T02:08:48.933+00:00

    Hi,

    why i started the thread ! the Server is working with same amount of RAM (8GB) from several years and from last 6 months data size is almost same. the Dedup was working perfectly fine but from last 30-40 days it is failing. so i am trying to figure out if it is a Memory leak issue.

    According to the error message and theoretically, it should be no enough memory issue. And although it could work before, after some performance of add, delete, modify, open, close of the files, the situation will be changed.

    While, it is understandable that you suspect there is a memory leak, the easiest way to monitor if there's a memory leak is to check if the total memory is getting less. If you want to get a definite answer about memory leak, it's recommended to open a case with MS to troubleshoot the memory leak.

    Below is the link to open case with MS:

    https://support.microsoft.com/en-us/gp/customer-service-phone-numbers

    Thanks for your time!
    Best Regards,
    Anne

    -----------------------------

    If the Answer is helpful, please click "Accept Answer" and upvote it.

    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.