Powershell script : Fileserver Report by Filetype - memory usage

Ronald 1 Reputation point

I could use some advice with a report script for a Fileserver, it creates a report like this:

"Extension";"Size (MB)";"Count"

Which filetype; how much diskspace; how many files

I added "\?\" and "-LiteralPath" + regkey longpaths to get rid of errors about path length (the rest was written by someone else)

$directory = "\\?\D:\"  
(Get-Culture).NumberFormat.NumberDecimalSeparator = ','  
Get-ChildItem -LiteralPath $directory -Recurse | Where-Object { !$_.PSIsContainer } | Group-Object Extension | Select-Object @{n="Extension";e={$_.Name -replace '^\.'}}, @{n="Size (MB)";e={[math]::Round((($_.Group | Measure-Object Length -Sum).Sum / 1MB), 2)}}, Count | Export-CSV -Path C:\scripts\file-server-reports\viekdi16p-2022oct.csv -Delimiter ';' -NoTypeInformation  

When running that script, it's size in Ram grows constantly and it could use up all available memory of the server.
Is there a way to optimize this? Maybe write the stuff on the disk instead of RAM? Would take longer but the diskspace is no issue in comparison to several GB in RAM.

Windows Server PowerShell
Windows Server PowerShell
Windows Server: A family of Microsoft server operating systems that support enterprise-level management, data storage, applications, and communications.PowerShell: A family of Microsoft task automation and configuration management frameworks consisting of a command-line shell and associated scripting language.
5,404 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Rich Matheisen 45,111 Reputation points

    The Group-Object is keeping a copy of the output from the Select-Object. On a large file server that will probably overwhelm memory.

    Also, by reducing the file size to increments of 1MB and then rounding them before accumulating the total will result in a dramatic under reporting, especially if there are large numbers of files whose sizes are under 1MB.

    This uses a hash to accumulate the values and doesn't reduce the file size until it time to produce the report. It should reduce the memory load by quite a bit (since the number of file extension are considerably smaller than the number of files!).

    $directory = 'c:\junk'  
    (Get-Culture).NumberFormat.NumberDecimalSeparator = ','  
    $ext = @{}  
    Get-ChildItem -LiteralPath $directory -File -Recurse |  
            $e = $_.Extension -Replace "^\.", ""  
            $s = (Get-Item $_.FullName).Length  
            if ($ext.ContainsKey($e)){  
                $ext.$e = ( ($ext.$e[0] + $s_), ($ext.$e[1] + 1) )  
                $ext[$e] = ($s,1)  
    $ext.GetEnumerator() |  
                Name = $_.Key  
                'Size (MB)' = [math]::Round($_.Value[0] / 1MB, 2)  
                Count = $_.Value[1]  
        } | Export-CSV -Path C:\junk\viekdi16p-2022oct.csv -Delimiter ';' -NoTypeInformation  
    0 comments No comments