The Desktop FilesThe Truth about Defragmentation

Wes Miller

As long as there have been hard disks, there has been disk fragmentation. If you're concerned about your systems at all, you should be defragging them regularly. This probably reminds you of that flossing conversation your dentist has with you every year—you know the one. Well, defragmentation is just as important.

Unless your computer is sitting in the corner always turned off, it's getting more and more fragmented. In a moment, I'll show you how this happens, but first let's go over a little defragmentation history.

Defragmentation and Windows NT

Windows NT® didn't have a built-in defragmentation utility, and conventional wisdom at the time was that you didn't need one. After all, NTFS was built with optimization in mind, and it did not suffer from fragmentation as dramatically as a comparable FAT volume might have. But over time, even NTFS performance can suffer due to less-than-optimal file placement. This gave birth to a thriving market of third-party defragmentation tools. One of the most popular products among IT pros was the Executive Software (now Diskeeper Corporation) Diskeeper product. When Windows 2000 incorporated defragmentation functionality, Microsoft licensed the technology from Executive Software. The Windows® 2000 defragmentation functionality was available in its own Microsoft® Management Console (MMC) snap-in (dfrg.msc), which could be invoked by clicking on a volume or by manually running defrag.exe from the command line. Though no scheduling and only a constrained command line were available, many IT pros still created functional scripts to automate defragmentation using this built-in version. Defrag.exe in Windows XP and Windows Server® 2003 improved a bit upon the earlier version, but only provided limited automation—you still had to script it.

A number of vendors, including Winternals Software (my previous employer), produced software for the enterprise defragmentation space. Most of these products work the same way, with a centralized console, an agent of some type that lives on client systems (sometimes all the time, sometimes only on demand), and a structure for dictating when and how a defragmentation task runs.

Windows Vista® largely continues the trend of single-system defragmentation, while optimizing the defragmenter and including options such as defragmenting small file fragments (64MB or smaller), which speeds up defragmentation but leaves many larger file fragments across the disk. The defragmentation engine itself is more powerful in Windows Vista and Windows Server 2008 than in earlier versions of Windows. It also runs automatically and does not require any manual scheduling. But if you run a large organization or want to optimize your systems in ways that defrag may not provide by default, you may still need to license third-party software. And as in earlier versions, you may have difficulty defragmenting a volume where NTFS compression is enabled and used extensively. A third-party tool may be helpful in such a scenario as well.

How Does Fragmentation Occur?

Fragmentation occurs because files change over time. Ideally, Windows files themselves don't change. Or at least they don't change very often—really only with service pack and software update installations. Otherwise, the system files remain relatively constant. After performing a large update such as the installation of a service pack, fragmentation will naturally occur since the files being updated usually cannot be overwritten where they are on disk—and may require a reboot to be updated—resulting in further fragmentation.

User files and data (and the Windows registry), on the other hand, are subject to constant change. Reading, writing, editing, copying, and deleting files all the time causes a great deal of fragmentation, especially as the drive capacity is filled.

To visualize this, imagine a perfectly arranged disk, in which files occupy space contiguously with no space between them. Suppose you open a file and edit it, and then you try to save it. If the file has grown, Windows either has to save the file in its entirety somewhere else on the disk (imagine after file N) or save the newest data after file N. This means file 2 is now fragmented. If you now edit file 1, you'll have two fragmented files. Keep repeating this over time and you will end up with a considerably fragmented system where the drive has to look in multiple locations to access an individual file. With large database files, expansive hard disk files for virtual computing technologies, and copious amounts of video and audio stored on the average disk, fragmentation is common. Of course, the ever-larger size of current hard disks makes fragmentation less of a problem, but it doesn't make it go away. And older laptops with smaller hard disks will become less and less responsive, slower to boot, and take longer to open and save files.

The key concern isn't necessarily how fragmented the system is, but how fragmented the individual files are. Ever since the first disk defragmenters arrived on the scene with multicolored visuals, users have looked at them and thought, "I want a solid bar of color." But actually that is the last thing that you want. Some products try to make their disks appear as two blocks—one colored (file data) and one generally white (free space). Unfortunately, this is one of the worst things you can do to a disk if your intention is to minimize the frequency and cost of defragmentation. If you've aggressively compressed all file data to the front of the disk and removed much of the free space when you make additional edits to a file, all those edits will have to be written to the end of the file data, so by defragmenting in that way, you've actually caused fragmentation, beginning with the next file edit. Instead of two blocks of neatly organized file data, a good defragmentation results in an image that is not necessarily perfect. For an example, see Figure 1 for my test system before defragmentation. Then look at Figure 2 to see the same system after Windows had completed defragmentation.

Figure 1 My drive before defragging

Figure 1** My drive before defragging **(Click the image for a larger view)

Figure 2 My drive after defragging

Figure 2** My drive after defragging **(Click the image for a larger view)

Note that Windows focuses not on the fragmentation of the files across the disk, but instead focuses on the individual file fragmentation itself (see Figure 3 for the section of the defrag report on the Most Fragmented Files before I defragmented this system).

Figure 3 Defragmentation report

Figure 3** Defragmentation report **

As a result, even after defragmentation, Windows will not show you a completely compacted set of file data at the front of the disk. Also note the large section of green data in both Figures 1 and 2. That is the Windows pagefile, which I'll discuss shortly. If you have hibernation enabled on your Windows system, it will show up similarly in a defragmentation.

Regular Defragmentation Is Essential

Just as with flossing, the main thing to remember is to defragment on a regular basis. It becomes even more important to defragment frequently on high-use systems such as servers, because if you wait, the process may take too long and leave your server unavailable or unresponsive (because defragmentation can take a considerable amount of time and certain phases of defragmentation can be CPU-intensive as well).

Even the best online defragmenters can't defragment everything. For example, they can't always defragment open or locked files (such as the pagefile or registry files) and must work around the hibernation file if it exists (which is as large as the RAM in use on your system). To optimize locked files, see the upcoming section on defragmentation tricks.

I think it's important to understand the myths and facts associated with defragmentation. Figure 4 and Figure 5 list some common topics I've heard brought up when discussing defragmentation and explain the real chances of these problems occurring.

Figure 5 Myths about defragmenting

Myth Likelihood
Systems will crash. Not likely—Crashes are usually due to driver or hardware issues (memory corruption, thermal issues causing reboots, or disk failure due to disk thermal abuse).
Applications will crash. Not likely—As with an entire system, a properly engineered application should not crash due to fragmentation, so unless it can be constrained to be the cause, you shouldn't assume that fragmentation is causing it.
Registry corruption will occur. Not likely (see above)—Fragmentation does not cause or lead to registry corruption.
Defragmentation should result in a solid bar in any visual representation of the disk. Not likely—You should not generally squeeze all files on the system together unless you are specifically planning to image the system with a sector-based imaging tool that doesn't exclude free space or shrink or split the partitions. You need to get the files to the front of the disk; otherwise, aggressive file consolidation leads only to further fragmentation.

Figure 4 Common concerns about defragmenting

Concern Likelihood
System performance will suffer. High—Specifically depending on the usage of the system.
Recovering accidentally deleted files may become difficult or impossible. High—Most file recovery tools depend on heuristics (to determine the data type) and file cluster proximity. Scattered files result in a more complex or impossible recovery operation.
It will be costly to defragment the system. Medium—There is a direct correlation between how much the system is used and how much files change on the system over time. If users are aggressively editing files but saving them only to server shares, for example, you will not see much degradation over time.
Disk damage will occur. Low—Not really likely. Again, the more you abuse a system (leave it on with disks powered up all of the time, editing, modifying, or deleting files) the more likely this is. But with today's hardware, you should not see this occur simply due to file editing.

Defragmentation as a Part of Your Infrastructure

You should seriously consider running a defragmenter on a regular basis on all of your systems. Your options are to use the defragmenter built into Windows or to look into third-party options. Basically, it comes down to how much engineering you want to do. Windows XP and later versions let you add defrag.exe as a Scheduled Task—the Knowledge Base article at shows you how. Note that laptop defragmentation comes with some special considerations. You want to defragment a user's system when it is powered on—ideally via AC power—but you also don't want to take CPU or disk cycles when a mobile user is attempting to get work done. Unfortunately, however, when a mobile user isn't working on his system, odds are it's disconnected from AC power and either in standby or hibernate (which is powered off from the perspective of power management and Scheduled Tasks). At Winternals (as likely everywhere), we spent quite a bit of time trying to design an ideal mobile user scenario. If you're using Windows Vista, a solution using the built-in defragmenter and Scheduled Tasks, which has much better power awareness than earlier versions, may well provide enough of a solution. If not, a third-party solution for mobile systems may be necessary. Some of these actually kick off a defragmentation task, let you know that it ran, when it ran (and how often), and how long it took.

In addition to mobile scenarios, you will want to think about fragmentation above the file system. This isn't something you'll generally need to be concerned about with desktop or mobile systems, but servers, Microsoft Exchange, SQL ServerTM, or other data stores may require their own defragmentation. In such cases you'll need to decide whether that should be done before or after you've performed a file system defragmentation.

If you are using a third-party defragmenter, you'll want to ensure that it uses the Windows defragmentation API. Since Windows NT 4.0, Windows has had its own defragmentation API to prevent unexpected data loss. For example, if you're in the middle of moving a file and your system loses power, the last thing you want is an incomplete fragment move, which could result in data loss and even an unbootable system.

Finally, if you are using any virtual computing technology, you may want to consider how you manage the defragmentation of your virtual disks as well. They are just as likely to have fragmentation as a physical spindle, and in many senses may be worse due to size constraints. Plus, if your virtualization solution allows you to shrink disks or partitions, you may need to defragment first in order to free up space (or at a minimum, reduce the amount of time that a shrink operation will take).

Additional Tools

By now, you're surely convinced that defragmentation is an important and valuable function. Even if you do not use a third-party tool for defragmentation itself, you may want to consider a tool by Mark Russinovich called PageDefrag, which you can get from the Microsoft Sysinternals tool site (see As I mentioned earlier, the Windows page file cannot be defragmented while Windows is online. In fact, the same is true of the Windows registry files and the event logs. PageDefrag allows you to defragment these locked files before Windows has finished booting and locked these files, using a small driver. To use it, simply run PageDefrag and set it to execute either on the next reboot or on every reboot. When the system reboots (at the point in the boot process where you may have seen a disk check run before), it will defragment the files specified in the list (see Figure 6).

Figure 6 Files to defragment in PageDefrag

Figure 6** Files to defragment in PageDefrag **

Once PageDefrag has completed, the system will finish booting normally. If you're looking for more on PageDefrag, Lance Whitney wrote about it in the September installation of Utility Spotlight (see Note that PageDefrag is only supported on Windows NT 4.0, Windows 2000, Windows XP, and Windows Server 2003—it is not currently supported in Windows Vista.

In addition to PageDefrag, there is another tool available from Sysinternals called Contig, which allows you to defragment a specific file manually (see This can be useful if a specific file was not able to be defragmented via a normal defragmentation task or if it has become fragmented since (but the remainder of the volume has not become as fragmented).

Contig can also defragment an entire directory or subdirectories by using wildcards. The following will defragment all files under the Windows directory, regardless of where they are located.

Contig –s c:\Windows\*.*

Applying the –v switch will make the operation verbose, and –q will make it run quietly.

Wes Miller is a Development Manager at Pluck ( in Austin, Texas. Previously, he worked at Winternals Software in Austin and at Microsoft as a Program Manager and Product Manager for Windows. Wes can be reached at

© 2008 Microsoft Corporation and CMP Media, LLC. All rights reserved; reproduction in part or in whole without permission is prohibited.