SQL CAT New Whitepaper: Data Compression: Strategy, Capacity Planning and Best Practices by Sanjay Mishra, et al.

Data Compression: Strategy, Capacity Planning and Best Practices
Hot off the presses, the SQL CAT team has just published a new whitepaper for which I had the opportunity to provide a technical review.  The author is SQL CAT best practices maven Sanjay Mishra.  Contributors include SQL CAT member Sunil Agarwal and architects Marcel van der Holst & Peter Carlin.  Besides yours truly, tech reviewers were Stuart Ozer, Lindsey Allen, Juergen Thomas, Thomas Kejser, Burzin Patel, Mike Ruthruff, & Prem Mehra of SQL CAT as well as Joseph Sack, Cameron Gardiner, MVP Glenn Berry, Paul Randal (SQLskills.com), & David P Smith (ServiceU Corporation).

Put the Big Squeeze on Your Data

<image source>

The data compression feature in the Microsoft SQL Server 2008 database software can help reduce the size of the database as well as improve the performance of I/O intensive workloads. However, extra CPU resources are required on the database server to compress and decompress the data, while data is exchanged with the application. Therefore, it is important to understand the workload characteristics when deciding which tables to compress. This white paper provides guidance on the following:

  • How to decide which tables and indexes to compress

  • How to estimate the resources required to compress a table

  • How to reclaim space released by data compression

  • The performance impacts of data compression on typical workloads

See Sanjay’s post at the SQL CAT blog For more information, refer to the whitepaper Data Compression: Strategy, Capacity Planning and Best Practices.


Jimmy May , MCDBA, MCSE, MCITP: DBA + DB Dev
Senior Performance Consultant: SQL Server
A.C.E.: Assessment, Consulting, & Engineering Services

This post was written with the PracticeThis.com plugin for Windows Live Writer

The first thing to do in a cardiac arrest is to take your own pulse.
    —The Fat Man, House of God, Samuel Shem