Providing Defense in Depth for Your Desktop Deployment Projects

Viewpoint

By Jeremy Chapman
Senior Product Manager, Windows Vista Deployment

See other Viewpoint articles.

I have been writing and evangelizing deployment solutions for about five years now, and one perception that always seems to persist is that “deployment” is a finite task that is only about automating an operating system installation. I like to think of that stage more as the climax of the project, rather than the project itself. Before you can start deploying desktops on any broad level to users, you have to assess your current situation, identify your goal, evaluate and plan on how to get there, develop everything you need and once everything is validated and ready, you can start to deploy. Typically, the deployment preparation steps take much longer than rolling desktop images en masse to your users. Among all of these predeployment preparation steps are several security considerations and deliverables, affecting the deployment project as well as the entire PC life cycle.

When considering security in the context of desktop deployment, there are three primary areas of focus:

  1. What goes on the PC itself?

  2. What happens to the management infrastructure?

  3. How do you secure the automated deployment infrastructure?

These areas are highlighted by Defense-in-Depth Security Model and through the Microsoft Infrastructure Optimization Model. An effective desktop security plan should address every layer, every person, every device, and every application. The desktop is the main user touchpoint into your IT services and for most organizations it is a primary storage place for sensitive information, so the policies securing the desktop are extremely important to the overall security of your organization and its data.

Figure 1

Defense-in-Depth Security Model

The Defense-in-Depth Security Model does an excellent job of highlighting the strategies used in each defense layer. For more information about the Defense-in-Depth Security Model, check out the Defense-in-Depth webcast series. The Infrastructure Optimization Model highlights technology capabilities along the continuum of IT organizational maturity levels (Basic, Standardized, Rationalized, and Dynamic).

To get an idea of the Infrastructure Optimization Model and what each level brings to desktop security, I’ll summarize each level. A Basic organization does not have security policies or standards applied to the desktop. A Standardized organization brings limited directory authentication, managed patching, and maintained anti-malware. The Rationalized level adds configuration policy enforcement, local firewalls, secure remote access to network resources, certificate services, and secure wireless networking. The Dynamic level adds network quarantine and integrated threat management.

The five fundamental practices highlighted by the Infrastructure Optimization Model with the most impact on reducing complexity and costs related to desktop deployment and service management are:

  • Standardizing desktop strategy and minimizing images

  • Implementing comprehensive security and compliance tools

  • Automating software distribution

  • Virtualizing applications and delivering as an on-demand service

  • Centrally managing PC settings and configurations

Combined with the right software tools, well-defined policies and documented processes, these practices will help you achieve a well-managed, secure IT infrastructure and streamline management efforts. Bringing it back to your desktop deployment project and desktop security strategy, your organization should select a level of security required by your own circumstances and security needs. There is no one size fits all approach to desktop security. For the purpose of this article, I will concentrate on Infrastructure Optimization attributes for organizations at the Rationalized level. Chances are if you are reading this, you are probably already at or near the Rationalized level with respect to security.

What Goes on the PC Itself?

If you are familiar with the Microsoft Deployment process or Business Desktop Deployment 2007, there are several task areas (or features teams) that are driving or facilitating what is found on the operating system image or installed at deploy time.

Figure 2

Microsoft Deployment Process

The Microsoft Deployment process is not a straight step-by-step routine. The spokes of the wheel above point to feature teams responsible for the main tasks in the deployment project. Usually these teams work simultaneously and across teams. This is especially the case for the Security team. If we start in the center of the process wheel, the Security team begins feeding requirements into the business case and project plan for deciding whether to go through with the operating system deployment and how the process should be scoped and executed.

Starting from the Testing Process and moving clockwise, the Security team works with the test team to define appropriate test cases. Moving to Infrastructure Remediation, all themes regarding back-end, network and client infrastructure are reviewed with the Security team. We’ll cover the deployment infrastructure later in the article, but this task involves ensuring that elements like Active Directory Group Policy infrastructure are in place, public key infrastructure will support the new systems, deployment infrastructure is capable of passing user state securely and reliably from PCs to servers. And if you are planning to use Windows BitLocker Drive Encryption, you may want to add a Trusted Platform Module (TPM) version 1.2 to your client hardware spec.

The Image Engineering and Application Management tasks probably have the most impact on what goes on the PC itself. In Image Engineering, we are determining elements of default PC configuration across the organization, like local firewall, BitLocker Drive Encryption, Encrypted File System, Rights Management Services, and default Internet browser configurations. If you are using a hybrid or thick image strategy, meaning some or all of your applications are installed and captured in your standard image, you will also include anti-malware applications in the image, plus any required remote access applications or management agents. During the Image Management phase, you also determine how software updates, packages, and drivers are installed before the deployed PC boots into the full operating system.

Application Management is the discipline that governs your entire application portfolio and security impacts are relevant for all applications intended for the new operating system. For the sake of brevity, we’ll include Microsoft Office deployment into the Application Management discussion. Typically, the first step in application management is finding out what you users already have installed on their PCs. Based on the inventory mechanism you use, there may be security considerations for how you access and query Windows Management Instrumentation.

Once you have your extended list of applications, the next step is to take what can in some cases be a list of more than 10,000 applications and get that list down to a more manageable number. Beyond usage metrics, costs, compatibility, and supportability of these applications on the new operating system, your Security team will apply the same standards of security review, such as ISO/IEC 17799:2005, to the existing list to decide what gets carried forward. In a thin image strategy, where little or nothing beyond the configured operating system is in the image and applications are installed at deploy time, security decisions are made about standard antivirus and anti-malware applications. Finally, when packaging applications for automated installation, it is a good idea to use the installation capture mechanisms and see what file folders and registry keys are written to during application install and make security changes as needed.

The final major task related to what goes on the PC itself is Migration. Just as important as what goes on the new PC is what and how information comes off the old PC. In Microsoft Deployment, we use the User State Migration Tool (USMT) to capture files and settings that constitute the “personality” of the old system and reapply those to the new system. This can happen in-place for wipe-and-load scenarios or the user state can be saved to an external location. The Security team should assess and influence what information is transferred via USMT or other tools and where the information goes. For example, you probably don’t want to allow users to store their files to an unprotected portable USB drive when receiving a new PC. Likewise, you probably want to block any known malicious files from getting migrated into the user account locations of the new PC. The Migration team will work with the Security team to ensure that the right information is transferred in a secure way between systems.

What Happens to the Management Infrastructure?

The management infrastructure assumes most of the security management tasks once the PC is configured and standard applications are installed. Management considerations include strategies around software updating and configuration management. The Windows Vista Security Guide focuses primarily on configuration management and enforcement using Active Directory Group Policy. The great thing about this guide is that it offers more than straight guidance about what to set in Group Policy and gives you two predefined Group Policy objects:

  • Enterprise Client. Client computers in this environment are located in a domain that uses Active Directory and only need to communicate with systems running Windows Server 2003. The client computers in this environment include a mixture: some run Windows Vista, others run Windows XP.

  • Specialized Security, Limited Functionality. Concern for security in this environment is so great that a significant loss of functionality and manageability is acceptable. For example, many government agency computers operate in this type of environment. The client computers in this environment run only Windows Vista.

These Group Policy objects constitute the security baseline applied to the operating system and you can modify them according to your needs. You can also use Group Policy to enforce BitLocker Drive Encryption usage and use integrated local software restriction policies to control what software can run on PCs, restrict access to specific files on multiuser computers, and prevent executable files from running on PCs.

Once the baselines are set, you can continue working on the management side of the equation by drilling into several aspects briefly covered during the Infrastructure Remediation tasks, including Encrypting File System, Rights Management Services, and device control as part of your data protection strategy.

The Windows Vista Security Guide and Data Encryption Toolkit for Mobile PCs provide a wealth of information on hardening the desktop environment and protecting data. They will help save you lots of time as part of the overall desktop deployment and security planning process.

One final thought about management infrastructure is to look at operating system deployment automation solutions like Microsoft System Center Configuration Manager 2007, where many of the features and considerations described in this article are engineered directly into the workflows automated by the product. Configuration Manager 2007 by default is designed to protect critical information, such as usernames, passwords, and BitLocker Drive Encryption keys.

How Do You Secure the Automated Deployment Infrastructure?

Deployment infrastructure security is often overlooked inside the larger deployment project. The Security team should plan for the security of the deployment infrastructure itself -- primarily, how to protect deployment servers, file servers, and license keys.

Once all of your preparations are complete, your images are tested, and everything appears to pass your defined security standards, then your Security team should assess the security of deployment servers. Deployment servers often store applications, updates, packages, and user credentials used to execute the deployment. In Microsoft Deployment, for example, a task sequence calls on filenames and locations to install on a deployment server using a task sequence. If someone manages to replace an application or package with a piece of malicious software keeping its original name and location, potentially thousands of deployed PCs could be affected. Only authorized personnel should be able to access the physical deployment servers or what is stored in them. Depending on your situation, you may want to use collusion, or require two or more users to work together when accessing deployment servers. To this end, you can try to limit or prevent remote logon to deployment servers, if your situation allows it. Finally, requiring two-factor authentication is also effective to reduce the risk of cracked or compromised passwords, where simple authentication is used.

During the user state migration phases in a network deployment you are often passing user files and settings to file servers in your deployment infrastructure. In many cases, this data is highly sensitive -- such as locally stored financial information, strategy documents, or customer data -- and security for this portion of the deployment infrastructure is critical. If you are using Windows servers to store this information, refer to the Windows Server 2003 Security Guide or the Windows Server 2008 Security Guide to help your team adequately lock down these servers.

Product key protection is another area where the deployment infrastructure can help, and you should take precautions to protect the related infrastructure components. Professional versions of Windows Vista and Windows Server 2008 volume license keys are usually managed using Key Management Services (KMS) typically running on Windows Server 2003 or Windows Server 2008 infrastructure. This service helps manage and increase the protection of volume license keys compared with the old system in which a key could be stolen or publicized and used by unintended recipients. As with the old volume key system, the process is still transparent to end users when deploying the new operating system. Even though this system is much more secure against key leaks in design, it is still a good idea to limit access to the KMS servers and protect them from external threats.

A Security team cannot completely eliminate all security risks. Plan for ways to identify and track attacks and perform security audits to ensure that the deployment infrastructure is as secure as it can be.

Conclusion

As with any IT service, security considerations should be made in every phase of the desktop deployment project -- from the initial vision and decision to upgrade through every aspect of planning and developing the components that will eventually be deployed to your desktops. Because the desktop is often the user’s primary conduit into most other IT services and the primary location to store files, security implications on the desktop deployment project are particularly important. In the true sense of the Defense-in-Depth Security Model, you should examine all layers affected by the desktop experience.

Resources