Dodfooding Challenges Part One
There are a couple of issues that come up for developers who work on Visual Studio that just aren't issues for most people. The first is that working on components of VS while using VS is hard. It's hard for a couple of reasons.
- You want to have a debug build of the product to make it easier to debug, but at the same time, you want to have a release version of the product so that the performance isn't too bad.
- If you want to be able to replace a component in the editor, you need to shut-down the editor, replace the component, then restart the editor, since windows holds the file locked.
Before we started working on VS2008 we had a single solution for both of these problems: Side by Side installs of Visual Studio. Back in the day, most developers who worked on Visual Studio didn't actually install Visual Studio through the setup process.
Batch Setup
Instead we had a batch script which would copy the binaries from the daily build machine and then run regsvr32 on a huge list of different dlls in order to getting a "working" version of VS. This had some advantages, but also some very serious drawbacks.
Pros:
- Batch Setup allowed us to have two sets of binaries in two different locations. One of these was the "dogfood" or retail build, and the other was the "debug" build.
- Batch Setup allowed us to specify alternate registry roots for the two different builds. This didn't work perfectly, as VS has some COM components that are registered in global locations, but it worked well enough that relatively few teams even had to care about what scenarios didn't work.
- Batch setup registered the binaries to their "built" locations. The output tree of our build doesn't match where the binaries are installed by setup. This means that if you have a real setup installed, and you build some binaries, you also have to copy them into place in order use them.
Con:
- A minor con is the lack of "perfect" registry support for Side by Side.
- The major concern should be obvious: Dev's didn't run setup. This meant that if a dev made a change to registration, or to the names or locations of files that need to be installed, they usually forgot to setup to do the right thing. This meant that most of the time, setup didn't actually work!
Because of the second drawback above, the division made a decision at the end of VS2005, that we wouldn't continue to use batch setup for VS2008. Instead, we invested in updating our setup technology to make it easier for developers to build setup, and to understand how the setup gets built, so that it became easier to keep setup working. We also invested in some tools to make using an installed build easier. For example, we wrote a tools called "pupdate" that is able to update installed binaries from the location that they are built to automatically.
However, we never did get to the point where it was possible to install two VS2008 builds side by side to enable the experience above. How did dev's adapt to this? There were 3 different ways:
- Some people basically stopped dogfooding. That is, they continued to use VS2005 as their dogfood build, since it was possible to install VS2005 side by side with VS2008.
- Some people started doing remote debugging. Most of my team falls into this category. In this solution, we took an old machine and installed the debug build on it, then we would use a dogfood build on our primary machine for doing work and building. When we wanted to test some changes, we would "pupdate" the binaries from the primary machine to the test machine, and then use remote debugging to debug.
- Some people created a stripped down installable version of VS that could be installed Side by Side and used that for their dogfood environment. This is basically an express sku that was developed internally for developers to use.
None of these are quite ideal, it'd be really nice to go back to the old days of being able to install 2 builds of VS side by side. We're thinking about investing in that for VS10, but as far as I know, it's still up in the air.
Comments
Anonymous
December 09, 2007
Welcome to the thirty-seventh edition of Community Convergence. Visual Studio 2008 has been releasedAnonymous
December 09, 2007
Welcome to the thirty-seventh edition of Community Convergence. Visual Studio 2008 has been releasedAnonymous
December 20, 2007
What about virtual machining the development PC to eliminate the need for an old remote PC?Anonymous
December 20, 2007
Hi Patrick, That's a valid option too. I happen to use a lot of RAM while developing, and since I still have an old machine, I find it works better. There are also an (admittedly small), number of scenarios you can't test in a VM, like the smart device emulator.Anonymous
August 15, 2008
Lately I've been trying to support work happening in a variety of our development branches, and that's