Sdílet prostřednictvím


Wither White-Box Testing?

I was recently reading JW's blog post on Prevention vs. Cure: https://blogs.msdn.com/james_whittaker/archive/2008/07/24/prevention-v-cure-part-1.aspx and it set me off a bit.  The blog post talks about "developer testing" which got me thinking about one of my biggest gripes: the "Software Design Engineer In Test" title.  I hate this title because it implies that someone who does test development has a different skill-set then someone who does product development. 

 So what does this have to do with JW's blog?  Well, JW draws a distinction between "developer testing" and "tester testing" where developer testing involves things like design reviews, code reviews, unit testing, and presumably stepping through code.  In other words, its what we traditionally have referred to as "white box" or sometimes "glass box" testing.  White-box testing is extremely valuable.  Getting chest-deep in the code, crawling memory and operating systems primitives, using tools like perfmon and process explorer is a really good way to find nasty bugs.  Sometimes its the only way.  For example, consider threading bugs.  To borrow from a well-known gun slogan, IMO "Testers don't find threading bugs, customers do."  The variations in timing between thread execution are infinite and testing all cases is impossible.  You can't guarantee you don't have deadlock bugs through black-box testing--you have to analyze the code itself.

So who is going to do this work?  There is no question that the buck stops with the developer who wrote the code.  Good developers have their designs and their code reviewed.  They write unit tests and step through their code.  In effect, they do white-box testing every day.  However, there are two problems with this.  The first is that developers are psychologically invested in their code.  No developer wants to think they write buggy code; it would mean they are a bad developer.  Therefore they have a vested interest in not finding bugs.  Which isn't to suggest most developers just close their eyes and check in.  Far from it--many developers are very dilligent and make every effort to test their code.  Still the psychological disadvantage cannot be ignored; its a bit akin to letting a student grade their own papers.  At the end of the day, developers have to believe their code is great and at some level, that prevents them from being able to take an unbiased view of their code.

The second problem is that the degree to which the most well-intentioned developer is able to to test their code tends to be dictated by the schedule.  You can "Wide Band Delphi" all you want, but if reliable estimating were possible, my landscaping would have been done last week and the plumber would have been at my house on Tuesday instead of Friday.  Good developers try to provide good estimates and leave time for testing, but the reality is that estimating is almost impossible to get right--unless it happens to be something that you have done recently and repetitively (which isn't typically the case in a software project).  And even if you get your estimate right, other things interfere like customer issues and special projects which somehow never seem to affect ship dates.  So at the end of the day, developers do the level of testing that they have time to do which is often severely curtailed by the schedule. 

Schedule pressures come into play when it comes to spec, design, and code reviews.  Everybody is under the gun to get their own stuff checked in, so who has time to review anybody else's work?  Sure, many orgs require code reviews prior to checking in, but usually what that means is two developers sitting in front of a machine where the author rapidly scrolls through their changes while the review sits and listens to the author explain how great their code is.  The whole thing is over in minutes.  A good code review should take a significant percentage of the time that it took to write the code in the first place and should be done without the author present, but that never happens because nobody has the time.

Well, almost nobody.  SDETs (who should theoretically have the same skills as developers, or else what is the 'D' doing in their title) are paid to test, so and they should be fully capable of white-box testing.  By employing "test developers" in a dedicated capacity as white-box testers, the two core problems that limit the effectiveness of white-box testing are removed.  First, the fact that they are dedicated ensures that whatever time is available in the schedule will be solely dedicated to white-box testing.  Second, unleashing developers on someone else's code not only removes the psychological barriers to finding bugs, but it ties their ego to doing an extremely thorough review of the code.  In addition, when experienced people do white-box testing, it becomes a learning experience for the less experienced people.

Which brings me back to why I hate the SDET title.  Development is development, product or otherwise.  Developers benefit from doing white-box testing as well as from having it done on their code.  Skilled developers may not wish to be forced to choose between product development or development-level testing.  Some developers should be forced to do some of the latter for their own personal growth.  But as it stands, the SDET title gets in the way; if you want to be a dedicated tester, you must choose a different career path.  IMO, we should just have an SDE title where it is well understood that part of being a developer means spending time being solely dedicated to white-box testing.

Finally, I should mention that none of this is meant to diminish the importance of black-box testing.  Its just that most organizations get black box testing right because it is what we tend to associate with traditional testing.

Comments

  • Anonymous
    July 24, 2008
    The comment has been removed

  • Anonymous
    July 24, 2008
    Dude, you only read part 1 and I thought I made my facetitious fairly clear in it. Stay tuned. I like your post. A good read. /jw

  • Anonymous
    July 24, 2008
    The comment has been removed

  • Anonymous
    July 25, 2008
    Thanks James--yes I knew you were being facetitous & hopefully my post didn't come of as sounding critical of yours.  I liked it a lot :).  It just got me thinking about my favorite pet peeve enough to blog about it.