Probably not: But it is sure changing rapidly, within both web content businesses and software as a service firms. No doubt, corporations will also follow this trend line as experience with faster, more efficient and less costly testing techniques spreads.
Following up on my last post, here is some direct feedback from Robert Johnson at Facebook, who spoke recently about their process for software development and testing.
"Facebook developers are encouraged to push code often and quickly. Pushes are never delayed and applied directly to parts of the infrastructure. The idea is to quickly find issues and their impacts on the rest of system and surely fixing any bugs that would result from these frequent small changes."
"Second, there is limited QA (quality assurance) teams at Facebook but lots of peer review of code. Since the Facebook engineering team is relatively small, all team members are in frequent communications. The team uses various staging and deployment tools as well as strategies such as A/B testing , and gradual targeted geographic launches. This has resulted in a site that has experienced, according to Robert, less than 3 hours of down time in the past three years."
Johnson certainly confirms some of the Beck's observations about how test changes when the business need for agility starts to dominate, specifically:
- Making the development manager responsible for coding, integration & testing, and eliminate separate QA & build/integration teams;
- Eliminate separate patch releases;
- Perform continuous beta testing with key customer participation, and stop performing up-front usability testing ahead of development.
- "Immunize" code by extensive code reviews, and by building in testability, assertions, or other self-validation techniques;
- Perform rolling deployments across subsets of the user base;
- Perform "release experiments" in lieu of testing.
The argument against agile QA is usually that such techniques have no place in truly mission-critical corporate applications, and that such methods sacrifice quality by shifting the burden to the consumer.
Well, if you haven't noticed, the quality burden has been shifting to the consumer for a couple decades now. For example, permanent beta testing is now the norm at most large Internet content and ecommerce firms.
Traditional QA engineering won't change dramatically where risks due to poor quality are too great: Medical devices or aeronautics; regulated markets, such as security exchanges; on-line banking or card payment systems; core corporate accounting; etc. But out of a typical corporate application portfolio, most applications pose no such risks, and these new QA techniques offer opportunities for more agility, faster time to market, with significant cost savings.
The logic of capitalism is such that quality is never an absolute, but just one among many factors that influence consumer demand. When properly marketed, a shift in quality techniques that results in lower costs to the consumer without significant loss of features will almost always find new customers and more demand from existing customers. I argue this will be as true for internal corporate IT as with external consumers, and once again, Internet development techniques will lead the way.