A big trend in software development in recent years has been so-called “componentization,” or breaking the code into interchangeable pieces. Just as computer hardware is built out of components that are connected, so too are developers who work to assemble software out of reusable parts, according to the web glossary Loosely Coupled.
It’s easy to see the benefits of building software out of reusable pieces. Presumably, an organization that does this can get its software to market faster with better quality, all while cutting costs and allowing the business to respond better to the market.
Alas, as with anything, component-based development presents challenges, especially on the testing side.
In a recent piece for TechTarget, consultant Tom Nolle argued that Service Virtualization – the use of automation and simulation in testing – can help address the issues that componentization raises while managing the lifecycle of an application.
But, Nolle cautions, Service Virtualization has its limits as well.
I chatted via email about this with Nolle, who is president of CIMI Corp., a consulting firm that has specialized in telecommunications and data communications since 1982.
Nolle: Software has become more componentized, which means testing it is much more complicated. This also makes development more of a team effort, which also increases test complexity.
ServiceVirtualization.com: Why do you believe Service Virtualization is the best way to test code?
Nolle: It provides a stable platform for representing other components while testing one component or an assembly of components. If you try to test a component system it’s hard to know, if something fails, who did things right versus wrong … In addition, if component A needs input from component B, and B isn’t written yet, you can’t test A without something like Service Virtualization.
ServiceVirtualization.com: What are the best ways to ensure that simulations prove accurate when doing Service Virtualization?
Nolle: Service Virtualization demands effective enterprise architect processes ahead of development to establish a baseline for data validity.
ServiceVirtualization.com: What is a transformation diagram, and why do you favor using it in setting up simulations for testing?
Nolle: A transformation diagram shows how data flows through components and how it’s changed along the way. It’s part of recognizing what should be presented at any point.
ServiceVirtualization.com: What are the limitations of Service Virtualization, and how can they best be dealt with?
Nolle: Simulating components with services always raises the risk you didn’t simulate accurately, in which case you’re testing against false standards. You need to use enterprise-architect process flows to validate your data and validate interfaces between components carefully … Also, if Service Virtualization testing doesn’t address the interfaces and flows of information correctly, then it won’t prepare you for the real thing.