This blog touches on many of the themes which test consultant Paul Gerrard and I will cover in our upcoming webcast “Transform Your Testing for Digital Assurance”. To find out more and book your place, please click here.

The Standish Group’s Chaos Report has always made for uncomfortable reading for anyone responsible for software projects. Since 1994, it has painted a consistently grim picture, and the 2016 report is no different: 52% of projects in 2015 were judged to be “challenged” and 19% failed. For projects at large companies, just 9% are “successful”.

Moving to a value-driven delivery model

What has changed, however, is the definition of “challenged”. Projects are no longer judged solely on whether they are delivered on time and within budget, but must also deliver value to be “successful”.

This reflects a broader shift in understanding, whereby 78% of enterprises believe that value-driven delivery business models are needed to respond to disruption in the next 3 years.

However, delivering value appears to be as hard, if not harder, than delivering on time and within budget. The average large project runs 45% over budget and 7% over time but delivers a massive 56% less value than expected. A 2015 study published in the Harvard Business Review similarly found that 75% of cross-functional teams do not meet customer expectations while also staying within the planned budget and on schedule.

An incompatible triad?

Fundamental changes are needed at key points within the software delivery pipeline, to eliminate bottlenecks while delivering what the user actually wants.

The first thing is to understand what value looks like to your customers, and this returns to the need to establish upfront knowledge of the desired functionality through accurate requirements.

The requirements must also be reactive, being iteratively updated to reflect the latest user needs, and the dedicated design phase and monolithic documents which before stood at the start of Waterfall projects have often been replaced by a constant barrage of user stories and change requests.

However, these stories are still typically blocks of text, from open text fields in project management tools to written email requests. This brings you back to natural language, far removed from the logic of a system which needs to be developed and tested, while much research suggests that the majority of defects hark back to the design phase.

Within the same sprint or iteration, a change captured in the requirements must be tested sufficiently enough to provide the assurance that the desired user needs have actually been reflected in the code. However, testing is another area which can rarely keep up with the rate of change.

Typical bottlenecks in a sequential approach.

When the user needs change, the test assets must be updated and kept in alignment. This is again manual and time-consuming, as the assets are not traceable back to the design itself. Technical debt of three or more iterations is common, meaning that new functionality is essentially left untested.

A requirements-driven approach to value-driven delivery

For testing to be able to deliver value on time and within budget, the requirements gathering process must first be fundamentally improved. Model-Based approaches offer a way to capture user needs accurately, while carrying the substantial benefit that tests can be generated from the model.

Flowchart modelling provides the required mathematical precision for automated test creation, while offering the substantial benefit that flowcharting can be used by technical and non-technical stakeholders together. BAs, for instance, can formulate functional requirements similar to VISIO and BPMN diagrams, before developers and testers add additional functional logic and link the sub-processes under the master.

Tests are equivalent to paths through the flowchart, and can be generated using coverage algorithms to test every logically distinct path in the smallest set of tests possible:

Generating the smallest set of tests needed to cover a requirements model of the system under test.

Meanwhile, with a tool like CA Agile Requirements Designer, a configuration file can be assigned to the flow to generate automated test scripts, data and virtual data at the same time as the optimized tests.

Testers therefore have everything they need for rigorous testing, derived directly from the design itself. The test assets can therefore be maintained at the same time as the design, using impact analysis to identify and update any tests impacted by a change system wide.

Changing user needs can be accurately and quickly reflected in the design, at the same time as testing that the changes have been reflected accurately in the code. Rigorous testing can therefore keep up with constantly changing user needs, validating that value has been delivered to both user and organization.

Generating the smallest set of tests needed to cover a requirements model of the system under test.

The “Transform Your Testing for Digital Assurance” webcast will take place 7th December, 4pm GMT/ 5pm CET / 11am ET. Book your place here.