4 Ways Software Can Catch Up with Henry Ford’s Processes

A number of industries have long used simulation and optimization techniques to streamline their production processes. Some of these ideas are decades old. Consider Henry Ford’s assembly line of a century ago.

But the software industry is just now laying the foundation to catch up, Akshay Rao, senior solutions architect at CA Technologies, said in a recent webcast

One of the keys to understanding the efficiency of any process lies in comparing idle time to touch time. In software development, idle time refers to the amount of time programmers wait for test systems to become available before they can move to the next task. Touch time is the time actually spent doing software development. 

The obstacles that keep us waiting – whether it’s limited mainframe time, other teams’ schedules or the excessive costs of repeatedly querying databases – are referred to as “constraints.”

Because of constraints, the ratio of idle time to touch time is out of whack in software development, especially as compared with other industries.

“When you compare that to the manufacturing world and other engineering disciplines, it is low and ripe for innovation,” Rao says. 

Rao points to four key DevOps concepts that promise to help bridge this gap, starting with the use of simulation – Service Virtualization – to eliminate constraints. 

1. Constraint free development – Reducing testing bottlenecks around access to legacy systems, paid services and fresh testing data. Rao says application developers need access to environments and test cases that even the real systems can’t provide. For example, you can’t tell a production system to generate an error or respond slowly. Developers need to simulate these characteristics to do better testing.  Service Virtualization techniques can help to model and simulate the required systems and data sets. 

2. Continuous Application Delivery – Bringing agile to the deployment process. Release automation is the notion that we can use advanced orchestration to reduce the touch time required to move software into production. This can reduce the amount of time needed to push out new releases. It also can help to improve the testing environment since developers can test against specific configurations. Rao says many organizations are able to use this approach to reliably push out up to 2,400 updates per week.

3. Complete Monitoring – Gathering rich, actionable information that can improve production systems.  One of the challenges of modern enterprises is that performance data is spread throughout IT systems across various servers, routers and cloud services. In order to create accurate simulations, it is important to gather comprehensive data across these various systems.

4. Collaborative data mining – Mining all of the performance characteristics of application data to enhance the simulations used for testing. The raw performance data does not provide insight into how the underlying applications behave. Collaborative data mining is the idea of using artificial intelligence techniques to transform this raw data into models of how the applications behave. This can help to provide highly realistic simulations based on what the environment is doing versus what you expect it will do.

Early adopters of these principals are seeing tremendous improvements in their deployment speed and quality, using Service Virtualization and continuous application delivery, says Rao.

For example one large telecommunications company was able to reduce software release cycle time by 33 percent while finding four times as many defects in development. ROI for the project was realized in just four weeks, he said.