In: Computer Science
Why is it costly if an error was discovered in later phases of software development?
For bug fixes, earlier is better (and cheaper)
They say prevention is better than a cure, and this definitely
holds true when it comes to bugs and security issues. During the
event process, it's less expensive and efficient to repair bugs in
earlier stages instead of later ones. The cost of fixing a problem
increases exponentially because the software moves forward within
the SDLC.
The Systems Sciences Institute at IBM reported that it cost 6x more
to repair a bug found during implementation than to repair one
identified during design. Furthermore, consistent with IBM, the
value to repair bugs found during the testing phase might be 15x
quite the value of fixing those found during design.
Clearly, it’s harder to rectify issues as a product approaches the
top of its development life cycle. The earlier bugs are introduced
(e.g., during the design phase), the higher their potential impact,
and the more complex they can be to resolve. The changes made for a
bug fix also can affect the application’s functionality. In turn,
developers may need to make further changes to the codebase, adding
to the cost, time, and effort. So it’s important to seek out and
fix bugs during the first stages of development.
Consider an example of a bank finding a security flaw after
releasing an application employed by thousands of consumers . If
the bank had found the difficulty earlier in development, there
would are some cost to repair it. But now, the bank will spend
exponentially more effort, time, and money to fix it. Additionally,
the complexity of implementing changes during a live production
environment further increases the general cost related to
late-stage maintenance.