When should you not use automated testing for a web application?

Suppose I am using an interpreted language to power a medium or large application and that the application is manually tested (no unit tests, no integration tests, etc.). Support for the specified product will not be removed at any time for the foreseeable future, and new development will occur daily.

Are there situations in which it would be inappropriate to start implementing an automated test suite?

+1


source to share


4 answers


Situations in which automated unit testing cannot be implemented:

  • When an application is at the end of its life cycle and is about to be discarded
  • When management processes to fire you for committing extra effort to testing
  • When the existing system is so convoluted and poorly designed that adding tests would only be practical if the software was canceled and redesigned from scratch.
  • If it makes your property redundant


Other than that, I can't think of any reason not to start adding unit tests. Unit tests can help make it easier to change and redesign your system. It should be part of the day-to-day work of the developer, not an extra extra.

+3


source


Anytime you want to guarantee a busy group of people doing manual error testing.



+2


source


Automated testing is just one testing tool. He finds a certain class of defects. Running only unit tests or only automated UI testing or only manual testing will not give you the best results.

Automated UI tests are great for regression. They should be done on any fairly complex application that is under active maintenance.

+1


source


When automatic testing will trigger events or leave data in the system that will interfere with its correct operation.

0


source







All Articles