Green automation? Reduce – reuse – recycle – regression
Regression testing is expensive and its value difficult to quantify, particularly as its main objective is to find nothing and deduce that everything is all right. It can however be considered in terms of costs versus benefits. The primary benefit is pretty straightforward – to provide confidence that existing functionality is not affected by changes. Costs though, come in three main parts – initial setup, execution and analysis, and maintenance.
Automated regression tests should provide high value during the project’s lifetime, especially if some sort of iterative approach is used for development. We can probably assume that the initial set-up cost gets a good return. My primary concern is what happens with the regression suite post-project when resources are diverted. As changes continue to be deployed and the maintenance priority is downgraded, the regression functionality soon starts to deteriorate.
So the challenge is to keep a well-maintained regression suite that retains currency and doesn't continually drain resources. Can we reduce costs without losing too much benefit?
I believe we can utilise some ideas from environmentalist philosophy to minimise waste from regression testing and maybe even improve quality in the process...
If we think about the two extremes of regression from none to everything, what are we getting from our cost versus benefit analysis? Obviously 'no regression' is of no benefit. Nor is it a viable option unless there are alternative techniques being used (this is a different discussion). But what of the other extreme?
There is an often-heard argument ‘automate everything’ – assuming that by doing so, we will not miss any regression defects.
But even if we have a complete automated suite and therefore no additional set up, the cost of execution and maintenance will likely far outweigh the benefits.
Let us consider that few regression tests are likely to discover a defect. Every change will probably affect multiple tests that will then require updating and execution. Can we justify the high cost of maintaining a large suite of tests, the majority of which have limited value?
So how about a compromise? A smaller number of tests that concentrate on the regression essentials – happy paths, common functionality, high-risk and defect-prone areas.
You don't run every test in a manual regression suite every release. Instead, you prioritise and target specific areas. An automated regression suite may have more scripts that are cheaper to run, but if a test is of too low a value to be considered for manual regression, is it still of value just because it's automated?
A reduction in the number of test scripts will clearly have a direct impact on costs. But there are other benefits too. By targeting the regression, we can provide quicker feedback, still have confidence that key functionality is unchanged and save valuable resources for other purposes.
In this context, reuse is about the smart use of test artefacts.
There are plenty of existing examples of reuse in automation with reusable actions and functions. Some tools bundle code into modules that perform specific business functions, simplifying and reducing the need to repeat code. We can also do this with a well-designed automation framework.
This does require discipline from the automator and buy-in from management. Some tools in particular have a higher initial set up, so when the pressure hits, there must be a willingness to continue with this approach to reap the rewards later on.
There is a second benefit. If we can make the automation easier to understand, it becomes more accessible, making handovers simpler and might even encourage the less technically adept to get involved...
One major issue with automated regression testing is that it's usually seen as a separate activity often performed by a separate team.
The functional test team commonly designs the original regression scenarios and, although they are most likely to be the first ones to be notified of changes, they often have little to do with the tests after initial creation.
The separation frequently results in poor communication which quickly leads to duplication of activity, maintenance issues, test redundancy and gaps in coverage.
One way around this is to get the 'manual' functional test team more involved in automated testing and vice versa. Alternatively, could a single team of multi-skilled testers perform both roles?
These test designers could be kept involved with regression by encouraging ownership of the test assets – requirements and scripts.
As owners, the team would be expected to keep them up-to-date and be encouraged to utilise both the scripts and automation for purposes other than strictly regression. For example:
- Selecting tests or suites to be used for smoke testing
- Using regression tests and/or data set-up scripts to help with manual testing
- Making use of common business flow tests for training and orientation purposes
- Using the time saved on maintenance for other purposes
Keeping in touch with what’s in the regression suites will enable the team to concentrate their own testing in other areas, give them greater confidence that nothing has been missed and prevent duplication of effort.
All this adds to the quality and saves time, helping to provide greater return on investment.
Ok, so it's not really about being environmentally friendly and some of the analogies may be a little tenuous… but there is a serious message.
Less is more. Reducing the size of the regression suite will provide faster feedback about the priority functionality and allow precious resources to be allocated elsewhere, perhaps on targeted exploratory testing.
Intelligent automation. Reuse of assets makes coding simpler, reduces waste and encourages involvement.
It’s more than just regression. Recycle those scripts and ‘share the love’ to help functional testing and get a return in reduced maintenance.
I am currently attempting to implement these ideas in one of my engagements and am already seeing some benefits from the ‘reduce’ and ‘reuse’ elements. The ‘recycle’ aspect is going to be harder and will take a little longer. But I'm looking forward to seeing a lot less red crosses and a lot more green ticks in my automation results.