Assessing the performance of IR’s newly-introduced and upgraded services presented a technical testing challenge which sought to understand performance impacts and customer experience expectations of its new hardware and software solutions.

Chris Hourigan, Business Transformation – Testing Lead at IR, says “There are multiple touchpoints to IR systems from individual taxpayers, business partners and the tax agents who interact with IR. Combined, there can be hundreds of thousands of accounts interactions with our systems on a daily basis, generally at key calendar dates.” This is why performance testing is an essential component of quality assurance. “By simulating the full loads expected at those overlapping events, IR has a clear view on how
its systems will cope before they go live,” he adds.

The purpose of the business transformation, notes Patrick O’ Doherty, Enterprise Architecture & Business Design Lead – Business Transformation, is the delivery of a digital real-time interaction replacing traditionally paper-based activities. The scale of performance testing these interactions is amplified through the range of IR products available to specific customers, while the underlying structure of the systems presents a further complication.

Any failure in any of the new systems and services we’re rolling out will taint public confidence. There’s a substantial reputational risk every time a new service is introduced – and performance testing as a component of a broader range of assurance services is the insurance policy addressing that risk"

Chris Hourigan, Business Transformation Testing Lead, Inland Revenue


Assurity created a testing solution that provides ‘Performance as a Service’ to all delivery activities under the programme. By adopting a performance risk assessment approach, actual and perceived risks were ascertained across the full gamut of IR products within scope of each phase of the programme delivery cycle.

In fact, the information which resulted from performance and customer experience monitoring was made so accessible that monitors were installed on the IR campus. This allowed anyone to view the results – “It effectively replaced the water cooler for office conversations. We had real up to the minute facts, not hearsay,” says O’ Doherty.

Through the creation of an enduring suite of performance artefacts, IR has ready access to proven performance testing assets in future, accelerating the delivery, and reducing the risk of new products while driving down the cost of assurance.


Thanks to the performance testing strategy and assets, multiple potential issues were identified and resolved prior to reaching production. “The performance testing on the business transformation programme has supported IR in achieving significant gains in system performance for our customers,” says Hourigan.

“On peak days, IR can support over 500,000 logins to myIR – that’s more than three times above the peaks prior to transformation – while the stability and certainty provides a better experience for our customers and our business,” he adds.

While there were issues identified post-production, the assets and approach in place ensured these were quickly identified and dealt with.

Internally, says Hourigan, performance testing tends to be understood in terms of what happens when everyone tries to get through the front door at the same time. “It’s all peak volumes, soak tests, memory and scalability, break points and throughputs.

“But what we’re getting closer towards is the customer viewpoint into IR; a piece of information on what their experience is going to be, what happens to the system when people behave in certain ways. And not always in ways that we were expecting. It has to be 24/7, now, with digital systems, so performance testing must span it all.”