Determine performance testing success using SLAs

Micro Focus Frequent Contributor
Micro Focus Frequent Contributor
2 2 1,238

goal - Copy.jpg

When starting a new cycle of performance testing, there should always be a goal at the end of the cycle, otherwise how do you know if the testing was successful or not? Sometimes there are there ambiguous targets, like that the system should support X number of users at peak load, but it won’t go deeper than that. By setting up quantitative goals and using service level agreements (SLAs), the result of the performance test is answered with a definitive pass or fail.

Within IT, the focus is usually on service-based SLAs and are typically either related to the availability of the service (over time) or to the performance of the service. Most commonly it’s the latter that can be verified by doing performance testing, e.g. by setting up SLAs around what are acceptable wait times for different actions in the business processes. SLAs can also provide, easy to determine, benchmarks when introducing changes to the AUT, e.g. when releasing a new version, making sure that we are still within our requirements.

While performance testing is sometimes treated as a check box activity after the functional UAT testing, it can be a lot more if it is more closely tied into what the business expects of the service. Again, the usage of SLAs will help here, since from a business point of view, by defining SMART business goals for what is expected of the service and then translating those into suitable performance SLAs, it might be possible to cover more areas than the simple customer experience. They can also be related to customer retention and revenue targets, e.g. not only if the system can handle X checkouts per hour with reasonable response times, but can it handle Y number of checkouts without losing any orders? Can the customer search for any items without delays, and can they still do so during the holiday season’s peak load?

SLA usage in Performance Center and StormRunner Load

Performance Center (LoadRunner Enterprise) and StormRunner Load (LoadRunner Cloud) enables you to define specific SLAs for your performance tests, that will determine if your test was a success or not. That can then e.g. be used as release gates in your DevOps chain, either manually or automatically with CI/CD tools like Jenkins, where if the AUT fail on any of the SLAs, then it won’t be deployed.

When setting up e.g. transaction response time SLAs in a performance test, the numbers should be reasonable and the system will often have been built with some numbers in mind, as per above, so either the business or the system architects will be able to provide the required numbers. Failing that, just using reasonable values or even benchmarking the competitors can be a good base for them.

Remember that a well-defined SLA is not only e.g. how fast a certain transaction is, but also under what conditions, which is normally defined under a certain load. We should also include how to measure it, which is often either at the 90th percentile or on average, depending on the type of measurement.

Different transactions in a business process will often have different SLAs, since doing a login, a search or simply clicking a link on a web site will make use of a different amount of system time and resources, so remember to keep your SLAs realistic.

SLA setup in Performance Center

To setup an SLA in Performance Center, you need to select a measurement, a load criteria, and a threshold value. This is done on the Summary Page of a Test, in the SLA widget.NewSLA.png

In this example, we are setting up a response-time based SLA, and there is the option of using either percentile or average based measuring.

Measurement.png

If using percentile-based transaction response time, we then select which transactions in our scripts to be used for the SLA calculations, and then set the maximum threshold for each transaction.

Threshold.png

Once the test has been executed and collated, the results also must be Analyzed in order for the SLAs to be calculated. After that, we can find them in the general HTLM Report and in the specific SLA Report.

ViewReport.png

As seen above, the SLA Status for the run is Failed, and if we open the SLA report, we see that two of the selected transactions have missed the SLA goals, thus failing the test.

90th_Percentile.png

This is also visible in the HTML Report on the Summary page, where the transaction summary is listed.

HTML_Report.png

So, in the context of being able to say if a performance test was successful or not, using SLAs are certainly one of the easiest ways to do so, since once the results of the test has been collated and analyzed, the answer is easily found on the Test Runs tab page, or can be automatically detected if launched by a CI/CD tool.

2 Comments
Micro Focus Contributor
Micro Focus Contributor

Well written Anders, thank you!

Micro Focus Contributor
Micro Focus Contributor

Excellent post!

The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.