3 Key Performance Testing Metrics Every Tester Should Know | Abstracta (2024)

Dive into the core of performance testing metrics and discover the importance of accurate analysis in ensuring optimal system performance. Through simple explanations, we will guide you toward making informed decisions in your testing endeavors and identifying performance bottlenecks.

3 Key Performance Testing Metrics Every Tester Should Know | Abstracta (1)

Unlocking the full potential of performance testing means delving into the performance metrics that matter. It’s not enough to run tests and gather data; the real power lies in accurate analysis, empowering you to make informed decisions and boost your system’s performance.

Embarking on performance testing unveils three crucial metrics: Average, Standard Deviation, and Percentiles. Each offers unique insights, painting a comprehensive picture of system performance.

Through a thoughtful analysis of these metrics, we lay the foundation for enhanced system responsiveness.

Looking for a Performance Testing Partner? Explore our Performance Testing Services! Our global client reviews on Clutch speak for themselves.

Making Sense of The Average, Standard Deviation, and Percentiles in Performance Testing Reports

There are certain performance testing metrics that are essential to understand properly in order to draw the right conclusions from your test results. These performance metrics require some basic understanding of math and statistics, but nothing too complicated.

The issue is that if you don’t understand well what each one means or what they represent, you’ll come to some very wrong conclusions.

In this post, we want to focus onaverage response time, standard deviation, and percentiles.Without going into a lot of math, we’ll discuss test metrics and their usefulness when analyzing performance results.

Want to learn all about performance testing? Don’t miss our Continuous Performance Testing Comprehensive Guide

The Importance of Analyzing Data as a Graph

The first time we thought about this subject was during a course thatScott Barbergave in 2008 (when we were just starting up Abstracta), on his visit to Uruguay. He showed us a table with values like this:

3 Key Performance Testing Metrics Every Tester Should Know | Abstracta (2)

He asked us which data set we thought had the best performance, which is not quite as easy to discern as when you display the data in a graph:

3 Key Performance Testing Metrics Every Tester Should Know | Abstracta (3)

In Set A, you can tell there was a peak, but then it recovers.

3 Key Performance Testing Metrics Every Tester Should Know | Abstracta (4)

In Set B, it seems that it started out with a very poor response time, and probably 20 seconds into testing, the system collapsed and began to respond to an error page, which then got resolved in a second.

3 Key Performance Testing Metrics Every Tester Should Know | Abstracta (5)

Finally, in Set C, it’s clear that as time passed, the system performance continued to degrade.

Barber’s aim with this exercise was to show that it’s much easier to analyze information when it’s presented in a graph.In addition, in the table, the information is summarized, but in the graphs, you can see all the points. Thus, with more data points, we can gain a clearer picture of what is going on.

Interested in data analysis? We invite you to read this article: Data Observability: What It Is and Why It Matters. (PONER COMO RECUADRO)

Understanding Key Performance Testing Metrics

Okay, now let’s see what each of the metrics for performance testing means, as a key part of your performance testing process. Let’s do it one by one, checking their importance for analysis purposes. Understanding metrics like server CPU usage or CPU capacity utilized can provide insights into how efficiently the system is processing requests.

Average Response Time

To calculate the average, simply add up all the values of the samples and then divide that number by the quantity of samples.

Let’s say we do this and our resulting average peak response time is 3 seconds. The problem with this is that, at face value, it gives youa false sensethat all response times are about three seconds, some a little more and some a little less, but that might not be the case.

Imagine we had three samples, the first two with a response time of one second, the third with a response time of seven:

1 + 1 + 7 = 9

9/3 = 3

This is a very simple example that shows that three very different values could result in an average of three, yet the individual values may not be anywhere close to 3.

Fabian Baptista, co-founder and member of Abstracta’s board, made a funny comment related to this:

“If I were to put one hand in a bucket of water at -100 degrees Fahrenheit and another hand in a bucket of burning lava, on average, my hand temperature would be fine, but I’dlose both of my hands.”

So, when analyzing average response time, it’s possible to have a result that’s within the acceptable level, but be careful with the conclusions you reach.

That’s whyit is not recommended to define service level agreements (SLAs)using averages; instead, have something like“The service must respond in less than 1 second for 99% of cases.”We’ll see more about this later with the percentile metric.

Don’t miss this Quality Sense Podcast episode about why observability is such relevant in software testing, with Federico Toledo and Lisa Crispin.

Standard Deviation

Standard deviationis a measure of dispersion concerning the average, how much the values vary for their average, or how far apart they are.

If the value of the standard deviation is small, this indicates that all the values of the samples are close to the average, but if it’s large, then they are far apart and have a greater range.

To understand how to interpret this value, let’s look at a couple of examples.

If all the values are equal, then the standard deviation is 0. If there are very scattered values, for example, consider 9 samples with values from 1 to 9 (1, 2, 3, 4, 5, 6, 7, 8, 9), the standard deviation is ~ 2.6 (you can usethis online calculatorto calculate it).

Although the value of the average as a metric can be greatly improved by also including the standard deviation, what’s more useful yet are the percentile values.

Percentiles: p90, p95, and p99

Understanding percentiles is crucial for accurate system performance analysis.

Let’s break down what percentiles like the 90th percentile (p90), p95, and p99 mean and how they can be used effectively in performance tests.

What Are Percentiles?

A percentile is a valuable performance testing metric that gives a measure under which a percentage of the sample is found. This helps in understanding the distribution of response times and other performance metrics. The percentile rank is another important metric that helps in understanding the distribution of response times.

The 90th Percentile (p90)

The 90th percentile (p90) indicates that 90% of the sample values are below this threshold, while the remaining 10% are above it. This is useful for identifying the majority of user experiences and ensuring that most users have acceptable response times.

The 95th Percentile (p95)

The 95th percentile (p95) shows that 95% of the sample values fall below this threshold, with the remaining 5% above it. This provides a more stringent measure of performance, ensuring that nearly all users have a good experience.

The 99th Percentile (p99)

The 99th percentile (p99) represents the value below which 99% of the sample falls, leaving only 1% above it. This is particularly useful for identifying outliers and ensuring that even the worst-case scenarios are within acceptable limits.

Why Use Multiple Percentiles?

Analyzing multiple percentile values, such as p90, p95, and p99, provides a more detailed view of system performance. Tools like JMeter and Gatling include these in their reports, allowing teams to calculate percentile scores using different methods. This comprehensive approach helps in identifying performance bottlenecks and understanding how the system behaves under various conditions.

Complementing Percentiles with Other Metrics

To get a complete picture, teams should complement percentiles with other metrics like minimum, maximum, and average values. For example:

  • p100: Represents the maximum value (100% of the data is below this value).
  • p50: Known as the median (50% of the data is below and 50% is above).

Establishing Acceptance Criteria

Teams often use percentiles to establish acceptance criteria. For instance, setting a requirement that 90% of the sample should be below a certain value helps in ruling out outliers and enabling consistent system performance. This is particularly useful in identifying issues related to memory utilization and other critical performance aspects.

By focusing on the percentile score, teams can make more informed decisions and optimize their performance tests to achieve better results.

Need help with percentiles? Explore our Performance Testing Services! Our global client reviews on Clutch speak for themselves.

Careful with Performance Testing Metrics

Before you go analyzing your next performance test’s results,make sure to remember these key considerations:

1. Avoid Averages

Never consider the average as“the”value to pay attention to, since it can be deceiving, as it often hides important information.

2. Check Standard Deviation

Consider the standard deviation to know just how useful the average is, the higher the standard deviation, the less meaningful it is.

3. Use Percentile Values

Observe the percentile values and define acceptance criteria based on that, keeping in mind that if you select the 90th percentile, you’re basically saying,“I don’t care if 10% of my users experience bad response times”.

If you are interested in learning about the best continuous performance testing practices for improving your system’s performance, we invite you to read this article.

What other considerations and performance issues do you have when analyzing performance testing metrics? Let us know!

Reaching for open-source software, or a free performance load testing tool? Get to know JMeter .Net DSL, one of the leading open-source performance testing tools, bridging JMeter and . NET. It revolutionizes performance testing and open-source tools, enhancing software quality, efficiency, and reliability.

HowWe Can Help You

With over 16 years of experience and a global presence, Abstracta is a leading technology solutions company specializing in end-to-end software testing services and AI software development.

We believe that actively bonding ties propels us further and helps us enhance our clients’ software. That’s why we’ve forged robust partnerships with industry leaders like Microsoft, Datadog, Tricentis, and Perforce, empowering us to incorporate cutting-edge technologies.

We craft strategies meticulously tailored to your unique needs and goals, aligning with your core values and business priorities. Our holistic approach enables us to support you across the entire software development life cycle.

Embrace agility and cost-effectiveness. Visit our Performance Testing Services page and contact us to discuss how we can help you improve your system’s performance.

Follow us on Linkedin & X to be part of our community!

Recommended for You

TOP 10 Best Performance Testing Tools

API Load Testing

Cost vs. Value: Analyzing the ROI of Outsourcing Application Testing Services

3 Key Performance Testing Metrics Every Tester Should Know | Abstracta (2024)
Top Articles
6 Ways to Boost Ad Revenue
10 best sales discovery call questions — and why you should use them | Calendly
English Bulldog Puppies For Sale Under 1000 In Florida
Katie Pavlich Bikini Photos
Gamevault Agent
Pieology Nutrition Calculator Mobile
Hocus Pocus Showtimes Near Harkins Theatres Yuma Palms 14
Hendersonville (Tennessee) – Travel guide at Wikivoyage
Compare the Samsung Galaxy S24 - 256GB - Cobalt Violet vs Apple iPhone 16 Pro - 128GB - Desert Titanium | AT&T
Vardis Olive Garden (Georgioupolis, Kreta) ✈️ inkl. Flug buchen
Craigslist Dog Kennels For Sale
Things To Do In Atlanta Tomorrow Night
Non Sequitur
Crossword Nexus Solver
How To Cut Eelgrass Grounded
Pac Man Deviantart
Alexander Funeral Home Gallatin Obituaries
Energy Healing Conference Utah
Geometry Review Quiz 5 Answer Key
Hobby Stores Near Me Now
Icivics The Electoral Process Answer Key
Allybearloves
Bible Gateway passage: Revelation 3 - New Living Translation
Yisd Home Access Center
Home
Shadbase Get Out Of Jail
Gina Wilson Angle Addition Postulate
Celina Powell Lil Meech Video: A Controversial Encounter Shakes Social Media - Video Reddit Trend
Walmart Pharmacy Near Me Open
Marquette Gas Prices
A Christmas Horse - Alison Senxation
Ou Football Brainiacs
Access a Shared Resource | Computing for Arts + Sciences
Vera Bradley Factory Outlet Sunbury Products
Pixel Combat Unblocked
Movies - EPIC Theatres
Cvs Sport Physicals
Mercedes W204 Belt Diagram
Mia Malkova Bio, Net Worth, Age & More - Magzica
'Conan Exiles' 3.0 Guide: How To Unlock Spells And Sorcery
Teenbeautyfitness
Where Can I Cash A Huntington National Bank Check
Topos De Bolos Engraçados
Sand Castle Parents Guide
Gregory (Five Nights at Freddy's)
Grand Valley State University Library Hours
Holzer Athena Portal
Hello – Cornerstone Chapel
Stoughton Commuter Rail Schedule
Nfsd Web Portal
Selly Medaline
Latest Posts
Article information

Author: Nicola Considine CPA

Last Updated:

Views: 6761

Rating: 4.9 / 5 (49 voted)

Reviews: 88% of readers found this page helpful

Author information

Name: Nicola Considine CPA

Birthday: 1993-02-26

Address: 3809 Clinton Inlet, East Aleisha, UT 46318-2392

Phone: +2681424145499

Job: Government Technician

Hobby: Calligraphy, Lego building, Worldbuilding, Shooting, Bird watching, Shopping, Cooking

Introduction: My name is Nicola Considine CPA, I am a determined, witty, powerful, brainy, open, smiling, proud person who loves writing and wants to share my knowledge and understanding with you.