Performance testing starts by setting a baseline and defining the metrics to track together with the development team. Nikolay Avramov advises executing performance tests and comparing the results frequently during development to spot degrading performance as soon as possible.
Avramov, an automation lead at Automate The Planet, spoke about performance testing at QA Challenge Accepted 2022
One of the common approaches that Avramov is seeing is to develop a product, do functional testing on it, pass it through user acceptance, and then check if it works "fine" under the expected loads. By starting to test at the end of the development process, we are missing out on opportunities to get more information we could use, Avramov stated.
According to Avramov, having a trend of results over time is crucial for performance testing:
Performance testing should be planned before and executed during development. Knowing the types of performance tests that need to be developed, the team should identify metrics to track over the course of the project and define a baseline for their system. Over the course of development there is always a way to measure the product.
During a project, bottlenecks could be chained together and become harder and harder to fix. And even if we do, this could introduce regression issues all over the system, Avramov mentioned.
Each application has its limits and specifics, Avramov said. According to him, the first goal for performance testing is to define what those limits are, and what the "idle" performance is, looking from the client’s standpoint:
The question is, what are we trying to improve?
- If it’s the load time of a home page, there is a set of metrics and tests that we can start running and track its results.
- If it’s the number of concurrent users it could withstand, then we need to configure a set of load tests to perform and analyse the results after each change.
- If it’s a business scenario that takes too much time, the problem-solving might require profiling a database query, or a series of web requests.
Finding performance bottlenecks is always related to teamwork across multiple roles, departments, and system components, Avramov said. No one has all the answers in one place, so part of the job is to connect all these people and channel their knowledge and energy to solve the problem.
Performance testing is an iterative process where constant improvements need to be made - on the software side and the tests themselves, Avramov concluded.
InfoQ interviewed Nikolay Avramov about performance testing.
InfoQ: How would you define performance testing?
Nikolay Avramov: Performance Testing is about asserting the system works efficiently and meets the expectations in terms of reliability, speed, responsiveness, and ability to withstand peak loads.
This type of testing can be done on multiple layers of the system to uncover different problems with the setup, configuration, client-side code and bottlenecks when exposed to higher loads.
Performance testing is not only about the server response time of our managed system. It’s also about the experience of the user while working with the software. Client-side performance tests can uncover issues with third-party systems or integrations that can harm the overall look and feel of the system.
InfoQ: How are performance testing and load testing related?
Avramov: Load testing is actually Performance Testing under simulated load. The goal is to capture performance metrics, while the system is near or over its expected load levels.
Load testing has its own sub-types like Spike, Soak, and Endurance testing. We can reconfigure parameters of the load applied to the system to simulate different real-world scenarios.
We can do performance testing even without applying load. This would be capturing the performance metrics from the server and from the web requests execution.
InfoQ: What’s your advice to teams that want to improve the way that they are doing performance testing?
Avramov: There are many types of performance testing that can be performed, and they all could be beneficial if we know how to use them and what to expect. So my advice is to educate yourself on the purpose of each of these, sit together with the team and decide what is applicable in your case. The implementation comes next - it can be done in many different ways, but the most important work is done within the team.
Most of the teams that have performance tests focus only on load testing. My advice is to think about the users. Each point of your graph could be a potential customer we would lose because of lousy performance. Do not forget about the client-side performance during load testing.