Abstract
The goal of performance regression testing is to check for performance regressions in a new version of a software system. Performance regression testing is an important phase in the software development process. Performance regression testing is very time consuming yet there is usually little time assigned for it. A typical test run would output thousands of performance counters. Testers usually have to manually inspect these counters to identify performance regressions. In this paper, we propose an approach to analyze performance counters across test runs using a statistical process control technique called control charts. We evaluate our approach using historical data of a large software team as well as an open-source software project. The results show that our approach can accurately identify performance regressions in both software systems. Feedback from practitioners is very promising due to the simplicity and ease of explanation of the results.
6 AUTHORS, INCLUDING:
Thanh H. D. Nguyen
Queen's University
14 PUBLICATIONS 246 CITATIONS
SEE PROFILE
Bram Adams
Polytechnique Montréal
100 PUBLICATIONS 686 CITATIONS
SEE PROFILE
Ahmed E. Hassan
Queen's University
196 PUBLICATIONS 2,454 CITATIONS
Trubin et al. [18] proposed the use of control charts for infield
monitoring of software systems where performance counters
fluctuate according to the input load. Control charts can
automatically learn if the deviation is out of a control limit,
at which time, the operator can be alerted. The use of control
charts for monitoring inspires us to explore them for
the study of performance counters in performance regression
tests. A control chart from the counters of previous
test runs, may be able to detect “out of control” behaviours,
i.e., deviations, in the new test run.
...
[18] I. Trubin. Capturing workload pathology by statistical
exception detection system. In Computer
Measurement Group (CMG), 2005
_______
The next paper that has citations to my work is in the next post: