Perflearner: Learning from bug reports to understand and generate performance test frames

Xue Han, Tingting Yu, David Lo

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

24 Scopus citations

Abstract

Software performance is important for ensuring the quality of software products. Performance bugs, defined as programming errors that cause significant performance degradation, can lead to slow systems and poor user experience. While there has been some research on automated performance testing such as test case generation, the main idea is to select workload values to increase the program execution times. These techniques often assume the initial test cases have the right combination of input parameters and focus on evolving values of certain input parameters. However, such an assumption may not hold for highly configurable real-word applications, in which the combinations of input parameters can be very large. In this paper, we manually analyze 300 bug reports from three large open source projects - Apache HTTP Server, MySQL, and Mozilla Firefox. We found that 1) exposing performance bugs often requires combinations of multiple input parameters, and 2) certain input parameters are frequently involved in exposing performance bugs. Guided by these findings, we designed and evaluated an automated approach, PerfLearner, to extract execution commands and input parameters from descriptions of performance bug reports and use them to generate test frames for guiding actual performance test case generation.

Original languageEnglish
Title of host publicationASE 2018 - Proceedings of the 33rd ACM/IEEE International Conference on Automated Software Engineering
EditorsChristian Kastner, Marianne Huchard, Gordon Fraser
Pages17-28
Number of pages12
ISBN (Electronic)9781450359375
DOIs
StatePublished - Sep 3 2018
Event33rd IEEE/ACM International Conference on Automated Software Engineering, ASE 2018 - Montpellier, France
Duration: Sep 3 2018Sep 7 2018

Publication series

NameASE 2018 - Proceedings of the 33rd ACM/IEEE International Conference on Automated Software Engineering

Conference

Conference33rd IEEE/ACM International Conference on Automated Software Engineering, ASE 2018
Country/TerritoryFrance
CityMontpellier
Period9/3/189/7/18

Bibliographical note

Funding Information:
This research is supported in part by the NSF grant CCF-1652149.

Publisher Copyright:
© 2018 Association for Computing Machinery.

Keywords

  • Performance bugs
  • Software mining
  • Software testing

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Human-Computer Interaction
  • Software

Fingerprint

Dive into the research topics of 'Perflearner: Learning from bug reports to understand and generate performance test frames'. Together they form a unique fingerprint.

Cite this