Reproducing performance bug reports in server applications: The researchers’ experiences

Xue Han, Daniel Carroll, Tingting Yu

Research output: Contribution to journalArticlepeer-review

6 Scopus citations


Performance is one of the key aspects of non-functional qualities as performance bugs can cause significant performance degradation and lead to poor user experiences. While bug reports are intended to help developers to understand and fix bugs, they are also extensively used by researchers for finding benchmarks to evaluate their testing and debugging approaches. Although researchers spend a considerable amount of time and effort in finding usable performance bugs from bug repositories, they often get only a few. Reproducing performance bugs is difficult even for performance bugs that are confirmed by developers with domain knowledge. The amount of information disclosed in a bug report may not always be sufficient to reproduce the performance bug for researchers, and thus hinders the usability of bug repository as the resource for finding benchmarks. In this paper, we study the characteristics of confirmed performance bugs by reproducing them using only informations available from the bug report to examine the challenges of bug reproduction from the perspective of researchers. We spent more than 800 h over the course of six months to study and to try to reproduce 93 confirmed performance bugs, which are randomly sampled from two large-scale open-source server applications. We (1) studied the characteristics of the reproduced performance bug reports; (2) summarized the causes of failed-to-reproduce performance bug reports from the perspective of researchers by reproducing bugs that have been solved in bug reports; (3) shared our experience on suggesting workarounds to improve the bug reproduction success rate; (4) delivered a virtual machine image that contains a set of 17 ready-to-execute performance bug benchmarks. The findings of our study provide guidance and a set of suggestions to help researchers to understand, evaluate, and successfully replicate performance bugs.

Original languageEnglish
Pages (from-to)268-282
Number of pages15
JournalJournal of Systems and Software
StatePublished - Oct 2019

Bibliographical note

Funding Information:
This research is supported in part by the National Science Foundation grant CCF-1652149 .

Publisher Copyright:
© 2019 Elsevier Inc.


  • Bug characteristics study
  • Experience report
  • Performance bug reproduction

ASJC Scopus subject areas

  • Software
  • Information Systems
  • Hardware and Architecture


Dive into the research topics of 'Reproducing performance bug reports in server applications: The researchers’ experiences'. Together they form a unique fingerprint.

Cite this