Guidelines for Benchmarking Automated Software Traceability Techniques

Yonghee Shin, Jane Huffman Hayes, Jane Cleland-Huang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

17 Scopus citations

Abstract

To comparatively evaluate automated trace ability solutions, we need to develop standardized benchmarks. However there is currently no consensus on how a benchmark should be constructed and used to evaluate competing techniques. In this paper we discuss recurring problems in evaluating trace ability techniques, identify essential properties that evaluation methods should possess, and provide guidelines for benchmarking software trace ability techniques. We illustrate the properties and guidelines using empirical evaluation of three software trace ability techniques on nine data sets.

Original languageEnglish
Title of host publicationProceedings - 2015 IEEE/ACM 8th International Symposium on Software and Systems Traceability, SST 2015
Pages61-67
Number of pages7
ISBN (Electronic)9780769555935
DOIs
StatePublished - Aug 5 2015
Event8th IEEE/ACM International Symposium on Software and Systems Traceability, SST 2015 - Florence, Italy
Duration: May 17 2015 → …

Publication series

NameProceedings - 2015 IEEE/ACM 8th International Symposium on Software and Systems Traceability, SST 2015

Conference

Conference8th IEEE/ACM International Symposium on Software and Systems Traceability, SST 2015
Country/TerritoryItaly
CityFlorence
Period5/17/15 → …

Bibliographical note

Publisher Copyright:
© 2015 IEEE.

Keywords

  • Traceability
  • benchmarks
  • evaluation metrics
  • measurement

ASJC Scopus subject areas

  • Software

Fingerprint

Dive into the research topics of 'Guidelines for Benchmarking Automated Software Traceability Techniques'. Together they form a unique fingerprint.

Cite this