Abstract
Empirical validation is an important component of sound requirements engineering research. Many researchers develop a gold standard or answer set against which to compare techniques that they also developed in order to calculate common measures such as recall and precision. This poses threats to validity as the researchers developed the gold standard and the technique to be measured against it. To help address this and to help reduce bias, we introduce a prototype of Multi-user Input in Determining Answer Sets (MIDAS), a web-based tool to permit communities of researchers to jointly determine the gold standard for a given research data set. To date, the tool permits community members to add items to the answer set, vote on items in the answer set, comment on items, and view the latest status of community opinion on answer set items. It currently supports traceability data sets and classification data sets.
Original language | English |
---|---|
Title of host publication | Proceedings - 2018 IEEE 26th International Requirements Engineering Conference, RE 2018 |
Editors | Daniel Amyot, Walid Maalej, Guenther Ruhe |
Pages | 496-497 |
Number of pages | 2 |
ISBN (Electronic) | 9781538674185 |
DOIs | |
State | Published - Oct 12 2018 |
Event | 26th IEEE International Requirements Engineering Conference, RE 2018 - Banff, Canada Duration: Aug 20 2018 → Aug 24 2018 |
Publication series
Name | Proceedings - 2018 IEEE 26th International Requirements Engineering Conference, RE 2018 |
---|
Conference
Conference | 26th IEEE International Requirements Engineering Conference, RE 2018 |
---|---|
Country/Territory | Canada |
City | Banff |
Period | 8/20/18 → 8/24/18 |
Bibliographical note
Publisher Copyright:© 2018 IEEE.
Keywords
- Answer set
- Data set
- Evaluation
- Gold standard
ASJC Scopus subject areas
- Engineering (miscellaneous)
- Safety, Risk, Reliability and Quality