Measuring Requirement Quality to Predict Testability

Jane Huffman Hayes, Wenbin Li, Tingting Yu, Xue Han, Mark Hays, Clinton Woodson

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

8 Scopus citations

Abstract

Software bugs contribute to the cost of ownership for consumers in a software-driven society and can potentially lead to devastating failures. Software testing, including functional testing and structural testing, remains a common method for uncovering faults and assessing dependability of software systems. To enhance testing effectiveness, the developed artifacts (requirements, code) must be designed to be testable. Prior work has developed many approaches to address the testability of code when applied to structural testing, but to date no work has considered approaches for assessing and predicting testability of requirements to aid functional testing. In this work, we address requirement testability from the perspective of requirement understandability and quality using a machine learning and statistical analysis approach. We first use requirement measures to empirically investigate the relevant relationship between each measure and requirement testability. We then assess relevant requirement measures for predicting requirement testability. We examined two datasets, each consisting of requirement and code artifacts. We found that several measures assist in delineating between the testable and non-testable requirements, and found anecdotal evidence that a learned model of testability can be used to guide evaluation of requirements for other (non-trained) systems.

Original languageEnglish
Title of host publication2nd International Workshop on Artificial Intelligence for Requirements Engineering, AIRE 2015 - Proceedings
EditorsRachel Harrison, Nelly Bencomo, Jane Cleland-Huang, Jin Guo
Pages1-8
Number of pages8
ISBN (Electronic)9781509001255
DOIs
StatePublished - Nov 25 2015
Event2nd International Workshop on Artificial Intelligence for Requirements Engineering, AIRE 2015 - Ottawa, Canada
Duration: Aug 24 2015 → …

Publication series

Name2nd International Workshop on Artificial Intelligence for Requirements Engineering, AIRE 2015 - Proceedings

Conference

Conference2nd International Workshop on Artificial Intelligence for Requirements Engineering, AIRE 2015
Country/TerritoryCanada
CityOttawa
Period8/24/15 → …

Bibliographical note

Publisher Copyright:
© 2015 IEEE.

Keywords

  • code testability
  • correlation analysis
  • human analyst
  • machine learning
  • requirement testability
  • statistical analysis
  • subjective assessment
  • supervised classification learning

ASJC Scopus subject areas

  • Safety, Risk, Reliability and Quality
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Measuring Requirement Quality to Predict Testability'. Together they form a unique fingerprint.

Cite this