Prediction scoring of data-driven discoveries for reproducible research

Anna L. Smith, Tian Zheng, Andrew Gelman

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Predictive modeling uncovers knowledge and insights regarding a hypothesized data generating mechanism (DGM). Results from different studies on a complex DGM, derived from different data sets, and using complicated models and algorithms, are hard to quantitatively compare due to random noise and statistical uncertainty in model results. This has been one of the main contributors to the replication crisis in the behavioral sciences. The contribution of this paper is to apply prediction scoring to the problem of comparing two studies, such as can arise when evaluating replications or competing evidence. We examine the role of predictive models in quantitatively assessing agreement between two datasets that are assumed to come from two distinct DGMs. We formalize a distance between the DGMs that is estimated using cross validation. We argue that the resulting prediction scores depend on the predictive models created by cross validation. In this sense, the prediction scores measure the distance between DGMs, along the dimension of the particular predictive model. Using human behavior data from experimental economics, we demonstrate that prediction scores can be used to evaluate preregistered hypotheses and provide insights comparing data from different populations and settings. We examine the asymptotic behavior of the prediction scores using simulated experimental data and demonstrate that leveraging competing predictive models can reveal important differences between underlying DGMs. Our proposed cross-validated prediction scores are capable of quantifying differences between unobserved data generating mechanisms and allow for the validation and assessment of results from complex models.

Original languageEnglish
Article number11
JournalStatistics and Computing
Volume33
Issue number1
DOIs
StatePublished - Feb 2023

Bibliographical note

Publisher Copyright:
© 2022, The Author(s).

Funding

This material is based on and supported through research sponsored by the Defense Advanced Research Projects Agency (DARPA) agreement number D17AC00001 and the Office of Naval Research. The content of the information does not necessarily reflect the position or the policy of the Government, and no official endorsement should be inferred.

FundersFunder number
Office of Naval Research Naval Academy
Defense Advanced Research Projects AgencyD17AC00001

    Keywords

    • Cross validation
    • Experimental social science
    • Model assessment
    • Preregistration
    • Reproducibility

    ASJC Scopus subject areas

    • Theoretical Computer Science
    • Statistics and Probability
    • Statistics, Probability and Uncertainty
    • Computational Theory and Mathematics

    Fingerprint

    Dive into the research topics of 'Prediction scoring of data-driven discoveries for reproducible research'. Together they form a unique fingerprint.

    Cite this