Irreproducibility in searches of scientific literature: A comparative analysis

Gábor Pozsgai, Gábor L. Lövei, Liette Vasseur, Geoff Gurr, Péter Batáry, János Korponai, Nick A. Littlewood, Jian Liu, Arnold Móra, John Obrycki, Olivia Reynolds, Jenni A. Stockan, Heather VanVolkenburg, Jie Zhang, Wenwu Zhou, Minsheng You

Research output: Contribution to journalArticlepeer-review

25 Scopus citations

Abstract

Repeatability is the cornerstone of science, and it is particularly important for systematic reviews. However, little is known on how researchers’ choice of database, and search platform influence the repeatability of systematic reviews. Here, we aim to unveil how the computer environment and the location where the search was initiated from influence hit results. We present a comparative analysis of time-synchronized searches at different institutional locations in the world and evaluate the consistency of hits obtained within each of the search terms using different search platforms. We revealed a large variation among search platforms and showed that PubMed and Scopus returned consistent results to identical search strings from different locations. Google Scholar and Web of Science's Core Collection varied substantially both in the number of returned hits and in the list of individual articles depending on the search location and computing environment. Inconsistency in Web of Science results has most likely emerged from the different licensing packages at different institutions. To maintain scientific integrity and consistency, especially in systematic reviews, action is needed from both the scientific community and scientific search platforms to increase search consistency. Researchers are encouraged to report the search location and the databases used for systematic reviews, and database providers should make search algorithms transparent and revise access rules to titles behind paywalls. Additional options for increasing the repeatability and transparency of systematic reviews are storing both search metadata and hit results in open repositories and using Application Programming Interfaces (APIs) to retrieve standardized, machine-readable search metadata.

Original languageEnglish
Pages (from-to)14658-14668
Number of pages11
JournalEcology and Evolution
Volume11
Issue number21
DOIs
StatePublished - Nov 2021

Bibliographical note

Publisher Copyright:
© 2021 The Authors. Ecology and Evolution published by John Wiley & Sons Ltd.

Funding

The authors thank Mei Ling Huang (Brock University, St. Catharines, Canada) for her comments on the statistical analysis. This work is supported by a grant of "111 project" in China. G.P. is supported by a postdoctoral fellowship by the State Key Laboratory of Ecological Pest Control for Fujian and Taiwan Crops, and A.M. by the grants #20765‐3/2018/FEKUTSTRAT and #TUDFO/47138/2019‐ITM.

FundersFunder number
Higher Education Discipline Innovation Project20765‐3/2018/FEKUTSTRAT
Higher Education Discipline Innovation Project

    Keywords

    • database
    • evidence synthesis methods
    • information retrieval
    • repeatability
    • reproducibility
    • search engine
    • search location

    ASJC Scopus subject areas

    • Ecology, Evolution, Behavior and Systematics
    • Ecology
    • Nature and Landscape Conservation

    Fingerprint

    Dive into the research topics of 'Irreproducibility in searches of scientific literature: A comparative analysis'. Together they form a unique fingerprint.

    Cite this