Evaluation of artificial intelligence systems for assisting neurologists with fast and accurate annotations of scalp electroencephalography data

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

Background: Assistive automatic seizure detection can empower human annotators to shorten patient monitoring data review times. We present a proof-of-concept for a seizure detection system that is sensitive, automated, patient-specific, and tunable to maximise sensitivity while minimizing human annotation times. The system uses custom data preparation methods, deep learning analytics and electroencephalography (EEG) data. Methods: Scalp EEG data of 365 patients containing 171,745 s ictal and 2,185,864 s interictal samples obtained from clinical monitoring systems were analysed as part of a crowdsourced artificial intelligence (AI) challenge. Participants were tasked to develop an ictal/interictal classifier with high sensitivity and low false alarm rates. We built a challenge platform that prevented participants from downloading or directly accessing the data while allowing crowdsourced model development. Findings: The automatic detection system achieved tunable sensitivities between 75.00% and 91.60% allowing a reduction in the amount of raw EEG data to be reviewed by a human annotator by factors between 142x, and 22x respectively. The algorithm enables instantaneous reviewer-managed optimization of the balance between sensitivity and the amount of raw EEG data to be reviewed. Interpretation: This study demonstrates the utility of deep learning for patient-specific seizure detection in EEG data. Furthermore, deep learning in combination with a human reviewer can provide the basis for an assistive data labelling system lowering the time of manual review while maintaining human expert annotation performance. Funding: IBM employed all IBM Research authors. Temple University employed all Temple University authors. The Icahn School of Medicine at Mount Sinai employed Eren Ahsen. The corresponding authors Stefan Harrer and Gustavo Stolovitzky declare that they had full access to all the data in the study and that they had final responsibility for the decision to submit for publication.

Original languageEnglish
Article number103275
JournalEBioMedicine
Volume66
DOIs
StatePublished - Apr 2021

Bibliographical note

Funding Information:
We would like to thank Carlos Fonseca and the IBM Cloud Team for support with setting up cloud accounts for challenge participants as well as Elise Blaese for guidance in designing the challenge launch plan, Olivia Smith for mathematical guidance, Josh Andres for help with designing the web portal and John Cohn for sharing his experience with client data management systems. IBM employed all IBM Research authors. Temple University employed all Temple University authors. The Icahn School of Medicine at Mount Sinai employed Eren Ahsen. Data Sharing: All data that underlie the results reported in this article, after anonymization (text, tables, figures, supplemental information) is available publicly as open source data at the website https://www.isip.piconepress.com/projects/tuh_eeg/html/downloads.shtml. All data underlying the design of the developed analytical models is available publicly through the supplemental information of this article. All data is available immediately with publication. We plan to open source the challenge platform after completion of a public challenge which is ongoing at the time of publication (https://www.ibm.com/blogs/research/2020/12/object-recognition-models/).

Publisher Copyright:
© 2021 The Authors

Keywords

  • Artificial intelligence
  • Automatic labelling, Crowdsourcing challenges
  • Deep neural networks
  • EEG
  • Epilepsy
  • Seizure detection

ASJC Scopus subject areas

  • Biochemistry, Genetics and Molecular Biology (all)

Fingerprint

Dive into the research topics of 'Evaluation of artificial intelligence systems for assisting neurologists with fast and accurate annotations of scalp electroencephalography data'. Together they form a unique fingerprint.

Cite this