64th ISI World Statistics Congress - Ottawa, Canada

64th ISI World Statistics Congress - Ottawa, Canada

Validity of data extraction in evidence synthesis practice of adverse events: A reproducibility study

Conference

64th ISI World Statistics Congress - Ottawa, Canada

Format: IPS Abstract

Keywords: adverse-events, data-collection, meta-analysis

Session: IPS 94 - Advances in Research Synthesis in Healthcare Research

Tuesday 18 July 10 a.m. - noon (Canada/Eastern)

Abstract

Background: In evidence synthesis practice, data extraction is arguably one of the most important steps and likely prone to errors, as it transfers the “raw data” from the original studies into the meta-analysis. In this study, we aim to investigate the validity of the data extraction in systematic reviews of adverse events, the impact of data extraction errors on the results, and to develop a classification framework for data extraction errors to support further methodological research.
Methods: We searched in PubMed for systematic reviews of adverse events published between January 1, 2015 and January 1, 2020. Meta-analytic data of eligible meta-analyses among eligible systematic reviews were extracted by four authors. The original data sources (e.g., full-text, ClinicalTrials.gov) were then referred to replicate the data used in these meta-analyses by the same authors. Data extraction errors were summarized at study level, meta-analysis level, and systematic review level. The potential impact of such errors on the results was further investigated.
Results: A total of 201 systematic reviews with 829 pairwise meta-analyses were included. In 66.8% (554/829) of the meta-analyses, there was at least one study with data extraction errors; 85.1% (171/201) of the systematic reviews had at least one meta-analysis with data extraction errors. Impacts were analysed based on 288 meta-analyses. Data extraction errors led to 3.5% (10/288) of meta-analyses changing the direction of the effect and 6.6% (19/288) of meta-analyses changing the significance of the P-value; meta-analyses had two or more different types of errors were more susceptible to these changes. The current study only focused on systematic reviews of randomized controlled trials; Since the sample sizes of the trials tend to be small, the impact may be exacerbated.
Conclusions: Current systematic reviews of adverse events face serious issues in terms of the reproducibility of the data extraction, and these errors may mislead the conclusions. Implementation guidelines are urgently required to help future systematic review authors to improve the validity of data extraction.