Influence of reported study design characteristics on intervention effect estimates from randomized, controlled trials

Jelena Savović, Hayley E. Jones, Douglas G. Altman, Ross J. Harris, Peter Jüni, Julie Pildal, Bodil Als-Nielsen, Ethan M. Balk, Christian Gluud, Lise Lotte Gluud, John P.A. Ioannidis, Kenneth F. Schulz, Rebecca Beynon, Nicky J. Welton, Lesley Wood, David Moher, Jonathan J. Deeks, Jonathan A.C. Sterne

Research output: Contribution to journalArticlepeer-review

851 Citations (Scopus)

Abstract

Published evidence suggests that aspects of trial design lead to biased intervention effect estimates, but findings from different studies are inconsistent. This study combined data from 7 metaepidemiologic studies and removed overlaps to derive a final data set of 234 unique meta-analyses containing 1973 trials. Outcome measures were classified as 'mortality,' 'other objective,' 'or subjective,' and Bayesian hierarchical models were used to estimate associations of trial characteristics with average bias and betweentrial heterogeneity. Intervention effect estimates seemed to be exaggerated in trials with inadequate or unclear (vs. adequate) random-sequence generation (ratio of odds ratios, 0.89 [95% credible interval {CrI}, 0.82 to 0.96]) and with inadequate or unclear (vs. adequate) allocation concealment (ratio of odds ratios, 0.93 [CrI, 0.87 to 0.99]). Lack of or unclear double-blinding (vs. doubleblinding) was associated with an average of 13% exaggeration of intervention effects (ratio of odds ratios, 0.87 [CrI, 0.79 to 0.96]), and between-trial heterogeneity was increased for such studies (SD increase in heterogeneity, 0.14 [CrI, 0.02 to 0.30]). For each characteristic, average bias and increases in between-trial heterogeneity were driven primarily by trials with subjective outcomes, with little evidence of bias in trials with objective and mortality outcomes. This study is limited by incomplete trial reporting, and findings may be confounded by other study design characteristics. Bias associated with study design characteristics may lead to exaggeration of intervention effect estimates and increases in between-trial heterogeneity in trials reporting subjectively assessed outcomes.

Original languageEnglish
Pages (from-to)429-438
Number of pages10
JournalAnnals of Internal Medicine
Volume157
Issue number6
DOIs
Publication statusPublished - 18 Sept 2012

Fingerprint

Dive into the research topics of 'Influence of reported study design characteristics on intervention effect estimates from randomized, controlled trials'. Together they form a unique fingerprint.

Cite this