John Ioannidis might be right!
In a recent review, the authors challenge the foundation of evidence-based practice (EBP), specifically on the grounds that the vast majority of research evidence is bad (Kane et al, 2016).
Methods
They analyse the 10 most recent systematic reviews of interventions published in 4 major journals (Annals of Internal Medicine, The BMJ, JAMA and Pediatrics). The sample size is supplemented with a further 10 recent reviews issued in reports from the Cochrane Collaborative and 16 from the Evidence-based Practice Center (EPC).
The authors continue to report on the quality of evidence of 76 included papers – however, upon adding up the numbers we came to a total of 66. The authors don’t reference any of the included papers, thus it’s not clear where the other 10 included papers came from.
The authors extracted the reported quality of evidence score assigned to each intervention/ outcome pair and categorised them by intervention type and quality level.
Results
Of the 76 reviews, 34 did not use a systematic quality of evidence rating scheme. From the remaining 42 reviews, a total of 1,472 outcomes linked to a specific intervention were abstracted.
In paragraph one the authors state that ‘of the studies that rated QOE, 39 used the methods endorsed…and 13 used GRADE.’ (Grading of Recommendations Assessment, Development, and Evaluation Working Group). This should clearly be 39% and 13% respectively.
Of the 1,472 outcomes, 1,039 included observational studies and 433 did not.
The strength of evidence rating (SOE) was moderate to high for 13.7% of outcomes where observational studies were included and 20.8% where observational studies were not included (p<0.01). The bottom line is that the SOE rating for the vast majority of outcomes in both groups were low or insufficient (86% and 79% respectively, p<0.01).
Meta-analysed interventions were less likely to have a high or moderate QOE rating.
Conclusion
The authors conclude:
Claiming that clinical practice is evidence-based is far from justified.
Strengths
This paper raises the important issue of how prevalent poor quality of evidence reporting is within systematic reviews. This is undeniable and important. The utility of science to inform clinical practice depends on quality evidence.
Limitations
The authors did not review the quality of the included papers; instead, they relied on the reported quality of each paper. It would have been more impactful had they also compared the two. Furthermore, this distinction gets lost throughout the paper, which can misrepresent its purpose and impact.
Additionally, there is a disconnect between their findings and the insightful story of the discussion. A more direct connection could have been made by further discussing the implications of their findings and how their data brings us closer to a solution to the problems at hand.
Bloggers
Our thanks to Rachel Playforth, Sadhia Khan, Cynthia Kroeger, Dan Mayer, Paul Dijkstra and Gerd Antes who worked together at our Making #EvidenceLive workshop on 21st June to produce this blog.
Links
Primary paper
Kane RL, Butler M, Ng W. (2016) Examining the quality of evidence to support the effectiveness of interventions: an analysis of systematic reviews. BMJ Open 2016;6:5 e011051 doi:10.1136/bmjopen-2016-011051
@NatElfService @Mental_Elf lovely short & pithy blog @DrPaulDijkstra
RT @EvidentlySian: Most research is flawed and many findings are false #EvidenceLive https://t.co/2ZyjphC1Hw via @sharethis
RT @Mental_Elf: Is John Ioannidis right?
Most research flawed?
Many findings false?
#EvidenceLive
Elf blog: https://t.co/PC7xYfDnkk https:/…
RT @archelina: Fruits of yesterday’s labour! RT @NatElfService: Most research is flawed and many findings are false https://t.co/ojKAI27dHj…
RT @BMJ_CE: Now Ioannidis: Why most clinical research is not useful This @NatElfServiceblog suggests he’s right! https://t.co/0RrBsxyfJp #…
@KiwiRowebot @MentalhealthMSc Follow #EvidenceLive for more on John Ioannidis’ excellent work & read this elf blog: https://t.co/PC7xYfUYIU
Don’t miss: Most research is flawed and many findings are false https://t.co/fIDO1lS86T #EvidenceLive #EBP
Most research is flawed and many findings are false #EvidenceLive https://t.co/FRmiKymoKO via sharethis
RT @Mental_Elf: Most research is flawed
Many findings are false
Tasty pre-#EvidenceLive blog from the elves
https://t.co/PC7xYfDnkk https:/…
The authors conclude:
“Claiming that clinical practice is evidence-based is far from justified.” https://t.co/9bad7wcB1y
can systematic literature reviews be classed as primary research? reviews of reviews
Important to realise the epistemological and methodological flaws in popular science.