Achieving more openness and transparency in clinical research is critical if the basis for evidence based practice is to be completely sound.
The fairly recent recognition of the consequences of publication bias is one part of this: if treatment decisions and clinical guidelines are based on only part of the evidence, then it’s likely that poor clinical decisions will be (albeit unknowingly) made.
The movement to get all trials registered and reported is clearly irking parts of Big Pharma, as this month’s court action against the UK’s Health Research Authority demonstrates (AllTrials campaign). But tackling publication bias is only one part of improving the evidence-base. We also need greater openness about how research agendas are set.
Beavering away since 2004, the folks behind the James Lind Alliance (JLA) have been assuredly developing methods to systematically prioritise the research needed. Their ethos has always been to work with the most important people in any healthcare situation, namely the patients, carers and the health professionals with whom they work.
The JLA approach has been developed, refined and replicated over many years (and published in their guidebook). It seeks to identify and then prioritise treatment uncertainties (or unanswered research questions). Each project supported by the JLA is called a Priority Setting Partnership (PSP) to signal the equal participation of and collaborative working by patients/carers and health professionals.
Mental health features prominently in the JLA’s work, with five of the 49 current or completed PSPs focusing on a mental health condition:
In their most recent article, in a new journal focusing on patient and public involvement and engagement in research (Crowe, et al, 2015), members of the JLA compared the types of treatments prioritised for research by the first 14 PSPs with the treatments being evaluated in studies registered in one of the main clinical trials registries. It makes for interesting reading!
Methods
Analysing priorities from all the JLA PSPs
The priority lists generated from the 14 PSPs were collated to give 149 separate priorities and 126 mentions of treatments. These were classified by two authors into treatment types according to a system developed by a different author in an earlier study (see Table 1).
Sampling concurrently registered clinical trials for comparison
The authors chose the WHO International Clinical Trials Registry Platform for their comparison data as they considered it was the most comprehensive trials database and sufficiently detailed to enable the analysis. Trial data were downloaded, duplicates and studies not matched to the inclusion criteria removed, leaving 1,682 studies to analyse, leading to 1,867 treatment mentions. Of these, 52.8% were deemed non-commercial research and the rest commercially funded studies. The same two authors worked together to classify the studies into treatment types.
Results
The authors found marked differences between the proportions of different types of treatments identified by JLA PSPs and those being evaluated in studies registered on the Clinical Trials Registry.
Table 1: Interventions mentioned in research priorities identified by the JLA PSPs and among registered trials (2003-2012)
Type of intervention | JLA patient-clinician PSP Percentage (numbers) of interventions out of a total of 126 interventions mentioned |
Registered non-commercial trials Percentage (numbers) of interventions out of a total of 1,069 interventions mentioned |
Registered commercial trials Percentage (numbers) of interventions out of a total of 798 interventions mentioned |
Drugs, vaccines and biologicals |
18.2% (23) |
37.2% (397) |
86.3% (689) |
Radiotherapy, surgery and perioperative, devices and diagnostic |
23.0% (29) |
29.8% (332) |
11.1% (89) |
Education and training, service delivery, psychological therapy, physical therapies, exercise, complementary therapies, social care, mixed or complex, diet, other |
58.7% (74) |
31.9% (307) |
2.6% (20) |
Discussion
The findings from this analysis, across a wide range of health areas, adds to the evidence from previous topic-specific studies to show there are important mismatches between the treatments that patients and clinicians wish to see evaluated and the treatments evaluated by researchers. This study also builds on this earlier work by including a larger dataset and longer timescales.
In particular, the authors identify that the mismatch is widest between JLA priorities and commercially funded research, with the latter dominated by drug trials. The way they analysed the data also suggests that few of the commercial studies can be comparing drugs with non-drug interventions.
Limitations
The authors recognise that using a clinical trials registry may lead to under-reporting of other types of research. This is a critical point because the authors also recognise that ‘methodological disincentives’ may lead to fewer controlled evaluations of non-drug treatments, interpersonal therapies and service delivery models. Other databases (e.g. studies funded by the National Institute of Health Research), may have included more studies of non-drug treatments and also included non-trial study designs that may be more suited to these types of interventions.
Conclusions
The authors concluded:
Research on drugs is preferred by researchers, evaluation of non-drug treatments is preferred by patients and clinicians.
As the original research into research priorities (in osteoarthritis: Tallon, 2000) that highlighted the mismatch and led to the establishment of the JLA was fifteen years ago, they argue that research culture is slow to change in regard to how important and relevant treatment research questions are identified and prioritised. Addressing this mismatch will need leadership and incentives as the current research culture privileges researchers’ (by this, do they really mean commercial?) interests rather than the needs of patients and their clinicians. The critical question, posed by Crowe in a blog post written to accompany the published paper, is:
When can we expect a change in culture and [research] commissioning that reflects these sorts of interventional studies, prioritized by patients and health and social care practitioners?
Links
Primary paper
Crowe, S. Fenton, M. Hall, M. Cowan K. and Chalmers I. (2015) Patients’, clinicians’ and the research communities’ priorities for treatment research: there is an important mismatch. Research Involvement and Engagement 2015 1:2.
Other references
All Trials Registered, All Trials Reported campaign www.alltrials.net/
Chalmers, I. Rounding, C. Lock K. (2003) Descriptive survey of non-commercial randomised trials in the United Kingdom, 1980-2002. BMJ 2003: 327:1017-9.
James Lind Alliance guidebook – www.jlaguidebook.org/
Tallon, D. Chard, J. Dieppe, P. Relation between agendas of the research community and the research consumer Lancet 2000: 255:2037-40 [Abstract]
Crowe S. (2015) Do patients and clinicians research priorities really matter? 2015 BioMed Central blog.
“@Mental_Elf: (Mis)alliances in treatment research priorities http://t.co/4Wdf2OroXT” @kgapo
(Mis)alliances in treatment research priorities http://t.co/aEevfWweqo #MentalHealth http://t.co/kTyGkV5ziw
RT @vanhappier: My latest @Mental_Elf blog summarises (mis)alliances in treatment research priorities highlighted by @LindAlliance http://t…
Not only is the research being done different to what patients want, research culture is slow to change http://t.co/R2eFapzT4x @Mental_Elf
Good article @Mental_Elf clarifying how research priorities can be set http://t.co/D3ypzNsqUx @LindAlliance https://t.co/APjQjpM40a
Hi @Katherine_JLA @iainchalmersTTi @vanhappier @LindAlliance
has blogged about our mismatch paper here:
http://t.co/vW025qP6hE
(Mis)alliances in treatment research priorities https://t.co/vYwJQQJsv7
(Mis)alliances in treatment research priorities https://t.co/4OjM4FmZJA
Don’t miss: (Mis)alliances in treatment research priorities http://t.co/V6EQ4PCwO3 #EBP
Excellent blog from @vanhappier @Mental_Elf on @LindAlliance & mismatches in research http://t.co/tTbEL3JBhz @sally_crowe @iainchalmersTTi
How do we change research funding culture 2 support research that matters to patients and health profs? @vanhappier
http://t.co/vW025qP6hE
Mismatch tussen wat artsen/patiënten zouden willen onderzoeken en dat wat onderzoekers daadwerkelijk doen @Mental_Elf http://t.co/iivcJGhEE6
Great work! “@Katherine_JLA: Excellent blog from @vanhappier @Mental_Elf http://t.co/W0pCCRjT0w @sally_crowe @iainchalmersTTi”