What should the relationship be between evidence generated through research and what policy makers and social care practitioners do? This entails questions of what we would like the relationship to be, and of what practical limitations there are to what it can be. This has long been an issue for debate and research.
Research about the use of research, if that doesn’t seem too self-absorbed for you, is a very important and well established topic of scholarship. It’s also very relevant for the re-emerging debate on the quality of the social care evidence base, prompted by a National Audit Office suggestion that social care has a ‘weak’ evidence base (Brindle, 2014). This study asks vital questions about how reliable research evidence is used for policy decisions. It’s no use having a strong evidence base if its use remains weak.
One possible view is that all policy and practice should follow what the research evidence tells us. This would be a neat, linear relationship and a rational process. Yet, a moment’s consideration shows us that this isn’t always possible. Sometimes the evidence isn’t clear or developed enough to provide a clear light.
Practical and cultural divides between researchers and policy makers and practitioners can also hinder this process. Then we have to consider the need to contextualise evidence, such as when working with an individual who doesn’t neatly fit the profile of a client found in the research evidence. And, of course, for good or bad, there will always be some politics in the process.
If we are willing, then to allow some movement away from the neat, rational, even simple, model of the research directing policy and practice, we certainly do not end in arguing that these two can occur in evidence-vacuums. We want our practitioners and policy makers to be like you, dear reader, aware of the evidence in an area and to be deeply reflecting on it and its implications for action.
There is an argument that we are some way from this and that there is a gap between the evidence produced by researchers and its uptake in policy and practice. Understanding this gap may help us to find ways of closing it so that we move to a more desirable research-policy/practice relationship.
This is where this article by Cherney and colleagues comes in, as they look at the access that a group of policy makers in Australia have to research evidence.
Methods
What, then did Cherney and the team do? They surveyed employees in federal and state government agencies across Australia. Full details of how the survey was conducted are given by the authors, and this is something we’ll come back in the discussion below.
There were many practical difficulties in conducting the survey, but the authors achieved an impressive final sample size of 2084 respondents.
Key findings
Amongst the authors’ findings are:
- Respondents said that the most important source of information for them, and the most frequently consulted, was colleagues in their organisations.
- University researchers were seen as an important source of information, but not so nearly frequently consulted.
- 58% of respondents said that they used electronic databases of research publications. Those who did not use databases had a range of reasons for not doing so, including lack of access, lack of skills in using them, and a preference for consulting colleagues and/or using internet search engines.
- A lack of time to read relevant studies was seen by many respondents as a significant practical barrier to using research.
- The authors undertook a form of mathematical analysis (logistic regression modelling) of their data to explore the relationship between variables and the relative significance of different factors. They found that ‘an overall organisational ethos and professional culture that value research have a bearing on the uptake of academic research among policy personnel, well above any perceived deficits in individual skills’ (page 13).
Discussion and conclusions
Two obvious issues to consider with this study are i) it is Australian based, and ii) the sample of respondents to the survey.
On the first one, how different is the Australian context of policy making to that of Britain? Undoubtedly there will be differences and we need to hold this in mind, but we should not dismiss the findings as completely irrelevant to understanding the same issues in the UK. Many of the issues raised in the paper have been raised by others researching the same theme in other countries, as discussed by the authors.
On the second issue, the authors clearly discuss the fact that they were not able to draw a random sample of staff in the government agencies. For various practical reasons they had to ask the government agencies to pass on the questionnaire to members of staff who fitted the criteria for the survey. The authors, then, have no way of knowing how representative the respondents are of the actual employees in the agencies. The authors discuss this clearly, and we can note that it was actually a rather large sample which may go some way to overcoming this concern.
Values and culture
We could engage in a more detailed discussion of ways in which research evidence might transmit to policy makers than the authors have. As the authors acknowledge, for example, colleagues could be a very important channel for finding out about evidence. The paper, though, does highlight some very important points with regard to understanding and closing this gap between the generating of research findings and their use in policy making.
The most important element is that people work in an organisational culture that actively and explicitly values the use of research evidence. As the authors wrote, ‘an overall willingness to seek out academic research will be determined by whether the occupational milieu in which policy personnel work is one that values academic research and sees it as important’ (page 13).
Practical issues
The authors then highlight a whole set of practical issues that could be improved to close the gap, some of which can be addressed by policy organisations, some by individual policy makers and others by researchers themselves. These need to be carefully addressed, but doing so without improving the organisational culture and the message it transmits about the value of research evidence will undermine the work done to remove or reduce the practical barriers.
The same point most likely applies to policy and practice in social care in the UK and we need to ensure that, where they haven’t already, organisations develop the kind of culture that Cherney and colleagues highlight as vital if we are to ensure that research is widely and appropriately used by policy makers and practitioners.
Link
Cherney A, Head B, Povey J, Ferguson M & Boredom P (2014) Use of academic social research by public officials: exploring preferences and constraints that impact on research use. Evidence and Policy. Print ISSN 1744 2648. Online ISSN 1744 2656 [Abstract]
References
Brindle D (2014) Mounting pressure on social care to build evidence base The Guardian Social Care Network, Thursday 13th March 2014.
@SocialCareElf Research proves that Shared Lives has best outcomes best safeguarding record and best value for money http://t.co/ifVqjSOXBl
@SocialCareElf Have favourited for train later. Good to see its “how” not “if”!
@Barod_CIC @SocialCareElf we always remain positive in the elf woods and stress when and how evidence is used, not if.
@Intipton @SocialCareElf We sent Anne on a @KESS_Central at Bangor Uni – includes looking at practices of policy-making in Wales #funtimes
@Intipton @SocialCareElf We need Elf Woods as light relief from occasional trips to the Slough of Despond ;)
How do policy makers use research evidence? http://t.co/hQavHGfCBW from @SocialCareElf
How do policy makers use #research evidence? http://t.co/0oStaddHTw By @Intipton via @SocialCareElf
Hi @bengoldacre Have a look at @Intipton’s blog “How do policy makers use research evidence?” http://t.co/POE4XKr5b4
Building and using evidence needs cultures receptive to using it – @SocialCareElf contributing to this http://t.co/sPCAVV1664 #rehoused
@DomWilliamson13 @Intipton Here it is: http://t.co/t7qHzeBS8H
Interesting summary MT @Mental_Elf: Have a look at @Intipton’s blog “How do policy makers use research evidence?” http://t.co/uhIDpGumA8
Pick yourself up from the mid-afternoon slump by learning secrets about policy maker’s use of research evidence: http://t.co/t7qHzeBS8H
Don’t miss: How do policy makers use research evidence? http://t.co/67v5sZ2gcu #EBP
The holy grail – just exactly how do you get policymakers to use your research, @Intipton for @SocialCareElf: http://t.co/f3cIiwRtMK
If organisations have an open and enquiring culture, the people who work in them will tend to be enquiring types! http://t.co/U13XnZ3WJD
[…] little while ago, fellow elf Mike Clark wrote a blog on an Australian study on how policy makers use research evidence. This time we’re looking […]
[…] on good evidence and then backed up when they do.” In a recent Australian study, summarised by Clark for research website Social Care Elf, the authors concluded that “an overall organisational ethos and professional culture that value […]