What works now: how can we use evidence more effectively in policymaking?

Evidence use in policymaking is nothing new. It has been talked about by policymakers, academics and professionals for the best part of ten years, and has been highlighted a lot, among other places, on this blog. Over the years various government initiatives have been set up to try to establish how best to use evidence and identify “what works” in relation to specific policy interventions, and “evidence-based” policymaking has become the catchphrase of policymakers across most sectors.

One of the newest books to be added to the Idox Information Service library reflects on the rise of “what works” as an approach to policy development. The book builds on discussions from the first edition of the book, and provides a sector-by-sector breakdown of how evidence is – and could be – used in policymaking across areas like health, the environment, education and criminal justice. It also offers some insight into appraising evidence and how to assess quality, as well as how evidence is used internationally, providing examples from the USA, Australia, New Zealand and Scandinavia.

As one of our key aims is to support and facilitate the sharing and use of evidence in the public sector, this book has been a welcome addition to our collection.

Making use of research across policy

In 2013, the UK government launched the What Works Network, which is now made up of 10 independent centres committed to “supporting the creation, supply and use of evidence” in specific policy areas including crime and policing, education and economic growth. The centres aim to improve the way government and other organisations create, share and use (or ‘generate, translate and adopt’) high-quality evidence for decision-making. According to the UK government, the initiative is the first time a government has taken a national approach to prioritising the use of evidence in decision making.

What Works Now? highlights research from Weiss (1979) which suggests that there are “7 types of research use”:

  • Knowledge Driven – research will be developed, applied and used once it has been produced
  • Problem Solving – research will be applied directly to a particular policy problem in order to solve it
  • Interactive – research forms part of a wider web of knowledge, policy and other research which all interact with each other
  • Political – research could (and probably will) be used to retrospectively provide support for a policy decision which has already been made
  • Tactical – research can be used as a tool to delay or deflect from decision making or action around a particular issue (i.e. “more research is needed in this area”)
  • Enlightenment – research informs policy through encouraging people to think and discuss particular ideas or concepts in a different way
  • Embedded research – research production is embedded in a wider contextual system which includes political priorities, the law and the media

Building a research base to support “what works”

Creating and disseminating research effectively have been cited as being key to creating a “what works” evidence base. A number of research institutes and think tanks contribute alongside real-life experiences of practitioners and other stakeholders to try and establish the conditions which support effective interventions and lead to positive policy outcomes.

One of the big discussions currently is around the creation of academic research to support what works programmes. Exploring what sort of research is useful to practitioners and policymakers and aligning this with the research agenda of academics and universities can help to create an effective supply chain of evidence to inform policymaking. However, often academics often do not engage with the policy process, or politicians politicise evidence, picking and choosing which findings to take notice of, which can distort the perception of what evidence is available in a particular area.

Encouraging fuller participation and a more robust appraisal of research from across the board is something which many institutions are trying to work towards. Research impact and knowledge exchange is now integrated into research funding and a growing number of people are working to feed research more effectively into the policy arena.

Evaluating research and evidence and judging which to take forward to inform policy decision making is also important. Along with discussions around assessing and labelling evidence the book considers how some of the main organisations in the UK concerned with promoting evidence-informed policy have gone about appraising evidence, weighing it up, assessing quality and “fitness for purpose” and taking account of non-research based forms of knowledge and evidence, such as the personal experience of practitioners.

Applying “what works” in practice

Applying “what works” in practice can be a challenge, especially in a setting that is perhaps very different from the conditions of a study that has been shown to produce successful outcomes from a particular intervention.

In the book, 10 guiding principles to support the use of evidence in practice are set out:

  • Translated – To be used research must be adapted and reconstructed to fit with local contexts. Simply providing findings is not enough
  • Ownership – Ownership of the research and allowing people to feel a sense of ownership over the development of research
  • Enthusiasts – Individual “champions” can be useful in ensuring that research actually gets used
  • Local context – Local context must be taken into account, particularly in relation to specific barriers and enablers which might help or hinder change
  • Credibility – Credibility of researchers and the people who support the research is key to ensuring that the research is taken seriously
  • Leadership – Strong leadership provides motivation, authority and integrity in the implementation of evidence
  • Support to implement change – Ongoing support to implement change is important, this could include financial, technical, organisational or emotional support
  • Develop Integration – Activities need to be able to be integrated with existing organisational systems and practices, changes do not happen within a bubble
  • Engage key stakeholders – To ensure effective uptake and buy-in key stakeholders should be involved as fully as possible form the earliest possible stage
  • Capture learning/ Effective evaluation – Don’t forget the importance of evaluation, identify what worked and what didn’t to help share learning and support future projects

Final thoughts

In theory, using evidence to inform policy sounds straightforward. The reality can be quite different. What Works Now? highlights that the “what works” agenda remains dominant across the policy landscape, even if the application or approach to it differs from policy area to policy area.

What counts as evidence is still disputed; getting evidence “out there” and encouraging academics to be involved in the policy process is still hard to achieve (although there is good work being done in this area to try and combat this); and context is still key to making evidence work in a particular environment.

Understanding evidence, and how to use it effectively has been a core aim of policymakers in the UK, and across the world for the many years. This book, and the supporting research outlined in it highlights that while evidence is still at the fore of policymaking, actually identifying what works and putting it into practice is a bit more of a challenge.

Members of the Idox Information Service can log into our website to request a loan of “What works Now?”

If you enjoyed this post, you may also be interested in:

A world of evidence … but can we trust that it is any good?

Follow us on Twitter to find out what topics are interesting our research team.

A world of evidence … but can we trust that it is any good?

What is good evidence? And how can policymakers and decisionmakers decide what is working and what isn’t, when it comes to deciding where public money is spent and how?

These are the kinds of questions that models and tools such as randomised controlled trials and cost-benefit analysis attempt to answer. The government has also supported the development over the last five years of the What Works Network, which now consists of 10 independent What Works Centres. When talking about impact there’s also been a move to capturing and recognising the value of qualitative data.

As one of our key aims is to support and facilitate the sharing and use of evidence in the public sector, we were interested to read a new publication ‘Mapping the standards of evidence used in UK social policy’.

Standards of evidence

Produced by the Alliance for Useful Evidence, the research has found 18 different Standards of Evidence currently in use across UK social policy.

The report notes that over the last decade there has been increasing interest in grading effectiveness or impact against a level or scale. Typically, the higher up the scale, the more evidence is available. Theoretically this means that decision-makers can have higher confidence in deciding whether a policy or intervention is working.

While all the evidence frameworks generally aim to improve the use of evidence, the different goals of the organisations responsible can shape the frameworks in different ways. They can be used to inform funding decisions, to make recommendations to the wider sector about what works and what doesn’t, or as a resource to help providers to evaluate. And unfortunately this means that the same intervention can be assessed differently depending on which framework is used.

The Alliance for Useful Evidence concludes that while a focus on evidence use is positive, the diversity of evidence standards risks creating confusion. Suggested options for improving the situation include introducing an independent accreditation system, or having a one-stop shop which would make it easier to compare ratings of interventions.

Dissemination and wider engagement

The question of standardising evidence frameworks is just one part of a wider effort to increase transparency. As well as collecting evidence, it’s important that when public money has been invested in carrying out evaluations and impact assessments, that this evidence remain accessible over the longer term and that lessons are learned. It can often seem that government departments have very short organisational memories – especially if they’ve suffered a high churn of staff.

Two projects which we support in Scotland are focused on increasing the dissemination and awareness of evaluation and research evidence. Research Online is Scotland’s labour market information hub. Produced by ourselves and Skills Development Scotland, the portal brings together a range of statistics and research and acts as the centre of a community of practice for labour market researchers, practitioners and policy-makers.

Meanwhile Evaluations Online is a publicly accessible collection of evaluation and research reports from Scottish Enterprise. The reports cover all aspects of Scottish Enterprise’s economic development activities – some of the latest added to the site cover megatrends affecting Scottish tourism, innovation systems and the gender gap, and the commercial flower-growing sector in Scotland.

When working within the policy world it can be easy to suffer from fatigue as ideas appear to be continually recycled, rejected and then revisited as policy fashions change and political parties or factions go in and out of power. The spotlight, often driven by the media, will shine on one hot policy issue – for example, moped crime, cannabis legislation or health spending – and then move on.

Online libraries of evaluations and research reports are one tool which can help support a longer-term culture of learning and improvement within the public sector.

Evidence Week 2018

Inspired by similar objectives, Evidence Week runs from 25th to 28th June 2018 and aims to explore the work of parliamentarians in seeking and scrutinising evidence. It will bring together MPs, peers, parliamentary services and the public to talk about why evidence matters, and how to use and improve research evidence.

This may be the start of wider knowledge sharing about standards of evidence, to help those using them to improve their practice.


The Knowledge Exchange is a member of the Alliance for Useful Evidence. Our databases are used by government and the public sector, as well as private-sector consultancies, to keep abreast of policy news and research in social and public policy.

Why UK-sourced evidence matters … and why it is so often ignored

By Morwen Johnson

If you follow our blog, you’ll know that we care passionately about promoting the uptake of evidence and research by policymakers and practitioners. It’s easy to be complacent and assume that when public money is at stake, decisions are made on the basis of evaluations and reviews. Unfortunately, this is still not always the case.

The current evidence-based policy agenda in the UK encompasses initiatives such as the What Works network, the Local Government Knowledge Navigators and independent organisations such as the Alliance for Useful Evidence. They are working on fostering demand for evidence, as well as linking up academics with those in the public sector to ensure that the research community is responsive to the needs of those making decisions and designing/delivering services.

A recent article in Health Information and Libraries Journal highlights another challenge in evidence-based policy however. A mapping exercise has found that literature reviews often ignore specialist databases, in favour of the large, well-known databases produced by major commercial publishers. Within the health and social care field (the focus of the article), literature reviews tend to use databases such as Medline, Embase and Cinahl – and overlook independent UK-produced databases, even when they are more relevant to the research question.

Why does it matter?

Research has shown that how (and why) databases are chosen for literature searching can “dramatically influence the research upon which reviews, and, in particular, systematic review, rely upon to create their evidence base”.

To generate useful evidence for the UK context (relating to UK policy issues or populations), researchers need to understand the most appropriate database to search – but unfortunately our own experience of looking at the detail of methodologies in evidence reviews, suggests that in many cases the only databases searched are those produced by American or international publishers.

Grey literature is a valuable source in evidence reviews – and again this is often overlooked in the major databases which tend to focus only on peer-reviewed journal content. A recent Australian report ‘Where is the evidence?‘ argued that grey literature is a key part of the evidence base and is valuable for public policy, because it addresses the perspectives of different stakeholder groups, tracks changes in policy and implementation, and supports knowledge exchange between sectors (academic, government and third sector).

Another benefit of UK-produced databases is that they will make use of UK terminology in abstracts and keywords.

Social Policy and Practice – a unique resource

At this point I should declare a vested interest – The Knowledge Exchange is a member of a UK consortium which produces the Social Policy and Practice (SPP) database. The SPP database was created in 2005 after five UK organisations, each with a library focused on sharing knowledge in community health and social care, agreed to merge their individual content in order to make it available to the widest possible audience.

The current members of the SPP consortium – the National Children’s Bureau, the Idox Knowledge Exchange, the Centre for Policy on Ageing and the Social Care Institute for Excellence – have just been joined by the National Society for the Prevention of Cruelty to Children. Inclusion of the NSPCC’s bibliographic data greatly enhances the coverage of child protection research in the database. SPP has been identified by NICE, the National Institute for Health and Care Excellence, as a key resource for those involved in research into health and social care.

We want the UK research community to understand what SPP offers, and to use it when undertaking literature reviews or evidence searches. This process of awareness raising should start with students – librarians in universities and the UK doctoral training centres have a key role in this as it ties in with the development of information literacy and critical appraisal skills. Ignoring specialist sources such as SPP risks introducing bias – at a time when initiatives are attempting to embed research and analytics in local government and the wider public sector.


Information on the coverage of Social Policy and Practice is available here and the distributor Ovid is offering a free 30-day trial.

Knowledge insider…. a Q & A with Jonathan Breckon

jonathan_breckon_150x150In the latest of our series of Q&As with leading advocates of the use of evidence in policymaking and practice, we talk to Jonathan Breckon. Jonathan is Head of the Alliance for Useful Evidence – a partnership which champions the need for useful evidence, providing a focal point for improving and extending the use of social research and evidence in the UK.

Jonathan, what led you to a role about promoting and improving knowledge development? 

There are two ways in which I am interested in knowledge development; professionally I have always worked around universities, loved doing and finding out new research and working within research in UK. I have always been conscious however, of the gap between research and front line services, even when research is relevant to the service, and felt this was a great loss and disadvantage to public services.

My personal interest, as a user of public services, with my kids going through services such as schools, health and sports, I have been desperately aware that things are business as usual rather than continuously striving for innovation and change. The debate is now all about money and reductions when it should be about improvement and future proofing.

We don’t always know what it takes to bridge the gap between what we need, and what services can provide; research can actually help that. The What Works approach is really important but very hard, as it’s difficult to stop doing things we have already invested in. An evidence-based research approach can challenge and support this evaluation and we have a moral duty to do it and not continue to invest in services which don’t work.

What do you think the main benefits of developing your knowledge are?

The challenge of seeing if things work or not, why they work, where they work and who they work for – developing your knowledge is the critical aspect of improving how you do things.

It’s also important for a whole host of other benefits. I particularly like Carol Weiss’ work, which is instrumental – this ‘enlightment’ operational research should not be dismissed. This approach can support the ideas of learning continuously through research; it implies a continuous review of theory, methods, practice and we should always be striving to improve our methods and outcomes.

When people are talking to you about evidence, research or knowledge, what do they most frequently raise as issues?

The most common one is that investing in evidence is just rhetoric; politicians, charities, parties etc will never really be informed by the research agenda, and I agree. We aren’t in a super-rational culture, it’s about our wider culture, values and beliefs as well. But it’s a fundamental misunderstanding that research trumps anything. It is part of the mix, part of the overall democratic and rational approach to doing anything.

The Behavioural Insights Team has a massive role to play in understanding the biases in how we make decisions, whether in prisons, police, policy etc. We don’t work rationally all the time and evidence can help us understand the messiness of policy making. We just need it to be a bigger part of the mix.

Everybody has this view that they use evidence but we don’t really understand how effectively they use it.

What are the hard to spot mistakes when it comes to developing your knowledge, which you really need to avoid?

The main one is that not all evidence is equal; that you have to make judgements about it. This is hard for those writing the research – it’s not about the quality of the research, and it’s about the point of view from demand. They need some things and not others.

The big challenge, if you are looking at impact, is you need different approaches, experiments, systematic reviews – one study is not enough. Such as when you see studies reported in newspapers – until replicated we don’t really know if it is robust. You need to avoid literature reviews where you cherry pick, go to something which is transparent and is systematic. This is true of both policy makers and researchers’ point of view; we underestimate the challenges facing both sides.

Need more about impact. We are very good at qualitative – world class – but we are behind in quantitative methods. It is being addressed but it will need to filter through.

How do you think people will be doing evidence, research and knowledge development in 5 years’ time?

What Works Centres will, I hope, be a key part of the evidence ecosystem, in the way that NICE have done, helping providers and policy makers make decisions. Although it doesn’t do research itself it sucks in research and uses it well.

There will always be critics of them, even of just the name, but they will change the system. Some have been around for a while and are well established, but others are new and are just about to be. As well as synthesising research they will commission new work. For instance in wellbeing, we know a lot about the correlation with health and wellbeing but don’t know a lot about what will work in improving it.

Technology makes it very difficult to guess about the future, who would have predicted the work in social media research? Big data is emerging now and in 5 years’ time might be a standard tool. The fundamental principles like statistics will be there but we will have to adapt to the possibilities offered by technology

If you had a list of ‘best-kept secrets’ about research, evidence and knowledge you would recommend, what would you include and why?

Just because you have done a social science masters and PhD, does not make you an expert in evidence, partly because people over-specialise. People need open their minds to different methods and how people do it in other places. The Department for International Development have an amazing range of techniques, nothing like anything you have seen, with a database of all the research they have funded or delivered.

Emerging opportunities such as social media research – still early days and fundamentally new, and could have a huge impact. Most people’s default though is to go to an expert and be frightened off journals and academics; I don’t think you always have to commission something new, it’s about variety, breadth and developing your understanding in as many ways as possible.


 

You can also read Q&As with Tim Allen, Local Government Knowledge Navigator; Clive Grace, Local Government Knowledge Navigator; Sarah Jennings of the Knowledge Hub; and Kim Ryley, recent Past President of the Society of Local Authority Chief Executives.

 

How collecting evidence can help improve public policy: Scottish Enterprise’s approach to economic development evaluation

By Stephen Lochore

EO homeThe Knowledge Exchange has helped a range of public sector organisations to collate, analyse and disseminate their evaluation and research material. One example is the Evaluations Online portal, a publicly accessible collection of evaluation and research reports from Scottish Enterprise, one of Scotland’s two regional development agencies. We’ve recently made some changes to the portal, and though these were mainly to design and layout, it got me (as Project Manager) thinking about wider issues of accountability and evidence-based policy. Continue reading