A world of evidence … but can we trust that it is any good?

What is good evidence? And how can policymakers and decisionmakers decide what is working and what isn’t, when it comes to deciding where public money is spent and how?

These are the kinds of questions that models and tools such as randomised controlled trials and cost-benefit analysis attempt to answer. The government has also supported the development over the last five years of the What Works Network, which now consists of 10 independent What Works Centres. When talking about impact there’s also been a move to capturing and recognising the value of qualitative data.

As one of our key aims is to support and facilitate the sharing and use of evidence in the public sector, we were interested to read a new publication ‘Mapping the standards of evidence used in UK social policy’.

Standards of evidence

Produced by the Alliance for Useful Evidence, the research has found 18 different Standards of Evidence currently in use across UK social policy.

The report notes that over the last decade there has been increasing interest in grading effectiveness or impact against a level or scale. Typically, the higher up the scale, the more evidence is available. Theoretically this means that decision-makers can have higher confidence in deciding whether a policy or intervention is working.

While all the evidence frameworks generally aim to improve the use of evidence, the different goals of the organisations responsible can shape the frameworks in different ways. They can be used to inform funding decisions, to make recommendations to the wider sector about what works and what doesn’t, or as a resource to help providers to evaluate. And unfortunately this means that the same intervention can be assessed differently depending on which framework is used.

The Alliance for Useful Evidence concludes that while a focus on evidence use is positive, the diversity of evidence standards risks creating confusion. Suggested options for improving the situation include introducing an independent accreditation system, or having a one-stop shop which would make it easier to compare ratings of interventions.

Dissemination and wider engagement

The question of standardising evidence frameworks is just one part of a wider effort to increase transparency. As well as collecting evidence, it’s important that when public money has been invested in carrying out evaluations and impact assessments, that this evidence remain accessible over the longer term and that lessons are learned. It can often seem that government departments have very short organisational memories – especially if they’ve suffered a high churn of staff.

Two projects which we support in Scotland are focused on increasing the dissemination and awareness of evaluation and research evidence. Research Online is Scotland’s labour market information hub. Produced by ourselves and Skills Development Scotland, the portal brings together a range of statistics and research and acts as the centre of a community of practice for labour market researchers, practitioners and policy-makers.

Meanwhile Evaluations Online is a publicly accessible collection of evaluation and research reports from Scottish Enterprise. The reports cover all aspects of Scottish Enterprise’s economic development activities – some of the latest added to the site cover megatrends affecting Scottish tourism, innovation systems and the gender gap, and the commercial flower-growing sector in Scotland.

When working within the policy world it can be easy to suffer from fatigue as ideas appear to be continually recycled, rejected and then revisited as policy fashions change and political parties or factions go in and out of power. The spotlight, often driven by the media, will shine on one hot policy issue – for example, moped crime, cannabis legislation or health spending – and then move on.

Online libraries of evaluations and research reports are one tool which can help support a longer-term culture of learning and improvement within the public sector.

Evidence Week 2018

Inspired by similar objectives, Evidence Week runs from 25th to 28th June 2018 and aims to explore the work of parliamentarians in seeking and scrutinising evidence. It will bring together MPs, peers, parliamentary services and the public to talk about why evidence matters, and how to use and improve research evidence.

This may be the start of wider knowledge sharing about standards of evidence, to help those using them to improve their practice.


The Knowledge Exchange is a member of the Alliance for Useful Evidence. Our databases are used by government and the public sector, as well as private-sector consultancies, to keep abreast of policy news and research in social and public policy.

Why UK-sourced evidence matters … and why it is so often ignored

By Morwen Johnson

If you follow our blog, you’ll know that we care passionately about promoting the uptake of evidence and research by policymakers and practitioners. It’s easy to be complacent and assume that when public money is at stake, decisions are made on the basis of evaluations and reviews. Unfortunately, this is still not always the case.

The current evidence-based policy agenda in the UK encompasses initiatives such as the What Works network, the Local Government Knowledge Navigators and independent organisations such as the Alliance for Useful Evidence. They are working on fostering demand for evidence, as well as linking up academics with those in the public sector to ensure that the research community is responsive to the needs of those making decisions and designing/delivering services.

A recent article in Health Information and Libraries Journal highlights another challenge in evidence-based policy however. A mapping exercise has found that literature reviews often ignore specialist databases, in favour of the large, well-known databases produced by major commercial publishers. Within the health and social care field (the focus of the article), literature reviews tend to use databases such as Medline, Embase and Cinahl – and overlook independent UK-produced databases, even when they are more relevant to the research question.

Why does it matter?

Research has shown that how (and why) databases are chosen for literature searching can “dramatically influence the research upon which reviews, and, in particular, systematic review, rely upon to create their evidence base”.

To generate useful evidence for the UK context (relating to UK policy issues or populations), researchers need to understand the most appropriate database to search – but unfortunately our own experience of looking at the detail of methodologies in evidence reviews, suggests that in many cases the only databases searched are those produced by American or international publishers.

Grey literature is a valuable source in evidence reviews – and again this is often overlooked in the major databases which tend to focus only on peer-reviewed journal content. A recent Australian report ‘Where is the evidence?‘ argued that grey literature is a key part of the evidence base and is valuable for public policy, because it addresses the perspectives of different stakeholder groups, tracks changes in policy and implementation, and supports knowledge exchange between sectors (academic, government and third sector).

Another benefit of UK-produced databases is that they will make use of UK terminology in abstracts and keywords.

Social Policy and Practice – a unique resource

At this point I should declare a vested interest – The Knowledge Exchange is a member of a UK consortium which produces the Social Policy and Practice (SPP) database. The SPP database was created in 2005 after five UK organisations, each with a library focused on sharing knowledge in community health and social care, agreed to merge their individual content in order to make it available to the widest possible audience.

The current members of the SPP consortium – the National Children’s Bureau, the Idox Knowledge Exchange, the Centre for Policy on Ageing and the Social Care Institute for Excellence – have just been joined by the National Society for the Prevention of Cruelty to Children. Inclusion of the NSPCC’s bibliographic data greatly enhances the coverage of child protection research in the database. SPP has been identified by NICE, the National Institute for Health and Care Excellence, as a key resource for those involved in research into health and social care.

We want the UK research community to understand what SPP offers, and to use it when undertaking literature reviews or evidence searches. This process of awareness raising should start with students – librarians in universities and the UK doctoral training centres have a key role in this as it ties in with the development of information literacy and critical appraisal skills. Ignoring specialist sources such as SPP risks introducing bias – at a time when initiatives are attempting to embed research and analytics in local government and the wider public sector.


Information on the coverage of Social Policy and Practice is available here and the distributor Ovid is offering a free 30-day trial.

Knowledge insider…. a Q & A with Jonathan Breckon

jonathan_breckon_150x150In the latest of our series of Q&As with leading advocates of the use of evidence in policymaking and practice, we talk to Jonathan Breckon. Jonathan is Head of the Alliance for Useful Evidence – a partnership which champions the need for useful evidence, providing a focal point for improving and extending the use of social research and evidence in the UK.

Jonathan, what led you to a role about promoting and improving knowledge development? 

There are two ways in which I am interested in knowledge development; professionally I have always worked around universities, loved doing and finding out new research and working within research in UK. I have always been conscious however, of the gap between research and front line services, even when research is relevant to the service, and felt this was a great loss and disadvantage to public services.

My personal interest, as a user of public services, with my kids going through services such as schools, health and sports, I have been desperately aware that things are business as usual rather than continuously striving for innovation and change. The debate is now all about money and reductions when it should be about improvement and future proofing.

We don’t always know what it takes to bridge the gap between what we need, and what services can provide; research can actually help that. The What Works approach is really important but very hard, as it’s difficult to stop doing things we have already invested in. An evidence-based research approach can challenge and support this evaluation and we have a moral duty to do it and not continue to invest in services which don’t work.

What do you think the main benefits of developing your knowledge are?

The challenge of seeing if things work or not, why they work, where they work and who they work for – developing your knowledge is the critical aspect of improving how you do things.

It’s also important for a whole host of other benefits. I particularly like Carol Weiss’ work, which is instrumental – this ‘enlightment’ operational research should not be dismissed. This approach can support the ideas of learning continuously through research; it implies a continuous review of theory, methods, practice and we should always be striving to improve our methods and outcomes.

When people are talking to you about evidence, research or knowledge, what do they most frequently raise as issues?

The most common one is that investing in evidence is just rhetoric; politicians, charities, parties etc will never really be informed by the research agenda, and I agree. We aren’t in a super-rational culture, it’s about our wider culture, values and beliefs as well. But it’s a fundamental misunderstanding that research trumps anything. It is part of the mix, part of the overall democratic and rational approach to doing anything.

The Behavioural Insights Team has a massive role to play in understanding the biases in how we make decisions, whether in prisons, police, policy etc. We don’t work rationally all the time and evidence can help us understand the messiness of policy making. We just need it to be a bigger part of the mix.

Everybody has this view that they use evidence but we don’t really understand how effectively they use it.

What are the hard to spot mistakes when it comes to developing your knowledge, which you really need to avoid?

The main one is that not all evidence is equal; that you have to make judgements about it. This is hard for those writing the research – it’s not about the quality of the research, and it’s about the point of view from demand. They need some things and not others.

The big challenge, if you are looking at impact, is you need different approaches, experiments, systematic reviews – one study is not enough. Such as when you see studies reported in newspapers – until replicated we don’t really know if it is robust. You need to avoid literature reviews where you cherry pick, go to something which is transparent and is systematic. This is true of both policy makers and researchers’ point of view; we underestimate the challenges facing both sides.

Need more about impact. We are very good at qualitative – world class – but we are behind in quantitative methods. It is being addressed but it will need to filter through.

How do you think people will be doing evidence, research and knowledge development in 5 years’ time?

What Works Centres will, I hope, be a key part of the evidence ecosystem, in the way that NICE have done, helping providers and policy makers make decisions. Although it doesn’t do research itself it sucks in research and uses it well.

There will always be critics of them, even of just the name, but they will change the system. Some have been around for a while and are well established, but others are new and are just about to be. As well as synthesising research they will commission new work. For instance in wellbeing, we know a lot about the correlation with health and wellbeing but don’t know a lot about what will work in improving it.

Technology makes it very difficult to guess about the future, who would have predicted the work in social media research? Big data is emerging now and in 5 years’ time might be a standard tool. The fundamental principles like statistics will be there but we will have to adapt to the possibilities offered by technology

If you had a list of ‘best-kept secrets’ about research, evidence and knowledge you would recommend, what would you include and why?

Just because you have done a social science masters and PhD, does not make you an expert in evidence, partly because people over-specialise. People need open their minds to different methods and how people do it in other places. The Department for International Development have an amazing range of techniques, nothing like anything you have seen, with a database of all the research they have funded or delivered.

Emerging opportunities such as social media research – still early days and fundamentally new, and could have a huge impact. Most people’s default though is to go to an expert and be frightened off journals and academics; I don’t think you always have to commission something new, it’s about variety, breadth and developing your understanding in as many ways as possible.


 

You can also read Q&As with Tim Allen, Local Government Knowledge Navigator; Clive Grace, Local Government Knowledge Navigator; Sarah Jennings of the Knowledge Hub; and Kim Ryley, recent Past President of the Society of Local Authority Chief Executives.

 

How collecting evidence can help improve public policy: Scottish Enterprise’s approach to economic development evaluation

By Stephen Lochore

EO homeThe Knowledge Exchange has helped a range of public sector organisations to collate, analyse and disseminate their evaluation and research material. One example is the Evaluations Online portal, a publicly accessible collection of evaluation and research reports from Scottish Enterprise, one of Scotland’s two regional development agencies. We’ve recently made some changes to the portal, and though these were mainly to design and layout, it got me (as Project Manager) thinking about wider issues of accountability and evidence-based policy. Continue reading