CES Guide to: Using Evidence

Download the Guide

The ‘CES Guide to’ series lifts the lid on some of the approaches we use for our work with agencies, service providers and government departments and the community and voluntary sector.

Trustworthy evidence is central to effective decision-making, improves the delivery of publicly funded services, and ultimately, and most importantly, contributes to improving outcomes for people in Ireland and Northern Ireland.  

Evidence is an essential foundation for our work at CES. We produce evidence as an independent evaluator of programmes, policies and services. We translate evidence into useable formats and we support others to understand and use evidence to inform decisions on policy, services and practice.

In this guide we share our approach to evidence and our key takeaways.

What is evidence?

Our common understanding of evidence is information or facts that give us reason to believe that something is true, valid or present. Evidence is an essential component of decision-making for policy and practice, when aiming to improve public services, particularly within funding constraints and time pressures.  

Evidence is typically understood as a ‘good’ thing, but it is important to recognise that it can take a variety of different forms. In some contexts, evidence is understood to mean ‘empirical data’ or ‘research findings’. In other contexts it simply means ‘information we can trust’ or ‘information that can be relied upon’.  

At CES we draw on a wide range of evidence including research evidence, public evaluations, practitioner wisdom and local knowledge, good-practice examples, expert opinion, good quality service data, and the lived experience of citizens using services. We use the term ‘evidence’ to encompass all of these things.  

How do we decide what evidence to use?

Different forms of evidence are valued differently in different contexts. Rather than ranking forms of evidence based on set methodological criteria we advocate for a more pragmatic approach.  

When considering what evidence you need to support decisions, ask yourself “Can the evidence bear the weight of the decision it is supporting?”. For a decision about changes to how a small local service is delivered, for example, it may be enough to learn from service user feedback and local knowledge of community needs. For decisions about national policy, potentially impacting large numbers of people with significant implications for their lives and public spending, it is likely a robust and extensive evidence base will be needed to enable evidence-informed decisions.  

To guide us we apply a set of established principles to decide what evidence we use and how we use it.

Our guiding principles

Relevance  

We select evidence that is relevant to the question asked or decision to be made. Taking time to scope the boundaries of ‘what is’ and ‘is not’ relevant to your question is a vital step in any consideration of the evidence.  

Tip: Borrowing tools from systematic review/ evidence synthesis methodologies like SPICE (setting, perspective, intervention/exposure/interest, comparison, and evaluation) or PEO (population, exposure, outcome) can be very useful to support scoping of your question.

Starting with a clear sense of what is and is not relevant to your question is important to guide your decisions on what evidence to pay attention to. For example, is your question about the experiences of people in rural communities only, or concerning families with young children? Having a clear definition of the population/people that may be impacted by the decision will guide you in assessing whether or not the evidence available can apply to those who will be affected by the decision.  

We pay particular attention to the context when assessing the relevance of evidence. The research, geographical, economic, social and political contexts act as the backdrop from which information can emerge. Ask yourself is the context where the evidence is generated the same or similar enough to the context you want to apply it to? For example, can you reasonably apply evidence on the effect of financial support for lone parent families in the USA to an Irish context?  

Quality of evidence

We are diligent in finding and assessing evidence to make sure that we are using the best available evidence to support our work with clients and partners.  

A key question when it comes to quality is ‘how reliable is this evidence?’ Numerous concepts have been applied to assess quality of evidence, like the risk of bias in the study design and execution. We encourage you to consider if the evidence you are using was generated in a way that makes it reliable enough for your purposes. Consider who generated the evidence and any conflicts of interest that introduce a risk that the evidence may be biased. How trusted is the source of evidence? For example, you might place more trust in evidence from peer-reviewed scientific publications or reports from reputable organisations than media articles or opinion pieces with a clear agenda or conflict of interest.  

Tip: There are a range of freely available tools that can support you to assess the quality of evidence, such as JBI’s critical appraisal tools or Critical Appraisal Skills Programme checklists

Weight of evidence

When drawing together multiple sources of evidence we consider the weight of the evidence. The volume of evidence or the number of pieces of evidence you hold that you have deemed relevant to the question is much less important than the weight of that evidence.  

Weighing the evidence involves considering the quality, relevance, precision and consistency of the evidence you have gathered. All else being equal we would place more weight on:

  • evidence that is more relevant to the question/decision in hand  
  • higher quality evidence - for example, we would place more weight on a well conducted systematic review than a single study.  
  • studies that provide more precise evidence – for example, if our question concerns the impact of a programme for 2 year olds’ development we would prioritise evidence about 2 year olds rather than evidence about ‘under 5’s’ that doesn’t distinguish between 2, 3 and 4 year olds.  
  • a body of evidence that is consistent - for example, a number of different pieces of evidence or different types that all support the same answer to the question would be given more weight than a single piece of contradictory evidence.
Tip: You can find more guidance on using ‘weight of evidence’ approaches from The European Safety Authority (ESFA, 2017)

Equity

We work with evidence that impacts people and recognise that different people or groups of people may be impacted differently by a policy or practice. So, we would always consider who the evidence applies to and whether or not there is ‘disaggregated’ (separate pieces of) evidence that can tell us if some groups are impacted more than others.  

Critical and reflexive

As researchers and evaluators, we seek to be an honest broker and examine our own beliefs, judgements and practices to ensure that the most appropriate evidence is used and we aren’t ignoring or minimising relevant evidence that might not fit with our own world view or assumptions.  

Evidence gaps

There may not always be strong, high quality, relevant evidence available to support decision making. So, what then? Deciding to do nothing is still a decision. We need to be able to work with an incomplete evidence base. We support decision-makers to use innovative thinking to make the best use of the available evidence. When a decision has to be made but the evidence is sparse it is vital that decision makers commit to monitoring what happens and generating new evidence to fill the gaps.

How do we use evidence?

Here are some examples of our work and how we used evidence:

  • CES evaluation of the Area Based Childhood programme demonstrates that the use of evidence-informed and evidence-based practice supported service user engagement, helped to secure the buy-in of practitioners responsible for delivering services, and encouraged senior decision-makers to engage with innovation.2
  • In the Nurture Programme CES worked in partnership with the HSE and others to develop and test evidence-based resources for parents still in use many years later as www.mychild.ie.
  • We are supporting teams of academics in STEM to generate social impact through training and mentoring commissioned by Science Foundation Ireland’s Challenge Fund
  • We collaborate with academic researchers to mobilise evidence into policy and practice Bridging the gap between research and policy - Stable Lives, Safer Streets
  • We support organisations to evaluate their work, either as an independent evaluator or as a partner in building capacity to use evidence and evaluation.  

Useful resources:

We have developed a suite of evidence informed resources including:

Related Guides

Related

Work with CES

Get in Touch