CES Guide to: Evaluation

Download the Guide

The ‘CES Guide to’ series will attempt to lift the lid on some of the approaches we use for the work we do with agencies, service providers and government departments.

CES regularly carries out evaluations forour partners, examining the effectiveness of policies, projects or services to inform future developments or improvements. In this ‘Guide To Evaluation’ we will look at the reasons why you should evaluate and when, different types of evaluations and common challenges to be aware of.

What is an Evaluation?

Our definition of and evaluation is:

‘A planned investigation of an intervention, according to specific questions of interest. It is carried out in a systematic and robust way, using reliable social scientific methods, to determine an intervention’s value, merit or worth.’ 

An evaluation is an investigation. It examines or studies an aspect or aspects of an intervention in detail. In our work, an intervention usually refers to a service, project, programme or policy. Essentially, an intervention can be anything people do to try change a problem or situation. 

The aspects of the intervention to be examined or studied depends on the specific questions of interest. Evaluation questions should be specific and clearly direct the evaluator towards the information you want to find out. It is often useful to consult with key evaluation stakeholders for feedback and revise your evaluation questions if required.

Why and when to carry out an evaluation

Evaluations help examine the effectiveness of policies, projects or services to inform future developments or improvements. It's best practice to plan monitoring and evaluation before implementation to ensure proper data collection systems are in place from the start. There are a range of reasons why you might wish to evaluate an intervention, service, policy, or practice.

Evaluations can be used to:

What are some of the different types of evaluation?

  • Formative evaluations - Help improve ongoing interventions and inform future ones.
  • Summative evaluations - Assess impacts and overall merit of completed interventions; inform decisions about continuation or replication.
  • Process evaluations - Examine implementation factors and inform scaling up decisions
  • Outcome evaluations - Assess effects and effectiveness to determine if the intervention made a difference.
  • Retrospective evaluations - Look back at completed interventions, using questions framed in the present tense.                
  • Prospective evaluations - Move forward in time with ongoing interventions, using questions framed in the present tense.                
  • Pragmatic evaluations - Apply the best feasible method within real-world constraints; useful for well-defined questions with limited time/budget.
  • Theory-based evaluations - Develop theories of how interventions work; helpful for understanding 'how, when and for whom'.
  • Realist evaluations - Theory-based approach asking "What works, in what circumstances, and for whom?".

Common Challenges and Constraints

Budget Constraints

Robust evaluations can be resource-intensive. Cost reduction strategies include:

  • Simplifying evaluation design
  • Prioritising key questions
  • Using secondary data
  • Reducing sample sizes
  • Incorporating new technology

Time Constraints

Common issues include insufficient timeframes to measure outcomes, late commissioning preventing baseline data collection, and tight deadlines. Management strategies include:

  • Simplifying design
  • Prioritising key information needs
  • Using efficient data collection methods
  • Specifying short, medium and long-term outcomes

Data Constraints

Issues with missing, incomplete orpoor-quality data. Solutions include:

  • Reconstructing baseline data
  • Adapting methods for hard-to-reach groups
  • Using mixed-methods approaches
  • Planning evaluation from the beginning

Political and Organisational Constraints

These may include obstruction attempts,threats to independence, or stakeholder resistance. Effective managementrequires:

  • Clear evaluation boundaries
  • Good teamwork and communication
  • Collaborative stakeholder analyses

Ethical Considerations

All CES evaluations follow key ethicalprinciples:

  • Do no harm
  • Informed consent
  • Voluntary participation
  • Confidentiality and anonymity

Knowledge and Skills Constraints

No evaluator can be skilled in everything - it's important to assess what type of evaluation matches available expertise.

Key Factors for Choosing Evaluation Type

1.           Purpose and audience

2.           Capacity and resource availability

3.           Nature and scope of intervention

4.           Context complexity

5.           Stage of project implementation

Important Note on Limitations

Evaluators must be transparent about constraints and limitations, as these will affect the certainty of findings. It's crucial to express appropriate caution about conclusions drawn from constrained evaluations.

Read or download the full guide here

Want to talk to CES about a project? Book a call with the team here.

Related Guides

Related

Work with CES

Get in Touch