Figure 1: Key Steps in an evaluation
1. Why do I need to evaluate?
- Identify evaluation purpose
- Identify evaluation audience
2. What do I need to find out?
- Identify evaluation questions
- Select evaluation design
3. What will I measure?
- Select outcomes and outputs for measurement
- Identify indicators
4. How will I measure it?
- Select data collection methods
- Ensure data is good quality
5. Who will I collect data from?
- Determine sample
- Consider ethics
6. When will I collect data?
- Develop timeline
7. What will I do with the data?
- Data analysis
- Evaluation write-up
This diagram demonstrates how each of the columns in a program logic model translate to evaluation questions and indicators:
Parent education program
Design parent education curriculum
Provide six participants interactive training sessions with handouts
Targeted parents attend
Parents increase knowledge of child development
Parents use improved parenting skills
Parents learn new ways to discipline
Reduced rates of child abuse and neglect among Provide six participants
Key evaluation questions
Were the inputs sufficient, timely?
Was curriculum developed? Were all six sessions delivered?
Did all parents attend that we intended? Who did/did not attend? Did they attend all six sessions? Why/why not?
To what extent did knowledge increase? Did they learn new approaches? What else happened?
Are parents actually using improved skills? So what? What dierence do these skills make?
Has there been a decrease in rates among participants? Were goals met?
- when delivered
Curriculum # sessions held
#, % attended per session Certificate of completion
#, % with increased knowledge of... Additional outcomes
#, % using skills Types of dierences reported
#, % abusing/neglecting children before/after