Using qualitative methods in program evaluation

Using qualitative methods in program evaluation

24 May 2016
Using qualitative methods in program evaluation

This article outlines some key considerations for using qualitative methods in program evaluation.

Qualitative research seeks to answer questions that stress how social experience is created and given meaning, beyond the scope of numbers and statistics (Rogers & Goodrick, 2010). Qualitative research can use a variety of methods such as interviews and observations – the method you choose should be based on what you are researching and the resources you have. However, being rigorous and transparent is the key to good qualitative research. That is, to be of high quality your research should be supported by a series of logical and justifiable steps.

Where should I start?

To begin with, have a strong rationale for why you have chosen a qualitative approach to answer your research question/s, and identify suitable data collection methods (interviews, focus groups, observations, open-ended surveys etc.) and any key perspectives that should be captured. Consider what techniques or concepts will guide the data analysis and interpretation stage, and what quality checks you can put in place to justify your interpretations.

Overall, being clear about the methodological process will help to strengthen the credibility of your findings.

Can qualitative methods be used to measure program impact?

If you’re considering adopting a qualitative approach to evaluate the impact of a program1, be aware that using strictly qualitative methods may not generate the answers you need, and is generally recommended in only a handful of cases. For instance, if you have a small population size, there are no existing outcomes measures that can be used with your target group, or you are evaluating a pilot version of the program.

An alternative option is to incorporate qualitative methods into your evaluation design alongside quantitative methods, as part of a mixed-methods design. Evaluations that adopt a mixed-methods approach are well placed to establish any causal relationships between the program content and outcomes, and to tell us how and why these changes occurred. Mixed-methods can also be used to:

  • test assumptions of how programs work in practice;
  • identify or explore unintended outcomes of the program; and
  • capture detailed and complex data about a particular issue or program, and enhance understandings about what aspects of the program have and haven’t worked.

Can anyone conduct a qualitative program evaluation?

Regardless of the techniques you choose to gather information about your program, conducting a quality qualitative evaluation relies on having time and expertise. Collecting participant insights can be time consuming, especially if only one evaluator is involved in the project. There is also the potential to be left with a large volume of data that needs to be interpreted, synthesised and communicated. Having an expert in qualitative methods conduct or assist with the evaluation will help to ensure that these tasks are completed in a systematic and rigorous way. The evaluation write-up is a good opportunity to demonstrate that your evaluation is of high quality – but again, this requires a specific set of skills.

Need help?

If you don’t have the desired skill set to undertake your research and can’t access help from within your organisation, a member from the Industry List can help with a range of tasks, from providing advice on your research design through to conducting a program evaluation. Contact the expert panel team at fac-expert-panel@aifs.gov.au for more information.

Further reading

  • Ezzy, D. (2002). Qualitative analysis: Practice and innovation. Crows Nest: Allen and Unwin. Chapter 4.
  • Liamputtong, P., & Ezzy, D. (2009). Qualitative research methods (3rd ed.). South Melbourne: Oxford University Press.
  • Miller, E & Daly, E. (2013). Understanding and measuring outcomes: The role of qualitative data. Glasgow, Scotland: Institute of Research and Innovation in Social Services.
  • Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Thousand Oaks: Sage Publications.
  • Stufflebeam, D. L., & Shinkfield, A. J. (2007). Evaluation theory, models, and applications. San Francisco: Jossey-Bass.
  • Trochim, W. (2006). The Research Methods Knowledge Base
. Ithaca: Web Center for Social Research Methods. Qualitative measures section.

References

Rogers, P. & Goodrick, D. (2010). Qualitative data analysis. In Wholey, J., Hatry, H., & Newcomer, K., (Eds.), Handbook of practical program evaluation (3rd ed., pp. 429–53). San Francisco: Jossey-Bass.

 

1. For a discussion of the differences between impact and process evaluation, see CFCA Practitioner Resource Evidence-based practice and service-based evaluation.

 

The feature image is by Stef Lewandowski, CC BY-NC 2.0.

Comments

I will be interested to learn about the certification programs or license required for social work practitioners especially in the realm of non-clinical social work and engagements with communities in the international development context
MINTO
Hi Minto In Australia the peak body for social workers is the Australian Association of Social Workers (https://www.aasw.asn.au/) - there may be some information here that is useful to you. It's hard to comment on what would be needed for roles in the international development context as you have mentioned, as it's likely it would vary according to what the role was. Some roles may require experience or qualifications in community development or similar areas, others may require more formal qualifications in social work. Searching roles at places like the World Health Organisation or United Nations may give some clues. There are also specific tertiary courses related to international development, e.g. https://government.unimelb.edu.au/degrees/14-master-of-development-studies. Please let us know if we can help with anything else. Thanks, Elly
Elly

Add a comment

Authors

Kathryn is a Senior Research Officer within the Practice Evidence and Engagement area at the Australian Institute of Family Studies.

Kelly is a Senior Research Fellow and Manager of Project Evaluation and Qualitative Research at the Australian Institute of Family Studies.

Need some help?

CFCA offers a free research and information helpdesk for child, family and community welfare practitioners, service providers, researchers and policy makers through the CFCA News.

Subscribe

CFCA News

Sign up to our email alert service for the latest news and updates

Subscribe