Anagha Joshi is a Senior Research Officer at the Australian Institute of Family Studies.
Using research to answer practice questions
Using research to answer practice questions
This short article describes how to incorporate research into decision making based on practice-relevant questions.
This is the second short article in a series focusing on the use of evidence in practice. This article describes how to incorporate appropriate research into decision making based on practice-relevant questions, and considers the strengths and limitations of using different types of research.
Research is a core component of an evidence-informed approach to practice. To support the best selection and use of research in decision making, an approach that maps a set of common research questions against appropriate types of research can be used as a guide. This can support you in selecting appropriate research to identify a problem in a community or to answer questions on the effectiveness of a program or appropriateness of a practice.
In this article, we describe and apply this approach when answering practice questions in the child, family and community welfare sector, highlighting the strengths and potential limitations of using different types of research.
How practice questions can be answered by different types of research
There are four common ways research can be used to answer relevant practice questions:
- understanding the problem and its context
- finding out what works and determining resource allocation
- understanding processes to deliver programs/practices
- assessing the appropriateness and acceptability for users. , ,
We map these four ways, identifying the best types of research to answer these questions and providing research examples from our sector (see Table 1).
|Uses of research evidence||Practice questions||Rationale for using research evidence||Types of research that answer these questions||Examples relevant to the child, family and community welfare sector|
|Understanding the problem and its context||
|Finding out what works and determining resource allocation||
|Understanding processes to deliver programs/practices||
|Assessing the appropriateness and acceptability for users||
1. Understanding the problem and its context
Effective practice decisions require an understanding of the problem, its context and the needs of the community. Research can be used to inform this. Surveys can be used to determine how many people are experiencing or are affected by an issue, as they can capture information on a large number of people at once.Cohort studies can provide appropriate data about a group over a period of time (e.g. to understand the associations between infant and childhood experience and adult outcomes). However, surveys and cohort studies are not appropriate when trying to explain why the problem is occurring or why a program or intervention is not working. .
2. Finding out what works and determining resource allocation
Identifying which program or intervention is best at solving a problem supports effective decisions on resource allocation. Systematic reviews provide strong evidence for whether certain programs or interventions work and why. This is because they systematically summarise the results and screen the quality of many studies on a topic to answer a question about what works. Systematic reviews are considered high-quality evidence when making decisions.4 However, systematic reviews may not be available, particularly when the evidence on a topic is newly emerging. In these instances, other types of research can be used including case-control or quasi-experimental designs.
In addition, randomised control trials (RCT) can be helpful when answering questions on effectiveness (what works?) and cost effectiveness (is it worth it?). This is because RCTs can link causes to their impacts through assigning participants to random groups during interventions, which increases the trustworthiness of the findings.RCTs are less appropriate for understanding the context, processes and acceptability of programs and practices (the ‘how’ and ‘why’ things work). They are also conducted in a controlled environment that often does not reflect real-world settings.
3. Understanding processes to deliver programs or practices
After selecting a program or practice, understanding the key components for the effective implementation of programs or practices supports effective decision making and positive outcomes for users. Qualitative research can provide this, as it involves asking practitioners and users what they believe helped or hindered a program’s success. This type of data can be helpful for understanding the reasons that lead to program or service outcomes, and it works well alongside survey data to understand participant experience and program quality.
4. Assessing the appropriateness and acceptability for users
Understanding whether programs or services are acceptable or appropriate for intended users supports effective decision making. Qualitative research is useful for answering these practice questions as it provides an open dialogue for individuals to voice their concerns, opinions and experiences. As indicated above, qualitative research supports understanding the reasons that lead to program or service outcomes and it works well alongside survey data to understand participant experience and program quality.
A limitation of qualitative research is that it can be resource intensive, which can limit the number of participants in the research. In addition, individual experiences can be subjective, which means that if only a small number of people are interviewed then findings may not represent the majority.
Choosing appropriate research when making evidence-informed decisions in practice supports improved outcomes in the child, family and community welfare sector. This article is a guide to understanding the best types of research evidence to use when asking common practice questions.
How will you use the evidence or information in this short article in your work? We would love to hear from you in the Comments field below.
Further reading and related resources
- Evidence, hierarchies and typologies: Horses for courses
This paper by Petticrew and Roberts describes an evidence matrix on how to assess different types of research and how to link practice questions to research designs.
- What counts as good evidence?
This report by the Alliance for Useful Evidence explains the use of evidence and the different approaches for identifying evidence in practice.
- Evaluation and Expert Panel project resources
This collection of resources from the Expert Panel project has been created to help service providers with research and evaluation initiatives.
1. Petticrew, M., & Roberts, H. (2003). Evidence, hierarchies, and typologies: Horses for courses. Journal of Epidemiology and Community Health, 57(7), 527–529.
2. Kumanyika, S., Brownson, R. C., & Cheadle, A. (2012). The L.E.A.D. framework: Using tools from evidence-based public health to address evidence needs for obesity prevention. Preventing Chronic Disease, 9, E125.
3. Shaxson, L. (2014). Investing in Evidence: Lessons from the UK's Department for Environment, Food and Rural Affairs (Working paper 2). Jakarta: Knowledge Sector Initiative. Retrieved from www.ksi-indonesia.org/assets/uploads/original/2020/01/ksi-1580281998.pdf
4. Blanchet, K., Allen, C., Breckton, J., Davies, P., Duclos, D., Jansen, J. et al. (2018). Research Evidence in the Humanitarian Sector: A practice guide. London: Evidence Aid, London School of Hygiene and Tropical Medicine and Nesta (Alliance for Useful Evidence).
5. Caruana, E. J., Roman, M., Hernandez-Sanchez, J., & Solli, P. (2015). Longitudinal studies. Journal of Thoracic Disease, 7(11), E537–40.
6. Breckton, J. (2016). Using Research Evidence: A Practice Guide. London: Nesta (Alliance for Useful Evidence). Retrieved from www.alliance4usefulevidence.org/publication/using-research-evidence-a-practice-guide-january-2016
7. Hariton, E., & Locascio, J. J. (2018). Randomised controlled trials - the gold standard for effectiveness research. BJOG: International Journal of Obstetrics & Gynaecology, 125, 1716.
8. Galdas, P. (2017). Revisiting bias in qualitative research: Reflections on its relationship with funding and impact. Los Angeles, CA: SAGE Publications.
Add a comment
Dr Kristel Alla is the Knowledge Translation Specialist at the Australian Institute of Family Studies.
Dr Nerida Joss is the Executive Manager of Knowledge Translation and Impact at the Australian Institute of Family Studies.
This short article describes an evidence-informed approach to practice and the role of research to support better outcomes for families and childre
This short article describes ways in which organisations can support their practitioners to use an evidence-informed approach to practice.
This page presents a range of resources to help service providers with research and evaluation initiatives.
This webinar was for CfC Facilitating Partners, Community Partners & others who are implementing the 30% evidence based program requirement.