Evaluation of the second phase of the Families and Children Expert Panel Project

Content type
Practice guide
Published

May 2021

26 May 2021

The Expert Panel Project is now known as AIFS Evidence and Evaluation Support. 

gettyimages-875247434-mapodile.jpg

Summary of findings

Background

The Families and Children Expert Panel Project (referred to from here as ‘the Expert Panel Project’) was commissioned by the Department of Social Services (DSS) in 2014. The project was established to help services funded under DSS’ Family and Children’s Activity (FaC) to build their capacity to plan programs, implement evidence-based programs and evaluate program outcomes.

This document summarises the findings of the evaluation of the second phase of the Expert Panel Project and covers the period July 2017–June 2020. During this phase, the Australian Institute of Family Studies (AIFS) project team continued their previous work (undertaken between 2014 and 2017) to support the 52 Facilitating Partners funded under the Communities for Children Facilitating Partners (CfC FP) – a FaC sub-activity. This support primarily consisted of assisting service providers to meet DSS funding requirements for evidence-based programs (described here as the ‘evidence-based requirement’). In this period the Expert Panel Project also increased capacity building support to other services funded under the FaC Activity. This support was delivered using a ‘tiered’ approach, ranging from universal low intensity support, such as online resources and tools, to more intensive one-on-one assistance and facilitation of access to external evaluation experts.

The evaluation examined the implementation and uptake of Expert Panel Project activities in this phase. It also considered the needs of FaC service providers, their experiences of engaging with the Expert Panel Project and the effect of this engagement on program planning, implementation and evaluation.

The evaluation adopted a mixed methods approach and drew on AIFS administrative data; a FaC Activity provider survey (n = 240); webinar participant surveys; workshop participant surveys; interviews with 13 CfC FP providers; a review of 10 Facilitating Partners’ Activity Work Plans (AWPs) and consultation with the Expert Panel Project team.

Key findings

Fifteen evaluation questions were developed to guide data collection. This section summarises the evaluation findings using data to address each of these questions.

1. Was the project implemented as planned?

On the whole, the Expert Panel Project was implemented as planned. However, FaC provider participation in some activities was lower than expected. For example, the ‘Industry List’ of approved evaluation providers established for this second phase of the Expert Panel Project has been in place since July 2018 but has not been used by FaC providers. The findings of the first phase of the Expert Panel Project indicated that service providers rarely used the Industry List because they lacked the funds to purchase assistance and/or preferred to find their own non-Industry List provider. Data collected for this second phase evaluation supported these findings.

Participation in the Evaluation and Outcomes Measurement workshops was also lower than anticipated, with the majority of workshops having less than eight participants; two had only four participants. Only 45% of FaRS and 30% of CaPs providers – the target audience for the workshops – reported being aware of these workshops. Factors that may have contributed to this were shorter than ideal lead times for the earlier workshops and delayed release of promotional material due to changes taking place in the DSS delivery regions at this time.

Uptake of Expert Panel Project telephone and email support with program logic models and evaluation planning was also lower than anticipated. This support was open to all FaC providers but only 11 providers used the service.

2. Which aspects of the project were FaC providers aware of?

Awareness of the Expert Panel Project varied across FaC sub-activities. CfC FP Facilitating Partners had the greatest awareness of Expert Panel Project activities because their evidence-based finding requirement, and the Expert Panel Project’s role in assessing applications to meet this requirement, meant CfC FP’s had the most contact with the Expert Panel Project team. Even so, only 66% of CfC Facilitating Partners and 37% of CfC Community Partners responding to the survey were aware that the Expert Panel Project offered telephone support regarding CfC FP assessments, and only 53% of FaC providers overall reported being aware of Expert Panel Project online resources.

3. How appropriate was the support offered by the project?

In general, the support offered by the project was appropriate for the needs of the majority of FaC providers. For example, 98% of Evaluation and Outcomes Measurement workshop attendees indicated that the topics presented at the workshops were relevant to their work and more than a quarter had been able to apply the knowledge gained in a workshop. Most FaC providers (84%) indicated that they were satisfied with the support they had received from the Expert Panel Project team, and 82% of providers who received telephone or email assistance reported being able to apply that information to their work. During interviews, CfC FP providers described the support provided by the Expert Panel Project, particularly telephone support, as helping them meet the evidence-based requirement.

When asked if this support could be improved, several FaC providers suggested that they would like more face-to-face meetings because this could improve the project team’s understanding of their services. Others commented that the project team may be able to play a role in expanding the Guidebook of Evidence-based Practices.

4. Which providers engaged with the project?

In comparison to the FaC activity as a whole, higher proportions of CfC FP Facilitating Partners engaged in Expert Panel activities; and approximately 43 (of 52 in total) had contacted the Expert Panel Project team for advice about the CfC FP assessment process. Just over half of the Facilitating Partners responding to the survey reported accessing information from an AIFS website, compared to less than a third of CaPS, FaRS or IFSS providers.

CfC FP providers, particularly Facilitating Partners, were more likely to engage with the Expert Panel because they required support when submitting programs for assessment. Other FaCS providers did not need to meet this requirement and therefore may have been less motivated to seek support. However, it is noteworthy that a slightly higher proportion of CaPS providers attended a series of Program Implementation workshops delivered by CEI (29%), compared to Facilitating Partners (27%). FaRS providers were also well represented (18%) at these workshops.

5. Which aspects of the project did providers engage with?

Providers engaged with a range of Expert Panel Project activities. These included online resources, such as online learning modules and webinars; telephone and email support; and in-person workshops. Almost half of the FaC providers who responded to the survey reported engaging with AIFS’ online resources. In 2019, four of the Expert Panel Project resources had more than 5,000 unique views and four of the templates had been downloaded more than 1,000 times.

The two online learning modules aimed at building the capacity of FaC Activity providers to measure outcomes and develop program logic models had been viewed 577 and 948 times respectively in the previous eight months.

At the time of writing, the two Expert Panel webinars on the topics of needs assessment and program implementation were viewed 2,254 and 1,314 times since they were posted less than a year ago.

CfC FP providers made 157 submissions for assessment by the Expert Panel Project team, which often involved telephone and email support from team members, sometimes over an extended period. However, when the wider FaC activity was surveyed, just under a quarter of respondents reported accessing telephone or email assistance from the team.

As noted earlier, attendance at the Evaluation and Outcomes Measurement workshops was lower than anticipated (an average of eight attendees per workshop) but attendance at the Program Implementation workshops was much higher. The reasons for this are unclear but it may be due to the latter topic being of more interest, shorter in length (half a day versus two days) and having a much greater lead time for promotion across a broader range of promotional channels.

6. Did the project increase providers’ interest in evaluation?

Data on the impact of the Expert Panel Project on FaC providers’ interest in evaluation is limited to CfC FP, CaPS and FaRS providers; that is, the services where the project had the greatest reach.

Interview data indicated some increased interest in evaluation among CfC FP Facilitating Partners. However, it is difficult to distinguish the impact of the Expert Panel Project support from that of the evidence-based requirement. Facilitating Partners did note increased awareness of evaluation and evidence, which included listing evaluation as a standing item on meeting agendas and more dedicated funding being directed towards evaluation. The Expert Panel Project also appeared to have motivated a few Facilitating Partners to submit programs for assessment out of interest and to support future funding applications, rather than having to do so to meet current funding requirements.

However, although the CfC FP Community Partners indicated that they had positive experiences when engaging with the Expert Panel Project team, there was limited indication that this engagement in itself had led to an increased interest in evaluation.

Likewise, CaPS and FaRS providers did not appear to have a greater interest in learning more about evaluation following attendance at a workshop (mean scores: pre-workshop 4.5, post-workshop 4.0). This apparent decrease in interest is difficult to interpret but may be due to a decreased desire to learn about evaluation after gaining knowledge at a workshop.

7. Did the project increase providers’ confidence in finding information about evaluation?

The only data on confidence in finding information about evaluation comes from attendees at the Evaluation and Outcomes Measurement workshops. The CaPS and FaRS providers who attended a workshop indicated a marked increase in confidence in knowing where to find information about evaluation (mean scores: pre-workshop 3.5, post-workshop 4.4). No data are available for other FaC providers.

8. Did the project increase providers’ confidence in developing program logic models?

The Expert Panel Project appeared to have a notable impact on improving providers’ confidence in developing program logic models; this improvement was most likely among providers who had the most intensive support from the Expert Panel Project team, namely CfC FP, CaPS and FaRS providers. In particular, a major component of the work undertaken by the Expert Panel Project team was helping CfC FP providers to develop a documented program logic, or theory of change, as part of the process of assessing their application to meet the evidence-based requirement.

A number of respondents to the FaC Activity provider survey specifically mentioned that the knowledge they had gained from the Expert Panel Project had been applied to develop program logic models. This knowledge had been gained either through telephone or email communication, workshop attendance or viewing the website, with some respondents commenting on the usefulness of the logic model templates available on the website.

The Expert Panel Project made resources about program logic widely available through their website, but the type of practitioner viewing the resources and how they were used is generally unknown. The resources, however, were popular. The web resource How to Develop a Program Logic for Planning and Evaluation was viewed 12,656 times in 2019 and the program logic template was downloaded more than 1,000 times. Likewise, the online learning module on developing logic models was viewed 948 times (although only approximately 20% of the module was viewed on average).

The CaPS and FaRS providers who attended a workshop indicated a substantial increase in confidence in developing logic models (mean scores: pre-workshop 3.0, post-workshop 4.3). A few months after the workshop a number of these providers reported that they had used the information from the workshops to develop or revise program logic models in their workplaces or had built the capacity of others to develop logic models.

9. Did the project increase providers’ knowledge of the steps needed to evaluate a program?

There was limited data to address this question, but in post-workshop surveys, participants from CaPS and FaRS providers reported greater confidence in planning an evaluation (mean scores: pre-workshop 3.3, post-workshop 4.1) and increased knowledge of the steps involved in an evaluation (mean scores: pre-workshop 3.5, post-workshop 4.3).

10. Did the project improve program planning and documentation?

There was some evidence that the project improved program planning and documentation among some FaC providers, particularly CfC FP providers who submitted program plans for assessment. A total of 81 CfC FP programs were submitted to the project for assessment and most (n = 69) were approved as meeting minimum standards in relation to program planning, implementation and evaluation.

The Evaluation and Outcomes Measurement workshops also appeared to improve program planning. Several months after attending a workshop, a number of CaPS and FaRS providers reported using the materials provided at the workshop to inform the development of future programs and to use in training others within their organisations to undertake this work.

In addition, a number of the respondents to the FaC Activity provider survey mentioned that they had used the variety of resources and information provided by the Expert Panel Project to inform program planning. These tools were used, for example, to standardise workplans across programs and in grant applications.

11. How did the project strengthen program documentation; for example, inclusion of program logic models and justification for program effectiveness?

The Expert Panel Project provided one-on-one support to CfC FP providers, often over extended periods, to assist them to strengthen program documentation. This telephone or email communication was supported by online resources such as program logic templates.

The Evaluation and Outcomes Measurement workshops encouraged participants to work on their own program logic and evaluation plans and involved one-on-one support from the workshop facilitators. A relatively high proportion of workshop participants (85% of those responding to the follow-up survey) reported applying the knowledge gained from the workshops to their work.

12. Did the project improve evaluation and outcomes measurement?

There is some evidence that the Expert Panel Project increased knowledge about evaluation and outcomes measurement, particularly among attendees of project workshops. Following a workshop, CaPS and FaRS providers reported increased understanding of program outcomes (mean scores: pre-workshop 4.1, post-workshop 4.5) and increased knowledge of how to choose an outcome measure (mean scores: pre-workshop 3.1, post-workshop 4.2).

It is difficult to demonstrate changes in regular practice in relation to evaluation and outcomes measurement as a result of the Expert Panel Project. However, a number of respondents to both the FaC Activity provider survey and workshop follow-up surveys reported applying knowledge gained from the project to improve outcomes measurement, including identifying suitable outcomes measures.

13. To what extent did the project meet the needs of providers?

The majority of providers indicated that the project was relevant to their work and they were satisfied with the support they had received. The content of the Expert Panel workshops was seen as relevant to most attendees (e.g. 98% of those attending the Evaluation and Outcomes Measurement workshops). Eighty-four per cent of respondents to the FaC Activity provider survey who had received telephone or email support from the Expert Panel Project team were satisfied with the support they received and 90% would contact the Expert Panel team in the future if they needed to.

The majority of CfC FP providers who were interviewed were satisfied with the support that they received from the Expert Panel Project. They found telephone support extremely helpful but more face-to-face contact was requested. Respondents to the FaC Activity provider survey also expressed a desire for more contact with the Expert Panel Project team.

While the project in its current format appeared to meet the needs of providers, other topics and methods of support may be preferable to some providers. For example, when FaC providers were asked how they would like to receive support in the future, the most popular methods were workshops, webinars and in-house training. Only 18% of survey respondents nominated telephone support, which is one of the main forms of support provided by the project.

14 What barriers related to program planning, implementation or evaluation did providers face?

FaC providers reported experiencing a number of barriers to undertaking program planning, implementation and evaluation. These were not all directly related to the Expert Panel Project but affected engagement with the project and the effectiveness of the support offered. A major challenge was finding an evidence-based program that met community need. The Guidebook produced by the Expert Panel Project was seen by many as providing an inadequate range of programs in this regard. The Expert Panel team acknowledged that the Guidebook provides a relatively short list of programs due to the limited number of relevant programs that have evidence for effectiveness. Finding appropriate measurement tools, particularly for specific community groups, was a related barrier reported by providers.

Other reported barriers included limited time or staffing to undertake program planning, implementation or evaluation tasks; limited staff knowledge and skills; reduced opportunities for staff training in program evaluation; limited funding for evaluation support, including payment of evaluation consultants, training expenses for off-the-shelf programs and the expenses involved in having measures translated.

In addition, limited awareness of the support available to providers to assist with program planning, implementation and evaluation may act as a barrier. For example, less than half of most FaC sub-activity providers were aware of key Expert Panel Project activities.

15. What barriers did the Expert Panel Project team face in delivering program support?

The Expert Panel Project team noted a number of barriers to the delivery of program support. The main barrier was not being able to reach a broader range of FaC providers. Currently, much of the more intensive support provided by the team is delivered to CfC FP providers and, even then, not all CfC FP providers accessed support.

Attempts at targeting other sub-activities, such as workshops for CaPS and FaRS providers, had met with limited success, although this may have been the result of problems promoting the workshops rather than a lack of interest from providers.

With current funding levels, Expert Panel team members were only occasionally able to participate in individual face-to-face meetings, despite some FaC providers preferring this method of support. Generally, the Expert Panel Project team felt that telephone or email support was adequate, especially since other types of face-to-face support, such as workshops, were also offered.

FaC providers’ engagement with the Expert Panel Project may also have been affected by providers’ limited funding, meaning they gave less priority to program evaluation and the kinds of support offered by the Expert Panel Project. Limited evaluation skills among FaC staff could mean that they needed considerable support to meet assessment requirements. In addition, the frequently long time periods that CfC FP programs were under assessment (more than a year on average) could be another indication of the low priority that providers gave to evaluation and other evidence-based work and/or their lack of confidence and skill in undertaking this work.

The Expert Panel team also acknowledged that they were only one of several factors that could influence service providers’ interest in evaluation or in using research evidence. For example, funding bodies that required programs to have an evidence base for their work could be significant motivators to improved program documentation and evaluation efforts. Likewise, organisational support could influence the priority that providers gave to this work, as well as the availability of training or resources to support it.

Conclusions

The Expert Panel Project has been successful in increasing some FaC service providers’ understanding, knowledge and skills in program planning and evaluation. This was most true for the providers that had the most contact with project activities; particularly service providers who were funded under the CfC FP sub-activity and had a specific reason to engage with the project in order to meet the evidence-based funding requirement. These providers reported increased knowledge and skills, with improved understanding of program evaluation, the evidence base for programs and how to develop program logic models. Although the most evidence of improvement as a result of the project related to improved program documentation, there was also some indication of increased evaluation activities.

Provider awareness of Expert Panel activities was uneven across the FaC services, with relatively high awareness among CfC FP providers, particularly CfC FP Facilitating Partners. However, even among CfC FP providers there appeared to be limited awareness of the full suite of supports offered by the Expert Panel Project.

There is limited evidence that the Expert Panel Project has yet led to widespread, routine outcomes measurement or rigorous evaluation. However, there is evidence that the Expert Panel tools and templates have been used by some FaC services to make evaluation and outcomes measurement a more routine part of their practice and to develop more systematic approaches to program planning.

In conclusion, the Expert Panel Project has been able to improve some FaC Activity providers practice of evaluation and outcomes measurement. The support provided by the Expert Panel Project was more likely to be effective when there was also appropriate support from the providers’ own organisations. Without such in-house support, routine and widespread change in evaluation practice and outcomes measurement was less likely.

Subscribe to our newsletter to stay informed about the project.

Contact us if you want to know more about the evaluation.


Featured image: © GettyImages/ mapodile

Share