The Expert Panel Project
In 2014, the Australian Government Department of Social Services commissioned the establishment of a panel of experts to help service providers to deliver evidence-based programs and practices in the family support sector - the Expert Panel Project. Now that the Project is half-way through its 5-year term, this article reviews progress to date and some of the benefits and challenges of the process so far. Particular reference is made to programs under the Government’s Communities for Children Facilitating Partners initiative. So far, the Project has provided valuable insight into the level of skills and knowledge across the sector regarding evidence-based programs and outcomes measurement, and the effort needed to support what is, effectively, a cultural change.
The Australian Institute of Family Studies (AIFS) was commissioned by the Department of Social Services (DSS) to establish a panel of experts to assist agencies funded under the department's Families and Children Activity ("the Panel"; 2014-2019). The role of the Panel is to advise, mentor, support and train service providers to increasingly offer services and programs that are shown to improve outcomes for families and children.
Around the halfway mark of the project, this article reviews progress to date and some of the benefits and challenges of the process so far.
The Panel was established to assist service providers to plan, implement and evaluate programs and practices. This was in response to consultations with the sector in 2013, which identified broad-based consensus for increasing the use of evidence-based programs and practices, a focus on prevention and early intervention and a shift towards true outcomes measurement and reporting. There was also recognition across the sector of the need for support to achieve this goal.
The role of AIFS
AIFS works in close collaboration with the DSS National Office to implement the project and administer the Panel and Industry List. The specific activities that fall under AIFS' remit are to:
- establish a panel of experts to build sector capacity for evaluation and measuring outcomes;
- establish a high-level steering committee;
- conduct assessments of Communities for Children Facilitating Partner (CfC FP) funded programs in relation to their evidence-based requirements under their grant agreements;
- produce a guidebook that lists evidence-based programs relevant to the CfC FP sector in support of their evidence-based requirements; and
- develop resources, publications and webinars that are responsive to need and shared with the sector via the CFCA information exchange.
Our work on the Panel over the past two years has provided valuable insight into the level of skills and knowledge that the families and children sector have regarding evidence-based programs and evaluation/outcomes measurement, and the effort needed to support what is, effectively, a cultural change. This article focuses on our experiences in supporting Communities for Children Facilitating Partners,1 a sub-activity of the Families and Children Activity, to meet a contractual requirement for the delivery of evidence-based programs (EBPs).
Communities for Children EBP requirement
CfC FPs provide prevention and early intervention services to families and children aged 0-12 years in disadvantaged communities throughout Australia. The objectives of the program are focused on approaches that promote positive family functioning, safety and child development outcomes. Services support the wellbeing of children by building strong parenting skills and stronger and more sustainable families and communities.
CfC FPs are place-based, and develop and facilitate a whole of community approach whereby Facilitating Partners collaborate with other organisations to provide a holistic service system for children and families. As part of this role, Facilitating Partners fund other organisations (known as Community Partners) to provide services including parenting support, group peer support, case management, home-visiting services and other supports to promote child wellbeing.
New grant agreements for CfC FPs, commencing in 2014, required that:
- From 1 July 2015, at least 30% of the funding used for direct service delivery should be used to purchase evidence-based programs.
- From 1 July 2017, at least 50% of the funding used for direct service delivery should be used to purchase evidence-based programs.
To help facilitate the increased use of programs based on a greater level of evidence, a range of supports and resources were developed and implemented by AIFS. These included:
- a guidebook of evidence-based programs;
- a process by which existing programs delivered by providers can be assessed;
- a "matchmaking" database that provides a supported pathway to assistance with program planning, implementation and evaluation (the "Industry List");
- Expert Panel projects; and
- CFCA publications and resources.
Each of these supports and resources are discussed further, below, in terms of the extent to which they contributed to an increase in the adoption of good quality programs.
What we've found
The "guidebook" programs
The first step was to develop an online "guidebook" that identified programs that had:
- a sufficient evidence base to be considered acceptable for inclusion in the evidence-based program requirement;2
- matched the objectives of the CfC FP program; and
- provided as many feasible options as possible for a national initiative.
At the time of writing this article, there were 32 program profiles available (apps.aifs.gov.au/cfca/guidebook/programs). The criteria for inclusion were purpose built. They needed to be "rigorous enough" to ensure a level of quality of program but not so rigorous that providers were faced with too limited a choice. If, for example, the evaluation criteria had been solely restricted to one or more randomised controlled trials (the "gold standard" of evidence) providers would have been left with only a handful of parenting programs to choose from.
While the adoption of evidence-based programs is one way of meeting the requirement, there are several challenges associated with relying solely on the guidebook as a source of approved programs. One of the first challenges faced was the lack of diversity in the available programs. The vast majority of programs either partly or wholly focused on parenting skills, which relate to a number, but not all, of the objectives for CfC FPs.
Choosing a guidebook program also assumed existing skills for providers, such as assessing a program's "fit" to their target group needs, maintaining program fidelity and/or appropriate program adaptation, and associated implementation skills. While these skills were recognised as critical to the success of adopting a guidebook program, formal implementation support has only been available to date via the Industry List. This requires individual organisations to use their funding to procure this support, which was not always available or not prioritised towards implementation support. Promoting and supporting the effective implementation of evidence-based programs is a notable gap at present.
Access to the training and support associated with the guidebook programs has also been problematic for some providers. Often training is only provided in the eastern seaboard states, and there are pragmatic issues such as cost and proximity (particularly for rural and remote providers). Added to this, attrition of trained staff in rural and remote areas remains a challenge.
In spite of these issues, conversations with some CfC FPs have provided very positive feedback in instances where guidebook programs have been adopted.
In acknowledgement that the guidebook of evidence-based programs did not provide enough variability or depth to cover off on every CfC objective, and in recognition of the good practice that was already occurring in the sector, a process was set up to assess existing programs.
Programs were assessed by CFCA information exchange researchers, with at least two personnel involved in each program assessment process. There were five criteria against which these programs were assessed. These criteria were judged as the minimum standards for a quality program. For the purposes of this requirement, the criteria were:
- a theoretical and/or research background to the program;
- a program logic or similar;
- activities in the program that generally matched good practice in meeting the needs of the target group;
- an evaluation (with at least 20 participants) establishing that the program has positive benefits for the target group; and
- staff members who are qualified and/or trained to run the program.
The extent to which existing programs submitted for approval met these criteria was highly variable. At times, the shortfall existed in terms of any program documentation that detailed how the program met the criteria. Although the program "pathway" from target-group needs to outcomes was often assumed to be implicitly understood by the program provider, the documentation process was valuable in highlighting where there were gaps. In many cases, the development of a program logic provided a compact, visual representation of the program that helped to clarify the methods by which outcomes were achieved.
Many programs had been in operation for several years. Although they may have originally had a sound theoretical or research evidence basis, this hadn't been revisited in order to assess whether underlying program assumptions still held.
Overall, the program assessment phase has provided a valuable service for CfC FPs, in combination with intensive support processes, to assess the level of rigour and quality that underlies their service offerings.
Program improvements and embracing "failures"
Where it was assessed that further work needed to be undertaken to meet the criteria, in many cases it became a protracted effort to do so. Several evaluations are still underway at the time of writing. It has been critically important that we have recognised where organisations are at and worked from there, tailoring an approach that has helped to "nudge" them along the spectrum of quality service delivery and outcomes measurements. Developing program logics is a fundamental skill within this. Almost invariably, there has been goodwill and commitment to learning new skills on the part of the sector.
It has also been critically important that we created an environment where finding out that programs don't work, or are not a good fit, is equally as important as finding out that they do. In other words, we needed to allow the space for service providers to embrace "failure", reflect on the lessons learned, and either adapt or replace the program. Where this has occurred, it has been a challenging outcome for all involved. In these cases, contract managers, service providers and CFCA staff have been involved in reaching an agreement in regards to next steps, and flexible arrangements in regards to contract compliance.
The Families and Children Activity (FaC) Industry List service connects FaC service providers with 42 research, practice and evaluation experts who were selected via a tender process for membership on the Families and Children Expert Panel. FaC service providers can access independent support and guidance to plan, implement and evaluate new and existing programs, using their existing funding.
CFCA assists service providers to scope what particular assistance is needed and matches them with suitable Industry List members. After the project between the service provider and the Industry List member has been completed, CFCA works with both parties to help share general outcomes of the project with the broader sector.
To date, there have been ten projects between CfC FPs or Community Partners and Industry List members. Feedback received in regards to these projects indicates that the assistance has been invaluable in completing tasks associated with the evidence-based program requirements. There have been examples where the relationship established between the two agencies has resulted in an ongoing partnership to complete further work beyond the Industry List project.
There have been longstanding issues, however, with the cost of help from the Industry List. Feedback indicates that Communities for Children providers perceive the cost of services is too high, and that spending on service provision should take precedence. Understanding the value of evaluation and outcomes measurement in improving service delivery can, at times, be a tough sell.
This year also saw the completion of the first project between a CfC FP and an Industry List member. As mentioned previously, there is an expectation that once completed, the two parties will work together to disseminate key general findings from the project to the sector. A webinar was conducted by Mallee Family Care and Social Ventures Australia in July last year, which outlined how the two organisations worked in partnership to establish good practice in evaluating service delivery.3
Expert Panel projects
Expert Panel projects are ones where a need for service provider support is identified as common and needing a national approach. Projects are funded by DSS and are decided in a collaboration between DSS, AIFS and the Expert Panel Steering Committee. Panel projects focus on supporting service providers funded under the Families and Children Activity. To date, a number of projects have been completed or are in progress. These include:
- the measuring outcomes project;
- program planning and implementation - Children and Parenting service providers;
- assistance for rural and remote service providers to meet the evidence-based program requirements; and
- development of a measurement tool for family dispute resolution.
Around 85 providers have been directly assisted as a result of Expert Panel projects to date, with a number of the projects expected to deliver benefits to the sector as a whole.
Resources and support
AIFS, via the CFCA information exchange, provides intensive support to stakeholders involved in the CFC FP sub-activity to help them understand what is needed for program planning, implementation and evaluation/outcomes measurement in order to meet the requirement. This has given organisations a "safe space" to gain information and resources that will help them to deliver better programs, and has helped us to build trust and gather intelligence on where the skill/knowledge gaps are. We have often played the roles of a "quality check" and information resource for many service providers who are considering adopting programs or developing new programs. An example of this is where CfC FPs are looking to adopt well-known and commercial programs that are not necessarily strongly evidence-based and often don't stand up to scrutiny in terms of proposed outcomes.
The program assessment process has been a great incentive for providers to adopt a more evidence-based approach. This has been particularly useful for CfC FPs who are commissioning programs for delivery via Community Partners. Many providers have expressed interest in having more programs assessed, beyond the 50% requirement, indicating an appetite for a process by which programs and services can be assessed in an ongoing fashion. However, we have needed to be careful to reiterate that the program assessment process does not constitute the endorsement of an "evidence-based program", as the criteria used are somewhat less rigorous than the more commonly understood meaning of what "evidence-based" constitutes. This is usually in relation to the rigour with which evaluations have been conducted, usually requiring a randomised-controlled trial.
Summary of benefits and challenges
To date, the implementation of the Expert Panel project has resulted in a strong research-policy-practice interface that draws upon key knowledge translation principles to increase the use of evidence in practice. AIFS has acted as a knowledge broker between service providers, DSS policymakers and members of the Panel to create a system that better aligns program activities with proposed outcomes. We have been able to provide tailored support to the service sector, while feeding back key issues to the department to create a dynamic policy environment that is responsive to need. The strong and highly collaborative relationship between the DSS National Office and AIFS has also been fundamental to the successful implementation of the project.
The project certainly hasn't been without its challenges. The variability of existing skills and knowledge in the sector at the commencement of the project has led to a need for tailored and often intensive support to plan, implement and evaluate programs. Program assessments are time-intensive and frequently arduous, and it is often the case that the challenge of documenting evidence of meeting the criteria is the biggest test of all. The challenge also remains in regards to how to recognise, support and validate good practice in the area of community-level programs.
Buy-in to the process at all levels is important for changes in practice by the DSS National Office, state and territory grant agreement managers, Expert Panel members, service providers and AIFS. Good communication channels are crucial and need close attention.
The Expert Panel project has highlighted that understanding the use of evidence in practice is a journey, and it is not a goal for which you can impose a one-size-fits-all approach. Since the evidence-based program requirement was actioned, there has been an enormous amount of goodwill and good intentions among service providers to meet what has at times been a formidable challenge. It has given providers an incentive to focus on delivering quality programs, and feedback from the majority of providers indicates that the exercise has helped them to increase their focus on understanding what works (and what doesn't) to improve outcomes for families and children.
1 See <www.dss.gov.au/our-responsibilities/families-and-children/programs-services/family-support-program/family-and-children-s-services#01>.
2 For more information on the criteria for these programs, see <aifs.gov.au/cfca/expert-panel-project/information-service-providers/frequently-asked-questions-communities-children-facilitating-partners#evidence-based>.
3 For further information see <aifs.gov.au/cfca/events/building-better-outcomes-framework-families-story-mallee>.
Elly Robinson is the Executive Manager - Practice Evidence and Engagement at AIFS.
Additional details on the Expert Panel project, can be found in the article "The Expert Panel project: Towards better outcomes for families" by Elly Robinson and Marian Esler, published in Family Matters 97, 2016.
For further information, email: email@example.com