Implementation and adaptation of evidence-based programs

Implementation and adaptation of evidence-based programs

Sophie Aitken
23 February 2016

This webinar addressed some of the challenges of implementing evidence-based programs within diverse community settings and client groups.

Adaptation and implementation of evidence-based programs

Audio transcript: Implementation and adaptation of evidence-based programs

Audio transcript (edited)

Webinar facilitated & speaker introduced by Elly Robinson

ROBINSON

This webinar is part of a series of resources that we have developed to support the work of the Expert Panel project. And I'll just explain the work of the Expert Panel project as a starting point. The project was commissioned by the Department of Social Services for five years from 2014 to 2019. The project supports families and children activity service providers to plan, implement and evaluate quality programs and share the results with others through the Child Family Community Australia information exchange. The aim is that programs and services are built on the best available evidence of what works for children, young people and families.

There's three main elements to the Expert Panel project. One is the Expert Panel itself, which is established by the Department of Social Services (DSS) and used to fund particular programs that have a national significance. The industry list, which is run by the institute here, provides support for service providers to plan, implement and evaluate programs using their own funding. And we also offer support to service providers directly – for example, the Communities for Children program, which is around the evidence-based requirements that they have specific to their contracts.

In terms of the support that is offered for service providers directly, there's the DSS funded programs that are administered by Expert Panel members, such as outcomes, measures, and Sophie and her organisation,
Drummond Street Services, is involved in this level of projects. There's also the service-funded projects that assist services to plan, implement and evaluate programs via the industry list. And face-to-face and helpdesk support resources and information, such as webinars like we're presenting today and other evaluation resources. So, this is part of our suite of resources that are available to support the work of agencies to plan, implement and evaluate programs.

It's now my pleasure to introduce today's presenter, Sophie Aitken. Sophie is a general manager of Implementation and Quality at Drummond Street Services. She plays a key role in providing support to family service agencies across Australia via the Department of Social Services' Expert Panel project in the areas of program planning and implementation, and outcomes evaluation.

Sophie also works with the Centre for Family Research and Evaluation, a collaborative applied research centre comprising academic, teaching and research institutions, Deakin University and Drummond Street Services. Sophie has qualifications in psychology, criminology and public administration, and has worked across government, private, academic and community sectors in service delivery, research and policy development positions. So, without further ado, please join me in giving Sophie a very warm virtual welcome.

AITKEN

Thank you, Elly, for that lovely introduction, and welcome to the audience members. It's a bit strange to be talking into a void here, but I'll just have to assume there are people listening. As Elly mentioned, I'm currently working with the Centre for Family Research and Evaluation (CFRE), which is a partnership between Drummond Street Services – a not-for-profit family service agency – and the psychology department at Deakin University. As Elly also mentioned, we've been supporting agencies under the Expert Panel project, both in the area of program planning and implementation, and outcome measurement.

In the course of working with other agencies, they've sometimes raised with us the question of the relevance of some of the evidence-based programs for their communities. Because the programs have often been developed and evaluated with mainstream communities, sometimes it can be difficult to find ways of implementing them with the diversities of families that they're working with. So, I thought it might be helpful today to talk about some approaches to selecting suitable evidence-based programs for your communities and to share some of CFRE's learnings about how programs can be adapted to make them more relevant for the diverse communities without losing the evidence base that supports them.

Before I do that, I just want to acknowledge that in working in this space of trying to link research to practice there can sometimes be a real tension or a disconnect between the expectations of sort of a pure research agenda or the requirements of the program developers with the realities of working and busy not-for-profit agencies working with complex families. And sometimes it can be really challenging to find a workable balance. I know that at Drummond Street, we've been moving towards evidence-based practices for the best part of 10 years now. And it has been a real process of trial and error.

Sometimes it's worked really well, but at other times we face hurdles and challenges, and struggle to make it work. So, while I do think we all have a responsibility to use the research and the evidence to get the best outcomes that we can for our families, I do also recognise that it's not always a straightforward process as some of these slides may make it appear. And that there'll always be bits and pieces that don't work as well as we'd like, and that's okay. The important thing is to keep working towards a more evidence-based approach and learn from what works well and what we could do better. 

So, having said that, the things that we'll be covering in the webinar today are how to go about selecting evidence-based programs to suit your target groups, measuring program fidelity and when you might need to think about adapting evidence-based programs and how you would go about doing this. We'll also look at the difference between good and bad adaptations. And also to think about what to do if adaptation isn't possible or feasible, how you can make your programs more evidence-informed by building in evidence-based components. Finally, we'll consider some of the particular issues when adapting programs for Aboriginal and Torres Strait Islander communities because our agencies tell us that that can present a particular set of challenges.

So, the need to consider adapting programs comes about because there's often a mismatch, as we said, between the population that the program was developed for and the target population that you want to use it with. As we mentioned, in many cases, the programs have been developed with mainstream populations and differences in the target population can operate as barriers to successfully implementing the program. So, some of the differences that you might come across in your families include differences in culture and language, and differences in the ages of the children that are being targeted in the program. You might be working with families from rural or remote communities as opposed to the urban communities that they've been developed with.

There might be differences in the education level or the socioeconomic status of your families. And they may differ in gender and sexual diversity. And, of course, there might be a mixture of these differences in your groups and that can further add to the complexity of adapting programs. So, at Drummond Street we've adapted programs for many different groups over the years. For instance, we've done a lot of work with newly arrived African families in the high-rise commission flats and had to adapt programs to suit their needs. We've developed relationship groups for gay couples. We've adapted some of the parenting programs specifically for fathers. We've had to adapt to anxiety groups for teenagers that were originally developed for younger children, just to name a few.

So, before you think about adaptation, you need to decide which evidence-based programs are going to be best suited to your program aims and target groups. And you'd normally identify those through doing a community needs and service gap analysis. So what you're wanting to be clear about is who are the at-risk groups in your community and which of the common risk factors are prevalent in your community. So, are there high levels of family violence or school disengagement? Lack of service system? High rates of incarceration and so on. You want to be clear about the ages of the children that you're working with and that may be determined by the service gaps in your communities. And also the type of program that you're looking to deliver – so, whether you're working with parents or children's groups, or delivering programs in schools or providing intensive family support programs and so on.

Once you're clear about these things, then you can go about identifying the best program to meet those needs. There's three ways that you could go about doing this. If you can find an existing evidence-based program that meets your program's aims and that's been developed and evaluated with your target groups, then you can go ahead and implement it without making any changes to the original program. If you do find a program that meets your aims but perhaps isn't well suited to your particular group, that's when you want to consider adapting the existing evidence-based program to make it more suitable. And then the third point that I'd like to look at is the idea of developing a program and enhancing it with evidence-based core components. So, we'll consider each of these options.

In thinking about which evidence-based program to adopt, you want to consider does the program meet your needs and do you have the resources needed to implement it over time? Implementing evidence-based programs is something that requires careful planning. And it may involve making changes within your agency to support the delivery of the program. This – the hexagon tool – is just one example of a tool that's been designed to help agencies to do this in a structured way by considering the six factors listed there. So, you can evaluate the suitability of particular programs for your agency. As we mentioned, you want to identify how well the program meets your needs, whether it fits well with your agency's priorities and structures, whether you've got the resources available to implement it and whether the evidence exists that the program will achieve the outcomes that you're looking for.

Do you have the expertise and support within your agency to implement the program? And whether you'll be able to sustain the implementation over time. So, it's important to think these things through before deciding on a particular evidence-based program. A good place to start, and you'd all be familiar with this by now, I imagine, is the AIFS guidebook of evidence-based programs. The website is listed there, so you can search the evidence-based program profiles using the C for C program objectives, which are listed here. You can search by a target group – it might be the age of the children that you're trying to target or whether you're looking to target parents or at-risk and vulnerable families. Or you can search with certain key words.

So, the program profiles will give you some information about which groups the program was evaluated with, its cultural relevance, as well as a link to the research. Again, many of the programs will have been designed with mainstream populations, but some have been specifically designed for or evaluated with different minority groups. So, it's worth exploring this in selecting your program. Another option that I thought was worth mentioning, particularly if you're working at a whole of community level, such as the C for C Facilitating Partners, is to explore working with the Communities that Care project.

Communities that Care is a framework or a process to help agencies and communities develop long-term plans to promote healthy development of children and young people. And this plan is based on a risk and protective factor profile of your community. It was originally developed in America and was introduced in Australia in 2000. Professor John Toumbourou from Deakin University, who is one of the principals of CFRE, is also the CEO of Communities that Care Australia. So, just in brief, the process involves a number of stages, which includes understanding the local needs of your community by conducting community profiling, which includes youth surveys. And that's to identify the risk and protective factors in your community across individual family, peer school and community domains. 

It also involves community engagement to get the support of key leaders and to build working partnerships between the residents, the organisations and the agencies. And most importantly, it involves the development of a community action plan, which is established to select and implement evidence-based prevention programs and intervention programs that have been designed to address the selected priorities and targets and that have relevance in an Australian context. Finally, built into it is a monitoring and evaluation process, which is implemented to assess the effectiveness of the local community plan. I just thought that was worth mentioning for those that are working across a community level because a lot of this webinar is more at the program level.

So, once we've selected our program, it's really important to think about implementing it with fidelity. Selecting the program is the first step, but you need to implement fidelity measurement to make sure that the program is being delivered as it was intended. When we're talking about fidelity, we're talking about faithfulness to the original program. Because otherwise you can't know whether the outcomes that you achieve or don't achieve are due to the program not being effective or to the program not being delivered properly.

Implementation may prove ineffective or inefficient or unsustainable if the monitoring doesn't occur. Practitioners will often change or adapt evidence-based programs as they implement them over time, so the program drifts from its original purpose or design. And then you lose the fidelity to the program. But before we can measure fidelity, we need to know what are the core components that need to be adhered to. We need to be clear at the target group that we're seeking to deliver to so that we can make sure that the right people are turning up to the program. We need to be clear about what the qualifications of the facilitators and what the training needs to be, how many sessions are to be delivered and the core program content that's to be covered. And we also want to be clear about what are the key activities and teaching methods.

So, these might be things like modelling, role-plays, peer support, homework, group activities, videos, handouts. We need to be clear about all of those different components that make up the program so that we can measure whether they're being delivered as intended. And then we also want to think about participants' engagement and responsiveness because there's no point delivering the program if the participants are not engaged with it. Once you've identified these core components, you then need to think about how you can go about measuring them. So, there's just a few points here to think about in measuring fidelity. Collecting demographic data about the clients enables us to make sure that we're reaching the target group that we're aiming to reach.

So, it might be information about which cultural group they're from, the ages of their children, their socioeconomic status and so on. Because we might find that in analysing this, we're still not reaching the most vulnerable and disadvantaged groups that we were meaning to. So, then we might think about putting other measures in place to address that. Recording details of the facilitators will ensure that they have the relevant qualifications and training. Or it'll identify where the gaps are if they haven't got it and what needs to be addressed. Collecting attendance records for every session allows us to identify whether some participants are missing important sessions. You might need to then deliver those to them individually or get them to attend a different group.

And it also allows us to measure the effectiveness based on the number or the content of the sessions that they have attended. Attendance records are really useful data to collect. Another suggestion is to complete a session record, which identifies which topics were covered in that session and which activities or homework were delivered. So, this can be like a checklist of the key topics and activities and the facilitators can just tick at the end of the session which ones were delivered. And that's a way of monitoring to make sure that all the relevant content and activities have been delivered. It's also a good idea to have, particularly for new facilitators, to have an observer attend to rate their fidelity to the teaching methods. So, if they're supposed to be teaching with a strengths-based approach for example or to be modelling certain behaviours during the program, then it's a good idea especially in the early stages to have somebody that can measure that.

And then in terms of the client responsiveness, one possibility is to record which participants have completed their homework tasks. Another is to see how many sessions they've attended or if they've stopped attending sessions, and also obtaining client satisfaction surveys can be useful data to measure that. I just want to move on now to looking at when we need to adapt an existing evidence-based program. So, once you've selected the program that you might wish to implement, you might need to consider making adaptations. One question that often comes up in talking about this is: Is it appropriate to implement and adapt evidence-based programs for groups that are different to those they were originally developed and evaluated with?

There are some that argue that programs must be delivered with strict adherence to the original program and there should be no compromises. But I think this is one of those examples where a strict research agenda needs to be balanced with the reality of working with diverse families. You need to make sure that the program is going to meet the needs of your families if it's going to be effective. So, there's no point trying to force families into a program that doesn't meet their needs. If we don't adapt the program, we risk failing to engage the target groups. And there's no point delivering an evidence-based program if nobody turns up or if they stop coming or if it's not meaningful to them. And so adaptations are sometimes necessary.

We know from the research that there are certain conditions that are important for children to thrive, regardless of the culture or the community or the family group in which they're raised. So there is a rationale for adapting evidence-based programs as long as we maintain the core components. This slide highlights some of the common risk and protective factors for child wellbeing. So, for example, children are going to be at greater risk where they experience abuse, family conflict, parental substance abuse and so on regardless of their culture or the location where they're growing up. And therefore programs that are demonstrated to address these risk factors or which develop protective factors such as building school connectedness or teaching social and emotional skills to children or strengthening parent-child attachment, these programs are likely to be of benefit across different groups.

When should we think then about adapting a program? Barrera and Castro are two researchers who've done a lot of work in this field and they suggest that adaptation is justified in one of these four conditions. The first being in effective engagement. So, if you've tried to deliver an evidence-based program but people from your target group are not turning up or they drop off after one or two sessions, then you might need to think about making changes to the program to increase participation rates. The second condition is where there are unique risk or protective factors in your target group. So, if your group had a particular experience or a particular stressor – it might be experiences of discrimination or trauma or forced relocation – you may need to consider this in the way that you deliver the program.

Sometimes as well research shows that there are differences in risk and protective factors for different cultural groups. So, for example, while assertiveness is seen as a protective factor for children in Western cultures, that's not always the case in some Asian cultures, and in fact low assertiveness in Chinese children has been found to be protective. So, it’s being mindful that there may be differences in the risk and protective factors across some cultures. The third point there, the unique symptoms of a common disorder is probably less relevant for our early intervention programs – it's more for clinical treatment programs. But it's just being aware that for some groups there may be differences in the way that a condition appears.

And then the final point there is if effectiveness is not demonstrated. So, if you do deliver an evidence-based program without making any changes and you find that the outcomes that you were seeking were not achieved, then that may be an indication that some adaptation is required to make it more relevant. If you are considering adapting an evidence-based program, there are a number of steps that you'd want to go through. The question, essentially, that you're asking is what needs to be changed and how you might go about doing it. We've already thought about the first two steps there – the importance of conducting the community needs assessment, and selecting an evidence-based program. The third point here is being able to identify what are the important differences between your families and the original population that the program was developed and evaluated with.

These differences, as we touched on before, could fall across a number of areas. So, these are some of the things that you might need to think about in adapting the program. There might be differences in language, which is obviously going to impact on the ability of the target group to understand the program content. You might want to think about changing the language in which the program is delivered or simplifying the language for families where English is a second language. When we're talking about language, we also need to think about the concepts and whether they have cultural relevance for different groups. So, for example, at Drummond Street when we're working with African populations and wanting to talk about issues of mental illness and depression and anxiety, those concepts don't translate terribly well.

The term mental illness for that population often means just crazy. And they often don't differentiate between physical and psychological symptoms. So, we consulted with members of the community to come up with more translatable terms that meant talking – rather than talking about depression and anxiety. We'd talk about things like the level of energy or the level of pressure they were experiencing, sleep problems and so on. So, in terms of those first two points there, in terms of language and ethnicity, they're the sorts of things that you'd want to think about in delivering or adapting your programs.

There might also be difference in socioeconomic status. And so what we're talking about there are the resources that the families have to draw on, whether it's transport or childcare, or it might be their education levels. So you may need to adapt the program depending on their capacity in that respect. And then, as we mentioned, urban and rural context. There's obviously going to be logistical barriers if somebody is living in a rural or remote setting, which is going to affect their participation in program activities. The next one there is one that's often relevant for our families where they're impacted by a number of different risk factors or – and they may differ in severity. If a program is just addressing one risk factor, so it's something like perhaps parenting style, but the family is actually dealing with a whole lot of different risk factors that might be to do with housing instability or financial hardship or family violence, then just addressing that one risk factor is probably not going to have much of an impact for that family. And similarly with family stability.

So, they're the sorts of things that you want to think about where you might need to adapt your programs to make them more relevant. In deciding what changes to make, the goal is to modify the program in a way that makes it more suitable for your group without changing the program's core components. When we talk about core components, we mean the elements of the intervention that are thought to be responsible for its effectiveness. So, you want to be really clear about what the core components of the program are. And to do this, you need to understand the program really well. What is the theory of change that informs the program? It can be helpful to talk to the program developer to ascertain what the core components are that can't be changed. But essentially, there are three aspects of the program that shouldn't be changed as shown on this slide.

So, firstly, there's the core content components and that's looking at what the program is teaching in terms of changes of knowledge or changes in skills and so forth. So, it might be teaching things about the normal stages of child development or teaching children social and emotional skills or teaching parents' behaviour management skills. You can't delete content from the program. Sometimes we meet with facilitators who say, "Oh, this particular session was too hard for our family so we're leaving it out." That's not the way to address it. You need to leave the core content in there, but find ways to make it more accessible for your family. So, it might be changes in the language, it might be allowing for more discussion of that topic or using visual aids and diagrams rather than words. Once you take the content out, then you're changing the intervention.

The second one there are the pedagogical components. So, that's talking about how we're teaching the information as opposed to the content of the information. It might be – as we mentioned before, it might be things like modelling certain behaviours or incorporating role-plays into the sessions. Or it might be homework activities and those sorts of things. So, it's the method of teaching rather than the content. And then thirdly, the implementation components. This is, as we mentioned before, this is about having the resources necessary, having suitably qualified staff to deliver the program, having all of the materials that you need. So, they're the things that we don't want to be changing in adapting our programs.

The things that we can look at changing, however, are listed here. So, location. We want to think about the settings where the participants are going to be most comfortable and where they're going to feel safe and the locations that are accessible to them. I know for some Indigenous communities they deliver their playgroups in a park setting because that's a more comfortable space. For some remote communities, they deliver programs in schools, which is often quite successful. It's a place where families are already familiar and that they will be congregating anyway. At Drummond Street, we deliver programs from the high-rise housing estates because that's where the families are that we're wanting to work with. So, you can certainly think about changing the location to make programs more accessible.

The second point there is about accessibility, which is not just about location, but it's also about whether your target group can get to the program. It might be about delivering the program after hours when they're not at work. It might be providing childcare for families so they can get there or thinking about transport. And for some families, it might be delivering programs online. Staffing as much as possible should reflect the diversity of the group that you're working with. So, using community members or leaders to co-facilitate or to facilitate the program can be very beneficial. We've certainly done that often with our African parenting programs. And the other facilitators you would want them to be trained in cultural competence as well.

So, we can also look at changing the language as we mentioned before or the other resources. So, adapting the images that you're using, adapting the content of the role-plays and so on to make them more familiar, more relevant for your target group. When we've delivered parenting programs for fathers – it can sometimes be challenging to engage fathers – we make sure that we're talking about fathers that the role-plays are including fathers and fathering activities. We've delivered the programs in outdoor settings and incorporated some physical elements to the programs, just as ways to make it more relevant for fathers and more engaging.

And then in terms of the activities, as we've said, you might want to minimise or eliminate written tasks if you've got low literacy levels or replace written tasks with discussion and so forth. So, returning to this slide again, once you've made the changes that you'll need, that you want to make, you need to pilot test the program with your community members and with facilitators. Because it won't be until you try it out that you find out what's working or whether there are more changes that are indicated. It's also an opportunity to get feedback at each session from the participants and the facilitators about how it's going. Keep in mind that adapting a program is a process that occurs over time. You might need to have a few goes until you get it right. Also remember to document any changes that you're making to the original program so that it an be replicated in future.

And the last point there is that it's really crucial to conduct outcome evaluations to make sure that our adaptations have been successful in maintaining the effectiveness of the program. There can be additional challenges conducting evaluations with different cultural groups, because, again, a lot of the standardised measures have been developed with mainstream populations and don't always translate terribly well. But we might have to leave that for the next webinar. And so then the third option there that I just wanted to touch on was the idea of developing or enhancing your own program with evidence-based core components. So, at times you might find that there isn't an evidence-based program that you feel is going to suit the needs of your target group, even with the good adaptations of the kind we've discussed.

In that case, you might think about developing or adapting your own programs by incorporating evidence-based program intervention components that have demonstrated effectiveness. This is an approach we've sometimes found to be useful at Drummond Street. Obviously, the preference is to use an existing evidence-based program because that's got all of the research behind it and has had robust evaluations done. But sometimes this can be a helpful solution. So, as many of you would be aware, there's certain conditions that you'd need to meet to receive approval through AIFS for your program to meet the 50 per cent evidence-based requirement for C for C's. And they're the same sorts of things that you would want to consider and develop in your own program. So, you'd want to make sure you've got a strong research background or theoretical framework to inform the program, that you've articulated a program logic or a theory of change, which explains how the interventions are going to lead to the outcomes that you're seeking.

You want to make sure that you've included evidence-based interventions or activities – that's what we'll have a look at in a minute – that you've got a program manual and documentation so the program can be replicated, that you've got suitably qualified facilitators, and that you've conducted a program evaluation to measure its effectiveness. So, in terms of those evidence-based interventions, once you've determined the type of program that you're seeking to deliver, you can then review the literature to find out what is known about the core components of effective programs of this type. Now, the CFCA website is a good starting point or you can conduct your own search for systematic reviews or meta-analyses, and then seek to incorporate some of these components into your own programs.

So, for instance, we had a look at the core components of effective parenting programs, and this was based on some systematic reviews and meta-analyses that have been done recently. I won't go through all of those now, but you can see that they can identify which sorts of skills you want to teach in your program, which might be parent-child play; problem solving or emotion regulation skills that the programs were more effective were held in two hour sessions, and that between six and twenty-four sessions were delivered with two facilitators, including fathers and grandparents in the programs; giving homework tasks to practice at home; and the weekly phone calls to provide support and encouragement, just as some examples. So, you can see that once you are familiar with those, you can actually try and incorporate some of those into your own programs.

We did something similar at Drummond Street with our intensive family support program. We did some research to identify the key components of effective family support, which I've listed there, and then we developed our program around those components. So, we have one primary worker that's allocated to the whole family to hold the case for the period of their support. We work with all members of the family and work to engage – specifically engage fathers. We include home-based support and practical support so it's not just a counselling intervention. We work assertively to engage families rather than just closing the case if they don't turn up, and work collaboratively with the whole family and setting goals. And that last point there, we work across a number of risk factors because the evidence tells us that, particularly for high-risk families, it's going to be more effective if you work across a number of risk factors rather than just focusing on one.

So we've identified six domains of family wellbeing that we work across, which includes parenting and family functioning, but also social connectedness and material wellbeing and so on. And we've also worked to incorporate those fidelity measurements to ensure that these components of family support are being delivered in our programs. We’ve actually developed a database that can tell us how many families have received a home visit, how many sessions each member of the family has had and so on. Finally I thought it was worth spending a little bit of time talking about the challenges of implementing evidence-based programs with Aboriginal communities, particularly those who live in more remote or more traditional Aboriginal communities because this is an area that some agencies tell us present particular challenges.

These are some of the reasons why it can be challenging to implement mainstream programs with remote Indigenous populations. The remoteness in itself can make it difficult to engage in programs. A history of trauma as a consequence of white settlement. So it often leads to a lack of trust in authority figures, a difficulty in discussing sensitive themes of family and children, and a reduced confidence in their parenting capacity. It’s important to meet families where they're at and just set realistic outcomes. The children are often raised by community, and experience different parenting norms from the mainstream communities. Many evidence-based parenting programs assume that there's a sort of nuclear family model where the children live with the parents and the parental responsibility for a child's behaviour is relatively clear and unambiguous.

That doesn't necessarily translate well for some traditional Aboriginal communities who have more fluid living arrangements, where children may be unsupervised or may move from one house or one family to another. That clear parental responsibility doesn't translate terribly well. Often they're not used to meeting the demands for regular attendance and adherence to timelines, and so that can make participation rates inconsistent. And there's not a lot of programs that have been evaluated with ATSI populations. But there are some. I've just listed a few there. It's not a comprehensive list. These are some of the programs that have demonstrated some success with Aboriginal communities that you might like to consider.

Alternatively, if you're looking to adapt another program for an Aboriginal population, there's certain points to consider. We've probably covered most of these before, but it's important to remember that it's essential to engage with the community leaders and to conduct consultation about the program content and the resources, to try to engage the whole community in activities rather than just focusing on individual parents. Employing local community leaders to help facilitate the program and ensuring that the non-Indigenous facilitators are trained in cultural competence. As we mentioned before, the location should be a safe place and easily accessible. It's a good idea to allow extra sessions for the sharing of personal stories and the development of trust between the participants and the facilitator. Again, thinking about culturally appropriate language, activities and resources, and adapting the activities to reflect the literacy competence of the group.

Finally I thought it'd be worth just considering one example of a program that was adapted for an Indigenous community. This was an adaptation of an evidence-based parenting program for Tiwi Islanders that one of our CFRE consultants Rama Pria was involved with.

Exploring Together is a parenting program for at-risk children, which was adapted for the Tiwi Islander community. The program includes three elements: there's a parent group, which teachers a behavioural approach to parenting; a children's group, which teaches emotional and psychosocial skills; and a parent and children's group, which is designed to strengthen the relationships between parents and children. There were some challenges that they experienced in delivering the program. They found that the children had very limited feelings vocabulary and so they found it difficult to articulate or to discuss feelings.

The parents also didn't readily or easily articulate their feelings or observations about one particular child. The behavioural approach to parenting didn't fit very well with the parenting style of the community, which was much more easygoing, less structured. As we mentioned before, they had the fluid living arrangements where children were moving amongst different family members. Even the basic concepts of behaviour management strategies and the styles of thinking associated with it were quite foreign to the Tiwi thinking style. And finally, the parent attendances were lower than they would have liked.

Some of the adaptations that they made. They delivered the program in a school environment so it was a familiar and safe place. They gained the support of community leaders, and four of the six program staff were Tiwi Islander themselves. They adapted the children's activities to reflect their limited feeling vocabulary. So, they had facilitators exploring the management of emotions verbally – they had children create collages by cutting faces from magazines and using them to illustrate emotions and they also used role-plays and stories told by the children, as well as simple games and activities to strengthen emotional competencies. They adapted the behaviour management approach to parenting to move it towards more of a group work approach with the parents using more of a family systems therapy approach to parenting.

Despite the challenges, there were still positive outcomes achieved. They found that the parent-child relationships were enhanced, that the children did develop social and emotional skills, that they had success in engaging the families and that they saw improvements – the teachers and the parents described improvements in the children's behaviour. I guess the message from that example is that there are challenges and not everything is going to necessarily go according to plan. But there are still benefits to implementing the evidence-based program and adapting it for different communities.

In conclusion, while there are undoubtedly challenges, positive outcomes can be achieved if you go through a careful process of adapting the original program whilst maintaining the core components. So, don't be afraid to try. The important things to keep in mind are that adaptation is often important for engagement and effectiveness, that you need to consult with members of the target population about what changes to make, that you want to maintain the core components of the program. It's a good idea to discuss any changes that you're making with the program developer and to trial the adapted program with the target population, and building fidelity measurement and program evaluation. So, that brings me to the end of the formal presentation. Thank you for listening and I've got some contact information there that I think will be sent around later.

WEBINAR CONCLUDED

IMPORTANT INFORMATION - PLEASE READ

The transcript is provided for information purposes only and is provided on the basis that all persons accessing the transcript undertake responsibility for assessing the relevance and accuracy of its content. Before using the material contained in the transcript, the permission of the relevant presenter should be obtained.

The Commonwealth of Australia, represented by the Australian Institute of Family Studies (AIFS), is not responsible for, and makes no representations in relation to, the accuracy of this transcript. AIFS does not accept any liability to any person for the content (or the use of such content) included in the transcript. The transcript may include or summarise views, standards or recommendations of third parties. The inclusion of such material is not an endorsement by AIFS of that material; nor does it indicate a commitment by AIFS to any particular course of action.

Slide outline: Implementation and adaptation of evidence-based programs

Slide outline

  1. Implementation and adaptation of evidence-based programs
    • Presenter: Sophie Aitken
    • This webinar is part of the CFCA information exchange webinar series: www.aifs.gov.au/cfca
    • Please note: The views expressed in this webinar are those of the presenter, and may not reflect those of the Australian Institute of Family Studies or the Australian Government
  2. Expert Panel Project
    • Commissioned by Department of Social Services (DSS)
    • 5 year project (2014-2019)
    • Support Families and Children Activity service providers to
      • plan, implement and evaluate quality programs
      • share the results with others
    • Aim: programs and services are built on the best available evidence of what works for children, young people and families.
  3. Elements of Expert Panel project
    • Expert Panel
    • Industry List
    • Support for service providers (e.g. Communities for Children)
  4. Support for service providers
    • DSS funded projects that are administered by Expert Panel members (e.g. outcomes measures)
    • Service-funded projects that assist services to plan, implement and evaluate programs (via Industry List)
    • Face-to-face and helpdesk support, resources and information (e.g. webinars, evaluation resources)
  5. Implementation & Adaptation of Evidence-Based Programs
    • Sophie Aitken
    • Centre for Family Research & Evaluation
    • 23rd February, 2016
  6. Centre for Family Research and Evaluation: CFRE
    • A partnership between - drummond street services - not-for-profit family services agency + & Deakin University Psychology Department - academic, teaching and research institution
    • To promote the health and wellbeing of all Australian families by contributing to the evidence-base of family based interventions
    • To build sector capacity to strengthen evidence-based programs through expertise and collaboration
  7. Image of young child on toy phone
    • Look, I don't know why the square peg doesn't fit in the round hole. But that's not my problem. Fix it.
  8. What we will cover in this Webinar
    • Selecting evidence-based programs
    • Measuring program fidelity
    • When to consider adapting EB programs
    • Process for adaption
    • "Good and bad" adaptions
    • Building evidence-based components into your own programs
    • Adapting for ATSI communities
  9. Image of a mother and child
  10. Differences that may warrant adaptation
    • Culturally and Linguistically Diverse
    • Aboriginal and Torres Strait Islander
    • Ages of Children targeted
    • Rural and Remote communities
    • Education Level
    • Socio-economic Status
    • Gender & Sexual Diversity
  11. Community Needs & Service Gap Analysis
    • At-risk groups
    • Risk factors
    • Ages of children
    • Program type
  12. Options
    1. Implement an existing Evidence-Based program
    2. Adapt an existing Evidence-Based program
    3. Develop a program with Evidence-Based core components
  13. The hexagon tool
    • Needs: How well the program meets identified needs
    • Fit: Overall fit with current priorities, structures and support
    • Resource availability: For training, staffing, technology, data systems and administration
    • Evidence: Indicating the outcomes that might be expected if the program is implement well
    • Readiness for replication: Expertise, exemplars available for observation, how well the program is operationalised
    • Capacity to implement:As intended and to sustain implementation over time
    • Source: Blasé, K., Kiser, L, & Van Dyke, M. (2013) The Hexagon Tool: Exploring Context. Chapel Hill, MC: National Implemention Research Network, FPG Child Development Institute, University of North Carolina at Chapel Hill.
  14. AIFS Guidebook of Evidence-based Programs
    • CfC program objectives
      • Create Strong Child-Friendly Communities
      • Early Learning & Care
      • Healthy Young Families
      • School Transition & Engagement
      • Supporting Families & Parents
    • Target groups
      • Infants (0-2 years)
      • Early Childhood (3-5 years)
      • Middle childhood (6-12 years)
      • Parents
      • At-risk or vulnerable
    • Keywords
      • https://apps.aifs.gov.au/cfca/guidebook/search
  15. Communities That Care®
    • A research-based process to enable communities to develop long-range plans for promoting healthy development of children and young people
    • Adapts to the needs of different and distinct communities
    • The process:
      • Understanding local needs - Community profiling including youth surveys
      • Identifying risk and protective factors across individual, family, peer, school and community domains
      • Community Engagement - mobilise support of key leaders and build working partnerships between residents, organisations and agencies
      • Develop Community Action plan to select and implement Evidence-Based programs and practices
      • A Monitoring & Evaluation plan to assess effectiveness
  16. Program Fidelity
    • Is the program being carried out as intended?
    • We can only know this if we put in place measures to monitor that all of the core components of the EB program are being adhered to
    • this is known as fidelity measurement
  17. Identify Core Components
    • Before we can measure fidelity need to identify what are the core components that need to be adhered to:
      • Target group we are seeking to deliver to
      • Qualifications of the facilitators
      • Number of sessions to be delivered
      • Core program content to be covered
      • Key activities and teaching methods - modelling, role plays, peer support, homework, group activities, videos, handouts
      • Participant engagement and responsiveness
  18. How can we measure fidelity?
    • Collect demographic data about clients
    • Record details of facilitators including qualifications and whether they have completed relevant training
    • Collect attendance records for every session
    • Complete session records to track which topics were covered and which activities/homework were delivered
    • Include trained facilitators to observe and provide feedback on delivery methods
    • Record which participants completed homework tasks
    • Obtain client satisfaction surveys
  19. Time to adapt
    1. Implement an existing Evidence-Based program
    2. Adapt an existing Evidence-Based program
    3. Develop a program with Evidence-Based core components
  20. Common Risk Factors Common Protective Factors

    Toumbourou, J. (2015) A review of therapeutic processes targeted in evidence-based parent and family interventions, Prevention Science Consulting Group, Unpublished paper.; Commonwealth of Australia (2000) National Mental Health Stratagy, Promotion, Prevention and Early Intervention for Mental Health A Monograph 2000

    Unhealthy lifestyle Healthy lifestyle
    Maltreatment, neglect & abuse Emotional competency skills
    Family Conflict Family cohesion/harmony
    Harsh parenting styles Parental involvement in children’s activities
    Parental mental illness/substance abuse Supportive relationship with an adult
    Poverty/economic insecurity Access to health and support services
    Lack of warmth & affection Community & social connectedness
    Social isolation Social skills
    Homelessness Family stability and security
    School failure/low commitment to school Opportunities for success at school
    Bullying & victimisation Prosocial peer group
  21. When to consider adapting…
    • Ineffective engagement
    • Unique risk or protective factors
    • Unique symptoms of a common disorder
    • Effectiveness not demonstrated
    • Source: Adapted from Barrera, M. & Castro, F.G. (2006), A heuristic framework for the cultural adaption of interventions. Clinical Psychological Science Practice, Vol 13, pp.311-316.
  22. Process for Program Adaptation
    1. Community Needs Analysis
    2. Choosing an Evidence Based Program
    3. Identifying Differences between Target and Original Populations
    4. Deciding what Changes to Make
    5. Pilot Testing the Adapted Program
    6. Evaluating Effectiveness
    • Adapted from Chen, E.K., Reid, M.C., Parker, S.J. & Pillemer, K. (2013) Tailoring Evidence-Based Interventions for New Populations: A Method for Program Adaptation Through Community Engagement, Evaluation Health Professional, Vol 36 (1), pp.73-92
  23. Source of mismatch Original validation Groups Current target group Potential mismatch effect
    Language English Non-English speaking Inability to understand program content
    Ethnicity Anglo-saxon, non-minority Minority group Conflicts in beliefs, values and or norms
    Socioeconomic Status Middle class Low SES Insufficient resources and different life experiences
    Urban-rural Context Urban, inner city Rural, remote Logistical and environmental barriers affecting participation in program activities
    Risk factors: Number & Severity Few and moderate in severity Several and high in severity Insufficient effect on multiple or most severe risk factors
    Family stability Stable family systems Unstable family systems Limited compliance in program attendance and participation
  24. What not to change: “Bad” Adaptations
    • Core content components:
      • knowledge, attitudes, values, and skills
    • Core pedagogical components:
      • how it’s administered - methods, strategies, interactions
    • Core implementation components:
      • logistics, resources, staffing and ability to maintain fidelity
    • Source: Resource Centre for Adolescent Pregnancy and Prevention (ReCAPP, 2013)
  25. “Good” adaptations
    • Location
      • Participants feel comfortable and safe
    • Accessibility
      • Participants can easily get to the program
    • Staffing
      • Facilitators reflect the diversity of group
    • Language & Other resources
      • Group content is understood
    • Activities
      • Tailored to capacity and culture of group
  26. Process for Program Adaptation
    1. Community Needs Analysis
    2. Choosing an Evidence Based Program
    3. . Identifying Differences between Target and Original Populations
    4. Deciding what Changes to Make
    5. Pilot Testing the Adapted Program
    6. Evaluating Effectiveness
  27. Options
    1. Implement an existing Evidence-Based program
    2. Adapt an existing Evidence-Based program
    3. Develop a program with Evidence-Based core components
  28. Developing an Evidence-Informed Program
    • Theoretical or Research Background
    • Program Logic or Theory of Change
    • Evidence-Based Interventions or Activities
    • Program Manual & Documentation
    • Qualified Facilitators
    • Program Evaluation
  29. Core Components of Effective Interventions
    • Determine what type of program you want to deliver - e.g. parenting programs, childrens’ programs, whole family support, playgroups
    • Review the literature to find out what is known about the core components of effective programs of this type
    • CFCA website a good starting point https://aifs.gov.au/cfca/ or search for systematic reviews or meta-analyses
    • Incorporate these components into your own programs
  30. Core Components of Effective Parenting Programs
    • Teaching skills in: parent-child play; helping your child learn; limit setting; problem-solving; emotion regulation; communication
    • Understanding different stages in childrens’ development
    • Programs held in 2 hour sessions
    • Between 6 and 24 sessions
    • 2 facilitators
    • Programs based on behavioural approaches
    • Include fathers and grandparents
    • Homework tasks to practice at home
    • Focus on strengths rather than deficits
    • Video and role play to demonstrate parenting strategies
    • Weekly phone calls to provide support and encouragement
    • Participant discussion and sharing of stories to enable peer support
  31. Key Components of Effective Family Support
    1. Dedicated Worker
    2. Whole Family
    3. Father-inclusive
    4. Home-based
    5. Includes practical support
    6. Assertive engagement
    7. Collaborative
    8. Strengths-based
    9. Goal-directed programs rather than those offering generic support
    10. Programs that target a number of risk factors either simultaneously or sequentially
  32. Challenges of Implementing Evidence-based Programs for ATSI communities
    • Remoteness
    • History of Cultural Decimation and Stolen Generations
    • Children Raised by Community rather than Nuclear Family
    • Different Parenting Norms
    • Fluid Living Arrangements
    • Demands for Regular Attendance & Adherence to Timelines
    • Programs Not Evaluated with ATSI populations
  33. Some effective programs
    • Aboriginal-specific programs that have shown to be effective:
      • NSW Aboriginal Maternal and Infant Health Strategy (NSW Health, 2005)
    • Mainstream Programs that included Indigenous families in evaluation:
      • The Family Independence Program (Homel, R. et al, 2006)
      • The Preschool Intervention Program (Homel, R. et al, 2006)
    • Mainstream EB programs that have been partially successfully adapted for Indigenous families:
      • Triple P - Positive Parenting Program (Turner et al, 2007)
      • The Resourceful Adolescent Program (Rowling et al, 2002)
      • MindMatters (Hazell, 2005)
      • Exploring Together Preschool Program (Robinson et al, 2009)
      • Family and Schools Together Galiwin’ku program (Guenther, J. 2011)
  34. Adaptations to Consider for ATSI Programs
    • Community Consultation on program content and resources
    • Engage whole Community in activities
    • Employ local community leaders to help facilitate the program
    • Non-Indigenous facilitators trained in cultural competence
    • Location must be considered a ‘safe’ place and be easily accessible
    • Extra sessions to allow for sharing of personal stories and development of trust between participants and facilitator
    • Culturally appropriate language, resources and activities
    • Adapt activities to reflect literacy competence of the group
    • Source: Mildon, R., & Polimeni, M. (2012). Parenting in the early years: effectiveness of parenting support programs for Indigenous families. Resource sheet no. 16. Produced for the Closing the Gap Clearinghouse. AIHW cat. no. IHW 77. Canberra
    • Source: Guenther, J. & Boonstra, M. (2009) Adapting Evaluation Materials for Remote Indigenous Communities and Low-Literacy Participants, Paper Presented to APCCAN 2009 Symposium 08, Perth, 17 November, 2008.
  35. Exploring Together for Tiwi Islanders
    • Exploring Together - Parenting Program for at-risk children
    • Parent group - Behavioural parenting approach
    • Childrens’ group - Psycho-social skills
    • Parent-children group - strengthen parent-child relationship
    • Source: Robinson, G. & Tyler, B. (2003) Evaluation of an early intervention program on the Tiwi Islands, Interim Report, School for Social Policy and Research, Charles Darwin University, Australia.
  36. Exploring Together for Tiwi Islanders - Challenges
    • Children had limited “feelings” vocabulary
    • Parents did not readily articulate feelings or observations about one particular child
    • Behaviour management approach to parenting did not fit well with parenting norms and thinking styles
    • Parent attendances were lower than optimal
  37. Exploring Together for Tiwi Islanders Adaptations
    • Delivered in School Environment so familiar/safe place
    • Gained support of Community leaders
    • 4 of 6 program staff were Tiwi Islander
    • Adapted childrens’ activities to reflect limited feeling vocabulary
    • Adapted parenting management approach to a more family systems therapy approach
    • Outcomes - despite challenges
      • Parent/child relationships enhanced
      • Children developed social-emotional skills
      • Success in engaging families
      • Improvements in childrens’ behavivour
  38. Key Messages for Adapting EB programs
    • Adaptation is Important for Engagement and Effectiveness
    • Consult with Members of the Target Population
    • Maintain the Core components of the Program
    • Discuss any Changes with the Program Developer
    • Trial the Adapted Program with the Target Population
    • Build in Fidelity Measurement & Program Evaluation
  39. Thank you for listening
    • Contact Information:
      • Centre for Family Research & Evaluation
      • drummondsectorsupport@ds.org.au
      • Ph: (03) 9663 6733
  40. Further information
    • Expert Panel project website
      • https://aifs.gov.au/cfca/expert-panel-project
    • Email
      • fac-expert-panel@aifs.gov.au
    • Join the conversation
      • You can continue the conversation started here today and access a range of related resources on the CFCA website: www.aifs.gov.au/cfca/news-discussion

This webinar was held on 23 February 2016.

Drawing on the combined knowledge and experience of drummond street services and Deakin University’s School of Psychology, the webinar provided family service agencies with advice on how to select, implement and adapt evidence-based programs to meet the needs of local communities.

Topics covered included:

  • Selecting evidence-based programs to meet your community’s needs.
  • Adapting programs for diverse client groups without losing the evidence base.
  • Sourcing core evidence-based components to build into your own existing programs.
  • Implementing programs to ensure best practice.

The feature image is by Ricardo Motti, CC BY-NC-SA 2.0.

About the presenters

Sophie Aitken

Sophie Aitken is the General Manager of Implementation and Quality at drummond street services, an innovative, research-based family and community service provider. She has a key role in providing support to family service agencies across Australia via the Department of Social Services’ Expert Panel project in the areas of Program Planning & Implementation and Outcomes Evaluation.