Evidence-informed decision making

Using research evidence to inform practice

Content type
Event date

23 March 2022, 1:00 pm to 2:00 pm (AEST)


Ken Knight, Amanda Paton, Beth McCann, Joanna Schwarzman




About this webinar

This webinar was held on Wednesday 23 March 2022.

The child, family and community welfare sector is increasingly being asked to use evidence to inform their decisions about service and program design and delivery to support better outcomes for children and families. Research evidence can be integrated with other sources of evidence such as practitioner expertise and client experience to implement an evidence-informed approach to decision making in practice. This webinar:

  • introduced frameworks and evidence about what helps to support research use within organisations
  • discussed the barriers and enablers of integrating research into practice and provide examples of approaches to support evidence-informed practice
  • provided examples of how organisations and practitioners have successfully combined research evidence with other types of knowledge to create impact or change.

This webinar is designed for program planners and practice managers involved in practice and organisational decision making. Practitioners looking for opportunities to strengthen evidence-informed decision making in their organisation, and researchers wanting to improve implementation of research findings will also find it useful.

Joanna Schwarzman facilitated this webinar. This webinar builds on these CFCA short articles:

Audio transcript (edited)

JOANNA SCHWARZMAN: Hi, everyone. Welcome to today’s webinar, Evidence-informed decision making: Using research evidence to inform practice. My name is Dr Joanna Schwarzman. I am a Research Fellow in the Child and Family Evidence team here at the Australian Institute of Family Studies. I’d like to start with an acknowledgement of the Boonwurrung and Wurundjeri people of the Kulin nation as the traditional owners of the lands that I’m speaking from today. I’d like to pay my respects to Elders past, present and emerging, as well as any Aboriginal and Torres Strait Islanders joining us today on the webinar.

While this webinar focuses on the use of research evidence in combination with other sources of knowledge, I’d like to recognise that Aboriginal and Torres Strait Islander knowledge, including knowledge of what works for Aboriginal families and services, is integral to decision making in practice. I’d also like to acknowledge the many Aboriginal and Torres Strait Islander researchers who have contributed to the body of research evidence as well. So, welcome everyone to what we hope will be an engaging discussion about how we can support and enhance the way we work by building an evidence base into our practice.

Joining me in the discussion today, we have a panel with experience that spans the breadth and depth of this topic. Each of our presenters has unique and shared experience of working on both sides of research and practice. So, I’d like to introduce Ken Knight from the Murdoch Children’s Research Institute. Hello, Ken. Amanda Paton from the Australian Centre for Child Protection.

AMANDA PATON: Hi, everyone.

JOANNA SCHWARZMAN: Hi, Amanda. And Beth McCann from the Centre for Family Research and Evaluation at Drummond Street Services. Hello, Beth.

BETH MCCANN: Hey, everybody. Hi.

JOANNA SCHWARZMAN: Thanks for joining us today. Okay, so setting the scene for today’s discussion, this webinar will explore how research evidence can be used to support decision making in programs and practice and organisations in the child, family and community welfare sectors. Our aim is to give you some ideas to take away about how to integrate research into decision making at the practice level or the programmatic or planning level and to help you understand some of the challenges that are commonly faced to improving evidence-informed decision making within an organisation.

In the discussion, I’ll ask questions of the panellists about frameworks for getting evidence into practice and what helps to support research use in organisations, some of the barriers and enablers of integrating research into practice and ask our panellists to provide examples of approaches that support evidence-informed practice in their settings that they’ve worked in, and give examples of how organisations and practitioners have successfully combined research evidence with other types of knowledge to create impact or change.

So, let’s get stuck into some questions with our panellists. Ken, I was really interested in starting with you because I know you’ve worked in the field of knowledge translation for many years and you’ve supported many people to think about how they create research evidence that will be useful for practice, as well as supporting practitioners to use that research evidence in their decision making. Are you able to explain for us today, what is evidence-informed decision making and what are some of the key concepts that are related to evidence-informed decision making?

KEN KNIGHT: Thank you, Jo and hi, everyone. Yes, I will try my best but that’s quite a big question that you’ve posed. I’d like to also begin by acknowledging the traditional owners of the country on which I live and work, the Wurundjeri people of the Kulin nation and pay my respects to Elders past, present and emerging.

I’d also like to provide a little bit of - I guess some broader context before we hear from some of the more applied examples and I guess the question of what is evidence-informed decision making, what is evidence-informed practice is something which is dear to my heart in the space that I’ve worked in for the past 12 years, really bringing what I would call a knowledge translation and research impact lens to making evidence more relevant, more accessible in applying evidence in quite specific contexts and I did work for many years as part of the Child Family Community Australia Information Exchange so it’s really brilliant to be back and speak as part of this event today.

So, the concept of evidence-informed decision making and evidence-informed practice is relatively straightforward, it’s about ensuring that the decisions we make and the actions that we take are informed by the best available evidence. The subsequent questions that this process raises for all of us about the nature of evidence and the ways that we work, I think we have to acknowledge are quite big and complex and today I think we also have to acknowledge that we will only probably skim the surface of this field but hopefully begin or continue for those of who you have been thinking about this work for some time, a much larger and sector-wide conversation about doing this important, necessary work as well as we can. I guess to set the scene, I’d like to share some slides with you.

So, if we could have the first slide please. Brilliant, thank you. So, I really like this image and I think it’s a helpful place to start because it reflects the striking reality for most practitioners, that sense of overwhelm is real and justifiable and many of us feel I think at imminent risk of being swept away by the sheer scale of the available research evidence. So, just to contextualise that, in the field that I work in currently, health and medical research, we have over 2,000 research articles published every day internationally and while this poses amazing opportunities to address problems in child and family health and wellbeing, we’ve never known so much as we do at this moment, there is no individual, organisation or system that can currently make sense of or integrate at that scale of evidence production.

So, I think the hard reality that we face in this space is that there is no simple fix and that we do need to embrace some complexity in this work. Applying research evidence and decision making and in practice is also not just a matter of following a series of procedural steps and I think all of you will relate to that if you’ve tried it and I think similarly if you’re a parent, you all know that raising a child is not a matter of reading and applying the child-rearing manual. What we’re talking about here is complex practices that do require skill in situational judgement which comes from experience as well as evidence about what works.

So, in my work, the approach that I take to try and address this complexity is using the tools and the processes provided by the field of knowledge translation and this is about bridging that gap between what is known or the evidence and then what we do, how we make decisions and what we do in our practice. So, you may have noticed that I’m using the term ‘knowledge’ here as well as evidence or rather as my preferred term rather than ‘evidence’ and I think this is a fundamental, tricky question that we need to grapple with when we’re talking about this area.

How do we know what we know? How do we approach sharing that knowledge and fundamentally, how do we apply that knowledge once we’re sure that it’s the right knowledge that we need to be grappling with? I think linked to that question of, “What is knowledge and how do we know what we know?” is a key awareness here that there are multiple types of knowledge and that to address challenges in child and family health and wellbeing, we do need a unified approach that brings together these key areas of what we know. So, you can see on the slide, we have research knowledge which is held by researcher primarily, but we also have practice knowledge which is primarily held by practitioners.

We have experiential knowledge or lived experience knowledge which is held by parents and community members. We have organisational knowledge which is held by service system organisers and we have policy knowledge which is held by policy makers. So, going back to the question you asked me around what is an evidence-based or an evidence-informed approach, at heart, it’s about an attempt to bring these different, necessary perspectives together. So, here, we can see this is the AIFS model of evidence-informed approaches.

You will likely have seen this model or will likely have seen other models or have your own model for evidence-informed approaches, evidence-informed practice or evidence-informed decision making and again, it’s about linking these separate, often fragmented and siloed areas. I guess the fundamental question in my work and I think in all of our work today is, “How? How do we do this?” This is lovely to bring together conceptually but what’s our way in? I think fortunately in response to that question is that there are frameworks to guide us and an excellent place to start and provide a map and a compass to navigate this tricky terrain. So, on-screen here is a diagram depicting the appropriately named knowledge to action framework which some of you may be familiar with.

The central circle here reflects the process of knowledge and evidence creation. Importantly, this framework recognises that there is a gap that we need to bridge between what we know and what we do and that this bridging of the gap and that applied kind of work requires significant and structured effort. So, here we also see on the outer cycle, a continuous, iterative ring that involves identifying real-world problems and linking and tailoring relevant knowledge, adapting the selected knowledge to a local context, identifying local barriers to the uptake and use of that knowledge, selecting and tailoring interventions to overcome those barriers, monitoring knowledge use, evaluating outcomes and sustaining knowledge use and I think importantly in this model, this is not a one-directional way.

It goes in all directions but also fundamentally, that the research knowledge generation is informed by real-world problems, not just people cooking up good questions that are then to be applied by others and I think taking a step back and bringing some of this together is that the promise, the proposed impact if we can do this well, if we can have a structured and meaningful approach to this work, is that the research evidence, the knowledge, will be more relevant, more accessible and more likely to be applied and in turn, the policy and practice will be able to apply and have the capacity to apply that knowledge and that that will lead to improved outcomes for children and families but I think that’s enough from me at the moment and Jo, I’ll hand back to you.

JOANNA SCHWARZMAN: Thanks, Ken. You’ve given me a lot to think about there. I really like the slide at the start, I think the slide of the overwhelm with the big wave of papers coming down, I think that might resonate with lots of us when we think about where to start incorporating research evidence or even where to start with accessing research and evidence relevant to our practice and sifting through that what did you say, 2,000 papers per day? That sounds outrageous. Luckily, most of us get to narrow down what we’re looking at out of those 2,000. I also really liked from that knowledge to action cycle, the - I think what it spoke to me is that it gives us lots of different opportunities for intervention.

When I say intervention, I mean there’s opportunities to use evidence or incorporate those - integrate those different types of evidence, so in planning the question or planning the intervention or even checking what’s being used and I like how you finished there with ultimately, it’s to make programs and policies as effective as they can be and then assess that afterwards to create positive change for our communities and families. Thank you.

You did touch on the different kinds of evidence and I wanted to move onto Beth here because we had a few questions along similar lines as well about different types of evidence and how they can be used alongside each other for evidence-informed decision making in practice and Beth, I know this is one area that you’ve been looking at and working on to help at Drummond Street Services, so what kind of evidence do you include in your work at the Centre for Family Research and Evaluation and how does research evidence contribute to evidence-informed decision making at Drummond Street?

BETH MCCANN: Okay, thanks Jo and thanks Ken for the nice introduction which has placed us quite well. I think you’ve given us a really good grounding and framework thinking about evidence-informed decision making and evidence-informed practice. I’d like to begin by acknowledging I’m calling in from Dja Dja Wurrung country and I’d like to acknowledge the elders past and present of these beautiful lands. So, I guess in terms of bringing in evidence-based decision making, the Centre for Family Research and Evaluation is based at Drummond Street so we’re an applied research institute within an organisation which is quite a great thing for an organisation to have I think.

Many organisations would love to have a little set of researchers based within the organisation and I guess we’ve got a remit to bring in evidence to inform practice and to ensure that we can help synthesise evidence that’s coming out. As Ken explained, there’s so much evidence out there so what do we need to help synthesise that for practitioners and for our managers so that we can make decisions based on emerging best practice and also to build evidence? So, there’s a lot of evidence building and evidence generation which happens at Drummond Street as well which we try to capture as a team.

In terms of evidence-based decision making, we have an evidence-based management framework so I think if you have a look there in the handouts, you can see a copy of the evidence-based framework and I guess it builds on the [APES] practice research and people framework. So, we really wanted to look at how we can bring the different forms of evidence to inform. So, we have - sorry, I have a two-month-old baby in the background who’s coming to interrupt. Sorry about that. So, basically, evidence-based decision-making framework is structured around organisational data. Sorry.

JOANNA SCHWARZMAN: All good, Beth, these things happen.

BETH MCCANN: Little bit of distraction, badly timed. So, we had the organisation - sorry.

JOANNA SCHWARZMAN: Beth, if you want to take a minute, let us know and we might - we could move on to Amanda if you -

BETH MCCANN: All good, she’s happy.

JOANNA SCHWARZMAN: She’s happy? Lovely. If you’re happy to keep going, thank you.

BETH MCCANN: Good thing about being on a video cam, I can breastfeed while I’m working.

JOANNA SCHWARZMAN: And our audience are people who work with children and families so I think you’re well placed.

BETH MCCANN: So, in terms of the evidence-based management framework, we have organisational data which is really critical to how our organisation works and functions. In our organisational data at Drummond Street, we collect really comprehensive demographic data about the individuals and families who come to Drummond Street Services, so not only demographic data but their presenting needs, their risk factors, the risk alerts, what risk alerts do we currently need to manage? This paints a really interesting picture about our clients and is something that we’ve added into this evidence-informed management framework.

We also have literature, so what literature is emerging in terms of best practice that we can integrate into particular programs and services and how do we integrate that in a way that’s meaningful to practitioners who are busy, who are under stress, who are under staffed particularly during COVID where we’ve seen an escalation of risk across the board? We have our practitioner wisdom and expertise. How do they go about implementing evidence-based programs? How do they deal with evidence? What are they seeing on the ground and how are the people that they’re working with actually presenting and does that fit with the evidence that we’re bringing in for them?

And then our user experience, what is it that our clients coming into the services want in terms of consuming services, in terms of the services and practices that they need? So, I guess we use those four different domains of knowledge and we try to value them each equally. I think there’s always been this traditional hierarchy of evidence coming from universities and evidence-based programs and practice being the gold standard but actually, when we’re working with marginalised communities, how do we bring in these different forms of knowledge from the communities themselves, from the practitioners who are on the ground and from our organisational data which helps us analyse the certain particular demographics and risk in the cohorts that are actually turning up for our services?

To give equal weight to each of those four domains is kind of tricky. So, we’ve been working on it over the years and I think COVID actually presented us an opportunity to be able to implement it quite well. When COVID started, we knew that things were going to change, we knew that there was going to be things that we would need to respond to as an organisation, as a management team and as service providers, practitioners on the ground. So, we looked at how we could use the evidence-based management framework to learn from what was happening and to respond to what was happening quite effectively. So, in terms of getting the organisational data, we started to pool that data on a regular basis.

What were we seeing in terms of risk factors, risk alerts and needs? Were different risks escalating? And we saw a tripling in family violence risk for example and the more than doubling in suicide risk. We’ve built in evidence from overseas and national literature, what were other researchers seeing, what were other organisations seeing, what were other practitioners seeing? So, how could we use what we were seeing at Drummond Street and compare it to what was happening on an international and national level? We looked at practitioner wisdom, what were they seeing on the ground? We don’t always have the opportunity to go to every single team meeting but during the beginning of COVID, we did.

People from my team sat down at every team meeting and asked COVID-related questions to our practitioners so that we could see and capture what was happening and feed back to our executive team so that we could make decisions based on what was happening and also client voice, it’s kind of tricky to get client voice during a pandemic, particularly when everybody is consuming services online. So, we had to think a little bit creatively about how to do that.

So, we worked with my evidence unit in terms of, “What measures could we bring in?” and the University of Miami really quickly released a pandemic index scale so we added that to our inhouse evaluation which has got two of the best four components so we could see how much people’s lives would be impacted across different domains of wellbeing as a result of the pandemic. We added different questions to our feedback forms about telehealth and the provision of telehealth services. Never have we provided all of our services online through telehealth so what did that mean for people and how did they consume services in that way? We had interviews and focus groups with clients as well coming into the service.

We had staff members going down to our commission housing and seeing what did we need as a community in those collective units, and we helped to gather this information and synthesise it. We also needed to think creatively about how to share the information. So, we saw risk escalate and if risk is escalating for practitioners who for the first time are working at home, isolated, on their computers, while managing everything else, the chaos that happens inside a home, then how could they get that information and read it in a way that was helpful to them? So, we actually built some interactive online reports and what these allowed people to do was to interact with information.

You didn’t have to sit and read a lengthy PDF document that was hard to consume. You could actually paint your own journey through an interactive online platform and be able to read the bits that were relevant to you or the bits that you needed to read and in fact, I’ve never had so many practitioners email me and say, “Oh my God, I love the interactive platform. You tricked me into reading the whole thing and I’ve never a whole one of your reports,” which I was slightly offended that, that they’d never read a whole report but that’s okay, I know now, stick to the interactive platforms.

So, I guess it’s things like this. How can we actually use evidence? How can we synthesise evidence and how can we build evidence? What’s happening on the ground that can feed back up as well? So, this is just one example of how we tried to conceptualise and use and operationalise our evidence-based management framework but I think it provides the audience with some examples that it can be done and different ways that different types of knowledge that can be harnessed.

JOANNA SCHWARZMAN: Beth, I’ve written lots of notes because there’s lots of things I wanted to pick up from your example there and I think I’ll go to the one right at the end, one of the impacts there, having people read a different kind of report, what a great impact that information has been shared onwards when maybe previously, people wouldn’t have. So, well done on that. I think you’re right, you did mention that it’s quite lucky to have researchers sitting within an organisation to have those close links.

I know not everyone who’s listening today will have that situation but it sounds like you’ve been able to make the most of it and really embed that evidence use or evidence generation and use alongside practitioners who are working with communities and clients and integrate all those different kinds of knowledge to come up with these reports, come up with suggestions, come up with ways to collect local data as well and respond quickly.

I think your example of the COVID times makes a lot of sense to integrate research from around the world because that was happening on a big scale but the local changes were really important when you need to make decisions about how to respond and as I was listening, it did make me think of another webinar we recorded. I think it was late last year with Amanda, which was about - was it about collecting and using data within organisations?


JOANNA SCHWARZMAN: So, I’d encourage people if you’re interested along, to have a look back for the recording on our website on that one too. Thanks so much, Beth and nice to meet your daughter on the camera too. Amanda, I’d like to throw to you now and just - I think because you can build on Beth’s experiences there, you’ve been working with organisations and practitioners to support the use of evidence in decision making and practice for a long time. You presented in our data webinar. You’ve got experience at the collection and use end. Are you able to tell us a bit more about your experiences please? And what are some of the factors maybe that help or hinder practitioners who are trying to incorporate research evidence or evidence into their decision making? What have you seen worked to overcome those challenges?

AMANDA PATON: Thank you and there’s no baby here so it’s a hard act to follow from Beth and no slides either so I feel unprepared without props. I think I’m in a really fortunate position where my entire role at the moment is actually around, “How do we translate evidence into practice?” and looking at how we bridge that gap and really for the ultimate gain of making improvements in terms of the way we work with children and families.

As you mentioned, I’ve worked probably for the last 18 or so years in the child protection space both as a practitioner but also as a manager and a leader and really looking at service design and development, so creating programs and then reporting on programs and outcomes and then later on, looking at things from a system policy perspective within government and now more research to translation and reflecting on that for this webinar and really highlighting that slide at the beginning from Ken.I think we’ve all been there where we’ve gone to look at a subject or a topic or we’re faced with a new client or a challenge and the mass of information is just so overwhelming. So, if I put my practitioner hat on, how do we manage that and how do we sift through all the noise to find what we need?

I think for me, there’s three really broad factors that I think block or hinder practitioners from being able to translate evidence into practice and the first one really is that culture and leadership, so at that organisational level, if there’s not a culture that really supports that continuous improvement and genuine quest to review our practice and to look at the outcomes and to do self-assessment and ongoing development and that’s going to be a huge barrier and the second one is really competing demands and this I think happens at the practice, system and the organisational level.

So, we have - we might have specific contracts that have fixed service models or a prescribed list of things that we need to do with clients, regardless of what sector we’re working with or tight requirements. We have often high workloads and caseloads and picking up on a previous comment from Beth, we’ve also got this rapidly changing environment of COVID now. I remember myself when we had to pivot within a couple of days from face-to-face therapy to online therapy, it was, “Gosh, where do we get the literature for that and where do we find out where is the best evidence base for this and how do we do this?”

So, those competing demands I think at all the levels are a huge factor and the lack of flexibility and funding and resources that sometimes exist and then the third factor that I think is probably the biggest barrier from a practitioner level is we’ve got no idea where to start sometimes. All we see is that sea of - there’s thousands of documents which Ken, I was horrified there’s 2,000 a day in your area and it’s overwhelming and where do you start? That’s a real challenge.

I think I’ve been fortunate enough over the years to be involved personally in those projects where you have a lovely partnership with research, so you have practitioners working alongside university-supported researchers, alongside policy and you have this lovely enmeshment and feeding in of this continuous improvement and that’s great and here in WA, I think Home Stretch is a great example where we’ve got practitioners working alongside researchers working alongside government and there’s this continuous feedback loop and changing of and translation of evidence into practice and acknowledging lived experience as well as practice wisdom and literature both nationally and internationally but that’s not always possible.

So, I think some of the best long-term change, particularly embedding that culture of valuing evidence translation into practice and overcoming some of those barriers has been through development of communities of practice and so, if I think about when I - many years ago, I established a large psychology team and then later on, a multidisciplinary team of about 60-odd staff, all from a range of different backgrounds but all focused on child abuse and trauma and assessment and responses, both in a clinical setting but then in an outreach setting as well.

I remember one of the first things that my chief executive at the time tasked me with before I could expand services was to go out and see if we were doing the right thing and see what models are out there, see what theories and what type of services we should be providing. So, I did my good university due diligence and I turned to the literature and I did lots of searches and I looked at the sector and what other people were doing and then I was able to prepare a paper that went up to the CE and made a whole bunch of recommendations on where we should go.

So, I had this lovely time that was contained to review and reflect and then implement and then of course, work happened and clients happened and as the leader, I was struck with, “How do we keep doing this when we don’t have time in our day-to-day?”

So, I realised that what we had was this huge community of practice where we had such rich knowledge, so by utilising a hybrid of a peer supervision model but bringing in research and bringing in practice wisdom and lived experience from a really, really broad range of practitioners across different geographical locations from different disciplines and different theoretical orientations - some practitioners were brand-new and they were students studying at the time and others had been in the field for 20-odd years, some had specialties in child sexual abuse and others in family violence.

So, how can we create a community of practice where I don’t have to read every journal article but I can present one that’s particularly relevant to me and share it with the team? I don’t have to attend every training on everything but I can go to one particular training that’s of interest and I can share that learning and share those resources with the team and we can have these robust conversations. So, what we were able to very successfully create was a culture of sharing and a culture of learning that was based on real-life practice, sharing of research evidence as well but sharing the load. So, if we think about that wave at the beginning from Ken, we don’t all have to manage that wave on our own.

We can actually all stand together and share the information and just take a little piece of it and really work through it. More recently in a university setting, we’ve taken that concept of a community of practice and tied it a bit more to formal training in the sector, so rather than doing a traditional face-to-face tutorial-style interaction, we’ve used this communities of practice model to look at, “Okay, how can we have highly skilled facilitators but how can we draw upon the collective practice knowledge around the room from a diverse range of practitioners to exchange evidence and to learn and reflect and influence each other’s practice?”

I think it’s really a - by coming together, I think it’s a simple and effective way, it’s relatively cost effective and to influence one another and a simple way to support practitioners to take on that kind of review, reflect and implement practice and it’s a way to also build a culture within an organisation that can then feed up into policy and service design but it’s something that practitioners can do very easily and draw upon the huge wealth of information that we have as a collective sector rather than having to do it alone or in a really structured partnership with a research institute or those things which are lovely but we’re not all fortunate enough to have those available to us at all times.

So, communities of practice for me have certainly been a way that I think I’ve had experience and success in that and it really has developed my practice and developed how I look at research and evidence from a whole range of different perspectives and how that’s embedded into practice but also service design and models of programs and things as well.

JOANNA SCHWARZMAN: I think that’s really interesting, Amanda. I like how - I think the way you explained it was that you had a common topic as a focus but everyone came from different perspectives, from theory, from practice and position in an organisation. So, I really like the ideas of sharing there, maybe across organisations as well which means you get to reflect in a different way. I did have a practical question. How often do you meet in community of practice?

AMANDA PATON: So, we established them originally monthly but what we found is that it spills out, out of the group. So, then you end up with ad hoc conversations or you might be having a challenging plan or an issue and you can draw on someone within the group, you can say, “I’m having this concern or this problem or I’m not quite sure where to go. Who’s read something recently on this issue that would be helpful or do you have a resource that we can exchange?” So, they kind of grow and they morph and they become really organic and self-determining which I think is the lovely thing about communities of practice.

They don’t have to be organisationally driven. It doesn’t have to be line management and it doesn’t have to be in policy or practice. It can be derived from the practitioner level and you can meet as often or as little as you like and I think in this virtual world that we’re now living in, we’re not hampered by, “Can we physically get to the one location?” and those type of things as well. So, that’s shown us a lot I think over the last couple of years too.

JOANNA SCHWARZMAN: Yeah, that’s a really good point. The transition to online has some benefits for those meetings and networks in creating networks. Thank you. I also really liked the starting point of the question when you were starting out. “Are we doing the right thing?” I think that’s probably a really common question for practitioners and managers and even organisations as a starting point when they’re using evidence and really as a way in to thinking more about, ‘What do we need to know? Are we doing the right thing?’

I certainly know from my experience that’s been a big starting point. Beth, I was wondering if you had any examples to follow up on of work that you’ve been involved in that has helped overcome some of these challenges, maybe the challenges of time or the challenges of embedding incorporating research evidence in an organisation and I know you’ve given us some really great examples already but I wonder if there was anything else that you wanted to - some things that support this process of maybe improving culture or changing culture towards learning and sharing or using evidence.

BETH MCCANN: So, I think it’s building on what Amanda said, creating the spaces so the communities of practice are creating spaces where people can share insight, knowledge and evidence in a way that’s really constructive to help build team environments around emerging evidence and practice and it’s that idea that you don’t have to do it all, that you can take one bit of knowledge and share that. I guess within Drummond Street, there’s been a real move building on what we’ve found during COVID and thinking about how the organisation can actually now function as an organisation going forward and I guess we’re really lucky to have a management team that is really curious and really wants to bring in evidence and build evidence.

There’s been a lot of literature scoping that’s happened from our team in terms of what might work, what could we see working in terms of place-based interventions, how can we build on our integrated service response models or integrated program models but also then taking what we’re seeing to practitioners and having their input and insight as well. “So, these are some of the problems that we’re seeing. These are some bits of evidence that we’ve been pulling but what do you think would work on the ground? How do we create the processes and systems and structures which will actually help to enable the incorporation of evidence within the work and how do we really enhance the voice of community?” and I think that’s a really critical thing that we really want to do all the time.

So, we work with marginalised communities, we don’t want to take models that are not appropriate to them to try and implement them. We want to know what is actually going to work to support them. So, I guess when we’re scoping models, we’re scoping evidence, we’re always scoping evidence that has been worked and used in context with different communities. It’s not just norms done a white population in the next audit from the city and trying to do it in the regional setting for example. So, what has worked with diverse communities? What are some ways that we can manage risks in organisations? So, if we’re bringing in a new model, how do we actually set up systems and structures to make sure that people who engage in that new model and service remain safe?

So, even thinking about our structures and supervision processes, so can we have a practice lead for example who oversees the new program and the work being carried out by different practitioners so that that process can help us to build evidence as well? Is the program working? Are the types of evidence we’re bringing in appropriate within the context within which we’re trying to import it?

So, I guess for me, it’s about bringing in evidence but it’s also about creating space to reflect on, “Is this the best evidence that we have brought in and does it work in this context?” and having management that are curious to know if it works and if it doesn’t work, how can we improve it or do we need to change and try something else because this approach is not working? I think creating space and having the environment for curiosity is a really great way to start to be able to find systems and structures that can work within your organisation and within your particular field.

JOANNA SCHWARZMAN: I love that word ‘curiosity.’ I think it makes it a little bit more I don’t know, easier to grasp than talking about evidence or using evidence but at the heart of it, it is a curiosity and a desire to improve or willingness to improve, learn and improve so that’s really interesting to hear. The idea of creating space and I heard some examples of creating space by giving it to someone in their role or giving it to - or embedding it as part of a process, that time for reflection and I think they’re really interesting examples of how it can be implemented within an organisation. I’m just noting the time and we do want to make sure we get questions. We want to get to some of our questions from our audience.

I actually have a great question which segues here for Ken. One of the questions - we got a few from registration and they’re starting to come through now as well - is building on from Amanda and Beth, this idea that there are challenges at that broader level, there are challenges in our funding systems and there are challenges in the way we have to report or the big drivers at the really high-level policy levels. I think you might be able to talk to this a little bit here. What are some of the factors that might be at play to support or challenge the use of research evidence more broadly in our sectors, in our fields?

KEN KNIGHT: Thanks, Jo and I have to say I have to wholeheartedly agree with Amanda and Beth and all of their really insightful comments and reflections and I think really in my work, the question for me is how are we as a community, as a research community, as a practice community, as a policy community with an interest in improving outcomes for children and families, developing an integrated culture that brings this together in a meaningful way and I think within our organisations, that’s one approach and within our own practice, that’s one approach but thinking system-wide and in terms of system-wide reform I think is necessary and to what extent can we each in our roles and as organisations advocate with funders with expectations around doing this work?

And I think touching on something that Amanda said around organisational culture, this is a fundamental that we have found in our work, that you can’t really approach this in a meaningful way but I think with that and influencing organisational culture are those expectations and drivers and all of the subsequent incentives and disincentives. So, I think within research contexts, there is a pressure to publish in peer-reviewed journals which produces that tsunami of evidence that probably no one else except other researchers can readily engage with and probably other researchers aren’t really readily engaging with that evidence either, so how do we bring that together?

And I think to turn that question around a little bit I think and to follow on with what Amanda was saying about communities of practice, that has certainly been the approach that we’re taking in the health and medical research sector that we do need a networked approach to share and learn but also to importantly fail because we will trip over when we’re doing this work.

This is complex as I’ve acknowledged a few times but one of the key ways of learning is having the safe space to not always succeed and to be able to be open and honest about that with each other and with our funders and with our communities but importantly into Beth’s point about bringing people together, so one of the other areas that I’m working with across Melbourne Children’s, so the children’s hospital, the Murdoch Children’s Research Institute and the University of Melbourne, is building capacity for coproduction and codesign of research but also of policy and practice.

So, we’re making sure that from the outset, we are bringing together all of those different custodians of knowledge, of lived experience and of different processes to address these challenges so that the research questions that we ask are relevant and appropriate, are likely to inform practice but are also informed by the most pressing policy and practice needs. So, I think that there is a whole lot there that we could unpack and this is just the beginning but I also think I’ve got a provocation for you maybe.

This notion of community of practice, we kind of have that here with the CFCA webinars. There is this kind of group of practitioners and professionals across Australia who are wanting to develop in this space and I wonder whether there’s an opportunity here in this online space to expand it. We may have our own smaller communities of practice but I think there’s a big opportunity here potentially.

JOANNA SCHWARZMAN: Challenge accepted. Ken, the network element is really important. I think like you were saying, it feeds on both what Amanda and Beth were saying about the importance of networks and having time for sharing and I really liked the idea of maybe improving or developing that - I don’t think you used these words but - permission to fail in this process and I think - my terms that I use a lot are learning and improvement, so we’re learning and people, when they’re learning, they make mistakes. So, I think including that in the way we work and not being afraid to fail and know that we’re trying to improve based on what information we have, then that’s a really nice way to approach it.

We are getting lots of questions coming through. I think I’m just going to jump around a little bit and - actually, no. I think I might pick up on some of the ones around culture because I think almost everyone has mentioned culture and we’ve got a couple of questions because culture can be a little bit nebulous, we’ve had some great examples about building culture, so starting by doing, starting by sharing and then informing and letting that filter up into different parts of the organisation or with Beth’s example of having really curious management which filters down through the organisation. I might put to the whole panel, what are some simple tips or strategies for teams to build an evidence-informed culture? Does anyone have any other tips they’d like to add?

AMANDA PATON: I think I might start by sharing - I think from a - if I think about a manager level and a program manager level, when we develop or design a program and receive funding for a program, we have to produce outcomes and you - in a very simplistic way, certainly in the child protection sector, we’re moving much more towards are, “If you can’t show outcomes and demonstrate that what you’re doing is actually making a genuine impact and difference, then you won’t get refunded or you certainly won’t get seed funding for new, innovative approaches or to try something new.”

So, there’s a massive drive there, there’s a massive push and that should be a huge motivating factor which could start at that middle management level and filter up or filter down or even start from a board or chief executive and executive leadership level and filter down through the organisation but there’s a significant amount of codesign that’s now going on around outcomes and how do we look at moving away from outputs and widgets and counting how many sessions we do with a client and genuinely look at what we’re actually doing and talk to our clients?

I use the word ‘clients’ very broadly here but how can we talk to our service users and see, “Is that thing that we’re actually doing making a difference? And if it is, let’s capture it and let’s talk about it and we can report it.” Funding and growth should be a huge motivating factor for organisations at that executive level but we can push that as program managers in terms of pushing it up and pushing it down because if I can demonstrate that my team is under-resourced and overservicing or there’s not enough of their FTE, I can make a really good case for additional funding or additional supports.

So, I kind of see that motivator of money if you like, to put it bluntly, as an excellent motivator up and down in an organisation to create that culture. It’s a necessary part of funding at the moment.

JOANNA SCHWARZMAN: It’s a good point, Amanda and a lot of - it does seem to make sense to start with, what you have to do. Start with what you’ve got and start with what’s mandated. I like the idea that things are being increasingly codesigned to make more sense, they’d be more meaningful but if it is something that you have to report on, why not use it as a starting point to embed those processes, systems, culture within an organisation and build on it from there? Great example. Did anyone else want to add any other tips for building culture? I do have a similar question I might ask in a moment if not. No? Okay.

BETH MCCANN: I think for me, it’s about finding your champions. So, there’s always champions within an organisation and there are always people who see the value of bringing in evidence-based practice or evidence-informed practice so I think if we can find those champions and let them do the work or let them lead the work, that’s a really good way to get it, to kick it off.

KEN KNIGHT: I was just going to follow up and say that I think that top-down, bottom-up approach is really the only feasible way. There does need to be leadership in this space. It does need to be driven by motivated, enabled workers on the ground but I think there is that - to Amanda’s point around resourcing - is that we do need to build capacity to do this well and we do need to invest in upskilling and training. We can’t just say, “This is a great idea, now go,” we have to have a structured way of enabling this important work as well.

JOANNA SCHWARZMAN: Yeah, really important points. I’m going to ask a similar question because I think it builds on what we’ve been talking about nicely. Does anyone on the panel have advice on how to persuade peers to embrace data and embed research findings in operations? We’ve started talking about that but what are your most persuasive lines that you use maybe?

BETH MCCANN: I think bribery always helps.


KEN KNIGHT: I’m thinking about incentives and I think I mentioned incentives briefly and I think incentives happen at the organisational level. They also happen at the funding level so in health and medical research, a couple of incentives that we’ve been working towards - it’s sad that you have to go straight to the carrot and the stick, I think some people are inherently, intrinsically driven and see the value so I think showcasing the value is an important thing but also in terms of promotion and criteria for researchers, making sure that they are working to codesign and codevelop and coproduce and translate their work is something which is important.

But then the other thing essentially is that in health and medical research, to be funded, you need to be showing that you do have a transitional pathway that you will engage with lived experience representatives, that you will be working with other practitioners or the community to ensure that what you discover will not just sit on a shelf or in a peer-reviewed journal article. So, I think a multipronged, multimodal approach is probably the best.


AMANDA PATON: I think a practice example I have comes kind of out of Ken’s point about permission to fail and not seeing failure as a failure but it’s an opportunity to learn and I think working with practitioners for many years when there is - when they’re struggling with a client or a presentation or are a bit stuck, actually supporting them at a very face level to actually return to the data, look at the evidence, look at the symptom tracking so it’s a very granular level but using that stuck-ness and not knowing where to go, using that as a point to actually go, “Let’s do something a bit different. Let’s look to the data in terms of how that client is actually tracking over time. Let’s look to the research evidence. Let’s look to other experts in the field to get some ideas.” So, that I think at a very micro level is what I would use to influence in that space.

JOANNA SCHWARZMAN: Yeah, nice one. Okay, so another question we’ve got here is - I think this one is for you, Ken. Someone would appreciate hearing thinking on the differences between evidence-based practice and practice-based evidence. Are you able to talk to that one for a minute please, Ken?

KEN KNIGHT: Yeah, definitely and it’s a great question and my response is going to be a simple one and hark back to those different types of knowledge and that informed - that evidence-informed approach that I put up on my slide and I think it really just depends on where we sit, what kind of custodian of knowledge are we? If we’re a practitioner, then we’re probably going to be looking to ensure that our practice and our decision making and our activities are informed by the best available research evidence and what we know from lived experience representatives. So, I think it really is just about shifting our perspective based on where we sit and what custodian of knowledge we find ourselves in at any particular point.

JOANNA SCHWARZMAN: Okay, nice one. This is a good, straightforward one. Beth, everyone is quite excited. What interactive platform did you use to present the data and the report that you were talking about?

BETH MCCANN: So, the platform is called Genially and it’s just one of the platforms that we found online. There are a number of them. This was the cheapest. We’re a not-for-profit so that fit in well with us. It is a little bit fiddly when you first start to use it. The senior research officer who pulled it all together originally does sometimes have a twitching left eye when we bring up, “Can we do another Genially report?” but once you get your head around it, it’s quite easy to use and navigate, particularly for those reading the report. So, the platform was Genially but have a Google and look at other platforms as well if you don’t feel like you’ve got tech-savvy people in the team and want to go with one that’s a little bit more straightforward.

JOANNA SCHWARZMAN: I actually had never thought there’d be multiple platforms. I’m sure that of course there are but things out there to do interactive reports. Thanks so much, Beth. This is a really good question that’s come through as well. I think we might all be able to add something to this one. How might practitioners navigate the conflicts that can arise when research takes us in different directions? And it sounds like a bit of a complex one. Amanda, do you want to start with that?

AMANDA PATON: Yeah. I think the first thing is to really acknowledge the tension there sometimes between practice and research and if we think about - when we think about research, if I put a practitioner hat on, quite often we think about research in a clinical setting on different models or approaches or things that we should implement with a whole range of different concerns within community and within families but quite often, the research that’s done in a very pure setting or it’s not in a real-life context because we screen out clients that have comorbid presentations or they don’t quite fit or from minority, specialty groups that need a particular focus.

So, what we end up with is something that doesn’t look like our setting that we work in, so I think as practitioners first of all, we need to be able to critically reflect and evaluate the research that we’re looking at and then really acknowledge the tension that exists. So, for example if I think about some of my earlier work looking at things like trauma-focused CBT for example and this is a very specific example but it can apply to a whole range of different settings, trauma-focused CBT and EMDR are two therapies that are really well evidenced in the literature as appropriate treatments for child trauma and child abuse and complex trauma but EMDR for example is very expensive to actually train people in so it might not work for all settings.

Trauma-focused CBT relies on a high cognitive capacity and can be really difficult if you’ve got children and families who are currently still within trauma and within complex environments. EMDR needs a supportive, secure base and caregiver for treatment to be able to be used. A lot of the families I work with don’t have that. They’re in the care system or the carers are changing or Mum and Dad are managing a whole bunch of stuff as well. So, there’s a real disconnect sometimes between what the research says in its purest form around what’s an evidence-based practice and what’s actually going to work in the real-life context.

So, as practitioners, the first thing that we can do I think is critically evaluate it and acknowledge that tension and if we can talk about that tension and we can acknowledge it, we can look to how we might need to adapt things without interrupting the fidelity of the model which is a whole other webinar, but what pieces we actually need to think about when we apply that evidence-based practice in a real-life situation and where the research might not reflect that I think is the first thing.

JOANNA SCHWARZMAN: Yeah, it’s a good point and I think it really goes back to your example about sharing with others when you’re reflecting on what’s been presented in the research but you’re also putting those ideas to others in practice and you might be in different positions and different perspectives to interpret it for your setting.

AMANDA PATON: Yes, definitely.

JOANNA SCHWARZMAN: Did anyone else want to add anything on this question? How might practitioners navigate the conflicts? Beth, yeah?

BETH MCCANN: I think it’s important as well to think about how the practitioners can then feedback to research as well. So, it’s not a linear process that goes top-down. We need to think about how we can continue to - if it’s not working within a particular context or if it’s not the right evidence base to be incorporating, how does that feedback into research, how does it feed back into the ways that we’re delivering services, the ways that service is being funded, both in terms of policy advocacy but also research advocacy?

I think there’s a real ability there if we do talk about the complexity and the issues for research and practice to work better together so that we can come up with frameworks that do actually work within a particular context that can be seen as best practice because best practice is only best practice if it works within the environments within which it’s working.

JOANNA SCHWARZMAN: That’s a really good point and you make me think of Ken’s slide on the cycle rather than the linear, so the knowledge to action cycle that involves providing feedback to researchers, definitely. Did you want to add anything there, Ken, on the cycle or on anything else?

KEN KNIGHT: Well, only to reflect back on that cycle and say it needs to be multidirectional because I do work with researchers who say exactly the same thing but from the research angle, so I think we all need to work and I guess find new ways potentially of working better together to ensure that research is informed by practice and that practice is informed by research and those connections are well facilitated and we have the capacity to do that and enable that in a productive and meaningful way.

JOANNA SCHWARZMAN: Fantastic. I wanted to finish with the panel by asking what’s one piece of advice or what’s one tip that you’d like to give our audience that can help them feel like there’s something they can do, something manageable that they can do maybe tomorrow, maybe at some other point, to start incorporating research evidence into decision making? Am I able to go alphabetical? Amanda, please.

AMANDA PATON: Sure. I think my advice would be to start small and start early and I think establish your own communities of practice. So, everyone now online could reach out to a few networks, doesn’t have to be within the same org, doesn’t even have to be within the same discipline but in a general kind of connection, make a time to meet and share cases, successes and barriers and share training and your experience and be open and really adopt that process of reviewing, reflecting and implementing with your peers and share one of those 2,000 pieces that you might have read but do that tomorrow. It’s an active step that I think people can do very simply.

JOANNA SCHWARZMAN: Thanks, Amanda. Beth please, what’s your top tip?

BETH MCCANN: I think finding what will work for your organisation. So, who works for the organisation? What steps are already in place? What already exists? So, how can you leverage off existing communities of practice for example, existing groups that are meeting? Do you need to get everybody together and buy lunch and go through synthesised research and have sessions set up?

Does bribery actually need to be part of it to get your practitioners along or is it a good way to start and then get that great buy-in from people who then want to support and continue to move and grow? So, I think your organisations will be different. It might be needed at that management level. It might be needed at a practitioner level. What’s going to work for you? Have a think about it.

JOANNA SCHWARZMAN: Thanks, Beth. Ken, what’s your final top tip for people?

KEN KNIGHT: I completely agree with Amanda and Beth but I feel like my mantra in this work is, “The perfect is the enemy of the good,” so iterative approaches are the only approaches that will be effective. You’re not going to be able to transform your practice or the organisation overnight but in terms of next steps, I would encourage everyone to acquaint themselves with the accessible information out there around these processes.

There is a wealth of information on the CFCA website and we can share some additional tools and resources if people on the webinar have particular questions around key issues but I also think that each of you will be a custodian of one of those types of knowledge that we talked about and probably multiple types of those knowledge and I guess to maybe paraphrase Amanda’s perspective, I would say what you could do after this is find another custodian of a different type of knowledge and begin a conversation and through that process, you will begin to increase your understanding and their understanding and take the first steps towards an integrated change.

JOANNA SCHWARZMAN: Amazing, thanks so much, Ken. Thanks, Amanda. Thanks, Beth. I feel like we’ve only just scratched the surface. Thank you so much for sharing your insights and thank you everyone for attending and sending in your questions. I think we might be at the end of our time here, so thank you so much for a really great discussion. It’s been a fabulous one. We’ve covered lots and at the same time, there’s much more to be covered. I think we might just jump off now. Thanks, everyone for attending and I’ll let everyone say goodbye.

AMANDA PATON: Thanks for having us.

KEN KNIGHT: Thanks, everyone.

BETH MCCANN: Thanks, bye.




The transcript is provided for information purposes only and is provided on the basis that all persons accessing the transcript undertake responsibility for assessing the relevance and accuracy of its content. Before using the material contained in the transcript, the permission of the relevant presenter should be obtained.

The Commonwealth of Australia, represented by the Australian Institute of Family Studies (AIFS), is not responsible for, and makes no representations in relation to, the accuracy of this transcript. AIFS does not accept any liability to any person for the content (or the use of such content) included in the transcript. The transcript may include or summarise views, standards or recommendations of third parties. The inclusion of such material is not an endorsement by AIFS of that material; nor does it indicate a commitment by AIFS to any particular course of action.

Slide outline

1. https://www.saragironicarnevale.com/


Illustrations for Science magazine for an article about the case of “Sato’s papers”, one of the biggest fraud in scientific field.

Article available https://www.science.org/content/article/researcher-center-epic-fraud-remains-enigma-those-who-exposed-him

2. What is knowledge

Alt text: Silhouette of a head and outline of brain with the words "What is knowledge?"

  • What do we mean?
  • How is it known?
  • Who needs it?
  • Whose job is it to translate it?

3. Types of knowledge

  • Research knowledge: held by researchers
  • Practice knowledge: held by practitioners
  • Experiential knowledge: held by parents and communities
  • Organisational knowledge: held by service system organisers
  • Policy knowledge: held by policy makers

4. Graphic showing Evidence-informed approach to practice.

  • Research - currently best available research and evaluation
  • People - Children, families and other users
  • Practice - practitioner knowledge and skills

5. Knowledge creation: Knowledge inquiry, Synthesis, Products/tools. Tailoring knowledge.

  • Motor knowledge use
  • Evaluate outcomes
  • Sustain knowledge use
  • Identify problem - Identify, review, select knowledge
  • Adapt knowledge to local context
  • Assess barriers to knowledge use
  • Select, tailor, implement interventions
Webinar questions and answers

Questions answered during presenter Q&A

To view the presenter Q&A, go to 46:26 in the recording

  1. What are some of the factors that might be at play to support or challenge the use of research evidence more broadly in our sectors, in our fields?
  2. What are some simple tips or strategies for teams to build an evidence informed culture?
  3. What is the difference between evidence-based and practice-based evidence?
  4. What interactive platform did you use to present the data and the report that you were talking about?
  5. How might practitioners navigate the conflicts that can arise when research takes us in different directions?
  6. What is one tip that you would like to give our audience, something manageable that they can do tomorrow, to start incorporating research evidence into decision making? 


Research Impact Manager at the Murdoch Children's Research Institute (MCRI) and Honorary Fellow, Department of Paediatrics at The University of Melbourne.

I currently lead an award-winning program, teach approaches to Knowledge Translation, implementation and research impact, and was part of the group that developed the first Research Impact Framework for the Australian health and medical research sector. Before working at MCRI and RCH I was Co-Manager of CFCA at AIFS - it's absolutely brilliant to return for this webinar! I'm passionate about partnering with diverse stakeholders to enable greater impact and establishing and evaluating models and processes that enable research evidence to be useful and utilised.

I am looking forward to sharing my ideas and experience and learning from the experience of others - this is complex work that we can only meaningfully advance as a collective.

Clinical Psychologist and Deputy Director Practice for the Australian Centre for Child Protection, University of South Australia

My role is centred around bridging the gap between research and practice in the area of child protection. Having worked for many years in the community services sector leading large multidisciplinary teams and alongside government, I have had first hand experience both as practitioner and manager, using research to inform practice and service design.

I am looking forward to sharing my successes and failures (harsh learnings!) in this area.

General Manager of the Centre for Family Research and Evaluation, Drummond Street Services

I work alongside an amazing team to support Drummond Street using an evidence-based management framework to integrate research into practice and to capture learnings from practice to build knowledge. We with organisations around Australia to promote the uptake of evidence informed practice. It has given me insight into the barriers and enablers of integrating evidence-based practice and decision making within organisations.

I am looking forward to discussing this topic and gaining new insights and perspectives.


Joanna Schwarzman | Research Fellow, Child and Family Evidence

Research Fellow, Child and Family Evidence team, AIFS

My career interests have focused on generating and using evidence in practice to more effectively address population health challenges. Over recent years, I have worked in government, non-government and research roles to help identify and overcome the challenges to evaluation and using evidence to inform program planning, decision making and organisational learning.

I am looking forward to hearing examples of how different practitioners and organisations are overcoming the challenges of embedding research evidence into their work.


Featured image: © GettyImages/monkeybusinessimages