Using data to inform therapeutic responses to child sexual abuse

Content type
Webinar
Event date

24 November 2021, 1:00 pm to 2:00 pm (AEST)

Presenters

James Herbert, Amanda Paton, Antonia Quadara

Location

Online

Scroll

About this webinar

This webinar was held on Wednesday, 24 November 2021.

Therapy for children disclosing sexual abuse is important for addressing the effects of trauma and the potentially lifelong impacts of abuse. Identifying factors that influence engagement or completion of therapy allows services in criminal justice, child protection, community support and mental health systems to make informed decisions about approaches to therapy. However, there is limited research on these factors, highlighting an opportunity for data collection at the local level to contribute to informing service and program design and delivery.

This webinar explored ways in which data from research and practice can inform programs and services responding to child sexual abuse. Specifically, it:

  • outlined what the research evidence says on what influences therapy engagement and completion following a disclosure of child sexual abuse
  • explored the role of referral processes and service practices in the completion of therapy
  • described how data from research and practice can be used to inform program and service model design.

This webinar is of interest to professionals working in child protection, child and family support and mental health, particularly program and practice managers in services supporting children and families after a disclosure of child sexual abuse.

This webinar builds on the CFCA papers:

Audio transcript (edited)

ANTONIA QUADARA: Welcome, everyone, to today's webinar on Using data to inform therapeutic responses to child sexual abuse. My name's Antonia Quadara and I'm a Senior Research Fellow here at the Australian Institute of Family Studies. I'd like to start by acknowledging the – with an acknowledgement of country and of the Bunurong and Wurundjeri people, traditional custodians on the land on which I'm speaking to you in Melbourne. And also pay my respects to Elders past, present, and emerging of the Kulin Nation, and extend that respect to other elders and Indigenous Australians who are attending this webinar.

Today we are talking about practice data and research and information, and how that can contribute to better outcomes for child victims of child sexual abuse. So when we're talking about practice data, this could be service administrative data, client outcomes data. It could be evaluation data. And the question is, what's the role that information and data has to improve service outcomes and service design, and the important role that practitioners and service providers play in collecting this data. And how that pertains to whether you're working specifically as a therapist, therapeutically with victim survivors of child sexual abuse, or across the broader service systems that interact with victims and their families, like Child Protection, community support services, criminal justice, and the mental health systems.

And we're also going to be talking about the barriers and challenges and opportunities for data driven practice. And we'll talk a little bit about what we mean by data before kicking that off in responding to child sexual abuse. Joining me are two people who have been wrestling with these issues and their tensions on a daily basis. We have Dr James Herbert and Amanda Paton. Before introducing our presenters, I'd like to acknowledge the questions that people have sent in and the breath of things that people are interested in. And particularly to know more about including things like how to respond to disclosures of child sexual abuse, how to support non-offending family members. We won't be covering these issues today, however they are vitally important ones. And we've gathered together a suite of resources around responding to disclosures, and other research that you can access to provide some of this information.

So to our presenters, I'd like to introduce Dr James Herbert, who is a Senior Research Fellow at the Australian Centre for Child Protection at the University of South Australia. James is a social researcher who is and has been looking at barriers to therapeutic engagement for child victim survivors of sexual abuse, and on the effective deliberation procedures for multidisciplinary team case review. So it's researchers like James who rely heavily on program data and service data to help inform his work. Welcome, James.

JAMES HERBERT: Antonia, how are you?

ANTONIA QUADARA: Good, good. You're coming to us from Adelaide?

JAMES HERBERT: Perth. Westopia. Yeah, thanks so much for the invite. Really excited to be talking about the topic. It's a really ambitious title, so hopefully we can deliver on it. And maybe more conservatively I've gone with the title of, “I would really like to use data to inform therapeutic response to sexual abuse, but you've got to sell the sizzle”

ANTONIA QUADARA: I'm sure we will, James. And I'd also like to welcome Amanda Paton, who's the Deputy Director of Practice at the Australian Centre for Child Protection at the University of South Australia. And who also works with James. Amanda has worked as a clinical psychologist. Hello, Amanda. And oversees the research and provides policy advice to the Department of Communities Royal Commission Team and Specialist Child Protection Unit. So Amanda sits right at the intersection of where research meets practice. And brings real insight to the challenges and opportunities at that crossover point. Welcome, Amanda.

AMANDA PATON: Hi, Antonia. Thanks for having us.

ANTONIA QUADARA: Pleasure. I'm not going to hazard a guess about where you are because I think you were going to be one place and now you're not and I was already wrong with James, so we won't do that.

AMANDA PATON: I'm in WA. I'm in Perth as well. The border changes didn't like me this week, so no Adelaide for me.

ANTONIA QUADARA: Okay, so now we're getting into the nuts and bolts, where there will be sizzle, James. There will be sizzle. So before we dive into the discussion about using data and research in practice and service planning and service design, let's start a little bit by setting the scene, starting with you, James, around the research that you've been doing, and that you've actually published for CFCA. So let's start, I guess, with a descriptive and then also some of the challenges that you encountered. So what does the evidence tell us about children accessing therapeutic services in the first instance or getting referred to them, engagement, and completion? What do we know about I guess the kind of uptake if you like, but also the factors that influence that and the factors that influence engaging in that therapeutic process and staying engaged?

JAMES HERBERT: Yeah, cool.

ANTONIA QUADARA: Yeah, speak to that.

JAMES HERBERT: So I guess the starting point this what sort of frames some of my thinking around this was really working on multiagency responses for the last couple of years. Which is I guess the meshing together of all these different disciplines and agencies that really need to be involved when child sexual abuse is occurring. And specifically working in child advocacy centre research, both in the US and in Australia. So the way they think about the connections with therapy is they're doing an integrated response. So referral and engagement with therapy is one of the key outcomes that they're really monitoring as part of the holistic response, joined in with I guess the police, child protection sort of forensic and interviewing response. So there's a lot of research on child advocacy centres, and I guess what I really found was it really emphasised the justice aspects. And there's a lot of research primarily about disclosures and interviewing practice, things like that.

And one of the gaps I really found was the contribution of that warm referral of having that child and family advocate that's there doing that work, addressing barriers to accessing that sort of thing. So that got me thinking about how you might study this in an Australian context. So I guess as a preparation for that, I undertook this systematic search of the literature. I found about 4,000 articles, which then I narrowed down to 49. Then did some fancy meta-analysis stuff to be able to take similar studies and lump them together in order to arrive at some rates that tell us something about the rate at which children engage and complete therapy services. As well as some of the factors that influence whether they – the factors that influence those rates.

So I guess the more – and I understand you guys were sent the link to the papers, so I guess the main bits that we found was the full sample. So all kids that got interviewed that were sent off into the world, studies that were following them, only about 30% were really making it to services. Whereas where there were specific referrals, so police or Child Protection had made a specific referral – here's a piece of paper, go to the service – showing up about 60% of the time. Whereas families initiating contact and going through a waitlist to an eligibility process, they're actually commencing the service about 80% of the time. The completions, we separated them into the really controlled experimental studies where there's a lot of, you know, you've got to meet this criteria to be in this trial. They're competing at a rate of about 74%. Whereas the services much more out in the community with much less structuring and control around the eligibility, about 59% complete.

So what we're getting or what this all up means is that for people that are sent off to services, anywhere between 18% and 35% are actually getting to the point of completing those services. So those are the ones that are potentially getting the benefit of these services. Higher of course if the family are initiating services. And yeah, it just got me thinking, in that pipeline there's a lot of holes. There's a lot of bits dripping out. And as well, these are primarily US studies. So they reflects primarily child advocacy centres that are doing these really coherent warm referral type responses, and not all jurisdictions in Australia have that. So potentially our rates could be worse, could be not worse. But I guess more to the point is, we don't quite know because there's not a lot of Australian research on this that takes into account our context and our situation.

So I guess that's what got me fired up about this topic. And thinking about it. Did you want to say something? I didn't want to monopolise the mic.

ANTONIA QUADARA: Oh no, that's fine. I'll talk to Amanda in a tick, but you said that there are a lot of holes in that pipeline. Can you just expand on that? Is that research holes, data holes, service holes? What are we talking about here?

JAMES HERBERT: Yeah, I guess from working and studying multiagency systems, I'm thinking about it all as one system. And those holes, really what we're talking about the systems of referral and intake to those services. So in Australian jurisdictions, you might be getting a piece of paper with a list of services, and maybe the paper is up to date, maybe it's not. Maybe it has the proper eligibility criteria and maybe it doesn't. If you're in a jurisdiction with a really well developed connection between the therapeutic side and the investigative side, you might be getting a child and family advocate or an advocate counsellor that is sort of walking with you through that. And I guess addressing some of the barriers you might have, whether they be attitudinal, reluctance to seek mental health, care for your children because of your own negative experiences with mental health systems. All sorts of things.

So I guess what I mean by those leakages is we don't have a clear idea about how some of those different systems and processes are affecting the accessibility of these services. And those leaks are – I guess what I'm imagining and what I've seen in other jurisdictions and particularly in the US where they've studied and done interventions to try and address some of these barriers, is there's often really simply things that can be done to address some of these barriers. Whether it just be someone going with them to establish a new therapeutic relationship or make it less intimidating to access some of those things. My sense is that there's really easy fixes that are just little things that are just to do with system coherency. And I would love to see data taking a really clear role in helping to diagnose where some of these leakages are, as well as working with the sector and victim survivors to generate solutions to address some of those barriers.

ANTONIA QUADARA: Okay. So you've covered off also a little bit on the challenges and limitations in the data that helps us understand I guess what is the kind of pathways, if you like, into therapeutic services. Amanda, I do have some specific questions for you around what data we're actually – what we mean by that. But I wondered if you had any reflections on what James had been talking about from an Australian context, noting he was talking particularly about – the research literature is often quite US-heavy.

AMANDA PATON: Yeah. I think that we have – you know, the obvious one is our geographical spread. And that's hugely problematic in Australia. And that's an ongoing issue. And we have a sparsity of population as well in certain areas. So that makes the economies of scale for service design and implementation really challenging from a policy development and design perspective. I think one thing though is that we know, picking up from what James said, we know that it makes sense having someone support families from the point of forensic review and assessment, Child Protection, and then through to therapeutic support. We know when we think about children and families who have experienced child sexual abuse, and the acute trauma that they're going through, even just by engaging in that disclosure process. So it makes sense having that.

But what we don't have is there's no data that backs up why we should invest financially in that as from a design perspective. It's a really costly type of model, having support services bridging that gap and plugging those holes. And if we don't have the data to say, yes that actually makes a difference, it's a very hard case to make to government or to Treasury that actually that's needed. And that it will make a short-term and a long-term impact, not just for children and families but for the broader systems as well in terms of maybe justice or policing and Child Protection. So you need data to be able to demonstrate what we kind of know from a therapeutic perspective is going to help.

ANTONIA QUADARA: So when we're – thanks for that, Amanda. When we're talking around we don't have that data and using data and research in practice and also describing this webinar talked about data-driven practice and service design, I just want to spend a moment I guess at the beginning and talk about what we mean by data and research in the first place. Particularly from your perspective, in your minds, James and Amanda, what are the different types of data and research that we're actually talking about? And Amanda, I might stay with you on this for the moment. What are the types of research and data that are relevant to the kind of challenges that you just – or issues that you just spoke to then? And as you say, this kind of – I guess knowing from a therapeutic and almost intuitive sense of what supports children and families to connect to services and remain engaged. But needing to be able to demonstrate that. What's the stuff that matters, research and data-wise? And why is it important, from you perspective?

AMANDA PATON: I think for me, starting with the data side of it first, which really informs research, I think there's service-driven data, which is outputs, outcomes, and satisfaction type data. So consumer feedback type data. And then there's community data. So data in terms of mapping need, mapping trends, prevalence rates, those type of things. And so there's – I see it as really two distinct areas where data comes from. But you actually need the complete picture. If we're drawing a picture of a house and a family, we don't just have the walls. We need the roof and we need the details and we need the family as well as the picture of the house. You can't just look at certain little individual elements. It doesn't give you the entire overview.

But then in terms of research, I think then that's split again into kind of two areas of research. So we have the evaluation research – and whether that's specifically on treatment or certain programs, but that really tells us whether something is making a difference and how much of a difference it makes and in what context. And that type of research I think relies quite heavily on service-driven data. So on service providers capturing the outputs in terms of the widgets and how many people come through and how many sessions they receive. But then also the data in terms of the outcomes and consumer satisfaction and perception.

And then there's the research that really looks at what's going in the community. Almost that kind of descriptive, empirical kind of research. And I think they're used very differently, both by a practitioner and by policy makers and those that are deciding on what type of services to put where. But again, you need both to be able to actually make a significant impact and change within this space.

ANTONIA QUADARA: Okay. Great. I want to come back to what some of the challenges might be in using these two types of data as well as the service-driven data, but James, I'm going to ask you to weigh in on here, is there – you know, thinking about the different types of research and data, what else do you see as being important for informing practice improvement and service planning?

JAMES HERBERT: Yeah. I guess something that came up in some of Amanda and I's more recent work – and it's probably very germane to the papers that I just spoke about – is the difference between some of those really controlled clinical trials, so the evidence base from particular interventions, thinking about that. When you have these studies that – they're assessing it essentially in the best possible case. So they're screening for current domestic violence. They're screening for mental health. They're screening for things. Because they just want to test whether the thing works. So you kind of have this set of evidence that's really about best possible implementation of the intervention, best possible evidence, but it doesn't really reflect how it's going to be developed– like rolled out in the community.

So you kind of have this evidence, hey, if you do this multi-million dollar trial with really high fidelity, with people that are video recording the sessions and giving you feedback, here's the effect you're going to get. But the reality is we're not going to get that. What we're going to get in the community – and that's a whole other area of study, which Amanda's mentioned, which is the stuff that we really depend in the community sector in particular to get data from. To really understand what these things look like in the community. And I guess that's some of the implementation piece, which is the difference between something that's really controlled and implemented in a gold standard fashion, and what you can deliver in the community to people that are having current distress. You can't find this sort of perfect sample within that. You've got to work with the people you've got. You've got to adapt the interventions you've got to work with those clients.

You might have someone that's – it says session one, you've got to do this, and you're like, this person is screaming at me. You've got to work with what you've got. And I think that's some of the challenges for practitioners, is the gap between the really high-quality evidence that gets lots of citations, researchers I worship or whatever, and then the people looking at, “Well, how do you actually implement this in the community without multi-million dollar implementation teams?” And luckily enough we've got someone on the panel that has really wrestled with that. And really thought through about one, how do you adapt some of these evidence-based interventions to work with the people you've got? How do you address some of the limitations for the evidence base around the types of target groups that you're seeing? And I guess, combine it with some of the different interventions and approaches that practitioners have in their toolbox.

ANTONIA QUADARA: Thanks, James. You've set me up perfectly to lean in on some of the things that Amanda had touched on earlier. And as you say, there's gold standard evidence. There's also what we understand therapeutically, what comes – you know, basically practice wisdom and practice insight around what supports families and children. And I think from looking at the questions as well that have come through, there is a real hunger and interest to know about what is best practice evidence, what does the evidence tell us, for example, about how to support non-offending family members or whether the timing of therapeutic interventions matters for the outcome. So I think we can agree that data and research are important, but what you've both touched on – and I want to spend – I just want to land on it a little while longer – is that that doesn't mean it's necessarily easy to actually apply in a real world, messy context, and particularly thinking about the specificities of Australia and our geography and all those sorts of things.

So Amanda, I'm interested in, from your perspective, given that you have, as I said at the outset, really sat at that intersection of bringing research and data that's [raw] service driven data, if you like, into play in order to improve service design and planning, what are the barriers and challenges for program managers, practitioners, both in generating that evidence, but also applying the gold standard or clinical experimental research? What are the barriers and challenges that you've identified? And what are the kind of questions that practitioners and program managers and service managers can be asking themselves and should be asking themselves about particularly the more settled evidence base, if you like, around evaluation and what works? Sorry, I'm notorious for the double-barrelled question.

AMANDA PATON: Yeah. I was just going to say, I'm going to hold over here for a sec the issue of the barriers of collecting and generating data, because I think that's hugely fraught. So I'll hold that. And if I miss it, poke me again and get back to me. But in terms of actually looking at – so once you have a clear idea around what your client group is, and for today we're talking about child sexual abuse, but child sexual abuse doesn't occur in isolation. So we have children, young people, and families where there's a multitude of other issues and concerns that often are occurring at the same time. They're hugely complex.

Now, if I was to look to the research and literature, which I started to do probably about 18-20 years ago when I started working in this area, I found some gold standard evidence-based practice and I thought, “This is great. I can get training in it. And it'll be fine and I can use this with my client.” And pretty quickly, even though I'd chosen an evidence-based practice that was a gold standard, pretty quickly I realised that that absolutely was not going to work with my clients. And so you start to look at things, and from lifting that up to a service design perspective – and I spent a long time managing a very large specialist psychology team. And one of the things was, well, how do we get training en masse for already very well-trained therapists, how do we support therapists who maybe don't have certain qualifications or experience? How do we pay for that training? And how do we pay for the resources that we need to actually implement that training?

Quite often in Australia, the training is not available here locally. If you're in Western Australia, you're even more disadvantaged, because it's normally on the east coast. And then you need ongoing supervision and support and peer support and networks for those types of implementations and those models as well. And that's before you even look at the client group in which you're working. And so in research you – as a practitioner looking at research, you need to really dig down into the context at which it was actually evidenced. And what you'll often see is they screen out 90% of your clients that come through the door every day. And so it's working only for a certain amount of clients, a very tight kind of context. And so as practitioners then you have to work out, “Okay, so what elements are going to work for my client? What elements aren't?”

Then you grapple with the issue of implementation drift. And if I change it too much and if I alter the way the manual says it works, if I tweak that too much and I don't maybe have the experience and the practice wisdom to do that, what impact is that actually having on the fidelity and on the outcomes that I might be able to expect for the children that I'm working with and the families. And so it's a hugely complex issue. And I think the only way to actually do it is to make sure that you have really clear data, so you really know the demographic of the clients that you're working with. And you know the complexities and the challenges that they're coming with. Which comes from a really thorough assessment that you can then translate into some type of en masse kind of number counting, if you like.

Which sounds quite crass, but you do need to – am I dealing with a service population where 40% also have domestic violence in their backgrounds? Or also experience disadvantage or school refusal? Is this a common kind of issue that I'm dealing with? So you need to have that data. And then you need to have the outcome data, because if we're tweaking evidence-based interventions and maybe not delivering them the way the manual says, then we need to be able to look at, is that still working? Am I still making a difference for children and families? And at the end of the day, that's the most critical bit. So I think you need to interrogate the research that's there, which is often actually very, very challenging because it's easy to get swept up in the glitz and the glamour of the names of people that are producing this research.

And also particularly in Australia, I think being swept up in the latest treatment that's coming over. And that's being marketed really well. And not saying it's not evidence-based and it might really work for a certain cohort of people, but you must ask yourself, is it going to work for the clients that you work with? And what data can you actually generate to support that? Which is really going to help in terms of your program reporting and funding and things like that as well.

ANTONIA QUADARA: So it's in part around asking, or as you say, interrogating I guess what are the assumptions that, say, a particular practice model – evidence-based practice model that's being evaluated, what are the assumptions it's actually got built within it in terms of who's been screened out, so therefore who is it working with? Do we actually know who it isn't working with? Because those individuals who are likely to form the bulk of children and families that people are actually working with are not part of that research.

AMANDA PATON: Yeah, yeah.

ANTONIA QUADARA: And there's other questions around what's the context that this has actually been implemented in doesn't match my context here. And it sounds like – to me, those are questions that, as you say, practitioners and service providers are completely on top of because that's their, in a sense, best place to ask those questions and interrogate that research, because they have the client community right in front of them, in front of mind.

AMANDA PATON: Absolutely. Yeah. And we did some reviews a little while ago, and from a clinical practice perspective, so looking at – so the researchers did summaries on evidence-based practices, and then we got our clinicians to kind of go, “Okay, but how does this work in actuality? How does it work within your funding model? How does it work with the clients that come through? Can you get supervision and support? And what's the cost of that?” So it does have to be translated into a very practical level. And really, the practitioners are the only ones that can do that. They're the ones that have to take the manual or take the resource and try and implement it. So it's a challenge, I think.

ANTONIA QUADARA: So, James, you've been nodding. And at the beginning you talked about the limitations of data. From your perspective as a researcher, what do you see as being the biggest challenges in – we're talking about being data-driven. What do you see as being the key challenges that you've experienced for services and practitioners?

JAMES HERBERT: Well, it's going to be a little boring, because I mostly agree with Amanda. I was hoping that there'd be a few more sparks, but yeah, I think absolutely there's some real, I guess, present challenges. And thinking about the process of evidence moving into practice, we know it's a very slow one. It's a very indirect one. It happens at a snail's pace. I think as well, you don't really want decisions being made off the basis of a single paper. You want an accumulation of evidence information the way things change. And sort of change in movements.

I think about resources like the California Clearing House in the US, which synthesises evidence and tries to communicate it back to people in a really clear way of, if you have this problem, here are some interventions. This one is rated this way, this one's rated this way. And there's really clear things. I'd love to see something like that in Australia that sort of does that work not only saying, “Hey, what's the evidence for it?” But, “How do you adapt it and apply it to an Australian context? And take into account our population, our workforce, and some of the really important cultural considerations as well?”

I think the other thing is – and I'm thinking about evidence-based modalities – is I don't know if you can just expect individuals or expect individual services to do this stuff. I think it takes a whole of service system. And it's one thing to say, “Hey, here are these modalities, and take it and grab it and put it into your toolbox.” The other thing is actually resourcing and actually putting together evidence-based interventions. So instead of saying, “Hey, we're going to fund this service. We're not going to really determine the content. It's kind of up to you guys. If you want to do an evidence-based thing, cool.” They might be part of our criteria we are judging it on. But ultimately the content is up to the service provider. And that puts them in a really challenging position of do you pay for workplace training Do you pay to have someone implement and do fidelity checks for this really rigorous intervention? Or do you deliver what you know how to deliver?

And there's sort of a devolving of responsibility of the evidence base for some of these things where governments are sort of saying to providers, “Hey, you guys are doing evidence-based stuff, right?” And then at the service level, they're like, “Hey, individual practitioner, you're using evidence-based practices, right?” And I just think it's the wrong way to go about it. I think it doesn't show a lot of faith and trust in the power of social interventions to actually change. The idea that you sort of squeeze services to deliver things with the cheapest bang for buck, without saying, “Hey, how do you actually get these outcomes effectively?” Sorry, I talked in a bit of a circle there. I was hoping to find some ground to disagree with Amanda on, but I think I agree with her on just about everything.

ANTONIA QUADARA: No, we're not here to disagree. We're here to explore the complexities. But just on that point around delivering evidence-based modalities, and I know that certainly from the questions we received, there is – and other research projects that I've worked on, that there is real interest in which is best evidence-based practice for working with victim survivors of child sexual abuse who, I don't know, have complex trauma. Just talking off the cuff.

And one of the – leaning in on your point, James, around the fact that it's a whole of service system approach, if you're talking about the specialist services who might be doing the assessment work around where is this young person at, and noting that they actually might have a whole range of other complexities going on, including learning difficulties or other sorts of things going on, in the absence of that information coming from other services it can be really hard to get a whole, rounded assessment. So sometimes there can be that risk – not a risk, but a kind of – people doing work and improving practice in isolation of each other, when the task is trying to – you know, right back at the point earlier, we were talking about that pipeline, that whole of service system approach. So I just wanted to reflect that back.

And Amanda, I want to come back to you just on the service driven data. It is something that we've touched on in terms of reporting outputs and outcomes, length of engagement, service episodes or episodes of service. And also I guess the broader service landscape and level of need. But just sticking with the actual – what we might call administrative data, service administrative data, this is typically part of accountability reporting and sometimes can feel like a necessarily evil that may not – it feels like it doesn't have much return for the service provider. Practitioners that are inputting that information. Is this true? And both things can be true at once I think sometimes. And from your perspective, what are the opportunities here? If it is a necessarily evil, so how can that information be used and leveraged to improve service responses, in your experience, for children who experienced sexual abuse and supporting their families? Where do the opportunities lie?

AMANDA PATON: I think when you say – when it's progress reporting time, everyone who's online I'm sure will groan. And we'll be coming up to a reporting period at the end of the year as well. And it is, it can be kind of reporting for reporting's sake. And I think we do sometimes wonder, as service providers, what happens to all that data? And what are government in particular doing with that data? And we don't get feedback on aggregated data across the sector. So that can be a real – it can be disheartening when stuff goes into the pot and information doesn't come back out. So that's one thing.

But if we stick with the data that we collect as a service or as an agency, it can really tell us a huge amount in terms of where we've been and were we're heading in terms of trends. So if I think of years and years ago when we're looking at where to put new locations for centres specialising in child sexual abuse, we didn't just kind of look at a map and go, “I really like that area and there's cheap accommodation. Let's go there.” We actually went to the data. And so we looked to the data in terms of well, where were our children and families coming from? What area? What geographical location? What was the highest kind of ratio? Then looking at the community stuff in terms of where are train lines, where are bus lines around the highest collection of people, and can we look at putting a service in there?

We looked at, okay, so we're getting actually more referrals for young people. Well, do we have therapists on staff who can actually deal with young people? And do we have the right treatment options for young people? Do we understand young people as a group well enough within our child practitioners? So we come up with other things that we need to think about there. Are we getting more adults into the service with their own child sexual abuse history? And do we have the right skillset to manage that? Or if we don't, can we actually then link in with other services in the area to actually provide on warm referrals to other services and provide those if we don't have capacity within our team.

So I think even looking at the very basics of that level of data, but that type of output-driven data can also tell us – so let's just say we've got an increase in managing suicide risk that comes in and we collect that data. So we can have a look and if we can see an upward trend in the amount of suicide risk assessments that our therapists are having to do every week or every month, we can actually trend that and graph that. But then we can go, okay, that's an increasing trend, but now let's actually look at the clinical response to that. So is our staff actually responding in a way that we would like to see them respond according to evidence-based practice for that area? Do we have the correct kind of connections with health services and Child Protection Services? And do we need to actually train our staff and provide more support? And is there some vicarious trauma that's going on with this increasing trend of having to respond to suicide risk in children and young people?

So there's so many ways to actually look at the data. It's so easy not to thought. It's very, very easy, and often therapists themselves and service providers and managers are very time poor. We have the – you know, asking to more with less. And data goes to the bottom of the pile. And certainly, interrogating the data that we have and trying to make sense of it and pull-out trends goes to the bottom of the pile after that. So it is a – I think it's more complex and difficult for smaller organisations. Larger organisations sometimes have whole departments or roles that are actually dedicated to this, to looking at data and to capturing it. And having those systems in place. Smaller organisations really struggle with the resource that it takes to actually look at that. And then give the time to it. But then actually implement changes because of it. Once you recognise the problem, you then actually need to take steps to resolve the problem as well.

ANTONIA QUADARA: Great. That issue of time poorness, poverty, opens us up to a question of like, well, how do we alleviate that? Which I will leave for the question section, because there's some questions that could provide some opportunity to think about that. And I am conscious of time, so I'm going to ask James just one last question on I guess the broader service system. Thinking about the child advocacy centres and thinking about the multi-agency approach that that advocates. And also in terms of the conclusions that you've drawn in the two papers that you've done for CFCA, you've noted that engagement and completion of therapy needs to be systematically tracked to more completely track the outcomes and impacts of sexual abuse services. And that services and systems should monitor their own data on risk factors and disengagement.

Thinking about this from a sort of integrated systems perspective, aside from the therapeutic outcomes or treatment information that specialist services might be looking to, what might be – or from your perspective, is the critical information to draw from the broader service system to help understand where the gaps are? For instance, you talked about that leakage right at the beginning. What can we obtain from other services that would help plug the leakage, stitch the pipe together? I'm not sure about those metaphors, but you know what I mean.

JAMES HERBERT: That isn't a two barrelled question, that's a two bazooka question I would say. The first part, in terms of multi-agency responses, look, I'm a true believer that you get people together in a room, they solve problems, they view the case more holistically. I think the challenge is – and this maybe gets back to your point – is about adaption and understanding local context. So we have this, thinking particularly about child advocacy centres, we have this evidence base for them which is primary from the early 2000s in the US. practice at that time that it was compared to was really crappy. Like really, really terrible. Like unqualified people doing child interviews, all sorts of wild stuff happening. Versus really well developed, really well resourced, holistic response that was developed and implemented really well. So it's not surprising that there's a pretty dramatic difference between the two things. So that's where we start from in terms of the evidence base.

Then when we were looking at how to roll that out in Western Australia and what outcomes we could expect as a difference, we had to sit down and say, “All right, well, what's the actual theory of change for this thing?” If you do all these things, you get these things. But if you're not doing all those things, if you already have qualified interviewers working, you already have fairly child-friendly facilities, if you take some of those things out, what can you realistically expect to see? And I guess some of our work was really about getting realistic about what difference in outcomes you're likely to see when you're only doing some of these things.

And moving to your point about the therapy engagement, a lot of the existing research was very focused on the criminal justice system, because they were initially developed around minimising trauma and distress from the criminal justice process, as well as facilitating disclosures and things like that. I guess where I've got to, and what I'm really interested in, is thinking about things as a system, about where the gaps are. The research suggests that there may not be generalisable challenges or things going on. So what it suggests is that you sort of needs to go back to the data on thinking from that initial point of disclosure, where are people ending up in the system? So are they making two services? What are the barriers to getting to those services? How long is it taking?

Now, getting that data is challenging. Often in some of these jurisdictions that have really fragmented services could be going to NGOs, could be going to government, could be going private practitioners. It's scattered all over the place. And unless we have a common start point, we don't really know. But I guess what I really wanted to talk about and what I'm really jazzed about doing, and would love to hear from people that are interested in working together on this, is thinking about how to use data to solve some of those problems. So I taught a really interesting intervention in Chicago. And what they had done was sort of monitored over multiple years who was getting two services and who was not getting two services. So they had various processes as well as getting consumer feedback about that. But you don't get to hear from everyone in that process. Whereas this was monitoring everyone.

And what that then led to was, hey, we have these problems. Here are the things that are stopping people from getting to the services. Here is who's getting them and who's not getting them. And they undertook a process of intervention development to solve some of those problems. Now, what that looked like there was a centralised waitlist. So the idea was instead of being on ten waitlists, you're on one. Once you get to the top, you can try out all these different services to get one that's in your area, at a time when you want it, in the language you want it. And you have a therapeutic alliance with the person. So that solved a lot of problems for them. They did some work with the advocates. So they trained them in motivational interviewing, they signed all the services up in their network to a minimum standard for therapeutic services. So that they had assurances that everyone was getting the same thing. So you didn't have to send them to in-house services.

And the beauty of all this was that because they were monitoring engagement and completion, they were able to say, “Hey, the interventions that we put in place have reduced wait times and improve engagement with these services. So they're getting to everyone.” Now, my point about this is not to follow the intervention development that they did, but to follow that process of getting the data, to understand what the problems are in individual jurisdictions, and use that data ideally with victim survivors, with professionals, and with government to say, “Hey, what are the cost interventions we could put in place that make it so that services are accessible, and we're not just sending untreated trauma out into the community?”

ANTONIA QUADARA: I feel like you were just – like you're warming up, James. You've got a lot to say.

JAMES HERBERT: I do rant a lot.

ANTONIA QUADARA: And I appreciate it. It's great. We've got ten minutes left for this webinar and we've got some questions that have come through, so I want to switch gears. And actually, James, I'm glad that you talked about some practical – you know, some examples of how data is being used to change practice. So thank you. We received a couple from the – at the point of registration. So I'm going to select some of those there and some live questions that have come through. And we may not be able to get through all of them right now, but people can rest assured that we will answer them. One of the questions is, “As an NGO, we may not have access to broader systems/government data that helps us identify trends or needs. Any suggestions?”

AMANDA PATON: I think there's – one is more conversation and lobbying with government to actually get that data. And just kind of picking up from James's point, there's an issue – and I think regardless of what jurisdiction you are in Australia – with linked data. So linking data in police to Child Protection to health to the NGO sector and the private sector, particularly within child sexual abuse, is really fraught. And it's very fractured. And so I think there needs to absolutely be more lobbying of that. And I know there's certain things in national strategies and pieces now that's actually calling for linked up data, and for improved data systems. So fingers crossed.

But I think there's a lot of power in the NGO sector. And I think that is a missed opportunity sometimes. We have a huge network in the community service sector, not just within child sexual abuse but within related areas right across Australia. And deidentified aggregated data can be shared and can be used. And I think it's – you know, speaking to the agencies next door and speaking to the agencies in the suburb over and actually finding those networks and developing those networks, sometimes it's only a cup of coffee and a catch-up like that. And to be able to share that data and information I think is really valuable. And that has a lot of power, I think. And you can really create quite a big picture just from the community service sector and the data that's available there.

ANTONIA QUADARA: Great. James, did you want to add anything to that?

JAMES HERBERT: Yeah. Look, I don't know if it's totally relevant, but I just got to thinking about information sharing and some of the challenges that happen around that specifically child sexual abuse. Often – and I've noticed this in a few different jurisdictions, is there'll be legislation or information sharing guidelines that are passed and agreed to, but not carried out in practice. And it's really odd. Really odd. I've been there in situations where the person has passed the ISG, walks into the room, and people are running up and they're like, “Hey, this happened and this happened and this happened. Am I allowed to share information?” And she's like, “Yes. Yes, that's why we passed ISG.” And people are just running up to her and like, “Am I allowed to share this information?” She's like, “Almost always yes. Yes.”

ANTONIA QUADARA: Almost always yes.

JAMES HERBERT: And it just got me thinking, I guess. You pass things like ISG, Information Sharing Guidelines, in SA. Or you have one level where you kind of have this thing where you've said, “All right, here's the conditions under which you can share it, and that's fine.” And maybe within the complexity of NGOs dealing with government agencies and things, there's nervousness about that. But yeah, I just think within the agencies, some of that nervousness is odd. And I don't understand why there are barriers to I guess using information-sharing guidelines, especially when it's clearly in the best interest of children. Why it takes so long for people to be convinced that their agency isn't going to come after them for complying with ISG. Very odd don't know if I answered the question, but it sort of relates to that issue of working across agencies and having that confidence to share information about – whether it's individually or whether it's at an aggregate level.

ANTONIA QUADARA: But if we put your two responses together, what we come away with is there's both the formal mechanisms like information sharing guidelines, and understanding what they are and what they enable. As well as the power of the NGO sector and the connections and that lobbying, and putting those together can become really powerful, I think.

AMANDA PATON: Absolutely.

ANTONIA QUADARA: We've got four minutes so we might have time for one more question. I'm sorry, I'm just reading here. Might be a bit complex for right now. Here's a good one. Is there capacity for services to collaborate with research organisations? Services may be collecting the data and researchers have the know-how and possible the time on how to analyse the information. What are the – I mean, in a sense, you guys are a model of that, but what are the opportunities and what are the things that make that possible and make it work?

JAMES HERBERT: It's complicated. It's gotten more complicated. I think –

ANTONIA QUADARA: Why is that? Can you talk about that?

JAMES HERBERT: Yeah. Look, I don't think this should be a sob story about researchers, because we've got great jobs. We get to do interesting things and all sorts of stuff. We don't always get to do exactly the stuff we want to do. And often we have to go begging to find salary money to do the things that we're really interested in doing and the things that are really impactful. So don't shed a tear for me or anything, but it can be challenging to get that time to do that really industry-focused work that can have really big impacts. Because sometimes the stuff that our incentives are built around. But I think in terms of NGOs wanting to seek and establish partnerships, I think that's a very good thing. And I think there should be more of it. I don't know how you facilitate it.

Definitely some academics do have time and it has to do with the structure of how we're employed, whether we're a 40/40/20. So whether our teaching subsidises some of our research time, or whether we're research only and we're more or less having to find our salary from various pieces of work. That can set up some of the complexity of working with NGOs. I think the other thing – and I've seen from some contexts is NGOs have this experience of researchers coming to them at the last minute of the ARC or something like that saying, “Hey, we've got this project and can you be our industry partner on it and can you give us $20,000?” And that sort of stuff. I think that undermines – what you'd want to see is that enduring relationship between service and research that you build up that quid pro quo, where it's not a problem if people are asking you to download journal articles for them or whatever else. Don't tell Springer or whoever else that I'm doing that, but –

But yeah, Amanda, I guess we kind of have a pretty direct experience of research partnerships. So you can talk to that.

AMANDA PATON: Yeah. I think a research partnership with an NGO and a research centre can be a great marriage, but it can also end in divorce. And I think the reason why I say that is a cautionary tale for NGOs – we think, you know, putting my community service sector hat on, we think what we do is great. And sometimes we might find some extra money and go out and want a researcher to research it. But if you do that, you open it up. Be careful what you wish for. You might think you are doing a great job, and I'm sure many are, but you also need to be open to – a researcher will evaluate what's in front of them. There either might not be enough outcome data to actually make a judgement of whether it's working. Or you might find that actually what you're implementing is not making a difference or a sizeable difference as you think it is.

But on the flipside of that, it can be a brilliant relationship in terms of working with a researcher. I think it's an excellent opportunity for community service sectors to improve their level of outcome data and their type of outcome data in terms of their pre-assessments and post-assessments and how they do that. And upskilling their workforce. And upskilling their workforce to actually start interrogating research as well, and what the literature is saying around a whole range of different things. So I think there should be more of it. There absolutely should be more of it. But both parties, the researcher and the community sector organisation, need to go into it with a very open mind, with a true partnership and – you know, ready to kind of embark on a journey and not pre-determining outcomes. And not have pre-determined expectations around that outcome I think, is probably the biggest piece.

But I think the community service sector, if they have the funding behind it, they should be doing more of it. And governments should be funding the community service sector to actually evaluate their programs more to actually do that. And I think we are seeing a slight shift, particularly in the outcomes-driven framework and funding that is starting within government and Treasury now. Certainly, for WA. I think we'll start to see more of that inbuilt evaluation in service contracting. Fingers crossed.

ANTONIA QUADARA: So just looking at the questions that are coming through, we have a question around how do services leverage their data to advocate for their service or programs? I might throw that one to you, Amanda, in the first instance, and then James, I'll be interested in your views as well.

AMANDA PATON: I think it's a really good question and if we start with government, they are moving more towards an outcomes-based program, I suppose, in terms of funding. And also reporting back on services. But in a philanthropic space, there's a lot more philanthropic support I think for child sexual abuse and Child Protection related issues and family violence and those type of things across Australia. But they are much more particular around the types of information that they want to see. And the feedback that they want to see in terms of their value for money. So, if I'm a philanthropist and I've earned $100 and I'm giving my $100 over to you to provide a service, I don't want to know so much how many times you've given that service. I want to know the impact of that service. So, I want to know what it means to the person that you've actually provided that service to.

And so I think there's a significant piece there where organisations that collect outcome data, if it's the right outcome data – so looking at the impact of their interventions and they can actually marry that to the cost of those interventions as well, then that tells a really good story for potential supporters of ongoing funding. It also allows organisations to make a case to government and to potential donors and sponsors and in grants and applications and things that there's a gap. There's a need. There's an issue. And hey, we've got the solution and this is how much it's going to cost. And this is the benefit of that solution. Not just in abstract kind of terms, but to children and young people. So they can actually really see – and everyone loves a graph. Everyone likes to be able to tell a story. They say, it tells a thousand words. And it absolutely does.

Being able to show that a group of clients that you've worked with have decreased in their symptomatology in terms of trauma related symptoms to child sexual abuse and to be able to have a narrative that goes along with it, where you've got some children really talking about their experience of the service and parents talking about that, that tells a complete picture around the impact. And that's a really powerful story to lobby government and to lobby potential supporters for ongoing funding or to fund something new where there might be a gap. I think there's a huge amount of responsibility that comes with that, in terms of service providers making sure that they use their data, whether it's output data or outcome or consumer kind of feedback, they have to use it in a very ethical, transparent manner. So collecting date, clients need to be aware that it's going to be used, deidentified, but that their information will go into a bigger pool of information. And that it might be used for these type of things.

And that as service providers, you explain the data in a very factual way, without shaping it or – so I think that's a really critical thing to remember as well. The responsibility with it.

ANTONIA QUADARA: Okay, great. James, what's the role of researchers, do you think, in that process, I suppose? What can researchers contribute to that? Whether it's with the services or thinking about from governments as well as commissioning agencies, what can researchers do to help services leverage their data?

JAMES HERBERT: Sometimes with the funding pitch, I don't know whether you want to talk to a researcher or you want to talk to a good PR person or a policy advocate. I think we would like the funding environments to be a meritocracy and to be logical, but I don't think they are. I don't think they ever will be, because there's different pots of money for different priorities. There's different I guess fundamental – you know, housing is one thing. And disability is one thing. And they're all different pots of money and there's different regions. It's all I guess a kind of beautiful chaos. And you hope that through the discourse and arguments and evidence being provided, that you do end up with things that have merit prospering and things that are underperforming either changing or being moved off funding.

But I don't think that's how it works. How researchers can contribute to that, I don't know. On one hand, we can be a gun for hire and we can be the people that help you put your best foot forward for your program. The other hand, like Amanda mentioned before, we can be the person that tells you that you're on a thing that isn't fully defined, that isn't ready. And I guess that's something I was thinking about before, is the pressure to demonstrate outcomes straight away is huge. Potentially quite destructive for programs. Because there is that pressure to bed things down and formalise procedures and have it be fundable within a certain time period. And [these things take a really long time to – you know, rubber hits the road and you're seeing real clients and you're adapting the program for the kind of complexities and the one-off events or the things that are unusual. Submitting your program to scrutiny very early, I think that can sometimes be kind of destructive.

I think the term premature evaluation exists where I guess programs that are still finding their way and still being formalised and turned into a program that's recognisable, the analogy is sort of like pulling a plant out to examine its roots. And it's challenging to think about what's the right place for researchers in all that. because obviously you're sort of talking service level research that might be helping you put your tenders or your pitches forward, and how to engage with the external researchers that might be asking uncomfortable questions sometimes.

AMANDA PATON: I think, just to add to that, there's also a balance – I mean, this is an extremely important discussion around how we use data and capturing data of different types, but at the end of the day, we've also got a child and family in front of us who are deeply distressed, who have experienced potentially the most horrific thing that they will experience in their lives, and are going through the process of disclosing and trying to work out what their new world is going to be like. Getting them to fill out questionnaires, getting them to do assessments is – that's not always appropriate. And it can't be – it has to be a balance. And I think practitioners are the ones that know best when they can actually give that assessment to that client. Is it appropriate to get them to fill out that checklist or that consumer survey or whatever it might be?

And policy designers and service developers and those people managing programs, they might have a motivation to have the data, but the practitioners need to hold the client at the centre of that decision making. And exercise their practice wisdom and what they feel is appropriate and what's not. And those two sometimes don't match up, and they don't marry. And I think it's often a tension too, at that service delivery end.

ANTONIA QUADARA: That's a really good point, Amanda. And I do like the analogy of pulling up a plant to see – you know, pulling it out and examining its roots to see how it's travelling. One last question that I guess flips – you know, we've been talking about something quite meaty around leveraging the data. There's a kind of precursor question, I guess, which is a little bit more straightforward, which is, are there ways to reduce the manual entry of data by service providers and to help alleviate the time demands required in collecting service level data? So how can we make that easier and more efficient?

AMANDA PATON: It's the million-dollar question, because they're so expensive to do. Years ago, we would have Excel spreadsheets, and like 100 Excel spreadsheets and paper, and stacks of paper that would take hours and hours. I think as technology has become more sophisticated, data capture of a whole range of different things has become more advanced as well. Lots of services now – you know, you go to a hospital, and sometimes on the way out, there's a quick consumer feedback where they quickly just tap on an iPad for what their perception was like today in terms of the service that they received. I know some service providers are actually doing that now as well. So having their intake and exit questionnaires either given to clients on an iPad in the wait room before their session, or sent to them via an email link. Some very basic kind of SurveyMonkey type things are being generated now too, which can be done. And then it's just as simple as generating a report, all deidentified.

So, I think there's loads of ways to use the technology that's now available. But easiest I think is getting the clients to fill it out themselves on a device that's easily available or in the privacy of their own home, via an email or text message link. There's also the flip of that, though, in that you have to then be extremely careful around the content you're asking them to fill in. There's security issues as well in terms of sending information via email and unsecured email. Or in the cloud and what security protections you've got from an IT perspective within your organisation. And you have to ensure that the content is not distressing for the client to fill out, if they're filling it out in their own home or in a wait room before they leave. So huge benefits of doing that type of thing, and definitely cuts down on staff time to actually enter it. But we then just have to step back and think about what we're asking of clients.

And also for clients where English may not be their first language. They may not have great written or verbal – you know, literacy skills. May not have access to internet and those type of things. So just the accessibility then becomes a further consideration too.

ANTONIA QUADARA: Great. Thank you. James, in your travels, what did you see? Did you see any sort of magic bullet that answered the $64 million dollar question?

JAMES HERBERT: Yeah. I think there's some clever stuff in the pipeline. I've seen some stuff around text mining and use of technology to automate some of these things. If you wanted paper-based things or things that are PDFs turned into data, you'd have to hire someone to sit down and do the quite laborious work of putting that in. Yeah, there's people working with all sorts of clever computing things to address some of those solutions. So hopefully we can see that popping up in the sector. I think we are slow to adopt technology. So yeah, good to see that. And yeah, even things like people writing codes to translate people's agency database into standardised data across multiple agencies. I think there's all sorts of clever things going on. And maybe it gets to the need for a broad skillset across the sector where there are people that know how to use technology and apply machine learning and apply all the big picture things that are happening in the technology sector to the sector. But of course, as Amanda mentioned, with an eye to I guess the sensitivities and the appropriateness of some things.

ANTONIA QUADARA: Okay, thanks. Great responses. That's all we've got time for. So we're going to sign off. Thanks to James and Amanda and thanks to our listeners. I'll see you next time.

AMANDA PATON: Thank you so much.

JAMES HERBERT: Thanks, Antonia.

WEBINAR CONCLUDED

IMPORTANT INFORMATION - PLEASE READ

The transcript is provided for information purposes only and is provided on the basis that all persons accessing the transcript undertake responsibility for assessing the relevance and accuracy of its content. Before using the material contained in the transcript, the permission of the relevant presenter should be obtained.

The Commonwealth of Australia, represented by the Australian Institute of Family Studies (AIFS), is not responsible for, and makes no representations in relation to, the accuracy of this transcript. AIFS does not accept any liability to any person for the content (or the use of such content) included in the transcript. The transcript may include or summarise views, standards or recommendations of third parties. The inclusion of such material is not an endorsement by AIFS of that material; nor does it indicate a commitment by AIFS to any particular course of action.

Related resources

Related resources

Webinar questions and answers

Questions answered during presenter Q&A

To view the presenter Q&A, go to 49:10 in the recording

  1. As an NGO, we may not have access to broader systems/government data that helps us identify trends or needs. Any suggestions?
  2. Is there capacity for services to collaborate with research organisations?
  3. How do services leverage their data to advocate for their service or programs?
  4. Whether it’s with services, governments or commissioning agencies, what can researchers do to help services leverage their data?
  5. Are there ways to reduce the manual entry of data by service providers and to help alleviate the time demands required in collecting service level data?

Presenters

Dr James Herbert is a Senior Research Fellow at the Australian Centre for Child Protection at the University of South Australia. He is a social researcher undertaking research and evaluation on the topic of cross-agency collaboration in responses to child sexual abuse, particularly the design and implementation of investigation and therapeutic support teams and addressing institutional barriers to effective collaboration.

James is currently undertaking research on addressing the barriers to therapy engagement for children that have disclosed sexual abuse, and on effective deliberation procedures for multi-disciplinary team case review. Working across child protection topics, he also has a key interest in the translation of research evidence into policy and practice change and building the capacity of the sector to apply evaluative thinking to complex social policy issues.

Amanda Paton is the Deputy Director, Practice at the Australian Centre for Child Protection at the University of South Australia. She is a clinical psychologist who specialises in the assessment and treatment of children and young people who have experienced complex trauma from child abuse, particularly related to sexual abuse, children living in out-of-home care and those displaying harmful sexual behaviours. Previously, she led a large specialist clinical team at the multiagency investigation and support teams in Western Australia responding to child sexual abuse and led the design and implementation of two child advocacy centres in Western Australia.

In her current role, Amanda oversees a range of projects including the research, development and implementation of evidence-based therapeutic models for responding to child abuse, neglect, trauma and harmful sexual behaviours in Western Australia. She also provides policy consultancy and advice to the Department of Communities Royal Commission team and Specialist Child Protection Unit.

Facilitator

Dr Antonia Quadara | Executive Manager, Strategy and Business Development

Dr Antonia Quadara is a Research Fellow at the Australian Institute of Family Studies. She currently manages the Sexual Violence Research (SVR) team and prior to this she was the manager of the Australian Centre for the Study of Sexual Assault (ACSSA).

Antonia has been undertaking research in violence against women, women's policy and criminal justice policy since 1999 when she completed a thesis on the treatment of Aboriginal sexual assault victim/survivors by the trial process. Her PhD, completed in 2006, explored the adult entertainment industry, women's safety and public space in public policy. She was a lecturer and researcher in the Department of Criminology (University of Melbourne) from 2001 before beginning at the Institute.

Antonia has a strong background in qualitative research methods, stakeholder engagement and consultation, and extensive experience in the writing, development and production of publications and resources for policy and service sectors involved in responding to sexual violence. Antonia's specialist research areas included: criminal justice responses to violence against women, feminist frameworks sexuality and sexual violence, sex work in Australia, sexual assault prevention and public policy development in women's safety and violence prevention.

Share