What is good research evidence and how do you find it?

Content type
Webinar
Event date

4 June 2025, 1:00 pm to 1:30 pm (AEST)

Presenters

Melissa Willoughby, Kat Goldsworthy

Location

Online

Scroll

About this webinar

Quality research evidence can improve service design and delivery, and professionals in child and family services often need to use this to inform their work. This can include writing grant proposals, developing program theories, or preparing program evaluations. However, people may not know how to find high-quality, relevant research.

The Evidence and Evaluation Support (EES) team at the Australian Institute of Family Studies has put together a four-part webinar series to share examples of evaluative practice and ideas for collecting and using evidence in the context of Australian child and family services.

In episode 3, Kat Goldsworthy, Research Fellow in the Evidence and Evaluation Support team, sits down with Dr Melissa Willoughby to discuss practical strategies for finding and using research evidence in a practice context.

This episode will discuss different types of research evidence and how to find and assess its quality; as well as insights into how research evidence can be used alongside other forms of evidence, such as lived experience and practice expertise.

This webinar will give you:

  • an understanding of what research evidence is and how it’s relevant to service design and delivery
  • insights into how to identify quality research evidence and make judgments about if it can be used to inform decision-making in practice
  • insights into how to find relevant research evidence.

The webinar is intended for anyone interested in understanding and using research evidence more effectively. It will be particularly useful for professionals involved in service design and delivery, especially those who write grants and funding applications, design new programs or refine existing ones, and those who support or do program evaluations.

Evidence and Evaluation Support Webinars

KAT GOLDSWORTHY: Welcome everyone to today's webinar. My name is Kat Goldsworthy, and I'm a Research Fellow here at the Australian Institute of Family Studies working in the Evidence and Evaluation Support team. I'd like to start by acknowledging the Wurundjeri, Woiwurrung and Bunurong people of the Kulin nations who are the traditional owners of the lands in Melbourne, where I am lucky enough to live and work. I also pay respects to the traditional owners of country throughout Australia and recognise their continuing connection to lands and waters. We pay our respects to Aboriginal and Torres Strait Islander cultures and to elders past and present. Today's webinar is part of a series designed to share examples of evaluative practice and ideas for collecting and using evidence in the context of Australian child and family services. The format is a little bit different to our usual webinar program, in that it'll be a brief conversation between me and a guest today. My guest is Dr Melissa Willoughby, who is a senior research officer here at the Australian Institute of Family Studies. And today, we're going to be discussing what good research evidence is and how to find it. Before we dive into the discussion, I do have a little bit of housekeeping to cover. So, this webinar is pre-recorded, which means there is no live Q&A session. For those of you who require captions, please watch the webinar via the AIFS website. There is a link in the chat and a full transcript will be available. There will be some related readings and resources given out in the handout section of the GoTo Webinar Control panel. Unfortunately, we can't cover everything, so there'll be some handy kind of tips and tools there for you. And at the end of the webinar, a short feedback survey will pop up and we'd really appreciate you taking a minute just to fill that out so that we can keep getting some feedback about how we can improve this webinar program. Okay. I think that's it for me with housekeeping. Welcome, Mel. I'm really, really excited to have you here.

DR MELISSA WILLOUGHBY: Thanks, Kat. Yeah, looking forward to having a chat about research evidence.

KAT GOLDSWORTHY: Yes. Well, we ran a workshop on this topic last year at the FRSA National Conference, and we did receive some very nice feedback about the usefulness of that workshop. And we sort of thought that other people might benefit from some of the key messages that we had shared. I say we I really mean you because you are…

DR MELISSA WILLOUGHBY: Definitely a team effort.

KAT GOLDSWORTHY: But yeah, I mean, we know that people working in child and family services, they have to interact with and use research evidence to inform the work they're doing in a variety of different ways, As you know, whether that be for grant writing, for developing programs theory or just preparing to do some data collection, monitoring, evaluation. But we're certainly not all trained in how to find research that's right for our purposes. And that's good quality. So, yeah, I've invited you here just to kind of give us some pointers in that area. So, I do have questions.

DR MELISSA WILLOUGHBY: And hopefully I’ll have some answers.

KAT GOLDSWORTHY: Oh, you've got all the answers. I mean. I keep kind of using this term research evidence. Research evidence. And it's something we use a lot here. And I sort of we kind of forget that we probably do need to define what that is. Could you just tell us what is research evidence.

DR MELISSA WILLOUGHBY: Yeah, that's a great place to start. I think first I want to say research evidence is one type of evidence that people can use, to help make different types of decisions. It can help inform what we do and how we do it. There are, of course, other types of evidence, like practitioner knowledge and skills and also lived experience. And ideally, we want to be using lots of different types of evidence when we're making decisions. And this can help us make better decisions by overcoming the limitations of each type of research. So, research evidence has particular qualities or characteristics that separates it from other types of evidence. And these qualities make research evidence really trustworthy and really reliable. So broadly speaking, research evidence is systematic. It's documented. And it's also something that's been subject to external scrutiny. So this means that research evidence has been conducted in a series of specific steps. All of the steps have been recorded and transparently reported. And it also has been judged by someone who's not been involved in conducting the research project, and this is usually referred to as peer reviewed. So, all of the research articles that you might read in an academic journal article have been reviewed by another researcher who was not involved in that project, and that's to just kind of check that it all makes sense and there are no issues with it. Digging down into research evidence a bit more. There are different types of research evidence and I have a slide that I'll share that kind of outlines it.

KAT GOLDSWORTHY: That I think will be really helpful.

DR MELISSA WILLOUGHBY: Great. So, on this slide you can see a summary of the different types of research evidence. The first one is descriptive and this research evidence looks at how to identify or understand a problem or priority. An example of this might be research that looks at how common is domestic and family violence in a particular population or in a particular community. The second type is effectiveness research. So this research tries to understand what works to address the problem. So what programs might be effective at addressing a particular problem. An example of this might be research that asked, what effect does this program have on a particular parenting practice, for example. And the third type is implementation. So I think about this as the ‘how’. So how can things be done to make a program the most effective? And this could be things like how often should the program be run, where should it be run and who should run it. The last thing I think is important to touch on here as well is there's different types of ways you can collect data. I'm sure many people would have heard of quantitative and qualitative data. So quantitative data obviously refers to numbers and statistics. And research could use methods like surveys or looking at administrative or routinely collected data to collect some quantitative data. Another type is qualitative. So this usually involves talking to people about their experiences. And this is another place where the different types of research can kind of come together. So qualitative research can interview practitioners. And it can also interview people with lived experience, about their experiences.

KAT GOLDSWORTHY: That is a fabulous overview. Thank you. Mel. I mean, it's nice to hear you sort of touch on the different forms of evidence. You know, I know that we can kind of get stuck in hierarchies around evidence sometimes, but of course, research evidence, as you said, is really just one component in this broader lens that we're when we're thinking about, evidence. And, you know, there's, there's a I often think about the evidence wheel that emerging minds, which is the project that you work on, some research that former researchers at ours developed, you know, several years ago. Now, as part of that project, with the evidence, we'll kind of the three parts looking at research evidence, sort of lived experience evidence and practitioner evidence and yeah, this is kind of just one sort of one slice of the pie, but a very important slice of the pie.

DR MELISSA WILLOUGHBY: Yes, definitely. And I don't have a picture of that wheel, but it's a great reference to look at and really how to bring together the different types.

KAT GOLDSWORTHY: I was just curious, while I was looking at that slide that you had up before, and we kind of got the descriptive, effectiveness was there one implementation? Was that the other one?

DR MELISSA WILLOUGHBY: Yeah.

KAT GOLDSWORTHY: That's right. Are these terms that, you know, if you're not familiar with research evidence and you kind of I know we'll get to this, but you're going to go looking at trying to find research evidence for your purposes. Do we need to be aware of these terms in that context, or is this kind of just a helpful way of thinking about the different types of evidence? Does that make sense? 

DR MELISSA WILLOUGHBY: Yeah, I think it's both. It's maybe the unhelpful answer. It's useful to know that there's different types of research evidence, because there's different types of research evidence, are aiming to achieve different things and can tell us different things. So depending on what question you're asking, you should be looking at each different type. So for example, if you want to know who should be running a program that you want to run or how often it should be run, it's not going to be helpful to look at a descriptive evidence. Another way these types are useful to know is that if you're searching for particular types of research evidence, sometimes these useful terms. So if you're looking for the how, the implementation might be actually useful to put that word in a search box implementation and a program name and see what comes up. So that's another way you can kind of think about using the different types.

KAT GOLDSWORTHY: Yeah. No, that that is very helpful. I feel like even I sometimes forget these basics. I mean, you work, you do a lot of this sort of research here at AIFS. I, you know, I'm doing much more capability building these days, but, it's nice to me to just kind of have these reminders about how to conduct research. And we'll talk about that a little bit more later in the conversation. I guess I'm wondering, is all research evidence made equal? I mean, you spoke about the different sort of qualities, like the steps, you know, the systematic kind of way that research evidence is conducted, having peer review and, you know, some other things. But even in that context, is all research made equal? Are there any kind of pointers that you can give us for finding quality research evidence?

DR MELISSA WILLOUGHBY: Yeah, it's such a great question and also a really important thing to think about when we're looking at research evidence. Yeah, as I said before, research evidence is very reliable and rigorous and that's a key quality of it, but it's also not perfect, and anyone who's familiar with reading research evidence will know it can vary a lot in the type of quality between studies and looking at quality is really important. When we're reading research evidence, because it lets us know how confident we can be when we're looking at these findings and how much we should actually rely on the findings to make the decision that we need to make. And that doesn't mean if something's not high quality, we throw it out. It just means we need to take that into consideration in our interpretation of the research. So this is something that's actually very tricky to do. And, it's definitely one of those things where the more you do it, the easier it gets. You'll get used to thinking about things and seeing things, when you're reading the research evidence.

But something that's hard to start to do. And researchers find this very difficult. We could spend multiple hours, multiple webinars talking about what is quality research, where is that different? How do you assess it? There's lots of different tools and scales that researchers use to try and assess the quality. And again, just like the research itself, none of these tools or scales are perfect either. So because we have limited time and we don't have multiple hours or webinars, two key things that I like to think about when I'm thinking about quality, are the things I'm going to go through today. The first one is considering the study type, which is also referred to as study design. And thinking about the different study designs and how well or not so well they can answer particular questions. And the second thing is called a critical appraisal and this involves looking at the parts of the study and how the study was conducted. So I'll start off with the first one. We've already kind of touched on today that different questions can be answered by different types of research. So looking at quality in this lens involves, understanding that different types of research, has different strengths and weaknesses based on the type of question that's being asked. I've prepared another slide to share.

KAT GOLDSWORTHY: That is great. I think our audiences will very much appreciate having a visual to go.

DR MELISSA WILLOUGHBY: I love a visual myself. For any other visual learners out there.

KAT GOLDSWORTHY: Yeah. It's hard.

DR MELISSA WILLOUGHBY: So I really like this table because it shows a great example of all the different types of questions that can be asked, and then also the different types of study design and it gives an indication by the little crosses with more crosses, meaning each study design is better at answering these types of questions. And as you can see, just at a high-level view, there's great variability in the types of questions and which studies are better at answering them. Just one example to pull out would be randomized controlled trials. So in terms of hierarchies, these are often considered to be a very high quality type of research evidence, but as you can see from the table, they're not the best at answering every single type of question. So they're usually great for, looking at how to evaluate a program. Do we know if the program has any effect? What effect does it have? And this is because they have very rigorous and strict research methods and the steps that need to be done to say that this is a randomized controlled trial. If you don't follow the steps in the right way, you can't say it's a randomized controlled trial. And this involves things like randomizing people to different groups and also having a comparison group to compare the effect of the program against. However, as you can see from the table, randomized controlled trials are not the best study to use for other types of questions. So if you wanted to know something like how many people in our particular population have experienced something like have a mental health issue? Or if you wanted to know what people involved with the program, what they thought of that program, what was their experience of it? Randomized controlled trials wouldn't be the type of study that you look at. You might look at other types of studies which have conducted surveys, or maybe qualitative studies of people who've been involved in programs. So I guess the key thing to ask here is what type of research is best suited to answer the question that you're asking.

KAT GOLDSWORTHY: That is really that's a really helpful kind of way of looking at it, instead of just thinking about, as you said, like that quality question sounds really complex, and it's nice to hear that researchers kind of struggle to answer that question as well. Like, it's not that, it's not an easy thing to determine all the time. So yeah, I kind of love that. That's the way you would look at it. And I’ve been thinking when I was looking at that table, too, and just hearing you talk about, you know, looking at qualitative research for certain some of those questions. It's kind of a reminder to me, too, that a lot of, you know, when I think about that evidence wheel that we were talking about before, the different forms of evidence and thinking about lived experience, evidence that there is published, you know, there can be published research through, qualitative studies that might kind of touch on some lived experience of certain groups of people in certain circumstances that you could even look at. We're not always necessarily just looking for the stuff that comes out of randomized controlled trials and systematic reviews. And, you know, at the end of things.

DR MELISSA WILLOUGHBY: Yeah, absolutely. And, you know, people get very strong opinions about whether quant quantum is better or worse. And I think it really depends on what you want to know.

KAT GOLDSWORTHY: Yeah, it comes back to that. What is your question? And then, I mean, you've already given us some really great tips and kind of how to locate research that can answer that particular question, which is kind of half the battle. Sometimes when you're faced with, you know, large numbers of studies and you're trying to find something that's useful for your context. I was also just I just thought it might be helpful to for the people who aren't super familiar with research and research terminology. Um, some of the terms that were on that table that you shared before, you know, like RCTs, randomized controlled trials, systematic reviews, quasi experimental designs, we the evidence and evaluation support team, we have various resources, that define and explain what those terms are. So I will probably include those in the handouts as well, just in case. If you see it, you don't really know what it means we can clarify that for you.

DR MELISSA WILLOUGHBY: Yeah, that'd be great. I might also touch on the second type of quality assessment we're going to talk about today. So this was the critical appraisal that I mentioned before. So I think looking at the design itself is a great first step and then you can actually look at the components within that study. So this is where you would question the validity of the study and its applicability to your circumstances and I feel like we're going to come back to this a bit today, but this is something that's going to be tricky and requires a lot of critical thinking, and thinking through things. But it can also be very useful. And as I said, the more you get used to thinking about research in this way, the easier this will be to do. So I just want to talk through a couple of principles on how to do this, and this is definitely not a comprehensive list. This is just an example of a few different things to think about. I also have another slide for this as well.

KAT GOLDSWORTHY: Wonderful.

DR MELISSA WILLOUGHBY: So the first one is Research methods and this is looking at how was the study conducted? What steps did they take? and while you're reading the method section of a bit of research you could think about, do these steps make sense? Is the way that they're going about this research, can this actually answer the question that the study is asking, or can this achieve that the aim that they're set out to do? And you can also look at things like, their exposures or their programs and also their outcomes. And how did they measure these things? Do they measure them in a way that's reliable or are they, accidentally measuring something else? Do they how long is their follow up? Is it a reasonable time between when the program finished and the outcome being measured? Another thing to think about is bias. So you can look at the authors affiliations and also how the study was funded, and think about would of either these affiliations or the funding source had any impact on the findings of the study, even unintentionally or subconsciously. And sometimes authors may include a little statement somewhere in the research article about whether the funding body was involved in any decision making through the research process. You can also look at things like, who are the people in the study? Are they representative of the whole population or the community that study is looking at, or are they very selected in a particular way, like maybe the studies only looked at university students? So this is a particular group of people who may not be generalizable to everyone in the whole population. So that could sway the findings in a particular way. It's also great, particularly for studies looking at programs to have a comparison group. So this helps us know what would have happened if the people didn't experience the program, what the outcomes be, or perhaps if they experience a different program or how these programs are different. Another thing to think about, which is perhaps a bit tricky, is chance. So this is most relevant to those quantitative studies.

And here what we're thinking about is the findings that we're seeing. Are they true findings or is it actually the program that's causing this change, or is it just chance, would this change happen if nothing was done or if something else was done? And a good way to look at that is to see if the results that we're seeing are either statistically significant or clinically meaningful in some way. So do they have real world applicability? And again, a great way to see this is is there some kind of comparison group or comparison either before or after the program? And the last one I wanted to mention was generalizability. And here we're actually stepping a little bit outside of the study itself. And we want to think a bit broader and think about is this study transferable to my context. So is it something about where the study was conducted, whether it's in Australia or somewhere similar like the UK or Canada, where we might be able to pull those findings across and have learnings to Australia? Or is there something again about the people in the study is can we are they generalizable to the community we're working in, or are they too different for whatever reason, maybe being a different age group, for example? So those are just a few things to think about when we're looking through studies. But again, obviously not everything that we could consider.

KAT GOLDSWORTHY: Yeah. It's you don't really realize how much goes into the decision making around assessing research evidence. And here you kind of talk about it in that way. I guess it's I remember this coming up a lot in the workshop that we ran last year, which was around really relying on our own expertise. You know, sometimes, I mean, critical appraisal, you know, it's a judgment call, right? You're looking at something using all of everything that you've learned and everything that you know, and kind of your rational thinking and your logic to figure out whether, you know, everything's kind of making sense. The conclusions are kind of reliable based on the methodology of circular piece of research. But at the end of the day, to, you know, particularly if we're using this research to inform services to either inform service delivery or design, or to think about, you know, using it as part of your program theory, I feel like that's where that judgment really comes into play. Is this, you know, is this research relevant to our context, to the people that we're working with?

And really, only the person in that space can say whether or not that there is relevance there. So that's kind of that's quite empowering.

DR MELISSA WILLOUGHBY: Yeah. And it's also, you know, there's no hard and fast rules here. It's not a yes no, this is good quality. This is not, this is relevant, this is not. It's all kind of on these spectrums of depending on what you want to use it for, it might be very relevant but low quality and vice versa. So it is about, you know, sitting with things and thinking them through, talking to people, doing extra reading and trying to make the best decisions that we can based on the information that we have and the knowledge and experience that we have.

KAT GOLDSWORTHY: Yeah. Yeah, that's really helpful to think about, I think. We're running out of time quickly, Mel, which is just these conversations go quickly. So I'm going to I was going to ask a question about kind of when you can't find any relevant evidence, I'm actually going to skip over that one and just think about or just talk to you about how to find research evidence, because I think that is a question that comes up a lot. As I said in my intro, not everyone's trained in doing this stuff, and there are little kind of tips and tricks. You've already spoken about some of them, you know, earlier in the conversation. So I guess, I'm curious about what like, are there any pointers, shortcuts maybe to finding research evidence for your purpose? Thinking back to the questions that you put up earlier, thinking about the context that a lot of our audiences might be working in. Yeah. Are there tips for finding research?

DR MELISSA WILLOUGHBY: Yeah, definitely. I do have some tips. I think for me it's thinking about how are we doing the searching and where are we searching? In terms of how looking for research evidence can take some time. So researchers have particular methods and strategies, to find research evidence and using these strategies can make it the searching a bit more efficient. I'm trying to get to the most relevant evidence as quickly as we can, and avoiding having to trawl through a whole lot of research. That's actually not what we're after. There are a lot of great videos out there on the internet, particularly from university libraries, that step through some of these search tips in a in a great detail and are really easy to follow. And, you know, the first time that I started looking for research evidence, I kind of had a video playing and would stop start it as I was doing it in real time, and that was really helpful. And there's a video that we can share a link to in one of the handouts that people can go and watch and pretty much step them through as they're doing it.

And I will just now talk about some broad kind of strategies or tips, but I definitely recommend. This is something that can get a bit nitty gritty, and we don't probably have the time to go through how to do a proper search, in this webinar. 

Kat Goldsworthy

We’re in a right age aren’t we, for like finding the videos that can help us do things.

DR MELISSA WILLOUGHBY: YouTube has helped me do a lot of things. So I of course have another slide to share, just as an example. So my starting place is often to think about what is it that I'm asking? And the example here is, you know, I'm looking for a program that can reduce social isolation in young people. So what if I wanted to search for evidence related to this. I'll break it down into some key concepts. And for this one, I can see there's three key concepts. The first is the population group. I want to know research on young people. I'm not interested in adults or infants. The next one is that I'm looking for an activity. So I'm looking for an intervention or a program that will do something. And the last thing I'm looking at is my outcome. So I want to know about social isolation. So you'll see on the slide these three concepts have been broken up. And the next thing I'll do is think about similar words for each concept. So as you can see for young people, other words might be youth or adolescents or teenager. For the program it could be intervention, evaluation, prevention and for socialization. It could be words like lonely or loneliness. And these key words are the words that will make up my search. So I'll take a couple of key words from each of the different concepts, and you can see an example search line down the bottom of the slide. And I'll chuck these keywords into a search box of either a database or a website that I'm searching. This is a bit of trial and error, so you could try out different combinations of words, see what comes up. Some words might work better than others, some words might be used more in the research than others. So if you're searching a database, another good tip is to consider the boolean operators. And these are the ‘ands’ and ‘or’ and these have been incorporated on the slide. And then some of the videos will go through in a lot more detail. But generally you want to combine each concept with an ‘and’ and all the different keywords within that concept with an ‘or’ so you can see in the example ‘young or adolescent’ and ‘program or intervention’ and ‘isolation or loneliness’. And this will help narrow down the results, making sure each concept is present in the research that we're finding. So that's a little bit on how, I'll stop sharing that. And the second thing I want to talk about is the where. So there are lots of different places to search. You'll likely need to search multiple different websites or databases to find research evidence that you're after. And the best places to search really depends on the topic that you're looking at, as well as the type of research that you're after. We will provide a list of examples of places to search that are free to search in a handout. So definitely go through that list. The first place that you could search is academic databases. And these are repositories of research evidence. On the handout you'll see examples like Google Scholar and PubMed. So it's important to note here that these are free to search so you can throw your keywords that you've developed in their search box and look at their results, all for free. But when you go to the full text, some of these might be behind a paywall. And this is a barrier to access research evidence that a lot of people experience. Unless you have a university affiliation, the journals will usually charge you to access. A tip here is to email the first author of the paper. So usually their email address will be included in the website that the paper comes up on. Or you could just Google their name and the institution, and you should be able to find an email, email them and say, hey, I'm really interested in your research. Would you mind sending me a copy? And they'll send it to you. They'll be very happy. Researchers want people to use their research. They'll be very happy to hear someone else is interested in it and wants to use it in some way. Another place people can search is websites of reliable bodies or News organizations that publish research or reviews of research. So again, these are free to search. But the good thing here is that they're also free to access.

So you can go directly to their websites and they might have a search box. And again, you can throw your keywords into their search box and have a look at their resources. Where you would go depends on what topic that you're interested and what type of research that you're after. Some examples obviously, the Australian Institute of Family Studies, we have a search box on our website, and we also have some keywords that you can search by. Other places might be the Parenting Research Centre or ANROWS, which is the Australian National Research Organisation for Women's Safety. But again, check out the handout for a much longer list of different places to go. The last thing I wanted to note here, is that you could also look at government data and grey literature. I know this is a webinar on research evidence, and these are technically not research evidence. Because they don't have that quality of being peer reviewed or subject to external scrutiny. However, they can also be very reliable and useful sources of evidence, depending on the topic that you're looking at.

So for government data, government bodies conduct surveys and they also publish data sets on health and social issues. These are usually freely available on their websites. You could find published resource, public published reports where they've summarized the findings of their data and you can also find some raw data on there as well. So example could be the Australian Bureau of Statistics or the Australian Institute of Health and Welfare, and this could be useful to look at when we want to know, the descriptive kind of research evidence. So how common is a particular issue in the Australian population or in the state of Victoria? The last one I wanted to touch on is grey literature, which can also be useful. But again, technically he isn't research evidence, so this includes things like reports from government and non-government organizations. So some research is published in grey literature as opposed to the academic journals. Other things, like evaluations of services or programs, can be found in the grey literature, particularly the implementation research. So again, this is the how we do things. What works for who, when. So that's also if that's the type of research you're interested in, that's a good place to search for it.

KAT GOLDSWORTHY: Thank you Mel. What a terrific overview. Given that we didn't have much time and there's a lot that could be covered, I think the ending on those kind of really practical ways of finding stuff and, you know, thinking like, we don't have to look for the perfect types of research evidence to find the information we need. So I really appreciate you kind of touching on that at the end there. We do have to wrap up I'm sad to say. Maybe one day we'll run this workshop again because it was good fun and I always learned something. So thank you so much for your time. I really appreciate you being here.

DR MELISSA WILLOUGHBY: Oh, pleasure. I feel like we could talk about this for a long time.

KAT GOLDSWORTHY: Yes, we really could. But yeah, thank you. I'll also just thank everyone who's listening in and joining us and who's interested to hear about research evidence and learn a bit more. We really appreciate it. Big thanks to our AFS communications team for making this possible and making us look so good. Please if you haven't already subscribed to the Evidence and Evaluation Support newsletter, you can sign up via the AFS website and stick around to complete the survey. The feedback survey, if you can. It'll take about one minute and we really appreciate that feedback. It helps us a lot. So thanks. We really look forward to joining you at the next webinar and take care. We'll see you soon.

Presenters

Melissa Willoughby photo

Melissa is a Senior Research Officer in the Child and Family Evidence and Evaluation Team and a Research Fellow in the Research and Evaluation, Family, Domestic and Sexual Violence Team at the Australian Institute of Family Studies. In this role, she has led and contributed to evidence reviews on various topics relevant to professionals in the child and family sector. She uses the findings of these reviews to design and develop knowledge translation products suited to practitioners and policy makers. Melissa completed her PhD in Public Health at the University of Melbourne in April 2023. She has led and contributed to multiple academic journal articles and knowledge translation products. She has also presented her research at national and international conferences.

Facilitator

Kat Goldsworthy

Kat Goldsworthy is a Research Fellow in the AIFS Evidence and Evaluation Support team. Since 2014, Kat has supported community sector organisations to collect, use and communicate evidence. She has authored a range of evaluation resources, coached individuals on applying evaluation practices, and runs regular workshops on developing program logic models and collecting evidence.

Share