Keynotes and Q&A session

Content type
Family Matters article
Published

November 2016

Download Family Matters article

Abstract

The AIFS Conference 2016 was focused on looking at how we go from research to results and using evidence to improve outcomes for families.

Conference held at the Melbourne Exhibition Centre on 6–8 July

The AIFS Conference 2016 was focused on looking at how we go from research to results and using evidence to improve outcomes for families. Highlights of this year's conference included three challenging keynotes and the Q&A session "Show me the evidence!"

Opening the conference, AIFS Director Anne Hollonds summarised a number of questions that were the focus of this year's event:

  • What is evidence?
  • Who is the audience?
  • How do we understand the needs of policy makers and practitioners?
  • How do we achieve honest critique, reflection and cross-disciplinary dialogue?

Considering these questions, two of our keynote speakers took us on discomforting—but ultimately rewarding—journeys about our understanding of research evidence, and how it is used (and misunderstood) in the name of achieving evidence-based practices. Professors Greg Duncan and John Lynch both challenged the audience to look behind the data in individual studies (or even meta analyses) of popular programs promoted as being based in evidence to see the broader context, such as how many of the expected outcomes were achieved and across how many different studies were there null findings. They challenged us to look beyond the popularly cited instances where significant impacts had been found.

Professor Duncan explored whether "two-generation programs" (a coordinated combination of parenting-focused programs directed at parents and child-based early childhood/education programs) can best help disadvantaged children. He explained that while both types of programs are effective, the combination is—disappointingly—not synergistically effective. Although there is some complementarity, the programs may, in part, substitute for one another. Parent take-up of services and their engagement in specific programs also is a significant problem and can be compounded when trying to deliver services simultaneously to two generations.

Professor Lynch outlined how Randomised Control Trials (RCTs) are accepted as one of the best forms of research evidence, with the capacity to show causal relationships between interventions (e.g., programs to improve parenting or children's literacy/numeracy) and outcomes (e.g., measures of child/family wellbeing). But if they are poorly conducted (e.g., not enough participants in particular subgroups of interest to have sufficient statistical "power" to conduct relevant analyses—which is often the case in programs directed towards disadvantaged groups—they won't assist policy makers in their task of implementing what works and, as a result, we can fail to bring about the expected improvements.

Professor Lynch went on to argue that administrative data are often an untapped goldmine and, when they are explored, can show unexpected results. They can challenge us to think about the context of our "usual care". Many of the characteristics of early programs targeted at disadvantaged groups are not dissimilar to current universal service delivery in Australia; for example, the number of perinatal visits currently accessed or the number of hours of early childhood education available to all children.

On an even more challenging as well as distressing topic—the exposure of children to sexual abuse in organisational contexts—Justice Jennifer Coate painted a positive picture of the importance of research in undertaking her role as one of six Commissioners to the Royal Commission into Institutional Responses to Child Sexual Abuse. Research data will sit alongside information gained from more than 40 public hearings and over 5,500 private sessions so far, including de-identified data being collected from each private session. The Royal Commission has funded 100 research projects involving over 70 experts from more than 30 research centres and universities, of which 24 have already been published. These will be used to help shape its final recommendations when the Commissioners deliver their final report in December 2017.

A much-anticipated highlight of the conference program was the Q&A panel session "Show me the evidence!" hosted by the inimitable Annabel Crabb, writer and broadcaster with the ABC. After treating us to some of her timely acerbic political wit, in the wake of the federal election the previous weekend, she introduced each of the panelists: Brian Head, Professor of Public Policy at the University of Queensland; Anna Burke, former politician and speaker in the Commonwealth House of Representatives; John Daley, CEO of the independent public policy think tank the Grattan Institute; Penny Armytage, former senior public servant and now a lead partner in professional services firm KPMG; and keynote speaker John Lynch, Professor of Epidemiology and Public Health (University of Adelaide).

During the discussion, two of the key themes the panel touched on were:

  • The role of research evidence: It can be valuable in defining the problem and in understanding the issue for which there needs to be an intervention or policy solution; however, it needs to be balanced with "practice-based evidence". Decision-makers (and researchers) also need to maintain a level of scepticism—because if new data emerge, we might need to change our minds. Synthesis of the multitude of existing research is "evidence" in its own right but, as the keynotes from the conference highlighted, we need to consider the context in order to understand and make sense of the data. For policy makers, beside the challenges of using evidence to inform policy decisions are the challenges of implementation. Decision-making is happening all the time at the program or practice level.
  • The need to communicate the science in understandable language based on dialogue between academia, policy and practice to reflect the "real world" decision-making/operating context. Policy makers and politicians need to make decisions "on the balance of probabilities" considering all elements, to identify the best options within the constraints. Practice-based evidence from the "coal face" is a valuable complement to research evidence from RCTs—it's about putting all the pieces together in a coherent way.

The "enemies" that get in the way of the effective use of evidence was another strong theme. These include:

  • Researchers rarely taking into account the demand pressures of the service system. They often lack pragmatism and sensitivity to the constraints of funding and the reality of the service delivery context. Having data and knowing how to use the data are two different things. The "all things considered" perspective that policy makers require can be challenging for researchers. When delivering the research, many senior decision-makers would prefer face-to-face interaction with researchers, to help them easily make sense of the options being presented. Policy makers experience frustration at the over-cautiousness of researchers (whose main conclusions are the need for further research) divorced from the real world of policy-on-the-run decisions.
  • Timeliness: There is often a timing disconnect between research and policy—so researchers need to have a "back book" of research that can be brought out and used if and when the opportunity arises; it is often too late to start a research program once an issue becomes topical. Policy makers also need to be timely and access any new and emerging information. How do we help them get access to the "back book" and what's in the bottom drawers of academics' desks, and then to make sense of it? Purposeful collaborations or intermediary organisations and "translators" can facilitate this access to people who "know stuff" and have the policy knowledge and skills to toss around the ideas in a pragmatic fashion.
  • Forgetting who we are trying to influence: Is it the minister? The department head? Or the general public? It is not just about convincing politicians or policy makers—it is also about bringing along the community. Communicating the science to the electorate is critical to effective evidence-based policy. We need to have better narratives about what we are trying to do. Public opinion isn't a constant. If academics don't step up, someone else will fill the knowledge void.

One of the key ideas to emerge from the panel discussion was of using greater collaboration and the principles of co-design as a future direction for research—developing successful strategies involving integrated policy action based on partnerships with a mutual shared interest in an agreed outcome. An example of where research, policy and practice has "collided" with positive results is road safety. If we can agree on the outcomes, we can empirically test the means (however, it should not be used as a proxy for disagreement about the ends per se). Sharing the pain and difficulty of co-designing policy responses, based on pragmatic realities, is what is needed. However, many of the "drivers" of academia in Australia don't support this (e.g. the rewards for peer-reviewed journal publications). In the US, the greater level of funding from alumni and philanthropy for universities, institutes and think tanks has shifted the incentives towards more public engagement.

In conclusion, members of the panel stated that researchers often lack the ability to translate the results of their research into concrete actions, or to link to current policy issues and public debate. Dealing with research data is an infinitely more secure place than having to influence the public mind. Sometimes the evidence is there but the public is not. Successful alliances have been based on the confluence of advocacy, evidence and rhetorical skills … plus patience.

The world of evidence is no longer a lonely ivory tower. Deployment of "the facts" is a matter of collaborating, gathering intelligence, measuring—and influencing—public appetite, and usefully deploying the evidence. So, the "next steps" for improving the take-up of evidence in policy and practice are to build the "back book" of research in purposeful collaborations, to create nimble platforms (e.g., using administrative data), and to make the process of policy co-design easier.

Share