Dissemination of evaluation findings

 

You are in an archived section of the AIFS website 

 

Content type
Practice guide
Published

November 2013

As well as letting your funders, managers and staff know about the findings of the evaluation, there is a broader issue to be considered. Other providers of similar programs may not be able to improve services to their clients if they don't know what you learned. Regardless of whether the evaluation revealed that your program had a positive, negative or little measurable impact on participants, the findings would ideally be communicated to the broader sector so that the evidence base is expanded and, more importantly, future program participants can benefit from what you found. This is referred to as "dissemination".1


Avenues of dissemination

Ideally, a dissemination strategy will have been designed as part of the overall evaluation plan. It may also have been specified as part of a funding or partnership agreement. There are a number of ways in which you can share your findings, such as:

  • reports to the management of the agency or organisation, the funding body, staff and other stakeholders, including other practitioners and service providers, and to participants via face-to-face meetings or written, plain-language summaries of findings;
  • publication of articles in a journal, newsletter or other similar resource, or on your agency's website. Summaries can also be presented on relevant social networking services, including social media and research blogs; and
  • in person at conferences, seminars, workshops and network meetings, both within the agency and to the broader professional community.

It is important to realise that you can (and perhaps should) disseminate or publish your findings as widely as possible. Publication is not restricted to one or the other mode - either report or journal article. While a report typically covers the entire evaluation, that evaluation may yield enough material for two or more journal articles. Each article could focus on a different aspect of the process, a particular element of practice, or a specific group of objectives.

Reporting evaluation findings

Evaluation findings are typically reported to the key stakeholders in a document that conveys the story of the program and its evaluation. Although reports will generally follow a standard outline, they tend to be structured and focused somewhat differently depending on the target audience and how the findings are to be used.

The purpose of an evaluation is one of the key decisions made at the beginning of the evaluation,2 which then guides the type of evaluation conducted. At this later point in the process, this purpose guides the type of report you write. An evaluation report can:

  • inform future long-range program or training decisions;
  • be used as evidence in your case for (re)funding;
  • help build community or stakeholder support;
  • be used as a comparison against past or future evaluations;
  • focus attention on key issues;
  • highlight differences between intended and actual outcomes for clients;
  • guide and justify resource allocation;
  • add to the evidence base about the effectiveness (or otherwise) of your program, or parts of it; and
  • encourage referrals from other agencies (Centers for Disease Control and Prevention, 2005; Office of Planning, Research and Evaluation, 2003).

Writing evaluation reports

Who is the audience?

Depending on who the report is for, you will need to take a somewhat different approach to its preparation. For instance, an academic audience will be more interested in details of the participants, the methods, procedures, instruments and statistical analyses than will, say, other providers and practitioners. Providers and practitioners are likely to be concerned with what the findings might mean for the way in which they work or the way in which the program is run. A service manager will look for the implications of the findings for allocation of funds or other resources, while funding bodies will be keen to know how the findings relate to their policies and whether their funding of the program should continue.

When writing your report, it may help to bear in mind that your evaluation may provide another practitioner with a model for evaluating their own programs. What would they need to know in order to conduct an evaluation of their own program?

Other differences might relate to the level of detail in the content of the report, and also to its length, the formality of the language, the structure, and the way in which the recommendations are framed. A tailored dissemination strategy may need to be devised for Indigenous participants and communities or for different ethnic and cultural groups. Such a strategy may include translation of final reports into relevant languages and/or face-to-face meetings with participants in their communities to talk with them about what the evaluation findings were and what they mean.

While the report should include information about all aspects of the evaluation, different parts of your audience are likely to be particularly interested in different elements.

If you ran your evaluation in partnership with an external evaluation consultant, they are likely to have experience in writing different types of reports for different audiences. Even so, it can help to get some information early in the process from the various stakeholders about which aspects of the evaluation they most need to know about so that you can make sure that the information is recorded in such a way that it can be easily inserted into the report. If your report is for a funding body, ask them about the key issues on which they will be expecting your comments or recommendations. These might include areas such as:

  • ongoing funding for the program;
  • further development of the program, or particular elements of it;
  • sustainability of the program;
  • impact of the findings on policy (when added to the current evidence base);
  • training needs and standards;
  • replicability of the program in other areas or rural/remote locations;
  • allocation of resources to further development; and
  • ways in which to improve client access.

A considerable amount of the content of the report can be written up well before the data are analysed and interpreted - for example, once the evaluation is underway, sections of the report such as the background, design, method, instruments and procedures can be developed.

General structure

The structure of the report will more or less mirror the stages of the evaluation. The following headings outline a common, structured format and can be used as a guide to what could be included in a comprehensive report, but it is a good idea to think carefully about the structure that will best suit your (the client's) needs.3 The headings given below will themselves probably not vary greatly across types of report, but the specific information and the amount of detail in each section will need to be tailored to the audience (Alston & Bowles, 2003).

Executive summary - An executive summary condenses the larger report so that a busy reader can understand the essential information about the evaluation, and then decide whether to read the entire document. While it comprises a part of the report, the executive summary is more than just a cut and paste of points from the main report and should be written so that it can stand alone. It sits at the front of the report but, as it needs to accurately reflect the entire process, it is usually written last. The executive summary will include an overview of the program, why it was developed and for which client group, the goals of the evaluation, the general method and design, key results, any limitations of the evaluation, and your conclusions and recommendations. While its structure and appearance can vary, often an executive summary will include a series of dot points, each focusing on a particular aspect of the program or the evaluation (University of Wollongong, 2000).

Background to the report - This covers the rationale behind the development of the program and the context in which it is offered; key characteristics about the program and the theory and research that underpins it; how and where it is delivered and who can deliver it; and the purpose of the evaluation.

Program goals and objectives - This includes the intended outcomes for clients in the short, medium and long term.

Evaluation design and method - This describes the type of evaluation design you used and how it was implemented: who participated, how they came to be in the program; what data were collected, how and by whom; and how the data were analysed.

Findings - The findings include descriptive statistics that report on the quantitative data you have collected about the participants (age, sex, any other demographic information), their scores on the measures used to evaluate the program objectives at each data collection point (pre-test, post-test, follow-up), which inferential statistics were employed to test the objectives (i.e., were there differences between the scores at different times?) and whether they indicated that any changes were statistically significant. Basic reporting of findings based on qualitative data will include the categories of information gathered (e.g., strengths, weaknesses, concerns, experiences, suggestions, etc.) and the key patterns, themes or associations that emerged from the data as they relate to the objectives.

Discussion and conclusions/recommendations - In this section, the findings are analysed and discussed. Did participants' scores change across the period of data collection? For which measures? How did they change and what does that imply about the effectiveness of the program? What actions are suggested by the findings? What do the findings mean for the various stakeholders? How do your findings sit within the general knowledge or literature about your type of program? (Are they consistent with previous studies of this group or this type of program? Do they add something new to the evidence base?) What are the limitations of the evaluation and how do they affect your conclusions about the program.

References - If you used or referred specifically to any journal articles, books, websites or other sources of information about the theory or practice underlying your program, other similar practices or programs, evaluation principles, methods or techniques, or the measures you used, the references need to be included in the report.

There are standard systems for setting out references. Guidance on referencing can be found online, in particular on many university websites. For example:

Appendices - Here you can include examples of the information you gave to participants and others who provided data, the informed consent documents, the instruments you used, any supplementary tables or charts that are referred to in the report, and any other information that might be relevant to a reader or potential evaluator.

Although you are most likely to report your findings to funding bodies, agency management and/or staff, you might want to consider reporting your findings to a broader audience in a more formal way through a journal or other publication.

Box 1: Tips for writing an evaluation report

  • Tailor it to the aspects your audience is likely to be most interested in
  • Be clear, succinct and use impersonal, non-technical language
  • Include a table of contents
  • Include a summary of stakeholders' involvement and roles (if applicable)
  • State the focus and the context of the evaluation
  • Include the evaluation plan and procedures
  • Acknowledge the strengths, weaknesses and limitations of your evaluation
  • Explain any constraints on how the evaluation could be conducted
  • Make use of charts and tables to present data. Make sure these are labelled correctly and meaningfully.
  • Acknowledge the limits of your conclusions and recommendations
  • Have the report read by at least one colleague to ensure you have been impartial in your reporting and discussion of the findings
  • Have the final version professionally typeset and bound
  • Distribute copies widely to relevant providers and agencies

Source: Centers for Disease Control and Prevention (2005)

Publishing articles

Research journals

Journal articles can reach a diverse group of readers, including practitioners and academics outside of the specific part of the sector in which you practice. While an evaluation report is submitted to its funding body or stakeholder group and is, except in very unusual circumstances, accepted as it stands, there are no guarantees that an article submitted to a journal will be accepted for publication. Journal quality (and reputation) can vary considerably, and acceptance is often contingent on meeting strict criteria.

The essential information to be included in a journal article largely remains the same as for an evaluation report, but their focus and appearance are likely to vary considerably. Depending on the journal, writing a journal article may require a somewhat more formal and concise style of writing and an emphasis on the more technical details of how the evaluation was carried out. Unlike an evaluation report, where you are writing for your agency or funding body, journal articles are typically reviewed before being accepted for publication, so your work will usually be closely examined by at least one independent reviewer before it gets to its audience. Certain criteria will need to be met, including how well the evaluation was conceived, carried out and interpreted. Due to this extra scrutiny, being accepted for publication in a journal offers an extra layer of credibility to the evaluation.

If you do decide to report the evaluation in a journal article, the first thing to do is familiarise yourself with other similar articles in the journal and with the journal's guidelines to authors. In this way, you can gauge the expected language, types of information to be included, and level of detail. Then you can ensure that the relevant information about, for example, the instruments you used to collect data, is available to you when writing the journal article.

Box 2: Which journal? Things to consider

  • Which journals do you go to for information relevant to your sector?
  • Which journal will allow you to get the findings most directly to your target audience? Will the article be available as an "open access"4 article - i.e., will it be accessible online and cost-free for other service providers and practitioners to access?
  • How important is it that the journal be refereed? Peer-reviewed?
  • How long will it take for the journal to actually be published? Is there a quicker way to get the information to the sector?
  • Is an e-journal an appropriate vehicle for reporting your evaluation?
  • Is there a website that could host your article?

Box 3: General style

While your report is essentially telling the story of the evaluation, its language and tone should be pitched at the target audience. As a general rule, and especially for reports to readers outside of the agency, the style of the report should be impersonal and impartial. Don't refer to yourself or the evaluation team in the first person, but state what occurred; for example, "It was decided to randomly allocate participants to control and intervention groups", rather than "I randomly allocated participants to control or intervention groups".

Other places to publish

You can extend the reach of your evaluation even further by submitting it to one or more of a range of newsletters published within the sector, by state and national peak bodies and clearinghouses such as the CFCA information exchange.5 Organisations such as Families Australia, Family Relationship Services Australia, and the various Councils of Social Services (Australian and state/territory) may also offer opportunities for dissemination. There are other online options to share your findings, such as The Conversation - an independent, open access news and commentary site - and through organisation social media sites, such as Facebook or Twitter.

Consider also those avenues that might be slightly outside of your particular part of the sector. For example, could the way in which you conducted your evaluation provide a model that could be applied in a parallel sector such as corrections, juvenile justice, youth, homelessness or family violence? The type of article you write for this purpose might focus on demonstrating the process you went through, the decisions you made, and the structures or resources that had to be put into place.

Practice profiles are another way to share information about your program with the sector. Practice profile databases6 to which you might submit a profile of your program will either have a standing, open submission process, or periodically issue a call for submissions, typically placed on the organisation's website, inviting providers to submit a profile. To be added to the database, profiles will need to meet a set of criteria, usually relating to the amount and quality of the evidence demonstrating the program's effectiveness. Profiles will be assessed against these criteria by a panel of evaluation professionals. These procedures ensure that the programs listed in the databases are indeed likely to be those that have positive benefits for participants.

Presentations

One effective way of letting the sector know about your program evaluation is through presentations at conferences, seminars, forums and workshops, where you can interact directly with other providers. A brief one- or two-page flyer presenting the key results and implications of the evaluation, and/or a poster presentation, can also be an effective way to disseminate your findings at these venues.

Some guidelines for putting together an engaging and effective presentation follow.

  • A useful framework for thinking about your presentation is:
    • tell them what you are going to tell them;
    • tell them; and
    • tell them what you told them.

This translates to the introduction, main body and summing up of your presentation. Each section can then be broken down into the key points you need to make.

The purpose of the presentation is to share the key messages from the evaluation clearly and succinctly, not convey every step in minute detail. Think about who the audience will be and how much they will already know about (a) programs of the type you evaluated, and (b) evaluation itself, and then tailor your material accordingly. Make your points simply, and make sure they have a logical sequence.

Decide what your aim is for the presentation and then focus on the information that will help you achieve that aim. If your aim is to share with other providers your experience of program evaluation in a service environment, then you might focus on how you put the evaluation plan into action and what you needed to do to ensure the process unfolded smoothly. On the other hand, if your aim is to inform people about the effectiveness of your program, then you would spend less time on the mechanics of doing the evaluation and focus more on the findings, what they mean, and how you responded to them.

Create an engaging presentation and use technology sparingly. By their attendance, the audience has already indicated interested in the topic. Demonstrating your command of the intricacies of PowerPoint is unlikely to make your message any clearer, but it may make your presentation memorable for the wrong reasons (especially if the technology fails). There is plenty of information on the Internet about creating good presentations, for example, the Primary Health Care Research and Information Service's How to Design a Great PowerPoint Presentation.

Box 4: Finding a forum for a presentation

To locate a conference or other forum for talking about your program evaluation:

  • go to the Australian Institute of Family Studies conferences related to family wellbeing web page;
  • subscribe to email newsletters or alerts of a range of organisations (including some that are not specifically related to your part of the sector) such as:
    • clearinghouses;
    • research institutions;
    • university departments and research centres;
    • government departments;
    • industry representative bodies; and
    • the larger organisations providing family & relationship services.

Other ways to share the information include:

  • arranging to give a presentation at a networking meeting;
  • inviting other providers to a meeting, seminar or webinar;
  • holding an in-service professional development session for providers of similar programs;
  • participating in webinars and videoconferences; and
  • recording your presentation and posting it on your agency website as a podcast.

Are we there yet? Continuing the cycle: Evaluation as part of the service environment

So, the evaluation is complete, the findings and recommendations applied and implemented (or not), and you have let the sector know what your evaluation found. Job done? Not yet.

An ongoing process of review, response, evaluation and, when appropriate, innovation is the most effective way to ensure that clients receive the best possible services and programs, because it identifies what works and should be retained as well as what doesn't work and should be modified or deleted. Making a commitment to evaluation at an agency or organisational level means this process becomes embedded in the service environment, so that it is part of "how we do things here". In a way, this is not too far from the principle of reflective practice that is already part of the daily activities of many practitioners, and the action research methods7 that many services employ to monitor and inform their practice.

References

  • Alston, M., & Bowles, W. (2003). Research for social workers: An introduction to methods (2nd Ed.). Crows Nest, NSW: Allen & Unwin.
  • Chambers. (1997). Chambers combined dictionary and thesaurus. Edinburgh: Chambers.
  • Centers for Disease Control and Prevention. (2005). Introduction to program evaluation for public health programs: A self-study guide. Atlanta, Georgia: United States Department of Health and Human Services. Retrieved from <www.cdc.gov/eval/guide/>.
  • Davidson, E. J. (2007). Unlearning some of our social scientist habits (PDF 108 KB). Journal of Multidisciplinary Evaluation, 4(8), iii-vi. Retrieved from <journals.sfu.ca/jmde/index.php/jmde_1/article/download/68/71‎>.
  • Office of Planning, Research and Evaluation. (2003). The program manager's guide to evaluation. Washington, DC: US Department of Health and Human Services, Administration for Children and Families. Retrieved from <www.acf.hhs.gov/programs/opre/research/project/the-program-managers-guide-to-evaluation>.
  • University of Wollongong. (2000). Report writing: The structure of business reports. Executive summary. Wollongong: University of Wollongong, UniLearning. Retrieved from <unilearning.uow.edu.au/report/4b.html>.

Footnotes

1. Dissemination is defined here as the wide circulation of news (Chambers, 1997).

2. See also Box 2 in Evaluation and Innovation in Family Support Services.

3. For an alternative structure, see Davidson (2007).

4. See Australian Open Access Journals and other information. Other journals may offer a selection of articles as open access.

5. The CFCA information exchange publishes short articles, including summaries of and links to evaluation reports, on the news section of the website, CFCA Connect - see the author guidelines.

6. Such as the Promising Practices Network and Aboriginal and Torres Strait Islander Promising Practice Profiles (AUS) <https://apps.aifs.gov.au/ipppregister>.

Acknowledgements

This paper was first developed and written by Robyn Parker, and published in the Issues Paper series for the Australian Family Relationships Clearinghouse (now part of CFCA Information Exchange). This paper has been updated by Elly Robinson, Manager of the Child Family Community Australia information exchange at the Australian Institute of Family Studies.

Share