This resource offers some basic tips for reporting evaluation results so that they are relevant to your audience.
Evaluation results are used to improve programs, sustain positive outcomes, and improve the community’s overall plan for addressing substance abuse and promoting wellness. But they can be used for other reasons as well, such as to help obtain funding and to build community awareness and support for prevention.
So evaluation results need to get into the hands of the people who can use them. Keep in mind that organizations don’t use evaluation results; people do. The Department of Health, for example, isn’t going to use the results of an evaluation, but “Cathy Smith” in the Department of Health may. So, unless you get the results of the program evaluation into her hands and explain how she can use the results, they will sit on a shelf somewhere in the Department of Health.
Follow these general guidelines for reporting your results:
Remember that each stakeholder has his or her own interests and may need different kinds of information about the results of an evaluation. So, one size will not fit all when sharing evaluation results with stakeholders.
The following set of questions can guide how evaluation results are presented, in order to ensure that results are relevant to various stakeholders and community members:
Answering these questions will help you determine the presentation of your evaluation results to various stakeholders.
Some factors may influence whether and how your evaluation results get used. So keep these in mind:
— The way in which findings are reported, including layout, readability, and user-friendliness, all make a difference. The timing is also critical. If a report is needed for a legislative session, but isn’t ready in time, then the chances of the data being used drop dramatically.
— The quality of the evaluation and relevance of the findings matters. If the evaluation design is logically linked to the purpose and outcomes of the project, the findings are far more likely to be put to use.
— The availability of support and technical assistance, after findings are reported, can sustain use. Questions of interpretation will arise over time, and people will be more likely to use the results if those kinds of questions can get answered.
— The political context or climate can have an impact. Some evaluation results will get used because of political support, and others will get squashed because of political pressure.
— Other factors, like the size of your organization or program, may matter as well. Sometimes larger programs get more press. Sometimes targeted programs do.
— Consider competing information. For example, are there results from similar programs that confirm or deny your results? Are there other topics competing for attention?
Developed under the Substance Abuse and Mental Health Services Administration’s (SAMHSA) Center for the Application of Prevention Technologies contract. Reference #HHSS277200800004C. For training and/or technical assistance purposes only.