Reporting

Utilization of Evaluation Results

How the results of an evaluation are communicated should be informed by the original purpose of the evaluation. For example if an evaluation was intended to generate knowledge the findings should be written so that stakeholders can readily utilize the results. Alternatively if an evaluation was primarily for the purpose of accountability the results should be formatted to show compliance with a funding agreement or in a way that speaks directly to the public about the value or effectiveness of a program.

Rossi, Freeman, & Lipsey (2004) highlight a number of important considerations to encourage utilization of evaluation work. They suggest evaluators include a dissemination plan as part of their overall design. The dissemination plan should consider the cognitive style of decision makers that will review the results. The plan should include scheduling that allow decisions makers sufficient time to factor evaluation results into their decision making. They also encourage incorporating an assessment of utilization of evaluation activities as part of the evaluation itself.

Telling a Story

I spoke with Kristy Rhyason about the role of communications in presenting evaluation data. Kristy is an experienced communications professional working in the non-profit sector and private industry in Edmonton, Alberta.

Kristy emphasized the importance of “telling a story”. As you determine what format your reporting will take consider best practices in your field. For example below are the annual reports for different Canadian food banks. Although not strictly evaluation documents they are important evaluative activities for non-profits.

Each of these annual reports have a number of overlapping characteristics. What you might notice when comparing is the emphasis on qualitative vs quantitative data. The Greater Vancouver Food Bank report relies heavily on text descriptions. From this you can get a sense of the importance of community building. The Edmonton Food Bank has large numbers printed across the pages to emphasize quantitative information. This is particularly effective at informing the reader of the rising demand food bank hampers. The Winnipeg Harvest report is a blend of both approaches. Consider the type of information that you want to bring to the forefront in your reporting.

Communicating Clearly

” The main difference between effective and ineffective data displays is their ability to communicate the evaluator’s key message in a clear and straightforward way such that it does not overload a viewer’s working memory capacity .”

Azzam, Evergreen, & Metzner (2013)

When presenting data it is important to make sure that it is represented fairly. Azzam, et al. (2013) caution the use of visuals that could imply unsubstantiated causation. Not all sets of data in an evaluation require visualizations. Evaluators should carefully consider the evaluation questions and what they want the audience to take away from their reporting. Naidoo & Campbell (2016) compiled a list of best practices based on a literature review:

  • Colours – Consistent colour palettes and themes support memory retention. Too many colours can be distracting, instead opt for high contrast between important visuals.
  • Organization – Enhance memory retention by using headings, logical sequencing of information, and grouping of complementary data.
  • Infographics – Coordinate visualizations through the use of repeated symbols and shapes, readable labels, reduced clutter, clear legends, and visual cues.
  • Readabiltiy – Typography matters, choose an easy-to-read font and include white space between headlines, text, and images.

When working with tools to create visualizations there is the temptation to incorporate exciting new features. It is important self edit to adhere to the guidelines listed above. Sometimes the default settings of programs like excel will create cluttered visuals. Azzam, Evergreen, & Metzner (2013) for example recommend deleting redundant labels or legends to streamline images for readability and clarity. They give the specific example of an auto-generated bar graph in excel that includes a legend, grid lines, and data labels on the individual bars. Removing some of the repetitive elements creates a more readable visual.

To demonstrate these best practices in action I have created two version of the same graph. The graph is the results of a student survey for a fictional baking course. The first graph below is an example of what not to do. The colours do not show meaning, the font is blurry, the 3D effect is distracting, there are a number of redundant labels, and the overall appearance is cluttered.

This next graph is clean and concise. There is a good use of white space and the image is at high enough resolution to make it clear and legible. The darker the colour the stronger the indicator. This makes it easy to pinpoint areas of weakness in a long list of skills by finding areas with lighter colours. If the colour scheme was repeated throughout the evaluation document the reader could quickly find meaning in each visual. The skills are ordered in a logical sequence for bread making.


References

Azzam, T., Evergreen, S., Germuth, A., Kistler, S., Azzam, T., & Evergreen, S. (2013). Data Visualization and Evaluation. New Directions for Evaluation, 2013(139), 7–32. https://doi.org/10.1002/ev.20065

Azzam, T., Evergreen, S., & Metzner, C. (2013). Design Principles for Data Visualization in Evaluation. New Directions for Evaluation, 2013(140).

Naidoo, J., & Campbell, K. (2016). Extended abstract: Best practices for data visualization. 2016 IEEE International Professional Communication Conference (IPCC), 2016-, 1–3. https://doi.org/10.1109/IPCC.2016.7740509

Rossi, P. H., Freeman, H. E., & Lipsey, M. W. (2004). Evaluation: A systematic approach (7th ed.). Thousand Oaks, CA: SAGE Publications.