Telling Stories through Meaning Making

by |

I recently attended the 2023 American Evaluation Association Annual Conference to present on how ICH recently used social network analysis for one of our evaluations. The theme of the conference was “The Power of Story,” which meant the conference was full of evaluators sharing different methods they had used to share the stories of their work and help partners and clients develop a deeper understanding of what is learned through evaluation (1).

This theme aligned very well with the approach that ICH takes to evaluation, as well as my own approach, informed by my background in Anthropology. Helping clients reflect on and engage with our learnings can often most effectively be done through sharing a story of our data with them. Additionally, one of my goals as an evaluator is to make sure that our clients hear and understand the stories of the community members we often work closely with during our evaluation work. Social network analysis is one tool that ICH has found very effective in helping us understand and visualize relationships among partners on a project, which is an important piece of the overall story of their work. I spent the entirety of my session talking with the many people who wanted to learn from our experience with this process. The common theme was that many felt having a better understanding of relationships among the community members and organizations they work with was critical, but did not know how to approach collecting that information. The conference gave us a great opportunity to share our strategy with other evaluators.

Conference panels were a great opportunity for me to improve my evaluation skills as well, and I wanted to particularly highlight a strategy called “meaning-making” that I learned about during a panel led by Lenka Berkowitz and Elena Kuo (2). The panel focused on the value of evaluators leading “meaning-making sessions” with program staff to facilitate “group reflection to uncover new insights from data and generate actionable next steps for program improvement.” This topic resonated with me in particular because ICH regularly uses reflection sessions with clients to encourage them to pause and think through what our evaluation has revealed. 

Since I returned from the conference, I’ve already used what I learned in the session to improve my reflection session facilitation skills. The meaning-making structure the panelists proposed was for evaluators and program staff to review the data collected in an evaluation and then come together to think through a series of questions: What in the findings was expected? What was surprising? What questions did the data generate for you? What should be done now that we have this information? Shortly after the conference, I met with a client for a reflection session, to look back at the data we had collected over the last year with several cohorts who had participated in the program being evaluated. Influenced by what I had learned at the AEA conference, I broke the data we had collected into four sections and spent time with the client going through the first three questions for each category. At the end of the session, I focused the discussion on how what we had learned and discussed could impact the future design of the program. The client shared that they were very glad they had taken the time to engage with the evaluation learning in this way and already had new ideas for what to do next. I found that this structure helped me facilitate a complex conversation that allowed the client to think constructively about a large amount of information without being overwhelmed by the data set; in short, it helped them see the story within the data. I’m excited to continue using this strategy to refine my approach to facilitation in the future, and excited to keep learning tools and approaches to help us tell compelling stories about what we learn through evaluation.

 

  1. https://www.eval.org/Events/Evaluation-Conference/Conference-Theme
  2. Berkowitz, Lenka and Elena Kuo. “Working with Program Staff to Create a Compelling Narrative of the Program’s Impact Through Meaning-Making,” AEA 2023 Annual Conference Panel

Amanda Robinson, PhD

Senior Research and Evaluation Project Manager