The Art of Measuring Change

Written by
Jannik Kaiser
on
June 30, 2021
Teamwork
Impact

What are your associations with words such as “measurement”, “evaluation” or “indicator”? For many people these words sound annoying at best, and there are good reasons for it. Yet - given the fact that you clicked on this article - it’s also very likely that you are motivated to contribute to social change, and the main intention of this article is to share about the beauty and collaborative power of attempting to measure it. What I am not doing is offering quick solutions, the same way art is not a solution to a problem. Rather, it is an invitation to step back, look at the nuanced process of change with curiosity and an inquisitive mind, and maybe discover something new. The article is also available as a dynamical systems map.

In a nutshell

There are two ways of reading this article: 

  1. Exploration mode: Use the systems map provided above and use the visual representation to navigate through the article more flexibly. 
  2. Focussed mode: Stay here and follow the flow of the article.


Here already a quick overview of the main points and arguments: 

  1. There is a good reason to be skeptical of impact measurement. If the approach taken is too simplistic and/or mainly serves to validate a perspective (e.g. of the donor), it’s almost impossible to measure what truly matters. Rather, such an approach further perpetuates existing power imbalances and puts beneficiaries at risk. (section: The dangers of linearity)
  2. The measurement of social phenomena has to pay justice to the intricacies and complexities of a project, a program, or any given action.  This requires understanding, which puts empathy, listening and collaboration at the centre of an empowering approach to measuring change. (section: About potluck dinners and “power with”)
  3. Meaningful indicators serve as building blocks in the attempt to capture social change. An indicator is meaningful when it not only follows the SMART principles, but is also systemic and relevant, shared (understood and supported by all stakeholders) and inclusive (aware of power relationships). (section: Indicators that matter)
  4. Complexity science can provide an alternative scientific paradigm to understand and make sense of our world and, hence, to measure change. (section: Coming back to complexity)
  5. It’s time to equally distribute power, namely the ability to influence and shape “the rules of the game”, or potentially even the kind of game that is being played in the first place.  A crucial step towards this is describing and defining what desired social change is and how we go about measuring it in a collaborative way. (section: Synthesis)


The starting point

I have been working in the field of impact evaluation and measuring social change for around 6 years now. I’ve interviewed cocoa farmers in Ghana, developed surveys and frameworks, crunched Excel tables of different sizes and qualities, mapped indicators, and held workshops on tracking change in networks. Doing all of this is way more than a job to me. I have met wonderful people on this path, have been part of impactful projects and really had the feeling of being able to contribute in a meaningful way. 

 

To me, impact evaluation is an ambitious, creative and collaborative application of complexity science (more on that later). It can be a deeply empowering process that reveals hidden opportunities and structural challenges, creating empathy between groups of people. 


And it can be harmful. 


The dangers of linearity 

Let me share a definition with you to explain what I mean: 


“An impact evaluation analyses the (positive or negative, intended or unintended) impact of a project, programme, or policy on the target population, and quantifies how large that impact is.”*


I agree with that definition in general, yet when it comes to the details in language, my opinion differs significantly. For example: 

  • Nobody is or should be a target. We don’t shoot projects, programs or policies at people, we work with people. 
  • Quantification plays a role, but we should also qualify impact. 
  • It is not said who has the power to define “positive or negative”, and “intended or unintended”. 


These are not simply trivial semantic differences. All too often I experienced situations where impact evaluation was used by organisations to prove and validate the effectiveness of their services, rather than to truly understand the perception and consequences of what they offer. 

Here is an example of what I mean: A couple of years back I had the opportunity to visit cocoa farmers in Ghana for an impact evaluation for an NGO. We found out that, yes, the NGO’s activities are actually having an effect and are contributing positively to farmers’ income. But we did not capture in writing the fact that most farmers stopped growing subsistence crops for their own consumption, hence making themselves more vulnerable to global trends and market prices. A few months after I left, the market price for cocoa dropped significantly. We did not capture it as it was not part of our evaluation framework or our mental model about what counts as relevant information.* 


This is what happens all too frequently: We have pre-defined and pre-conceived notions of what counts as a data point, and what doesn’t. This again is usually based on a linear model of thinking (aka “A” leads to “B”), not accounting for the complexities and intricacies of human interaction and social systems. I have talked to so many people in the field that were frustrated, stating something along the lines of “we don’t measure what is truly relevant”. 


To put it another way: If empathy and understanding are not at the centre of impact evaluation, both as the foundation and the goal, then it might be (unintentionally!) used against the people we want to benefit. It’s like having a knife in your hand. You can use it to prepare a delicious meal, but also to hurt yourself and others. 



About potluck dinners and “power with”

Let’s dig a bit deeper into that: We’ve learned that what we do in impact evaluation is to assess the change attributed to an intervention (a project, program, workshop etc.). You do something and then something else happens. Now, we want to understand the “somethings” and how they are connected.  This is not as simple a task as it may seem. Reasons include: 

  • We humans love to make our own meaning. You (and your organisation) probably know what the intervention is, but others perceive it fromtheir own perspective and life reality. 
  • Change is like air: Ever-present and hard to grasp. The “mechanics” of change can’t be pinned down easily and require us to look at context & conditions. Similarly, we know how difficult predictions are. Should we trust the weather forecast? For the next 3 to 5 days probably, but beyond that? 
  • There is power in the game: Oftentimes, multiple stakeholders and their multiple opinions are involved in an evaluation. There is nothing wrong with that per se, yet it is important to acknowledge.

The art of impact evaluation, therefore, is not to publish fancy reports, but to apply it in such a way that it deepens the internal understanding of issues at hand and strengthens the collaboration between actors. When this is the case, It builds empathy and human connection, and enables stakeholders to jointly develop shared meaning and scenarios for social change and transformation. 


In other words, impact evaluation is based on and deepens listening. Listening not only to people, but to their context and to groups of people, to underlying and invisible challenges and hopes.* 


In a more recent project I worked on we called all stakeholders together from the very beginning. It was clear who “has the money”, but we also shared the ambition to collaborate on a level playing field. As a consequence, we co-developed the project’s vision and the evaluation framework. We made clear that the first round of data collection also serves the purpose to understand what is relevant; and that the project goals will be further developed based on the “reality on the ground” rather than the content of the funders’ strategy paper. 

We created the space for a genuine dialogue, and the data collected further strengthened the mutual understanding and trust in the group and in the process. It revealed both further challenges and where the leverage points are for creating systemic impact. It also became clear that power is not only linked to money. 


“What makes power dangerous is how it’s used. Power over is driven by fear. Daring and transformative leaders share power with, empower people to, and inspire people to develop power within.” (Brené Brown)*


What we did was to ask a different question than normal. The central question was not “how can one central stakeholder prove their impact?”, but rather “how can we jointly contribute to desired change for a matter that we all care about?”. Shifting the question means that stakeholders and their individual contributions are valued differently, and that the quality of interaction changes. 


Speaking in the metaphor of our delicious meal: We did not go into a restaurant where you tell somebody what should be cooked for you. Rather, we sent out the invitation to join a potluck dinner. Organising  a dinner in this way does not happen at random. It requires clarity on roles and contributions, as well as shared agreements (e.g. on when and where to meet). The fundamental difference to a restaurant visit, though, is that it values participation and what each stakeholder (dinner guest) brings to the table, which in turn requires a mindset of being curious and open to surprises.  


So far so good, but how do we actually measure change?

More resources
Free workshops, events, resources and more, straight to your inbox each month
get our newsletter

Indicators that matter

My experience tells me that the success of your impact evaluation endeavours can be measured in the number of “aha” and “wow” moments you experience in the process. To me, these moments capture the quality of interaction and human connection in a project. They indicate that we are listening to each other. 


Now, as we enter the terrain of indicators, let me start with a confession: I love the word “indicator”! 

Having said that, I do probably have a different association with the word than 98% of the population. The origins can be found in the Latin “indicare”, meaning to point out or show. It also means “to be a sign of”. We humans use indicators all the time, for example when our rumbling belly indicates our  body’s need for nutrition or, depending on the cultural setting we are in, a smile indicates that the other person agrees with us. I would even go so far as to say that we would not be able to cope with the complexity of life without using indicators. They enable us to gain access to phenomena we simply don’t have direct access to but which  are crucial for planning our actions (like the intentions of the other people we interact with). The main question is how good the indicators are and how well they serve us. 

 

To me, thinking of and working with indicators is more creative than technical, more explorative than descriptive. 

One challenge arises when indicators and their meaning are less direct.

Interpreting a rumbling belly is quite straightforward, but what are indicators to understand and measure aspects such as team culture or a shift in mindset? Here are  two examples of “alternative” indicators I pondered in the past weeks: 


  • The “going above and beyond” indicator: How often do teammates share cookies (or comparable treats) with one another - without being asked to do so? How often do clients share something with you that goes beyond what was officially agreed (e.g. an honestly written thank you note)? These actions are an indicator for trust and appreciation, and although they are not a goal in themself they are valuable feedback about the conditions for collaboration. 
  • The “what’s the quality of goodbye?” indicator: People come and go, also within your organisation or project. I find it interesting to look specifically at HOW people leave. Is it in conflict or with frustration, or is it with gratitude and encouragement? The way we say goodbye says a lot about the way we worked together - and probably also how we  work with colleagues who are still around. 


Indicators are then usually packaged and used to formulate goals, such as improving the team spirit. One commonly used framework to design good indicators and goals is “SMART”, which says that goals should be specific, measurable, assignable, realistic and time-bound. Similar to the definition of impact evaluation, I agree with this approach in general.  Nevertheless, adhering (only) to these aspects carries a certain risk: Indicators (and goals) are never as neutral as these five characteristics might suggest. Instead, they will always be an expression of our cultural backgrounds, intentions,beliefs and the sector we are in and therefore run the risk of imposing a certain belief system onto others. Even the intentions of “doing good” or “empowering others” are no exception. My suggestion, therefore, is to add the characteristics of systemic (measuring something that matters), shared (understood and supported by stakeholders involved) and inclusive (sensitive about power relationships). 


Measuring change then requires us to develop and choose these indicators wisely and jointly. To me, that oftentimes feels more like co-writing a piece of poetry than developing a checklist. 

Of course there is much more involved insetting up a rigorous evaluation framework, but to me the essence is always about developing indicators that matter. 



Coming back to complexity

At the beginning of this article I wrote that “impact evaluation is an ambitious, creative and collaborative application of complexity science”. Let’s unpack that sentence further. 


An alternative word for “ambitious” could have been “impossible”, but that sounds less motivating. What I mean is that we simply can’t measure social change in its entirety. The limitations are not only conceptual, but rooted in the science of change itself. There are systems which  are dynamic and chaotic in nature (see the weather) and even the most powerful computers can’t deal with them*. And they never will be able to.  We can, however, try to get as close as we can, seek to understand, choose meaningful indicators and set up an evaluation framework in such a way that it is truly empowering. 


Here, creativity and collaboration come into play. Creativity means to start from a place of curiosity, inquiry and not knowing;  while also applying a certain structure and skills to the subject matter. A creative process might start with a blank canvas, but it then requires dedication, time and skills. Let’s think again about our potluck dinner: We have all been at parties or events that were rather boring, and others where we felt truly invited, excited to contribute, show up and participate. What’s the secret for the latter? A caring host, a clear invitation, trusting the other people at the event, and so on.  

Impact evaluation is no different. Creating the conditions for stakeholders to actively participate can be done intentionally.


The beauty is that we can - and should - do this jointly. I am using the word “should” here consciously. It is far more than a “nice to have” to include beneficiaries (and other stakeholders) early on. It is the most central part to make an impact evaluation meaningful and empowering. 


And what about complexity science? Well, the title of the article could have been “on measuring change from a complexity science perspective” instead of “the art of measuring change”. Less sexy, right? But equally true. 

The longer answer to that question is that you are looking down a rabbit hole called “complexity”, which encompasses a whole new scientific paradigm to understand and make sense of our world. It is the foundation of approaches such as systems thinking and network analysis and seeks nothing less than to understand the fundamental principles of life and living systems. 

Those of you who have already jumped into the rabbit hole probably know what I mean. Below I have included some links and resources that might encourage you to follow the rabbit. 



Synthesis

Let’s summarise the key points from the previous chapters: 

  1. We simply don’t know. We can’t know about the intricacies and complexities of a project, a program, or any given action. Pretending otherwise is, well, pretentious at best and harmful at worst. 
  2. Empathy, listening and collaboration are the foundation for an empowering approach to measuring change. 
  3. Meaningful indicators serve as building blocks in the attempt to capture social change. 


Following these three principles has two major advantages: (1) The results from the measurement effort are usually relevant and actionable for the involved actors* and (2) the process of collecting data in itself becomes a means to change power dynamics. 


In a more recent project we decided to dedicate time for an initial assessment and include qualitative methods strategically. We asked questions such as “which social challenges affect your daily life?” or “how do you imagine your community in 5 to 10 years? Which role do you want to play in creating it?”. The answers not only deepened our understanding beyond quantitative measures, but we also approached the people we interviewed as potential change agents. 

I am surprised how simple such a process can be, and how seldom it is done. The usual knee-jerk-reaction is that asking open-ended questions is a waste of time or beyond the available capacities. If this is the case, then we have to ask ourselves which capacities are needed to contribute to meaningful change and how we want to empower those we seek to empower. 


In the words of  Martin Luther King: 

“Power properly understood is nothing but the ability to achieve purpose. It is the strength required to bring about social, political and economic change. … What is needed is a realization that power without love is reckless and abusive, and love without power is sentimental and anemic.”*


Instead of love I feel more comfortable to use the terms empathy, listening or care in professional contexts. But the main point remains as relevant as 60 years ago: It’s time to equally distribute power, namely the ability to influence and shape “the rules of the game”, or maybe even the kind of game that is being played in the first place.  



A crucial step towards that is describing and defining what desired social change is and how we go about measuring it. Together.



______________________________________

*Footnotes and further resources (all URLs last checked on the 28th of April 2021)


Definition of impact evaluation from the Swiss Agency for Development and Cooperation SDC:

https://ethz.ch/content/dam/ethz/special-interest/gess/nadel-dam/documents/research/Impact%20Evaluations%20EN%20170201.pdf 


Adding to my field visit in Ghana:

The NGO I worked for does a tremendous job in empowering farmers; and offers a safety net for exactly these situations. Yet the general challenge remains that wanting to support others might actually make them more vulnerable and decrease their resilience.


Further resources on the topic of listening: 

ListeningALCHEMY and the “Listen In” podcast with Raquel Ark: https://www.listeningalchemy.com/ 

Oscar Trimboli: https://www.oscartrimboli.com/ 


Definition of the word “indicator” (or better “to indicate”)

https://www.thefreedictionary.com/INDICARE 

https://www.dictionary.com/browse/indicate 


URL to Brené Browns definition of “power over” and “power with”: URL: https://brenebrown.com/wp-content/uploads/2020/10/Brene-Brown-on-Power-and-Leadership-10-26-20.pdf


About weather forecasts and the impossibility of knowing the future state(s) of dynamical systems: 

YouTube-Video on Emergence and Complexity with Professor Robert Sapolsky, introducing the ideas of chaos (see Gleick’s book below) and the butterfly effect: https://www.youtube.com/watch?v=o_ZuWbX-CyE&t=1348s 

About Lorenz and his discovery of the “butterfly effect”: https://www.aps.org/publications/apsnews/200301/history.cfm 

About the idea of “replaying evolution” and introducing contingency as underlying force: https://www.americanscientist.org/article/replaying-evolution 



Resources on complexity science

Book recommendations: 

Chaos: Making a New Science (James Gleick)

Complexity. A Guided Tour (Melanie Michell) 


Websites and online resources: 

Complexity Explorer of the Santa Fe Institute of Complexity. Probably the most compelling and comprehensive platform on the subject; with (free) online courses and resources. URL: https://www.complexityexplorer.org/ 

“How Wolves Change Rivers”. A four minute YouTube video on complexity science in nature, focussing on the concept of “trophic cascades”. URL: https://www.youtube.com/watch?v=ysa5OBhXz-Q 

“Being a strange attractor”. A previous blog article of mine that introduces certain terms such as dynamical system or, yes, strange attractors. URL: https://unityeffect.net/2019/09/25/being-a-strange-attractor/ 



Methods 

Here is a short list of methods that support in capturing the complex nature of social change: 

  • Systemic modelling
  • Network analysis and visualisation
  • Most Significant Change (MSC) 
  • Story harvesting 


About power, love and Martin Luther King: 

BBC interview with MLK (very inspiring!): https://www.youtube.com/watch?v=Df4fycfda10

Adam Kahane’s TEDx talk on “Power and Love”: https://www.youtube.com/watch?v=1nPfpepxEuE 

Quoting: “The only way to to solve tough problems is to be able to work with both power and love”



Picture references: 

Thumbnail: Photo by Utopia By Cho on Unsplash

Cover picture: Photo by "My Life Through A Lens" on Unsplash 

About the author
Co-founder of Unity Effect. Striving to capture the depths and intricacies of social change, and how to shape it. Having a soft spot for complexity science, impact evaluation and cat videos.
Close Cookie Preference Manager
Cookie Settings
By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage and assist in our marketing efforts. More info
Strictly Necessary (Always Active)
Cookies required to enable basic website functionality.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.