We are thrilled to publish our first annual Impact Report. Putting it together was a project in and of itself, and gave us the first focused opportunity to reflect on what sort of impact we aim to have. As an organisation we are being openly experimental with our portfolio of projects during our first two years. We do not yet know where we can make the most change, and we are enjoying the process of learning by doing.
So how we measure and evaluate our impact is also an evolving process. Before taking on any project, we assess to what extent it meets one or more of our charitable objectives, how it aligns with our criteria, its likelihood to become sustainable, and we map where its intended impact lies across the character strengths, soft skills and other educational outcomes we seek to develop.
We have created a Theory of Change that we are ourselves testing and willing to update and improve as the year goes on. We are developing a series of tools that we share with our collaborators in order to track changes before, during and after projects. We are trying to do all this in a way that reduces administrative overhead and data collection, and instead makes it meaningful. We are keen on the Lean Data approach (as encouraged by Acumen: http://acumen.org/ideas/lean-data/) and believe in the idea of managing impact as it is happening as opposed to waiting until it’s all over to decide what to do differently next time.
There are some things we do as standard: we figure out the cost per participant, as a benchmark more than a judgment. We anticipate and then count up numbers of direct beneficiaries (the children or people who participate directly in projects we support), and also the number of indirect beneficiaries (those who indirectly gain, or we intend to gain, from the projects). Those are three of six KPIs (Key Performance Indicators) for which we hold ourselves to account. The others are our media reach, the number of participants who co-design projects, and our sphere of influence.
We support our partners to decide what questions they want to know at the start, and how best to find the answers as they go along. We want to be helpful rather than prescriptive. And during 2017 we are evaluating our own evaluation process. We never want it to be stiff, but we do want it to be pertinent, meaningful, perceptive—if we keep striving to do it better, perhaps we can help others do it better too.
Please read what we have set out in our 2016 Impact Report and let us know what you think.