Search

Why your agency doesn’t learn from data (and how it might)

Updated: Mar 29

There are two things about “learning” which everyone in the social impact sector agrees on:

Firstly, “Learning from data/evidence is essential for any organisation trying to affect social change.”

Secondly, “Learning from data/evidence is difficult and we are not doing it well enough”.

Every publication, conference, or staff meeting in our sector emphasises that learning is important, to enhance our chances at actually making the change we want to see despite the complexity and challenges. However, there is a never-ending series of blogposts and overbooked conference slots on topics related to “learning”, full of evaluators and funders lamenting the lack of it.

More importantly, there are far too many development interventions that don’t achieve any change for the better, that in some cases harm the people whose lives they aimed to improve.

We really can’t afford not to learn.


There are many reasons why “learning” is so difficult. I’ll illustrate them from the perspective of “Diane the Director”, who works for a medium-sized international development agency. She is responsible for strategic oversight over a portfolio of projects doing important development work in a variety of countries. Think about the variations of Diane’s problems that you have come across — and afterwards, let’s look at possible solutions.


Problem 1 — Insight framework: The organisation is not collecting the data it would need to make programming decisions.


Diane has read about organisational learning and has already set a “learning goal” for her teams. She observes her programme managers designing programmes and notices that this is based entirely on gut feeling and anecdotal experience. Her programme managers inform her that the kind of information they would need in order to design great programmes is simply not available. When they ask their Monitoring and Evaluation colleagues about the kinds of activities that have been impactful in similar contexts, these colleagues shrug. “It’s really not that easy to determine” they say, “But why don’t you read our latest 15 evaluations? There should be something in there”

Diane sits down with Michael the Monitoring and Evaluation team leader to look at the data that is currently being collected by their agency. “We only collect data the donors are interested in, really”, says Michael, “that’s why we can’t answer questions we are interested in.” They decide to change that and hire a monitoring and evaluation consultant to help their organisation establish their data needs and monitoring framework. Colin the Consultant does an excellent job and comes up with a set of useful data points attached to each element of the organisation’s theory of change. He also develops a set of Excel and Word templates to help the agency collect this data regularly.


Problem 2 — Data tools: The organisation does not have the tools to monitor the right data at the right frequency, in the right format.

Several months later, Diane and Michael look at all the information that now comes out of these new Excel and Word templates. It has a couple of gaps and errors where people accidentally wrote the wrong numbers in the wrong cells, or ignored essential sections, but it is much better overall. However, they now have a new problem: Diane and Michael get inundated with files from 50 programmes — which they still cannot compare over programmes or over time. In addition, their programme staff, though they see the value of the exercise, complain about how much time it takes to fill out these internal documents in addition to their donor-required monitoring and reporting.

“This is really not appropriate for the 21st century”, says Diane. Together with Michael, she convinces the agency’s leadership of the need to invest in a monitoring database, a bespoke online tool that would make it easier to both input/collect data and to analyse it. They contract Best Software Company Ltd to build the tool, based on Colin’s templates and data points. This turns out to be a time-intensive and at times very frustrating process, because there is a real culture clash between what the software developers can understand and value, and what Diane and Michael want them to do. Nevertheless, after several months, they introduce their new monitoring app to all of their staff. Michael’s team runs a number of trainings to make everyone comfortable with it, and then Diane and Michael sit back to watch the data roll in.


Problem 3 — Data tools adaptation: The organisation expects to produce the right monitoring app at first attempt. The current software market makes iterative development too costly, which leaves organisations with either imperfect or unusable tools.


Within weeks, they receive dozens of emails from their staff around the world, highlighting bits that seem to be missing from the new monitoring app — it turns out that capturing the real work and results of their agency is quite complex, and they were only aware of half the complexity when the software was built. As their staff demand more and more additions to the monitoring app, Diane and Michael are both encouraged and disheartened — they have run out of budget and cannot buy developer time from Best Software Company Ltd required to make the new changes. "We wanted to build an internal monitoring tool, not make Facebook", Michael tries to justify it to their staff, "just use it even if it isn't perfect for now." Their staff grow frustrated because the monitoring app doesn’t capture what they need it to. They increasingly resort to workarounds, in the form of emails and yet more spreadsheets that exist in parallel to the app. Some of the teams stop using the monitoring app entirely. Diane and Michael worry they have lost the entire investment. Due to the low engagement with the monitoring app, the agency’s leadership is not willing to invest in any further software development.


Problem 4 — Data presentation: The organisation expects humans to change behaviour and make decisions based on complex graphs and information, even though our brains are wired to instead remember emotions, pictures, or stories.



The teams that keep using the monitoring app produce enough data on a sub-section of the agency’s work to allow Michael’s monitoring team to do some analysis. They get excited at the mix of quant and qual data they now have available, and produce a few fancy graphs and broad conclusions, which they present to their programme colleagues. Everyone nods and says “oh how interesting”. However, during day-to-day work, Diane notices that her programme managers have not changed anything about their decision-making and programme designing.

Diane reaches out to her friend Vivian, who specialises in data visualisation and reporting. Vivian the visualisation expert looks at the data the agency has available, and produces a set of beautiful slides that tell a very convincing story based on what she found. Diane and Michael are thrilled. Finally some actual learning based on their own data!


Problem 5 — Insight use: The organisation expects humans to use data during idealised decision-making processes, not based on real-life incentives for behaviour. When they present this to the programme managers, they encounter two new problems: a) mismatched incentives, and b) disproved assumptions.



Programme managers are very busy people, constantly racing to satisfy the demands of their donors and their project stakeholders. Their performance is evaluated based on how much money they spent and how well they spent it, not based on how innovative their approaches were or how much actual impact they created. Consequently, they either don’t show up or listen to Vivian’s presentation, or they nod politely and then forget about it. Vivian knows she ideally would present this information at the precise time a programme manager would be inclined to make a decision based on it, but finds it hard to find that point in time as an outsider.

The programme managers who can spare the time to attend Vivian’s presentation react with significant scepticism: what the story concludes really doesn’t fit with what they observe on the ground. “You say X is the main problem with our programmes”, says Paul the Programme manager, who has been with the agency for eight years, “but we already deal with X through Y. I don’t see Y represented on your slides?” Paul crosses his arms, clearly annoyed at how "management" have yet again disregarded his work, despite the effort he puts into monitoring every penny he spends.

Vivian turns to Diane, who blushes and turns to her big folder containing all of the iterations of this monitoring solution. “You’re right”, she says apologetically to Paul the Programme manager, “we don’t currently collect the right information on Y. We discussed it briefly with Colin but he thought it would be difficult to collect. We didn’t include it in the monitoring app because that would have cost too much money. And of course Vivian could only visualise what we had data for.”

Everyone leaves the meeting feeling disappointed. The programmes teams feel like they have spent far too much time collecting data to find things they already knew. Diane and Michael feel like they have spent far too much effort and money without solving the problem they started with — their agency does not use data to make programming decisions. Colin the consultant, Best Software Company Ltd, and Vivian the visualisation expert have all been told they did a good job, but their solutions in fact to did not add up to what was really needed.


Fundamental problem: Unless an organisation approaches all of the above problems in an integrated way, it will fail to make progress on data-based decision-making.


If you work in a foundation, non-profit, social enterprise or public agency, the story above, or parts of it, will sound familiar. The good news is it’s not your agency’s fault, probably, it’s the fault of how our sector usually approaches this problem.

It really doesn’t have to be this way. There are tools and science out there to pre-empt some of Diane’s problems, and increase your chances of actually making learning from data a reality. There are no quick fixes or panaceas, this will continue to be mainly trial and error in practice. But just imagine what would be possible if we approached the question of making organisations learn from data in line with these four principles:

Data fit for purpose: when Diane and Michael decide to make data collection at their organisation strategic and purpose-driven, they work backwards from the learning needs at decision-making level. Rather than thinking about “what data should we collect?”, they start with “what data does which decision-maker need to see, when, how, and why?”, and then work backwards from there to find the data that can fulfil that purpose. They recognise they will not be able to answer that once and for all, especially not in the beginning. Instead, they define a “good enough” first set of data points, and schedule a way to improve this after their stakeholders had the chance to play with some real data.


Tools fit for change: it’s 2020 and you don’t have to make the decision between juggling spreadsheets and risking a fortune for a custom-built data software. You can build your dream monitoring database yourself, without needing to write any code, for about 1–4 k $ per year. (Yes, no joking, no zeroes missing.) Diane and Michael look at tools such as Knack, Zoho, Caspio and related no-code web apps, and decide to bring in Sally the self-built database coach upskill their own staff, in order to build their agency’s very own monitoring app. The result is empowering — within days, they can show a first version to their users. Within weeks, their staff have not only taken up using the much more user-friendly tool, but have established a regular feedback flow with the in-house no-code builders. The tool gets better with every user who complains, at no extra cost. Programme staff appreciate the fact that the person building their tool is their colleague, who knows their use cases as well as their struggles, and can respond to them better than any specialised software developer. Higher management is satisfied with the high value add at low cost, and grateful that this enhances the company’s data protection and privacy practices. (You can read the real story of how my current agency did this here).


Practice fit for humans: There is considerable misunderstanding in most organisations as to how humans a) learn, and b) change their behaviour. Most organisations I have come across imagine “learning” basically as events — “lunch and learn” sessions, or “peer-to-peer learning”, or formal courses. In reality, humans don’t learn primarily through formal teaching, but through doing. A more useful way to think about learning is as building habits — habits to question, to rethink work you think you already know, to find new ways of doing things. Diane and Michael start building new habits in their organisation, by making the new ways of behaving easier than old ways, and systematically addressing barriers to the kind of learning they want to see. They find that this way of working is also very helpful for managing practical change in an organisation — once they start applying behavioural science and human-centric design to the work on the monitoring app, their programme staff find it easier to use and integrate it into their daily work.

The overarching principle helping them along is to base everything on how humans actually behave, not how an idealised process would have them behave.


Own their own learning journey. From the start, they recognised and were open to people below and above them about these fundamental facts regarding learning from data -

  • We invest in improving how we learn because we think that is genuinely helpful for the work we do. We don’t do it because donors request it, or because it’s the new standard in the sector.

  • Learning will make aspects of our work visible that we are not proud of. That has to be not only acceptable, but encouraged, by staff at all levels.

  • Learning will not happen overnight. Each of the elements — data, tools, practices — of our new learning culture will take time to build, and many iterations on the way. The key is to test, all the time, and to accept that every part is imperfect most of the time.


A year later, Diane and Michael hold a presentation at their agency’s annual staff meeting. They talk about the steps of the learning journey, and the entire audience chuckles as they show them a very early version of the monitoring app — it had far too many confusing buttons all over. Colin the consultant sits in the back and is proud that elements of his monitoring framework feature in all 16 versions of the data framework. Sally the self-built database coach whispers “thank you” as Diane and Michael talk about how Colin made sure they brought her in, rather than Best Software Company Ltd, and Sally gets a round of applause for bringing the power to make their own tools to the agency. Vivian the visualisation specialist has become a close friend of Sally’s as the two of them spent hours and hours iterating an impressive dashboard, which even the agency’s governors admire. Paul the programme manager presents a case study of how his team worked with Michael to draw out relevant data when they designed their latest programme.

Diane doesn’t glow with pride, but with gratitude in the end. “Thank you for believing in us all this time”, she says, “we can’t wait to hear what more you want us to change about this data and learning system.”


Curious about what your organisation's learning journey could look like? So am I. Get in touch with any ideas, critique, or stories of your own!

0 views

©2020 by EvalTech Consulting.