What “putting learning at the centre” looks like in practice
Most development practitioners will by now be familiar with the movement towards Doing Development Differently, and the arguments for working in a more problem driven and iterative manner.
The hypothesis that the impact of aid – particularly when dealing with complex problems in dynamic environments – can be increased by eschewing traditional approaches and embracing an adaptive way of working, has become mainstream, and donors such as DFID, USAID, WBG, D-FAT and SIDA are exploring how best to balance the twin needs for operational certainty and technical flexibility.
But what does a programme that aims to put learning at the centre really look like, and how does monitoring, evaluation and learning (MEL) in an adaptive programme differ from conventional monitoring and evaluation (M&E) methods? Bond asked the Legal Assistance for Economic Reform (LASER) programme, one of DFID’s first programmes tasked with testing an adaptive approach, to outline how it went about achieving and measuring change in an adaptive programme by incorporating real time learning and reflection into the design and delivery of technical assistance.
_____________________
LASER, a three year ?4.3M programme which came to an end in May 2017, set out to support developing country governments in eight countries to improve the investment climate. The design of LASER was informed by an in-depth review of available evidence on law and justice related business environment reform and latest thinking on development approaches. In turn, LASER had a mandate to:
- Contribute to the evidence base on what works in law and justice related business environment reform
- Test new approaches to investment climate reform (specifically in fragile and conflict affected environments)
- Document lessons, including on adaptive programming
- Share learning with, and influence approaches of; donors, donor funded programmes and providers of pro bono legal assistance.
LASER’s approach was to emphasise learning by doing and to adapt throughout the life of the programme. This adaptation was managed in a structured and transparent manner at both a strategic and operational level, as a result of learning about 1) what works and what doesn’t, 2) changes in partner needs, and 3) the evolving political context.
Design
Instead of specifying up front what LASER assistance would look like and detailing the exact results to be delivered over the life of the programme, DFID (in partnership with The Law & Development Partnership and KPMG) developed an approach that aimed to meet a pre-agreed “level of ambition”, without tying the programme down to fixed inputs or outputs. This meant that LASER was able to support developing country government partners to identify problems they cared about and to develop solutions relevant to their context – while being accountable to UK taxpayers for delivering appropriate results and value for money.
In the design phase, three aspects were key to enabling ongoing learning and reflection:
- LASER’s design incorporated time and space to engage on an ongoing basis with the evolving evidence base. Engagement with a network of leading thinkers, the ability to reflect on new evidence and thinking as this became available, and the willingness of both DFID and the implementers to test new approaches in practice – even where this might result in failure – contributed to the success of LASER.
- LASER used a results framework which set high level outcome targets, without tying the programme down to narrow, predetermined outputs. By using an overarching logframe underpinned by nested logframes, and agreeing milestones on a rolling basis, LASER maintained the ability to focus on reform initiatives of importance to developing country government partners where the necessary political will for change existed. Simultaneously, by signing up to the delivery of a pre-determined number of Stories of Change and Major and Moderate results, and linking payment to results, DFID was assured of the scale of reform the programme was working towards and able to hold the implementers accountable for delivering this.
- A significant percentage of the total programme budget (28%) was allocated to ongoing lesson learning and influencing. In addition to M&E activities, outcome indicators and payment milestones were set against activities related to documenting and sharing of learning. This meant that ongoing learning, reflection and write-up of experiences and findings were viewed as legitimate programme activities and not treated as an “add on” to technical delivery.
Implementation
The LASER programme also benefited from a lengthy inception or exploratory phase during which significant in-country engagement took place. Although LASER undertook some analysis (but not too much), the programme emphasised “learning by doing”, and stressed the need to build relationships of trust with government partners by working on immediate problems developing country governments identified, even where these problems might not have been the “right” problems as defined by the donor. This early engagement provided opportunities to increase understanding of the political environment, and served as the “hook” or “entry point” from which to broaden engagement. It also provided the opportunity to “fail fast”, stopping initiatives unlikely to achieve meaningful results and allowing the refocusing of resources on avenues where traction has been achieved and where sustainable reform was likely, thereby increasing value for money to DFID.
Unlike traditional inception phases, the focus was not on stakeholder engagement, but on addressing real problems and delivering results. In addition, during implementation of the LASER programme:
- Monitoring and learning were viewed as integral to technical delivery, with technical delivery simultaneously presenting opportunities for learning beyond research. LASER experimented with different approaches to delivering technical support, and refined its understanding of adaptive programming and problem driven engagement as the programme progressed. One of the interesting consequences of this locally led approach was that LASER was requested by developing country partners to provide technical advice on MEL, and monitoring and evaluation capacity building initiatives became part of programme delivery, enhancing the sustainability of evaluation efforts and broadening the definition of accountability.
- M&E was viewed as the responsibility of all programme staff, and LASER developed tools and approaches to encourage a culture of learning and reflection. This meant that M&E was not viewed as a “stand-alone” programme component, but that all technical staff contributed to the continuous collection and analysis of data, and that reflection became an ongoing activity for country and programme teams. Tools such as problem diaries and spider diagrams were used, and opportunities for regular reflection – such as through strategy reviews – were introduced. Decisions were not made on “gut feeling” but country and output teams were expected to clearly set out evidence for recommendations on what to take forward or what to change, and to be able to defend this when challenged; in addition, this had to be documented (without becoming overly burdensome) and sign off – where needed from DFID – had to be obtained quickly and efficiently.
- MEL data was not only viewed as a way to measure progress and performance, or to meet accountability requirements, but was instead used to inform technical and programme management decisions on an ongoing basis. LASER operated in a manner that encouraged evidence driven programming, and programme management processes and systems allowed changes to be made quickly to the results framework, budgets, resources, milestones etc. as a result of learning.
Food for thought
During LASER implementation the programme not only enabled ongoing learning and reflection, but programme systems and processes necessitated this; decisions on expenditure or resourcing, for example, were driven directly by the in-country engagement. This way of working is, however, labour intensive, both for programme and DFID staff. Robust knowledge management and administrative systems need to be in place, while realistic expectations on level of effort and investment are required. LASER’s final paper on MEL provides eight guidance points for designing MEL tools and approaches to support change.
Importantly, such an approach requires a mind-shift from both donors and implementing partners, who 1) need to be comfortable with a high level of uncertainty and risk, 2) should acknowledge that not everything will work, all of the time, and some level of failure will be inevitable and 3) need to be aware that new ways of measuring change over more realistic timeframes are required.
To learn more about how LASER put learning at the centre, and this guidance click here.