How DFID Learns – Terms of Reference
1. Introduction
1.1 The Independent Commission for Aid Impact (ICAI) is the independent body responsible for scrutinising UK aid. We focus on maximising the effectiveness of the UK aid budget for intended beneficiaries and on delivering value for money for UK taxpayers. We carry out independent reviews of aid programmes and of issues affecting the delivery of UK aid. We publish transparent, impartial and objective reports to provide evidence and clear recommendations to support UK Government decision-making and to strengthen the accountability of the aid programme. Our reports are written to be accessible to a general readership and we use a simple ‘traffic light’ system to report our judgement.
1.2 Our reports on DFID’s programmes typically focus on four areas of assessment: objectives, delivery, impact and learning. By learning, we mean the extent to which DFID gains and uses the knowledge it garners in the course of its work and the work of others in related areas to influence its actions and strategies. From the 26 ICAI studies published to date, we have seen that DFID staff sometimes do not use available knowledge when making decisions, thereby inhibiting the impact and effectiveness of aid. We want to understand why this happens, as we have seen how much learning can enhance the impact and effectiveness of aid when used to best effect. The International Development Committee (IDC) is also interested in the topic and has requested that we look into how DFID’s staff learn and how the organisation enables them to do so.
1.3 These Terms of Reference outline the purpose and nature of the review and the main themes that it will investigate. A detailed methodology will be developed during an inception phase.
2. Background
2.1 DFID is a diverse organisation. The skills, training, experience and nationalities of DFID’s 2,750 staff vary considerably. They are based in two sites in the UK (in London and East Kilbride) and work across 28 priority countries. Further staff may be located with international agencies, co-located with the Foreign and Commonwealth Office (FCO) or be on secondment in other organisations. They carry out a variety of technical, administrative and policy-making tasks in support of the UK’s development objectives.
2.2 DFID’s staff use various sources for learning. Individuals draw, for example, on mixtures of personal prior experience, the wisdom and experience of peers or mentors, formal training and a variety of internal and external documentary analysis. Their learning is a mixture of deliberate processes to acquire knowledge and incidental accumulation over time. The knowledge acquired will be both explicit (i.e. staff are aware that they have particular knowledge) and tacit i.e. staff will not be aware). The knowledge will be about context, content and approaches and tackle the ‘why’, ‘what’ and ‘how’ of their activities. Not all staff will find learning easy; although they have access to information, this may not result in observable changes in behaviour or decisions.
2.3 From its creation in 1997, DFID has implemented changes in its staff’s working practices that have sought to strengthen the use of knowledge and improve performance. These have included using technology to improve connectivity across the organisation. Examples have included implementing an intranet and an electronic filing system that should allow any member of staff to access information and documents from their computer; and video-conferencing. Networking of staff has taken place around shared issues, interests and tasks and is significantly enabled by such technology.
2.4 In 2002-03, DFID senior management actively sought to break down previous silos of knowledge within professional advisory groups or cadres, which resulted in a wholesale restructuring of its Policy Department. Such silos were seen at the time as barriers to flexibility and the flow of knowledge. At the same time, human resource management changes sought to create more fluidity in roles between technical and administrative functions.
2.5 Over the last five years, there has been a focus on improving the quality and availability of evidence, both to ensure best results and to make decisions. This has resulted in changes to DFID’s project management processes (notably the introduction of business cases) and strategic decision-making (such as the Bilateral Aid Review and Multilateral Aid Review processes).
2.6 The focus on evidence has been accompanied by a significant increase by DFID in its funding of research and evaluation. The research budget alone is currently over £1 billion in the present 2011-15 budget allocation. DFID now provides open access to its research findings on key development issues online.1 Evaluation funding has also been decentralised to departments within DFID to improve its utilisation and relevance for learning. This move has also been accompanied by a professionalisation of DFID’s approach to using evidence and the accreditation of 131 evaluation and results advisers, based in DFID departments.2
2.7 DFID funds a series of external resource centres to generate, capture and present knowledge. These have covered key themes of work for DFID such as:
- health and education (now under the title human development);
- governance and social development;
- humanitarian and conflict;
- water, sanitation and the environment;
- climate change, technology, engineering, infrastructure and urban planning;
- livelihoods; and
- economics and private sector development.3
2.8 These resource centres supplement DFID’s internal analytical capacity and were created to provide knowledge for advisory and programme management staff. They are usually provided under contract by consortia of consultancy, research and academic institutions. Typically, most information that is generated is freely available online.
2.9 Alongside this trend, DFID has defined itself more clearly to be an organisation that does not deliver aid. Rather it manages others, commissioning suppliers or working with partners who do. Ensuring that DFID learns effectively from its delivery agents, particularly during an activity’s implementation, is a challenge when its delivery chains are long and − under an increasing aid budget − the coverage of particular programmes is broad. It is important to understand the relationship between DFID’s outsourcing of delivery of aid and the outsourcing of the necessary knowledge and insight effectively to make decisions and oversee its programmes.
3. Purpose of this review
3.1 To assess how effectively DFID and its staff learn in order to improve the value for money and impact of aid programmes, taking account of DFID’s increasing focus on fragile states.
4. Relationship to other reviews
4.1 Figure 1 on page 3 sets out ratings for learning given for our reports so far. It shows the 26 reports published to date. In 10 cases, we gave Red or Amber-Red ratings, meaning that significant or major changes to the way DFID is learning were needed (including to the way DFID undertakes monitoring and evaluation).
| Ratings | ICAI study | 
|---|---|
| Green | 1. DFID's Climate Change Programming in Bangladesh | 
| 2. DFID's Support to the Health Sector in Zimbabwe | |
| 3. Evaluation of DFID's Support for Health and Education in India | |
| 4. DFID's Humanitarian Emergency Response in the Horn of Africa | |
| 5. DFID's Livelihoods Work in Western Odisha | |
| Green-Amber | 1. DFID's Approach to Anti-Corruption | 
| 2. DFID's Programme Controls and Assurance in Afghanistan | |
| 3. The Effectiveness of DFID's Engagement with the World Bank | |
| 4. The Effectiveness of DFID's Engagement with the Asian Development Bank | |
| 5. Evaluation of DFID's Bilateral Aid to Pakistan | |
| 6. DFID's Oversight of the EU's Aid to Low-Income Countries | |
| 7. DFID's Water, Sanitation and Hygiene Programming in Sudan | |
| 8. DFID's work through UNICEF | |
| 9. DFID's Health Programmes in Burma | |
| 10. DFID's Support to Capital Projects in Montserrat | |
| 11. DFID's Support for Palestine Refugees through UNRWA | |
| Amber-Red | 1. Girl Hub: a DFID and Nike Foundation Initiative | 
| 2. Evaluation of DFID's Electoral Support through UNDP | |
| 3. The Management of UK Budget Support Operations | |
| 4. DFID's Education Programmes in Three East African Countries | |
| 5. Evaluation of the Inter-Departmental Conflict Pool | |
| 6. DFID's Education Programmes in Nigeria | |
| 7. DFID's Use of Contractors | |
| 8. DFID's Support for Civil Society Organisations through Programme Partnership Arrangements | |
| 9. FCO and British Council Aid Response to the Arab Spring | |
| Red | 1. DFID's Peace and Security Programme in Nepal | 
4.2 We noted in our 2011-12 Annual Report that, ‘with DFID’s technical expertise and standing, we would expect to see better sharing and lesson learning about what is both good and poor practice’.4 In our 2012-13 Annual Report, we returned to this theme, identifying, among other things:
- the risk of suppliers to DFID holding and not sharing the detailed knowledge of particular projects;
- the large number of separate relationships DFID has with partners it uses for delivery;
- the general difficulty of sharing lessons and experience amongst country offices and central teams;
- an example of DFID finding it a challenge to adapt the direction and aims of its programming to reflect changing contexts (in Nepal); and
- the challenge of maintaining institutional memory in teams, particularly in fragile and conflict-affected states where staff postings tend to be shorter.5
4.3 Such themes are reflected elsewhere. The topic of learning is one that many development agencies often seek to address. A 2007 report for the Swedish Government noted that ‘despite increasingly rigorous feedback systems, development agencies continue to be criticized for their inability to incorporate past experience. They are routinely accused of learning too little, too slowly – or learning the wrong things, from the wrong sources’.6
4.4 DFID has initiated – or has been the subject of – many reviews of its organisational learning since it was created in 1997. Our review will build on that work, as well as on studies from other development agencies.
4.5 A study by the Overseas Development Institute from 2010 identified three domains for learning in DFID:7
- For research and evaluation outputs: ODI noted that ‘the question of whether lessons are learned focusses on how influential that work is, whether findings and recommendations are taken up in policy and programming and acted upon’;
- For decision-making and action: ‘the question of lesson learning becomes a matter of looking at the extent to which evidence (and in particular, that emerging from DFID’s research and evaluation) feeds into and informs the process of policy making and programming’; and
- Concerning DFID as a learning organisation: ‘the question of lesson learning focusses on how knowledge within DFID is captured, shared and used, as and where it is needed’.
4.6 The ODI study suggests that DFID is more comfortable with using the findings of research and evaluation than it is with organisational learning (bullet three above). Similarly, it is much better at using research and evaluation findings during (or as part of) a project cycle, than in more complex and emergent decision-making processes. ODI concluded (perhaps not unexpectedly) that initiatives that promote a sense of ownership of research and evaluations and those that support the development and strengthening of interpersonal learning networks were effective. We are particularly interested in DFID’s corporate learning, noting that, in 1990, Senge identified that organisational learning is ‘only successful when it is based on an understanding of how the whole organizational system is connected, rather than a focus on individual parts’.8
4.7 There is considerable interest in this topic from within DFID. It has recently undertaken staff surveys that consider the use of evidence and how much time its staff have for learning. Individuals within DFID have recently written papers on learning from failure and improving networking. We will draw on this and other internal material. Where DFID has undertaken internal surveys, we will analyse the source data. We will also draw on our own studies, synthesising lessons from all previous ICAI reports on the topic of DFID’s learning, as well as undertaking our own surveys of DFID and others related to this topic if necessary.
5. Analytical approach
5.1 Our review will examine what difference learning makes to DFID’s work. The focus will be on DFID’s staff and their experience and practice. At the same time, we wish to look at the corporate enabling environment for individuals’ learning. This will include issues such as leadership, incentives, opportunities and the way that key groups of staff (such as specialist advisers) are able to contribute to corporate learning.
5.2 We will focus our analysis on the key activities that DFID undertakes. These are:
- making programme choices;
- creating theories of change;
- choosing delivery mechanisms; and
- adapting and improving during implementation of its activities.
5.3 We will gather evidence from activities undertaken by DFID during the last three years. We also recognise that learning takes place both corporately and in countries and wish to review both these dimensions. During our review, we will examine DFID’s learning in fragile and non-fragile states. Fragile states are of interest given the increasing priority given to them by DFID and since they are contexts of higher uncertainty where delivery of aid often requires increased adaptation to changing circumstances. This reduces predictability and implies that DFID’s decision-making will often have to take place against the background of incomplete knowledge and increased risk.
5.4 When addressing the processes set out in paragraph 5.2, we wish to identify answers in relation to the following elements:
- Connectivity: this concerns how well individuals are linked together (within professional groups, vertically between policy, programme management and delivery, as well as horizontally between people doing similar tasks in different places); and how well DFID’s staff are connected to beneficiaries, delivery agents, peer organisations and other sources of knowledge outside DFID;
- Creation: this will address where and how DFID staff acquire knowledge, where new ideas are created and from where they emerge. A key issue here is identifying the relative impact of different knowledge creators (including research institutions, academics, opinion formers and communities of practice);
- Capture: this will identify how DFID captures knowledge that can be used for learning andhow lesson-learning happens before, during and after activities take place. We wish in particular to identify how DFID learns from both success and failure. We will specifically look at DFID’s experience of learning from intended beneficiaries and its readiness to do so;
- Communication: this will consider whether and how the right information gets to where it is needed, including how easy and efficient it is for people to access the most relevant information. We want to know what the formal and informal processes are that move knowledge around the department; and
- Challenge: this will identify how DFID challenges staff to ensure that they apply knowledge to improve delivery and impact. We are interested in DFID’s corporate culture of learning, the examples set by managers and the corporate expectations for learning.
5.5 We will consider how these elements affect the delivery, efficiency, effectiveness and impact of UK aid. We will reference other organisations’ experience (for example, the private sector, other donors and the Foreign and Commonwealth Office) to make evidential points about DFID’s performance. We will not, however, undertake full comparative studies.
6. Indicative assessment questions
6.1 Since this is a thematic study of learning, we will not be following our usual assessment framework, although the questions below draw upon the thinking behind it. As we usually do, we wish to consider DFID’s clarity of vision for learning (its objectives, captured under incentives below), how well it undertakes learning (effectiveness), the impact on DFID’s work and how well DFID develops its own learning approaches as a result of experience. A detailed methodology will be developed during the inception phase, setting out the assessment questions and the methods to be used for answering them. The following questions are. Therefore, indicative only.
6.2 Objectives: Incentives for learning
6.2.1 Are DFID’s policies and targets for learning appropriate and adequate?
6.2.2 How well do managers provide leadership to guide staff learning across DFID?
6.2.3 How well do DFID’s departments create time and opportunities for staff to learn?
6.2.4 How well are staff held to account for ensuring that learning takes place and what effect do personal targets have on improving learning?
6.2.5 How well is learning integrated into the operational processes of DFID?
6.2.6 How does the evidence system work to support Continuing Personal Development and learning?
6.3 Delivery: Effectiveness of learning
6.3.1 What is the relative importance of the different sources of knowledge that staff use in DFID? Are some sources of knowledge privileged over others?
6.3.2 How well is DFID’s knowledge maintained over time?
6.3.3 In which ways dostaff prefer to learn? What are the formal and informal methods they use? How well does this match the opportunities for learning provided by DFID?
6.3.4 How well does DFID learn from intended beneficiaries?
6.3.5 How well does DFID learn from those partners who deliver its programmes?
6.3.6 How well does DFID learn from its own operations and experience, including both success and failure? How well is learning captured and communicated throughout the delivery chain?
6.3.7 How well is DFID overcoming any barriers to learning?
6.3.8 How does DFID’s learning compare with best practice and experience elsewhere?
6.4 Impact: Impact of learning
6.4.1 How effectively does individuals’ learning impact on the activities they perform?
6.4.2 What is the relative impact of the different sources of knowledge on programme delivery and effectiveness? Does this correlate with the relative importance attached to these sources (by DFID corporately or individual staff)?
6.4.3 How well does knowledge and learning support decision making in DFID i.e. making better programme choices, creating theories of change, choosing delivery mechanisms and adapting and improving during implementation of its activities?)
6.4.4 How well is DFID identifying the impact of learning on its performance?
6.5 Learning: Systematising Learning
6.5.1 How well does DFID build systems for learning into its operations and management?
6.5.2 How well does DFID ensure that its lessons and experience are fed back into its operations, planning and policy-making?
6.5.3 Are the levels of investment and effort in learning made by DFID sufficient to meet its needs?
6.5.4 How much does DFID change its approach to learning based upon experience and measurement of impact?
7. Methodology
7.1 We will develop a detailed methodology during the inception period. This will seek to match the key assessment questions with how they will be investigated, identifying sources of data and the processes of investigation.
7.2 Our findings will draw upon a mixture of primary and secondary data. The final sample sizes, case studies and detailed approach will be set out in the inception report. Data collection may include:
- reviewing third party literature of best practice;
- gathering documentary evidence from DFID’s own files;
- synthesising relevant lessons from ICAI’s reports to date;
- analysing data compiled from DFID’s internal staff surveys and undertaking our own qualitative survey of at least one quarter of DFID’s total staff;
- analysing at least 12 case studies of individual processes according to the four categories set out in paragraph 5.2. These will take place in at least three different departments across DFID;
- visiting one or more country offices – we will decide whether country visits are appropriate as we develop our methodology during the inception phase;
- conducting semi-structured interviews with DFID staff and external stakeholders;
- requesting submissions from DFID staff and interested parties; and
- analysing the learning capabilities of other organisations through secondary material.
8. Timing and deliverables
8.1 The review will be undertaken by a small team from ICAI’s consortium and will be overseen by the Commissioners. The lead Commissioner is Diana Good. The review will take place during the second and third quarters of 2013 and will be published in the first quarter of 2014.
9. Potential impact of our report on DFID
9.1 Our intention is that this report will help DFID to improve its effectiveness and value for money by:
- helping to sharpen the way that DFID targets its learning approaches;
- providing ideas to improve the effectiveness and value for money of the learning approaches that DFID uses; and
- identifying concrete ways that DFID can create more impact for beneficiaries from better leverage of learning.
Footnotes
1 See DFID’s open access site, Research for Development, http://r4d.dfid.gov.uk.
2 See, for instance, DFID’s How to Note, Assessing the Strength of Evidence, 2013, DFID, https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/158000/HtN_-_Strength_of_Evidence.pdf.
3 For links to DFID’s current resource centres see for instance, http://www.gsdrc.org, http://hdrc.dfid.gov.uk and http://cdkn.org.
4 Paragraph 54, Independent Commission on Aid Impact: Annual Report to the House of Commons International Development Committee 2011-2012, ICAI, June 2012, http://icai.independent.gov.uk/wp-content/uploads/2011/11/ICAI-Annual-Report-2011-12-FINAL.pdf.
5 Paragraph 26 et seq., Independent Commission on Aid Impact: Annual Report to the House of Commons International Development Committee 2012-2013, ICAI, June 2013, http://icai.independent.gov.uk/wp-content/uploads/2011/11/ICAI-Annual-Report-2012-13.pdf.
6 Knowledge and Learning in Aid Organisations, Swedish Agency for Development Evaluation, 2007, http://www.oecd.org/derec/sweden/learning.pdf.
7 Strengthening learning from research and evaluation: going with the grain, Overseas Development Institute, 2010, http://www.odi.org.uk/sites/odi.org.uk/files/odi-assets/publications-opinion-files/6327.pdf.
8 Quoted by Ingie Holvand in: Knowledge management and organizational learning: an international development perspective, Overseas Development Institute, 2003, http://www.odi.org.uk/sites/odi.org.uk/files/odi-assets/publications-opinion-files/170.pdf.