Graham Ward speech at the UK Evaluation Society conference – 10 April 2014

14 Apr 2014

ICAI and Parliamentary Accountability

Good morning and thank you for inviting me here today. I have the privilege of being the Chief Commissioner of the Independent Commission for Aid Impact.

ICAI was established in 2011 to provide independent scrutiny of UK Official Development Assistance. Our approach has focussed on two priority areas:

  • maximising the effectiveness of the UK Aid budget for the intended beneficiaries; and
  • delivering value for money for UK taxpayers.

Now, in 2014, we have just published our thirty-fourth report, which looked at Learning in the Department for International Development.

ICAI was set up as an independent body and it reports to Parliament, not to ministers. It is independent of Government. We report to a Sub-Committee of the International Development Committee, chaired by Fabian Hamilton MP. This ensures both independence and accountability.

The International Development Committee is the House of Commons select committee that provides oversight of the Government’s aid policy – a clear difference between their and our responsibilities, as policy is outside of our remit.

Our reporting is through the publication of our findings and through evidence hearings in Select Committee. As of December last year, these hearings are held in public, are broadcast over the internet and are recorded in Hansard, allowing the UK taxpayer greater access to our work.

Indeed, you may have seen our session yesterday in regard to our report on Livelihoods work in Afghanistan.

The aid scrutiny landscape in the UK is completed by the National Audit Office. There are broad similarities between their and our work; however, whilst the NAO undertakes one or two reviews of DFID a year, we undertake between ten and twelve and our coverage, as a consequence, is much greater.

As you would expect, we work closely with the International Development Committee and the NAO to increase complementarity and reduce overlap in our work. This has worked well, with the Committee’s investigations into democracy in Burma and visits to the Occupied Palestinian Territories, as particular examples.

I will now outline our approach to assessment.

Our approach is set out in our first report, ‘ICAI’s Approach to Effectiveness and Value for Money’, where we state our view that effectiveness and value for money are inextricably linked.

Fundamentally, how can a programme be value for money if it is not effective; and if there is poor value for money, is the programme being as effective as it could be?

We recognise that aid organisations, including DFID, take a range of different approaches to defining and ensuring effectiveness and value for money.

In our view:

■          effectiveness involves achieving a sustained impact for intended beneficiaries; and

■          value for money is the best use of resources to deliver the desired impact.

Our reports are written to be accessible to a general readership and we use a simple ‘traffic light’ system to report our judgement on each programme or topic we review.

In forming our ratings for programmes we consider key stages in the aid project lifecycle:

•           Objectives – Does the programme have realistic and appropriate objectives and a clear plan as to how and why the planned intervention will have the intended impact?

•           Delivery – Does the programme have robust delivery arrangements which meet the desired objectives and demonstrate good governance and management through the delivery chain?

•           Impact – Is the programme having a transformational, positive and lasting impact on the lives of the intended beneficiaries?

•           Learning – Does the programme incorporate learning to improve future aid delivery? Is it transparent and accountable?

Whilst we use a number of tools to examine the issues, our methodology has some important consistencies, such as our work with intended beneficiaries in order to gain evidence.

We have found that one of the great strengths of our beneficiary engagement is that we gain evidence that demonstrates the real effect of aid upon people’s lives. For instance, in Afghanistan, our team went to villages to discuss the work that DFID funded and heard of their concerns in relation to sustainability post-drawdown. This was an issue that we flagged within our report.

It may seem an obvious point to some but talking to the beneficiary provided clear primary evidence.

As Commissioners, we use the evidence gained from tools such as beneficiary feedback, literature review, surveys, interviews and risk assessment to triangulate and arrive at our conclusions. Sometimes this can lead to a fine judgement on whether a programme should be rated green-amber or amber-red but, in all cases, that rating is an indication that is easy to understand by all parties.

Indeed, the Permanent Secretary at DFID, Mark Lowcock, has said that he goes straight to the rating to gain a quick insight into the report.

In my view this methodology should inform both intended beneficiaries and the taxpayer as to whether aid programmes are having lasting, positive impact and are efficient. If we show that aid budgets are being used inefficiently, we make strong recommendations to address the issues and to follow up at a later date to assess whether effective action has been taken.

To clarify, once a report has been published, DFID are obliged to provide a management response after three weeks, declaring whether they accept or reject our recommendations and what actions that they will undertake.

We are currently engaged in the follow-up process for our Year Two reports. This involves examining what DFID has done in relation to our recommendations, including those which were rejected.

We do this through a combination of literature review and interviews.

In our Year One follow-up process, we found that DFID had put into place many of the actions that we had recommended and in some cases, such as in our recommendations in regard to the World Bank, had gone far beyond what we had said.

We also found that some of the recommendations that had been rejected in DFID Management Responses had actually been enacted subsequently.

We have found the follow-up process to be very useful in holding Government to account and our Year Two follow-up will be published in our next Annual Report in June.

I believe that it is increasingly important that people understand our methodology and, if you wish to know more, the ICAI website contains all of our reports and my team is in the process of uploading the Inception Reports which set out individual report methodologies.

Over the past three years, we have seen DFID begin to adopt some of our methodology, particularly in the area of beneficiary involvement; both in the design of programmes and their evaluation.

In early reports, Commissioners were not as closely involved in producing reports as they are now. An early learning for us was that we needed, as far as practically possible, to take part in country visits, even when circumstances are difficult.

By way of example, my fellow Commissioner, Mark Foster, visited the Philippines less than six weeks after the landfall of Typhoon Haiyan. He was able to witness the work of DFID and other agencies on the ground and produce not only a report with his team but also a compelling article on the ICAI website.

This review allowed us the opportunity to see how our recommendations in a previous review, on Humanitarian Response in the Horn of Africa, were put into action and we found that DFID had indeed learned from us and were better prepared to deliver an emergency response.

That we were able to mobilise our resources and assess DFID’s work so quickly is down to our flexible model and, I believe, is a real strength.

Our assessment framework has also evolved from its original form in the light of our experience to date. The most important change is to increase the focus on women and girls, which reflects not only the importance attached to women in development in terms of policy but also the recent Gender Equality Bill passed by Parliament. Given the findings of our reports so far, we have also increased the focus on robust programme management.

Both the International Development Committee and the Triennial Review of ICAI, a Cabinet Office required process that assesses the need for retention of Non Departmental Public Bodies such as ours, suggested that ICAI should move to more thematic reviews.

We have carefully considered this recommendation and agree with it. This has led to our current workplan encompassing new reviews on subjects such as Private Sector Development and Scale Up in Fragile States, while retaining more specific programme level reviews. We believe that this change offers greater coverage and depth to our work.

Of course, holding the UK Government to account is only part of the story. We have also looked at how ODA can be used to hold beneficiaries’ governments to account. This was a particular focus of our report into ‘Empowerment and Accountability Programmes in Ghana and Malawi’.

There, we examined two grant-making funds for civil society organisations (CSOs) and a project that supports community monitoring of local services.

These programmes supported a wide range of activities, from helping local communities to become more engaged in the running of local schools to civil society campaigns on the management of the oil and gas sector in Ghana.

The social accountability programmes that we examined were achieving some promising results by empowering communities to engage, constructively, with government to resolve problems with the delivery of public services and of development programmes.

By contrast, support for advocacy by Civil Society Organisations at the national level has had more limited impact and seems unlikely to generate significant improvements in government accountability.

We found that the most successful initiatives involved helping communities to build on existing capacities to find solutions which benefited both the community and the government service provider. We showed that clearer and more realistic goals, with stronger criteria for delivery decisions, would help to maximise results.

We also saw good examples of Government and community accountability in Western Odisha.

The Western Orissa Livelihoods Project (WORLP) in India sought to reduce poverty by improving communities’ water resources, agriculture and incomes. It built infrastructure such as embankments, water storage ponds and irrigation channels. It also provided loans and grants to the poor for community-based businesses.  The project saw DFID work with the Government of India and local communities.

The project influenced the Government of India in its development of national guidelines. It influenced the design and operation of the £2 billion National Watershed Management Programme, implemented in 27 Indian states and led to the delivery of projects helping the poorest to improve incomes while managing operations at a state level.

Locally, £20.4 million of funds were transferred directly to beneficiaries or disbursed through loans. Beneficiaries were paid wages for work on the project, providing incomes.

There was also community based accountability based on full transparency for beneficiaries in villages, which was reinforced by detailed record keeping, controls and audits.

Finally, given that this is the UK Evaluation Society Annual Evaluation conference, I would like to talk briefly about our recent report on How DFID Learns.

Our report has generated a lot of interest within the M&E community and beyond and, in some cases, a bit of misunderstanding.

We found that DFID staff are highly motivated with personal learning and are very good at it.

We also found that DFID owns and commissions an enormous amount of knowledge and can be innovative in developing new learning. We say that DFID is capable of performing excellently in terms of learning.

Our concerns were based on missed opportunities and challenges in regard to organisational learning.  There is, simply put, too much information available to staff that is not managed adequately.

Our view is not that DFID should generate less information; but develop learning that is focused, synthesised and user-friendly, which will mean staff can actually learn from an otherwise increasingly impenetrable body of knowledge.

I believe that DFID is capable of being a world leader in development learning, if it adopts this approach.

Finally, I would like to mention a couple of results to which our work has led.

As a result of our report on DFID’s Support to the Health Sector in Zimbabwe, the DFID team managed to negotiate significant reductions in administrative percentages paid to distribution organisations, which, in turn, has meant UK aid has reached an additional 170,000 people.

Our report on Trade Development in Southern Africa, our first Red rated report, led to the closure of a failed programme, releasing over £71 million that can now be used to alleviate poverty.

I am proud of the work that ICAI has produced and the positive impact that it has had upon the lives of intended beneficiaries.

Thank you.

Sign up to our newsletter to receive notifications of our latest news and reports

ICAI will use your personal data only to contact you about our work via email(Required)
You can find more information in our Privacy notice. Please check your Junk folder for a verification email to confirm your subscription.