About Contact
Monday, May 4, 2026

The Knowledge Bank

A $1 Billion Knowledge Portfolio. No Evaluative Framework. Twenty Years After Banerjee–Deaton.


The Paper

The World Bank spends approximately $1 billion per year on knowledge activities — Advisory Services and Analytics (ASA), country diagnostics, policy notes, sector reviews, technical assistance, and institutional advice — financed through Bank budget and trust funds. Between FY2017 and FY2024, the Bank launched 5,720 country-specific ASA activities through its formal tracking system. A significant additional volume of knowledge work, financed through trust funds, sits outside that system entirely.

None of this output is subject to systematic evaluation. Unlike lending — where every project undergoes a mandatory self-evaluation and IEG independently validates 100 percent of closed projects — ASA has no evaluative framework. No mandatory completion report. No independent validation. No portfolio-level outcome tracking. The Bank cannot tell its Board what its knowledge expenditure achieved.

This paper traces the evidence from the 2006 Banerjee–Deaton independent panel, through successive internal and external reviews, to the IEG’s forthcoming RAP 2025 — the first system-wide ASA evaluation in two decades. It documents how the Bank’s ASA governance process works, where it fails, and why the peer review system is captured from the inside. It proposes five reforms.

Download the Paper (PDF)


Key Findings

The accountability gap. The Bank evaluates 100 percent of its closed lending projects through mandatory ICRs and IEG validation. It evaluates zero percent of its ASA output through an equivalent system. IFC has systematically evaluated all client-facing advisory projects since 2008, with IEG validating a 51 percent sample annually. The World Bank has no equivalent for ASA. The RAP 2025 concept note states: the Bank “currently lacks an evaluative framework for ASA.”

The Banerjee–Deaton benchmark. In 2006, Abhijit Banerjee, Angus Deaton, Nora Lustig, and Kenneth Rogoff — supported by 25 senior academic reviewers — found that Bank research was disconnected from operations, used selectively to support institutional advocacy, and subject to insufficient quality control. They identified “a serious failure of the checks and balances that should separate advocacy and research.” Twenty years later, successive reviews have confirmed the same diagnosis. None has resolved it.

The peer review capture. The Task Team Leader selects the peer reviewers for ASA products. The author chooses who assesses the work. In one case, a Harvard professor identified major fundamental weaknesses in a GP report on external review. In the next round, cosmetic changes were made and the report was pushed through and published. The formal process was followed. The quality control failed.

The trust fund dimension. Bank-Executed Trust Funds account for close to a quarter of the Bank’s work programme budget, with three-quarters flowing to analytical and advisory activities. These carry no ICR requirement, no IEG validation, and no Inspection Panel coverage. The total trust fund portfolio exceeds $100 billion — comparable to IDA’s active lending portfolio. A companion paper — The Unaccounted Portfolio — documents the trust fund accountability gap in full.

RAP 2025. The IEG’s forthcoming Results and Performance report will include the first system-wide review of ASA use and influence since 2006, covering 5,720 activities. Budget: $1 million. The concept note is transparent about limitations: the system being evaluated may not contain the evidence needed to evaluate it.


Five Recommendations

1. Fix the peer review system. Assign reviewers through an independent function — not TTL selection. Document reviewer comments and the team’s response. Standard practice in every credible research institution.

2. Publish timeline and cost. For every ASA product: Decision Meeting date, publication date, elapsed time, and total cost including staff time, consultants, and trust fund contributions.

3. Require demonstration of utility. Independent external review panel every five years, sampling across regions and GPs, interviewing client government counterparts. Cost: less than $2 million — a rounding error against a $1 billion annual spend.

4. Publish download data. Total downloads, downloads by country, and page views for every published ASA product. Standard practice for academic publishers and think tanks.

5. Consolidate reporting. Require all knowledge output — regardless of financing source — to be registered in the ASA database. Report to the Board annually on total volume, cost, and documented utility. A data management task, not a policy reform.


Data

This paper draws on: the Banerjee–Deaton independent evaluation of World Bank research (2006); IEG evaluations including Knowledge-Based Country Programs (2013), Behind the Mirror (2016), RAP 2022, and RAP 2024; the IEG RAP 2025 Concept Note (August 2025); the World Bank Knowledge Strategy (2021); World Bank Trust Fund Annual Reports (2022–2024); Trust Fund Financial Information Summary (December 2024); European Court of Auditors Special Reports 11/2017, 32/2018, and 17/2024; and the IEG 2011 Trust Fund Portfolio Evaluation. Full data available at mdbreform.com/data/.


This Paper in the Series

This is the ninth paper in the mdbreform.com analytical series on World Bank delivery performance:


Parminder Brar is the founder of mdbreform.com and a former World Bank Country Manager and Lead Governance Specialist. Four field postings in FCV countries in Africa, 2003–2023. Contact: fcvstrategy@gmail.com

Browse by Topic