Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
Procuring Food for Thought: Does Centrally Coordinated Buying Get Better Meals to More Students?
Last registered on June 16, 2018


Trial Information
General Information
Procuring Food for Thought: Does Centrally Coordinated Buying Get Better Meals to More Students?
Initial registration date
June 11, 2018
Last updated
June 16, 2018 11:20 AM EDT
Primary Investigator
World Bank
Other Primary Investigator(s)
Additional Trial Information
In development
Start date
End date
Secondary IDs
This Impact Evaluation studies to which extent procurement through PAE Framework Agreements impacts the quality, value-for-money, and delivery of school meals to students in Colombia. This study will perform a randomized controlled trial (RCT) to evaluate the impact of replacing such existing ad-hoc agreements between individual ETCs (subnational education districts) and ‘fundaciones’ with centrally-coordinated, standardized, and credibly-enforced framework agreement contracts. Additionally, as the involvement of school-level actors in the administration of contracts may play a role in determining the effectiveness of service delivery, this evaluation will also assess the impact of bottom-up accountability mechanisms by empowering teachers, parents, and students to enforce the terms of contracts on suppliers. A randomized cross-design combining PAE Framework Agreements with client empowerment will attempt to determine whether supply- and/or demand-side interventions targeted at improving delivery have an impact on value-for-money and service delivery.
External Link(s)
Registration Citation
Roscitt, Michael. 2018. "Procuring Food for Thought: Does Centrally Coordinated Buying Get Better Meals to More Students? ." AEA RCT Registry. June 16. https://doi.org/10.1257/rct.3076-1.0.
Former Citation
Roscitt, Michael. 2018. "Procuring Food for Thought: Does Centrally Coordinated Buying Get Better Meals to More Students? ." AEA RCT Registry. June 16. http://www.socialscienceregistry.org/trials/3076/history/30794.
Experimental Details
16. The Impact Evaluation will consist of two primary interventions:
1) A centrally coordinated Framework Agreement designed specifically for the local purchasing of PAE;
2) A citizen information and grievance redress portal intended to provide demand-side, locally driven pressure on firms and local government to improve performance
These interventions will focus on improving: i) the transparency and standardization of the PAE procurement process; ii) the number and quality of competing firms; and iii) the ultimate value-for-money on government funds spent on school feeding recognized through improved access and equity to meals for the targeted students of Colombia. As outlined in Section 8, these two interventions will be combined in a cross-design with the objective of disentangling whether any observed affects are driven by the reforms in contract award, simply by oversight, and/or whether there are synergies between the two.
Intervention Start Date
Intervention End Date
Primary Outcomes
Primary Outcomes (end points)
The primary outcome is the value-for-money of school meals delivered -- measured by the cost, timeliness, and quality of meals delivered.
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
This evaluation will utilize a matched-pair, cluster-randomization (MCPR) experimental design to estimate the causal impact of: i) procurement of PAE through a centrally coordinated Framework Agreement; and ii) enhanced citizen oversight through circulares.
Experimental Design Details
Randomization Method
Randomization done in office by a computer.
Randomization Unit
55. The primary unit of randomization for the Framework Agreement intervention is the “segment”. Every ETC will be carved up into smaller market segments, each of which represents a “contract” that will be put up for bid for the provision of number of meals per day to a set number of schools to a winning firm. The purpose of segmentation is three-fold: (i) Smaller, more specialized firms may target segments where they may enjoy a competitive advantage; (ii) It lowers the economic requirements which could hamper these smaller firms into this market; and (iii) Larger firms can still bid for multiple segments (and in bundles, if allowed) to maximize the potential return from economies of scale. It is, therefore, important to recognize that multiple segments does not guarantee multiple awardees.
Was the treatment clustered?
Experiment Characteristics
Sample size: planned number of clusters
318 segments.
Sample size: planned number of observations
24,079 meals per day
Sample size (or number of clusters) by treatment arms
159 segments treatment, 159 segments control
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
59. Power calculations were performed utilizing on data covering historic procurement outcomes, firm performance, and stakeholder grievances collected by the research team as part of the 25 ETC diagnostic implemented as part of the IE. Data points on estimated effect sizes and variance cross-checked and supported with historic findings in the school feeding reform literature including Lawson (2012), Bundy et al. (2009), and Gelli (2011). Figure 5 below (blue line), reports power calculations at the cluster level (ETC), assuming: (i) an average cluster size of 1,000 schools; (ii) an estimated treatment effect of 0.5 standard deviations; (iii) and intra-cluster correlation (ICC) of 0.1 (solid line) and a more conservative ICC of 0.2 (dotted line). Aiming to achieve 80% power, the required number of clusters (segments) is 37, well within range of the sample of 318. Utilizing a more conservative effect size of 0.2 standard deviations (the red line) in combination with an ICC of 0.2 predicts that 174 segments will be needed for 80% power, still well within the range of comfort.
IRB Name
IRB Approval Date
IRB Approval Number
Post Trial Information
Study Withdrawal
Is the intervention completed?
Is data collection complete?
Data Publication
Data Publication
Is public data available?
Program Files
Program Files
Reports, Papers & Other Materials
Relevant Paper(s)