Partnership Schools for Liberia (PSL) program evaluation
Last registered on July 29, 2017


Trial Information
General Information
Partnership Schools for Liberia (PSL) program evaluation
Initial registration date
August 30, 2016
Last updated
July 29, 2017 12:12 PM EDT

This section is unavailable to the public. Use the button below to request access to this information.

Request Information
Primary Investigator
University of California - San Diego
Other Primary Investigator(s)
PI Affiliation
Center for Global Development
PI Affiliation
University of California- San Diego
Additional Trial Information
In development
Start date
End date
Secondary IDs
We propose a large-scale field experiment to study the effect of a public-private partnership (PPP) in Liberia. The Partnership Schools for Liberia (PSL) program will delegate management of 92 randomly assigned public schools to be operated free of charge to students by a variety of private organizations, both for-profit and non-profit. By comparing these PSL schools to regular government schools in the control group, we will test whether private management improves school management and teacher accountability as measured by absenteeism, time on task, and ultimately student performance. Complementary analysis will assess the sustainability, scalability, and relative cost-effectiveness of this PPP model, as well as its effects on equity through student composition and spillovers.
External Link(s)
Registration Citation
Romero, Mauricio, Justin Sandefur and Wayne Sandholtz. 2017. " Partnership Schools for Liberia (PSL) program evaluation." AEA RCT Registry. July 29.
Experimental Details

PSL is a contract-management public-private partnership (PPP). Specifically, the Liberian Ministry of Education (MoE) contracted multiple non-state operators to run existing public primary schools (PSL schools). Philanthropic donors provide these operators with funding on a per-pupil level. In exchange, operators are responsible for the daily management of the schools, and can be held accountable for results.
PSL schools will continue to be free and non-selective public schools (i.e., operators are not allowed to charge fees or choose which students to enroll). PSL school buildings will remain under the ownership of the government. Teachers in PSL schools will be existing government teachers (i.e., public servants). Private providers will be accountable to the government for performance. Specifically, operators must agree to school inspections and provide the necessary data to evaluate performance. However, formal mechanisms holding operators accountable for student performance are not yet finalized.

An important feature of PSL schools, compared to traditional charter schools, is that teachers in these schools will be civil servants. This hampers the operators’ ability to hold teachers accountable for learning outcomes and raises the question of whether this type of “soft” accountability will affect teachers' behavior.
Intervention Start Date
Intervention End Date
Outcomes (end points)
Access to schooling (i.e., enrollment rates in communities with and without PSL schools) Learning outcomes of students Teacher behavior (e.g., absenteeism, time on task, use of corporal punishment, and teachers' job satisfaction and turnover rates) School management (e.g, monitoring visits, support and training for teachers, investment in school infrastructure and materials, and extra-curricular activities) Parental engagement in education (e.g., expenditure on education and involvement in school activities) Equity, as measured by the socio-economic composition of students who access PSL schools Equity, as measured by spillover effects on nearby non-PSL schools
Outcomes (explanation)
We hypothesize that the success of the program will hinge on its ability to maintain or improve three key accountability relationships in the education system.

A central hypothesis underlying Liberia's charter school program is that private operators with greater capacity to implement routine performance management systems, regularly monitor teacher attendance, track student performance, and provide teachers with frequent feedback and support will help to overcome teacher absenteeism and low education quality.
This is not a story about accountability through carrots and sticks. Teachers in Liberia's charter schools will be drawn from the existing pool of unionized civil servants with lifetime appointments, and be paid directly by the Liberian government. Private operators will have limited authority to request that a teacher be re-assigned, and no authority to promote or dismiss civil service teachers. The hypothesis is that accountability can be generated through monitoring and support, rather than rewards and threats.
Note that this hypothesis stands in stark contrast to standard labor economics theories of accountability in the workplace which have dominated the economics of education literature in developing countries. These theories stress civil service protections and labor unions as impediments to accountability (Mbiti, 2016). In response, the experimental literature has focused on solutions such as payment for performance (Muralidharan and Sundararaman, 2011) and flexible labor contracts with credible threat of dismissal (Banerjee et al., 2007; Duflo, Dupas & Kremer, 2011; Duflo, Dupas & Kremer, 2012; Duflo, Hanna & Ryan, 2012).
We will measure the effectiveness of Liberia's 'softer' approach to managerial accountability through the randomized control trial, comparing teachers in treatment (i.e., charter) and control schools.

In the framework of the World Bank's 2004 World Development Report on public service delivery, there is a "short route" to accountability (i.e., bypassing the "long route" through elected representatives and the Ministry of Education) if parents are able to exercise "client power" in their interactions with teachers and schools. Client power emerges from freedom to choose another provider or direct control over school resources.

Internationally, the charter school movement is closely tied to policy reforms bestowing parents with freedom of school choice. The standard argument is that charter schools will be more reactive to parents’ demands than traditional public schools, because their funding is linked directly to enrollment numbers. However, there is limited empirical evidence establishing that school choice responds to learning quality in low-income settings (Andrabi, Das & Khwaja, 2008), and this mechanism may be more relevant for schools in high density locations like Monrovia than remote rural areas where choice is de facto limited to one or two schools in walking distance. Furthermore, since charter operators’ earnings are directly proportional to the number of enrolled children, it is in their best interest to increase enrollment, and retain enrolled children in their schools.

Charter school operators' contracts can be terminated if they do not achieve certain pre-established standards. In the U.S. literature, this is generally referred to as a "results-based accountability" structure for charter schools. Operators are given a set of Key Performance Indicators (KPIs) and targets for each school. The government expects operators to meet this targets, but has not made it clear what the consequences for operators area if they do not meet these targets.

To investigate these questions, the evaluation will collect survey data from parents, teachers, and students to measure both intermediate inputs (e.g., school management, teacher behavior, parental engagement), and final outcomes (i.e., student learning outcomes). We collect data on intermediate factors to provide insights into why PSL schools did or did not have an impact.
Experimental Design
Experimental Design
There are eight partners in charge of implementing the program’s pilot, all chosen by the government. In the first pilot year (2016/2017), BRAC is managing 20 schools, Bridge International Academies 22, the Liberia Youth Network 4, More than Me 6, Omega Academies 19, Rising Academies 5, Stella Maris 4, and Street Child 12.

Based on criteria established by the evaluation team, MoE, and operators, 185 PSL-eligible schools were identified. 92 schools across 12 counties were randomly selected for treatment. Each treated school will be administered by one of the eight private operators. Since each operator has different criteria for the schools they are able to administer, each school submitted a set of criteria necessary for their schools. Based on these criteria, the universe of 185 experimental schools was split into 8 mutually exclusive groups corresponding to the 8 operators, with each operator’s group containing twice the number of schools that operator will manage. Within each of these groups, half of the schools were randomly chosen to be treated, with the rest serving as controls.
Experimental Design Details
Not available
Randomization Method
Randomization done in office using Stata
Randomization Unit
Treatment happens at the school level; it consists of a private operator taking responsibility for the administration of a government school.
Was the treatment clustered?
Experiment Characteristics
Sample size: planned number of clusters
185 Schools
Sample size: planned number of observations
20 students per schools, therefore a total of 3,700 students ~8 teachers per school, therefore a total of 1,480 10 households per school, therefore a total of 1,850
Sample size (or number of clusters) by treatment arms
There are 92 treatment schools, and 93 control schools
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Using data from 2015 EGRA/EGMA assessments in Liberia, we estimate that the intra-cluster correlation in student’s test scores ranges between 0.1 and 0.2 for different grades and skills (with most estimates between 0.15 and 0.25). For all power calculations we use a conservative estimate of ICC at 0.2. Similarly, we estimate the proportion of the variance that is explained by observable characteristics (age, gender, district, and grade) to be between 20-30% (without including baseline test scores). Thus, for all power calculations we conservatively assume that the R-squared of observable student characteristics is 30%. Therefore, the minimum detectable effect size (MDE) with a power of 90%, at a 5% size, testing 10 students per school (in a total of 185 schools – 92 treated) is 0.22 standard deviations (Duflo, Glennerster & Kremer, 2007). Testing 20 students per school, we have an MDE of 0.2. These MDE are estimated under very conservative assumptions (high power, low size level, and conservative ICC and R2 from observable student/school characteristics). According to EGMA data from 2015, students in third grade are able to answer, on average, 33.7% of addition questions correctly. Increasing test scores by 0.2 standard deviations would be equal to increasing the average test score from 33.7% to 37.4%.
IRB Name
Human Subjects Committee for Innovations for Poverty Action IRB-USA
IRB Approval Date
IRB Approval Number
Analysis Plan
Analysis Plan Documents
Partnership Schools for Liberia: Pre-Analysis Plan

MD5: 82d7a8562481320c377224755e2e0c79

SHA1: 5132be5059fd4ba924cd22182133bb2f4b4eedc1

Uploaded At: May 10, 2017


MD5: e54a5f9320c621afa8a7ef7abd97b997

SHA1: 0ff96b0b559b444e36d7da5bdce642aaee37f249

Uploaded At: May 13, 2017


MD5: 0ac5acb75c168da54e26e19b87447a68

SHA1: 06be8ee99100e194a71876eb17df5d99e8b7eceb

Uploaded At: May 13, 2017


MD5: 0f758273187efb9bf808a0ea0141326f

SHA1: 032dc5f8503d30dc1b17556801a2d84b86c0307a

Uploaded At: May 13, 2017


MD5: 1d92e2e0579283ab1fafb5d69555a0a1

SHA1: 0f6cca96adbd6721c9274b8508cea264699f3008

Uploaded At: May 13, 2017


MD5: a77da3dcf0e849cbce99fe75a1862649

SHA1: f11d997d491b4af1e2fb01687a891b12c9b6a1f5

Uploaded At: May 13, 2017


MD5: 3be47d2a88923e2f54f5a7a9f64e6526

SHA1: 00de623bda9f24e26e992d34515149e022f4d669

Uploaded At: May 13, 2017


MD5: f25eb8172b72d4a7596cf3e0de47d41d

SHA1: 20e2fa0ccaa6264bb54eb8ad6bb8c3d9447ece8b

Uploaded At: May 13, 2017