Back to History Current Version

Partnership Schools for Liberia (PSL) program evaluation

Last registered on September 02, 2016

Pre-Trial

Trial Information

General Information

Title
Partnership Schools for Liberia (PSL) program evaluation
RCT ID
AEARCTR-0001501
Initial registration date
August 30, 2016

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
August 30, 2016, 4:20 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
September 02, 2016, 5:49 AM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
ITAM

Other Primary Investigator(s)

PI Affiliation
Center for Global Development
PI Affiliation
University of California- San Diego

Additional Trial Information

Status
In development
Start date
2016-09-05
End date
2019-12-31
Secondary IDs
Abstract
We propose a large-scale field experiment to study the effect of a public-private partnership (PPP) in Liberia. The Partnership Schools for Liberia (PSL) program will delegate management of 92 randomly assigned public schools to be operated free of charge to students by a variety of private organizations, both for-profit and non-profit. By comparing these PSL schools to regular government schools in the control group, we will test whether private management improves school management and teacher accountability as measured by absenteeism, time on task, and ultimately student performance. Complementary analysis will assess the sustainability, scalability, and relative cost-effectiveness of this PPP model, as well as its effects on equity through student composition and spillovers.
External Link(s)

Registration Citation

Citation
Romero, Mauricio, Justin Sandefur and Wayne Sandholtz. 2016. " Partnership Schools for Liberia (PSL) program evaluation." AEA RCT Registry. September 02. https://doi.org/10.1257/rct.1501-3.0
Former Citation
Romero, Mauricio, Justin Sandefur and Wayne Sandholtz. 2016. " Partnership Schools for Liberia (PSL) program evaluation." AEA RCT Registry. September 02. https://www.socialscienceregistry.org/trials/1501/history/10539
Experimental Details

Interventions

Intervention(s)
THE PARTNERSHIP SCHOOLS FOR LIBERIA (PSL) PROGRAM

PSL is a contract-management public-private partnership (PPP). Specifically, the Liberian Ministry of Education (MoE) contracted multiple non-state operators to run existing public primary schools (PSL schools). Philanthropic donors provide these operators with funding on a per-pupil level. In exchange, operators are responsible for the daily management of the schools, and can be held accountable for results.
PSL schools will continue to be free and non-selective public schools (i.e., operators are not allowed to charge fees or choose which students to enroll). PSL school buildings will remain under the ownership of the government. Teachers in PSL schools will be existing government teachers (i.e., public servants). Private providers will be accountable to the government for performance. Specifically, operators must agree to school inspections and provide the necessary data to evaluate performance. However, formal mechanisms holding operators accountable for student performance are not yet finalized.

An important feature of PSL schools, compared to traditional charter schools, is that teachers in these schools will be civil servants. This hampers the operators’ ability to hold teachers accountable for learning outcomes and raises the question of whether this type of “soft” accountability will affect teachers' behavior.
Intervention Start Date
2016-09-05
Intervention End Date
2019-06-30

Primary Outcomes

Primary Outcomes (end points)
Access to schooling (i.e., enrollment rates in communities with and without PSL schools)
Learning outcomes of students
Teacher behavior (e.g., absenteeism, time on task, use of corporal punishment, and teachers' job satisfaction and turnover rates)
School management (e.g, monitoring visits, support and training for teachers, investment in school infrastructure and materials, and extra-curricular activities)
Parental engagement in education (e.g., expenditure on education and involvement in school activities)
Equity, as measured by the socio-economic composition of students who access PSL schools
Equity, as measured by spillover effects on nearby non-PSL schools
Primary Outcomes (explanation)
We hypothesize that the success of the program will hinge on its ability to maintain or improve three key accountability relationships in the education system.

MANAGERIAL ACCOUNTABILITY (OF TEACHERS TO PRIVATE OPERATORS)
A central hypothesis underlying Liberia's charter school program is that private operators with greater capacity to implement routine performance management systems, regularly monitor teacher attendance, track student performance, and provide teachers with frequent feedback and support will help to overcome teacher absenteeism and low education quality.
This is not a story about accountability through carrots and sticks. Teachers in Liberia's charter schools will be drawn from the existing pool of unionized civil servants with lifetime appointments, and be paid directly by the Liberian government. Private operators will have limited authority to request that a teacher be re-assigned, and no authority to promote or dismiss civil service teachers. The hypothesis is that accountability can be generated through monitoring and support, rather than rewards and threats.
Note that this hypothesis stands in stark contrast to standard labor economics theories of accountability in the workplace which have dominated the economics of education literature in developing countries. These theories stress civil service protections and labor unions as impediments to accountability (Mbiti, 2016). In response, the experimental literature has focused on solutions such as payment for performance (Muralidharan and Sundararaman, 2011) and flexible labor contracts with credible threat of dismissal (Banerjee et al., 2007; Duflo, Dupas & Kremer, 2011; Duflo, Dupas & Kremer, 2012; Duflo, Hanna & Ryan, 2012).
We will measure the effectiveness of Liberia's 'softer' approach to managerial accountability through the randomized control trial, comparing teachers in treatment (i.e., charter) and control schools.

BOTTOM-UP ACCOUNTABILITY (OF TEACHERS AND OPERATORS TO PARENTS)
In the framework of the World Bank's 2004 World Development Report on public service delivery, there is a "short route" to accountability (i.e., bypassing the "long route" through elected representatives and the Ministry of Education) if parents are able to exercise "client power" in their interactions with teachers and schools. Client power emerges from freedom to choose another provider or direct control over school resources.

Internationally, the charter school movement is closely tied to policy reforms bestowing parents with freedom of school choice. The standard argument is that charter schools will be more reactive to parents’ demands than traditional public schools, because their funding is linked directly to enrollment numbers. However, there is limited empirical evidence establishing that school choice responds to learning quality in low-income settings (Andrabi, Das & Khwaja, 2008), and this mechanism may be more relevant for schools in high density locations like Monrovia than remote rural areas where choice is de facto limited to one or two schools in walking distance. Furthermore, since charter operators’ earnings are directly proportional to the number of enrolled children, it is in their best interest to increase enrollment, and retain enrolled children in their schools.

TOP-DOWN, RESULTS-BASED ACCOUNTABILITY (OF PRIVATE OPERATORS TO THE MINISTRY OF EDUCATION)
Charter school operators' contracts can be terminated if they do not achieve certain pre-established standards. In the U.S. literature, this is generally referred to as a "results-based accountability" structure for charter schools. Operators are given a set of Key Performance Indicators (KPIs) and targets for each school. The government expects operators to meet this targets, but has not made it clear what the consequences for operators area if they do not meet these targets.

To investigate these questions, the evaluation will collect survey data from parents, teachers, and students to measure both intermediate inputs (e.g., school management, teacher behavior, parental engagement), and final outcomes (i.e., student learning outcomes). We collect data on intermediate factors to provide insights into why PSL schools did or did not have an impact.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
There are eight partners in charge of implementing the program’s pilot, all chosen by the government. In the first pilot year (2016/2017), BRAC is managing 20 schools, Bridge International Academies 22, the Liberia Youth Network 4, More than Me 6, Omega Academies 19, Rising Academies 5, Stella Maris 4, and Street Child 12.

Based on criteria established by the evaluation team, MoE, and operators, 185 PSL-eligible schools were identified. 92 schools across 12 counties were randomly selected for treatment. Each treated school will be administered by one of the eight private operators. Since each operator has different criteria for the schools they are able to administer, each school submitted a set of criteria necessary for their schools. Based on these criteria, the universe of 185 experimental schools was split into 8 mutually exclusive groups corresponding to the 8 operators, with each operator’s group containing twice the number of schools that operator will manage. Within each of these groups, half of the schools were randomly chosen to be treated, with the rest serving as controls.
Experimental Design Details
Since the composition of students may change across PSL and control schools in response to treatment assignment, we will sample students from 2015/2016 enrollment logs, which was created prior to community awareness about PSL intervention. Each student will be evaluated as part of her/his “original” school, regardless of what school (if any) s/he attended in subsequent years. Using this panel dataset, we can recover the effect of PSL schools under an Intention-to-Treat (ITT) framework. Under some assumptions, we can use instrumental variables to recover the Treatment-on-the-Treated (TOT)
Randomization Method
Randomization done in office using Stata
Randomization Unit
Treatment happens at the school level; it consists of a private operator taking responsibility for the administration of a government school.
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
185 Schools
Sample size: planned number of observations
20 students per schools, therefore a total of 3,700 students ~8 teachers per school, therefore a total of 1,480 10 households per school, therefore a total of 1,850
Sample size (or number of clusters) by treatment arms
There are 92 treatment schools, and 93 control schools
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Using data from 2015 EGRA/EGMA assessments in Liberia, we estimate that the intra-cluster correlation in student’s test scores ranges between 0.1 and 0.2 for different grades and skills (with most estimates between 0.15 and 0.25). For all power calculations we use a conservative estimate of ICC at 0.2. Similarly, we estimate the proportion of the variance that is explained by observable characteristics (age, gender, district, and grade) to be between 20-30% (without including baseline test scores). Thus, for all power calculations we conservatively assume that the R-squared of observable student characteristics is 30%. Therefore, the minimum detectable effect size (MDE) with a power of 90%, at a 5% size, testing 10 students per school (in a total of 185 schools – 92 treated) is 0.22 standard deviations (Duflo, Glennerster & Kremer, 2007). Testing 20 students per school, we have an MDE of 0.2. These MDE are estimated under very conservative assumptions (high power, low size level, and conservative ICC and R2 from observable student/school characteristics). According to EGMA data from 2015, students in third grade are able to answer, on average, 33.7% of addition questions correctly. Increasing test scores by 0.2 standard deviations would be equal to increasing the average test score from 33.7% to 37.4%.
Supporting Documents and Materials

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
IRB

Institutional Review Boards (IRBs)

IRB Name
Human Subjects Committee for Innovations for Poverty Action IRB-USA
IRB Approval Date
2016-08-31
IRB Approval Number
14227
Analysis Plan

Analysis Plan Documents

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials